NASASpaceFlight.com Forum
General Discussion => New Physics for Space Technology => Topic started by: Mark7777777 on 07/07/2019 10:16 am

A new article has been posted in the Journal of Space Exploration regarding a Quantised Inertia FTL travel theory.
Author:
Michael Edward McCulloch
University of Plymouth, Plymouth, PL4 8AA, UK
https://www.tsijournals.com/articles/superluminaltravelfromquantisedinertia.pdf
Abstract
Special relativity predicts that the inertial mass of an object is infinite at the speed of light (c) causing zero acceleration and producing a cosmic speed limit. Here, a new model for inertia is presented that challenges this. The model (quantised inertia) assumes that inertia is caused by Unruh radiation made inhomogeneous in space by relativistic horizons. Quantised inertia is consistent with standard physics at normal accelerations, but predicts a new loss of inertia at very low accelerations, predicting galaxy rotation without dark matter and a minimum acceleration of 2c2/Θ ~ 2 × 1010 m/s (where Θ is the comoving Hubble diameter) which is equal to the cosmic acceleration and that persists even at the speed of light. This implies that the speed of light limit can be broken, albeit with this tiny acceleration and that this relativity  proof acceleration could be boosted by setting up a causal horizon around the ship.
Keywords: Unruh radiation; Quantised inertia; Faster than light travel

acceleration of 2c2/Θ ~ 2 × 1010 m/s (where Θ is the comoving Hubble diameter) which is equal to the cosmic acceleration and that persists even at the speed of light. This implies that the speed of light limit can be broken, albeit with this tiny acceleration and that this relativity  proof acceleration could be boosted by setting up a causal horizon around the ship.
Units matter.
That is three hundred years to accelerate 1m/s.
To change speed by 1% of c would take a billion years.

Now that there is a separate thread on this, I will list some of the many problems with McCulloch's theory (only some, because there are a ton)
The basic inertial mass equation in his theory divides by acceleration, which has obvious divide by 0 problems. Actually, for any acceleration less than 2*10^{10} m/s^{2}, it implies negative inertial mass.
Setting force to 0 in his equation 11 also brings up problems with this, causing the acceleration to be equal to a fixed 2*10^{10} m/s^{2}. This makes the inertial mass in his theory 0, which by special relativity, means the particle must be moving at the speed of light.
I have not seen anything where he reconciles basic conservation laws with his claims, considering that he claims to explain propellantless thrusters which break conservation of energy and momentum, the answer seems to be it doesn't.
On a similar note, general relativity is based on the principle that inertial and gravitational mass are the same. I have not seen any explanation of how to handle GR effects under McCulloch's theory, nor any demonstration that McCulloch's theory can replicate the standard tests of GR.
He says "The consequences of the FTL discussed here for causality are complex, and have not yet been considered." The consequences are actually quite simple, he does not modify the time dilation parts of special relativity, so this would simply break causality. Another thread on here has gone into the FTL causality problem in detail, and it basically boils down to you need to define a universal reference frame that FTL is restricted to be relative to. McCulloch's theory does not appear to contain any such thing.
McCulloch's theory has essentially already been falsified due to his many claims of things it is supposed to explain. Notably, this includes the Pioneer anomaly. He has an old blog post where he complains about the resolution of the pioneer anomaly, but all he does in it is demonstrate his ignorance of thermal modelling.
Other predictions include the emDrive and Mach effect, which aren't looking so good (see relevant threads to discuss each of those.)
In general, McCulloch fails to provide numerical predictions to accompany his claims, which in itself is a red flag, one reason he doesn't provide numbers may be provided by the next bullet.
McCulloch likes to claim his theory doesn't have adjustable parameters, but it contains one number which is not a fundamental constant, the cosmic diameter (why not radius?) many of his claims are based on the undefined nature of how to calculate that parameter, and it seems he basically handwaves changes to it any time he needs a different result.
Also as initially pointed out by others in the emDrive thread before this topic was split off, McCulloch's paper only addresses circular accelerators, not linear ones, and he has provided no numbers to support his claim on Twitter that the acceleration profile in linear accelerators is such that the effects of his theory would not be seen. Ina linear accelerator, the acceleration is only in the direction of motion, so his previous calculations apply, which say that the acceleration would not stop, and the speed of light would be exceeded. Even in the example in the paper, he does not do the math completely, since the LHC beams are around 0.99999999 c (~3 m/s short of c) so the amount of relativistic mass increase is huge, and even a small proportional decrease in mass could be significant. Also, he comes to his conclusions based on a Taylor series that is not a good approximation for near c. I have attached a plot from wolfram alpha showing the curved blue dotted line diverges significantly from the correct expression as v/c> 1. https://www.wolframalpha.com/input/?i=sqrt(1x%5E2)+taylor+x%3D0
Also, as an example of how McCulloch changes constant in his theory in an arbitrary way, for something like the emDrive, and his proposals to improve the FTL drive, shielding magically makes the divide by the size of the universe scale factor change to something helpful. He does not apply this same reasoning to particle accelerators, which are sealed metal vacuum tubes, because then it would be clear how wrong his theory is.
In summary: McCulloch's theory is completely inconsistent, and seems to me to be beyond hope of rescue.
Edit: typos, and fixed the link

Reply to meberbs' crit. Note that these are brief replies (all I can give at the moment):
 The negative inertial mass would never be reached because as the QIpredicted inertia approaches zero the acceleration increases again. The result is an acceleration that asymptotes to 2x10^10 m/s^2.
 Setting F=0 in equation 11 predicts the minimum acceleration of 2c^2/Theta which is the main prediction of QI. It is due to the attraction of the cosmic horizon, is fueled by information (see below) and is supported empirically by the supernovae data (Perlmutter and Riess, 1999). It is the value of the observed cosmic acceleration.
 QI does not conserve massenergy, but it does conserve massenergyinformation. In QI information can be converted to massenergy and vice versa. This is directly related to Landauer's principle which has been verified.
 I am not so worried about agreement with GR because it has been falsified. Ie: it has never predicted a single galactic rotation correctly. I am working on a paper using Qi to predict the bending of star light round the sun.
 As I said, I have not considered causality yet. QI does not depend on reference frames because it uses accelerations. These are independent of ref frame.
 The Pioneer anomaly has not been falsified. The modelling of known anomalies with a complex model is a disease of modern physics. Dark matter is the same.
 'Fails to provide numerical predictions'. Are you joking? I always make numerical predictions.
 The cosmic horizon is not an adjustable parameter. Sure it has an allowed range, but that is small and known.
 In QI it does not matter whether the acceleration is linear or circular. I performed the calculation for circular accelerators, but linear ones still accelerate at a huge rate so the effect of Qi will be tiny.
 The Unruh waves for particles in accelerator tubes will not interact with the tubes in the same way as for the emdrive, because in the emdrive the microwaves are finely tuned so that the Unruh waves are exactly the right size to resonate in the cavity.
 In summary. QI is not complete and I am not claiming for sure that every anomaly I have looked at is valid, but QI predicts the more conclusive ones, eg galaxy rotation, in a simpler, more elegant manner than any other theory.

 The negative inertial mass would never be reached because as the QIpredicted inertia approaches zero the acceleration increases again. The result is an acceleration that asymptotes to 2x10^10 m/s^2.
I don't think you meant to type that the acceleration "increases again", the asymptotic comment is the correct description of taking the limit of equation 11 as force approaches 0.
 Setting F=0 in equation 11 predicts the minimum acceleration of 2c^2/Theta which is the main prediction of QI. It is due to the attraction of the cosmic horizon, is fueled by information (see below) and is supported empirically by the supernovae data (Perlmutter and Riess, 1999). It is the value of the observed cosmic acceleration.
You are missing the point, acceleration in what direction? If there is a force, the acceleration is in the direction of the force if there is none, then there is no definition in your theory for the direction of acceleration. The accelerating expansion of the universe is an entirely different type of phenomenon than the local acceleration of a given mass that the equation you used describes.
 QI does not conserve massenergy, but it does conserve massenergyinformation. In QI information can be converted to massenergy and vice versa. This is directly related to Landauer's principle which has been verified.
If that is the case, please provide a rigorous explanation of how that works (It seems to me this would not work for various reasons, but rather than speculating I should see what the details are of your explanation, in case you did find some way that I can't think of.) this only covers energy conservation, which is tied to time symmetry in Noether's theorem, you also need something to cover momentum conservation, which your theory also clearly violates.
 I am not so worried about agreement with GR because it has been falsified. Ie: it has never predicted a single galactic rotation correctly. I am working on a paper using Qi to predict the bending of star light round the sun.
GR has not been falsified, your assertions that is has are part of what quickly loses you credibility when you talk to other scientists. GR predicts galactic rotation curves just fine under lambdaCDM, which is a model that has been tested including things such as the evolution of the universe over time to form galaxies as we currently observe them. Also, before you mention it, wide binaries have previously been discussed on this site. A recent paper shows that the data is actually consistent with GR, the pattern in the data is related to projection effects.
Anyway, as I said, you confirm here that you have not actually compared to your theory with any of the standard tests of GR yet. The various claims you make promoting your theory make it sound like it has wider applicability than GR, but in actuality, you have not shown it matches with any of the experimental tests of GR. I (and probably most other people with a relevant background) find this strange, because if someone comes up with a new theory of gravity, the very first question is whether it can predict the results of the various experiments confirming general relativity. Since your theory fundamentally contradicts the most fundamental assumption in GR, this is even more important because it is difficult to see how such a radically different theory can replicate all of the successes of GR.
 As I said, I have not considered causality yet. QI does not depend on reference frames because it uses accelerations. These are independent of ref frame.
Your statement about being independent of reference frame is the basis of why I said it is simple: Your theory breaks causality.
Your claim of frame independence actually brings up one of the issues I didn't address before:
acceleration on its own is not a relativistic invariant, you need to consider the four acceleration, which you do not address in your paper.
 The Pioneer anomaly has not been falsified. The modelling of known anomalies with a complex model is a disease of modern physics. Dark matter is the same.
The pioneer anomaly is not resolved with complex physics, but simple physics which has been known and tested for a long time: the momentum carried by electromagnetic waves, and the power proportional to T^4 of black body radiation.
I previously read a blog post from you on the Pioneer anomaly and you basically had 2 points:
The thermal model had a couple adjustable parameters correlated to the spacecraft data
This is standard practice in thermal modelling of spacecraft, some details of reality are not going to be accurately predicted by the model, so you adjust these to match with the as measured temperatures. Since these are correlated with temperature, not with the thrust, this in no way invalidates their predictions. It actually supports the validity.
You complain about there being "thousands" of finite elements which only shows that you don't understand the basics of this well used and effective modelling technique.
 'Fails to provide numerical predictions'. Are you joking? I always make numerical predictions.
You mean like the predictions you have provided for the upcoming tests of your theory?
https://twitter.com/memcculloch/status/1127923530799222785
Actually I couldn't find those predictions, and your tweet is backwards anyway. If you make a prediction for both unless both produce thrust consistent with your predictions, your theory is wrong. Claiming you are validated if either produce any thrust is a form a cherry picking.
Let's look at a previous list of "predictions" you made:
http://physicsfromtheedge.blogspot.com/2016/04/predictionsofmihsc.html
1, 4, 5, 6, 7, 11, 12, 15, 16 don't have numbers. That is more than half. You can argue that there is reasons for some of them, but even predictions like numbers 2 and 3 have an acceleration in them, but are actually incredibly generic, and describe a qualitative effect with no real world examples provided.
 The cosmic horizon is not an adjustable parameter. Sure it has an allowed range, but that is small and known.
You hypothesize changes to it all the time when you want a different result. When you compare the size of the observable universe and the speed of light, it is not surprising when you end up with a number related to the rate of expansion of the universe. Please point to a rigorous derivation of why the size of the universe is the relevant number, you ignored the question in my post on why it is diameter rather than radius.
 In QI it does not matter whether the acceleration is linear or circular. I performed the calculation for circular accelerators, but linear ones still accelerate at a huge rate so the effect of Qi will be tiny.
Again, you provide no numbers to support your claim even after being directly prompted for them. Towards the end of the accelerator, the inertial mass would increase reducing the acceleration.
 The Unruh waves for particles in accelerator tubes will not interact with the tubes in the same way as for the emdrive, because in the emdrive the microwaves are finely tuned so that the Unruh waves are exactly the right size to resonate in the cavity.
Take a look at bullet point number 11 from your post I linked to. You specifically claimed the waves in an accelerator would be shortened, which would increase the effect of your theory. Again, you are just picking and choosing assumptions to fit whatever suits you at the moment.
 In summary. QI is not complete and I am not claiming for sure that every anomaly I have looked at is valid, but QI predicts the more conclusive ones, eg galaxy rotation, in a simpler, more elegant manner than any other theory.
No physics theory is complete, we do not have a theory of everything. However you claim to be replacing GR, and have yet to successfully replicate a single one of its predictions. Your theory currently does not appear to even be fully rigorous or consistent, so calling it "simpler" is misleading. Not everything in the universe is simple, and insisting on a simple explanation of inherently complicated things just means being wrong. There is nothing elegant about breaking causality, which is what FTL means, unless a universal reference frame is defined.
Also I am cautious about bringing this up, since I don't want you take take this as an invitation to talk about some of the things you say on social media, which are counterproductive to having a reasoned conversation. I did some basic research on the journal you published papers in. A known problem with modern scientific publishing is that there are a large number of predatory journals out there that take advantage of the pressure for academics to publish and charge high fees for publication, often not performing proper peer review even if they claim to. Someone went to the trouble to compile a list and it was no surprise to find the one you are using is one of them: https://scholarlyoa.com/publishers/
As I pointed out (but you failed to address) You have a major mathematical error in your paper where you apply a low velocity Taylor series approximation of relativity to extremely high velocities where the series is not close to converging for the terms you used. Anyone with a mathematical background should be able to notice this error, and the approximation is so common in relativity, any competent reviewer with an appropriate background would immediately notice that error. This indicates that the journal you are using at best only pretends to do peer review. I recommend doing some research and finding an actually respected journal to get published in if you care to be taken seriously. (Due to that error alone your paper as currently written could make a good test for the peer review quality of relevant journals.)

I'm not a physicist, so here's my (limited) understanding of this:
You are missing the point, acceleration in what direction? If there is a force, the acceleration is in the direction of the force if there is none, then there is no definition in your theory for the direction of acceleration. The accelerating expansion of the universe is an entirely different type of phenomenon than the local acceleration of a given mass that the equation you used describes.
From my understanding, The acceleration is in the direction o movement. And that direction could be any referential frame you are using, like a gravitational orbit. If theoretically there would be no frame of reference then the direction of acceleration would be Random, like a quantum fluctuation. (This is my understanding at least)
If that is the case, please provide a rigorous explanation of how that works (It seems to me this would not work for various reasons, but rather than speculating I should see what the details are of your explanation, in case you did find some way that I can't think of.) this only covers energy conservation, which is tied to time symmetry in Noether's theorem, you also need something to cover momentum conservation, which your theory also clearly violates.
QI redefines momentum as Unruh radiation pressure of Rindler horizon. So of the universe expands the radiation pressure decreases, momentum also decreases. QI ties momentum to Rindler Horison and universe scale. When universe size approaches zero, momentum approaches infinity, and when the distance between any particles in the universe approaches infinity momentum approaches zero. Please correct me if I'm wrong.
GR has not been falsified, your assertions that is has are part of what quickly loses you credibility when you talk to other scientists. GR predicts galactic rotation curves just fine under lambdaCDM, which is a model that has been tested including things such as the evolution of the universe over time to form galaxies as we currently observe them. Also, before you mention it, wide binaries have previously been discussed on this site. A recent paper shows that the data is actually consistent with GR, the pattern in the data is related to projection effects.
GR was never proven at the galactic scale. GR is still an unproven theory at the galactic scale. (If it's proven why do scientists spend billions on Dark Matter searches? To prove what?)
Anyway, as I said, you confirm here that you have not actually compared to your theory with any of the standard tests of GR yet. The various claims you make promoting your theory make it sound like it has wider applicability than GR, but in actuality, you have not shown it matches with any of the experimental tests of GR. I (and probably most other people with a relevant background) find this strange, because if someone comes up with a new theory of gravity, the very first question is whether it can predict the results of the various experiments confirming general relativity. Since your theory fundamentally contradicts the most fundamental assumption in GR, this is even more important because it is difficult to see how such a radically different theory can replicate all of the successes of GR.
As MikeMcCulloch has put it. The challenge of #QI to GR is not at high accelerations, but at extremely low accelerations where GR has failed to predict Galaxy rotations without arbitrary additions of Darm Matter. This Dark Matter is a purelly hypothetical substance with no predictable properties. Does not fit the standard model, does not interact with anything (and I don't mean only light, anything, any matter. DM detector experiments were all failures), it's at this point a figment of imagination at best (that is if you're a normal human, not a scientist, and use basic logic here)
Your statement about being independent of reference frame is the basis of why I said it is simple: Your theory breaks causality.
Your claim of frame independence actually brings up one of the issues I didn't address before:
acceleration on its own is not a relativistic invariant, you need to consider the four acceleration, which you do not address in your paper.
Would you provide some arguments to these claims? I'm not a scientist so I would appreciate if you'd detail more what exactly are you referring to here.

meberbs. I have replied to a few of your comments, but as you will see, I stopped when I noticed that your comments were straying towards attacking my motivation & the state of modern journals. If you can't attack QI itself with solid physical evidence and you have to start digging around for other ways to undermine it, then I've learned from many long and painful experiences that the debate quickly deteriorates.
1. The direction of the extra QI acceleration is in the direction of acceleration.
2. The conservation of massenergyinformation is explained in this paper by myself and J. Gine: https://www.worldscientific.com/doi/abs/10.1142/S0217732317501486
3. GR has been falsified. We have to call a spade a spade, and GR has never, ever predicted a galaxy rotation correctly without 'tuning' of the data after the fact (ie dark matter). GR is wrong. It is as simple as that. I've published three papers on galaxy rotation now, and I'll be publishing another paper soon on wide binaries that confirms that.
4. The Pioneer anomaly has been modelled with a complex thermal model with over 2000 finite elements and two adjustable parameters. That's complex in my book.
5. Yes, I have made predictions, but you have to read my papers to see most of them.
6. The 8.8x10^26 m diameter of the cosmos is the currently accepted value for its size, given the need to also include inflation to explain the flatness problem. It is widely accepted, see eg Bars and Terning, 2009. extra Dimensions of Space and Time, Springer.

From my understanding, The acceleration is in the direction o movement. And that direction could be any referential frame you are using, like a gravitational orbit. If theoretically there would be no frame of reference then the direction of acceleration would be Random, like a quantum fluctuation. (This is my understanding at least)
This would contradict McCulloch's statement further down that his theory is reference frame independent. Predicting different acceleration direction depending on what frame you are looking from means your theory is simply inconsistent.
QI redefines momentum as Unruh radiation pressure of Rindler horizon. So of the universe expands the radiation pressure decreases, momentum also decreases. QI ties momentum to Rindler Horison and universe scale. When universe size approaches zero, momentum approaches infinity, and when the distance between any particles in the universe approaches infinity momentum approaches zero. Please correct me if I'm wrong.
This does nothing to address my question, and if accurate demonstrates the lack of conservation of momentum.
GR was never proven at the galactic scale. GR is still an unproven theory at the galactic scale. (If it's proven why do scientists spend billions on Dark Matter searches? To prove what?)
Falsified means proven to be wrong. "Not proven" does not equal "falsified." GR with dark matter successfully explains galactic scale gravity. The properties of dark matter are such that we are certain that it is difficult to detect. It possibly only interacts via gravity, in which case it will never be detected through its gravitational influence. We don't know know if this is the case or not, so scientists look for it (Also, I haven't heard of any dark matter searches costing billions, there are few science projects that get that level of funding, like the LHC or NASA's great observatories, and those generally have other purposes. Sure physicists hope that they might get lucky and find a candidate dark matter particle at the LHC, but it is primarily designed around resolving questions about the standard model of particle physics, which scientists are quite sure is at best incomplete, but experiments keep stubbornly agreeing with it, because so far none have found the scale where it breaks down.)
As MikeMcCulloch has put it. The challenge of #QI to GR is not at high accelerations, but at extremely low accelerations where GR has failed to predict Galaxy rotations without arbitrary additions of Darm Matter.
If that is the goal, then McCulloch's theory amounts to nothing other than an empirical model. The actual useful goal is a theory that has a larger region of applicability than existing theories.
This Dark Matter is a purelly hypothetical substance with no predictable properties. Does not fit the standard model, does not interact with anything (and I don't mean only light, anything, any matter. DM detector experiments were all failures), it's at this point a figment of imagination at best
All wrong, LambdaCDM has specific properties of dark matter specified that predict not just galactic rotation, but also the evolution of the universe and the formation of galaxies. There is also evidence that not all galaxies have the same amount of dark matter in them, though most are very similar as would be expected. As explained above, there are reasons for scientists to believe the standard model is incomplete from within the standard model, not because of dark matter. Also as explained above, dark matter interact with all matter through gravity, but there is no reason to believe that it interacts through any other force (and inherent reasons that it definitely does not interact through electrodynamics, so it is not surprising that it has not been detected directly.)
(that is if you're a normal human, not a scientist, and use basic logic here)
Science is based on basic logic. What you are doing here is effectively claiming that every scientist on the planet is an idiot (at best) One warning:conspiracy theories and insults are not accetable forms of conversation on this site.
Your statement about being independent of reference frame is the basis of why I said it is simple: Your theory breaks causality.
Your claim of frame independence actually brings up one of the issues I didn't address before:
acceleration on its own is not a relativistic invariant, you need to consider the four acceleration, which you do not address in your paper.
Would you provide some arguments to these claims? I'm not a scientist so I would appreciate if you'd detail more what exactly are you referring to here.
The fact that FTL breaks causality is a topic covered in any introductory class on special relativity. Details have been discussed in a separate thread on this site:
https://forum.nasaspaceflight.com/index.php?topic=43385.0

meberbs. I have replied to a few of your comments, but as you will see, I stopped when I noticed that your comments were straying towards attacking my motivation & the state of modern journals.
I did not attack your motivation, period. Anywhere you think I did so, you misread the intent of what I wrote. I would appreciate if you could let me know specifically where this happened, preferably via PM to keep the thread on topic, and I can work to make sure such confusion does not happen again, and modify the original statement if appropriate.
The comment about journals is to inform you of some information that would help you, (among other ways by not sending money to probable scammers.) The existence of predatory journals is a well known fact. Here is the most recent example (https://www.theguardian.com/australianews/2014/nov/25/journalacceptspaperrequestingremovalfrommailinglist) I have seen. (warning: contains repeated profanity)
If you can't attack QI itself with solid physical evidence and you have to start digging around for other ways to undermine it, then I've learned from many long and painful experiences that the debate quickly deteriorates.
I gave direct criticisms of your theory. A few points I made also include suggestions on what you can do to improve the credibility of your theory. As far as the conversation deteriorating when you go to nontechnical attacks, I suggest you review some of what you have posted on social media in light of that. Those specific comments are not appropriate for discussion here. (PM me if you can't figure out what I am talking about.)
1. The direction of the extra QI acceleration is in the direction of acceleration.
When there is no force, this is a tautology and undefined.
2. The conservation of massenergyinformation is explained in this paper by myself and J. Gine: https://www.worldscientific.com/doi/abs/10.1142/S0217732317501486
Thank you, I will review when I have time.
3. GR has been falsified. We have to call a spade a spade, and GR has never, ever predicted a galaxy rotation correctly without 'tuning' of the data after the fact (ie dark matter). GR is wrong. It is as simple as that. I've published three papers on galaxy rotation now, and I'll be publishing another paper soon on wide binaries that confirms that.
You can publish infinite papers on galaxy rotation, but unless you show that galaxy rotation cannot be explained by the existance of matter with a specific set of properties (which includes not interacting electrodynamically), then your statements about falsifying GR are just wrong. As I state in my previous post, there are other reasons the standard model is believed to be incomplete, so not being in the standard model doesn't matter.
Also, before publishing any papers on wide binaries, you should read this paper:
https://arxiv.org/abs/1810.13397
It shows quite clearly that the available data is consistent with GR. The approximation of the relationship between azimuthal velocity and separation as seen from Earth breaks down due to projection effects. The available data is consistent with GR. Not all pairs available have the radial data needed to correct for the projection effects, and given that the result of the shape of the corrected prediction graph depends on the uncertainty in the radial data (which is variable), it seems unlikely that further meaningful tests of whether the data is consistent with GR can be done anytime soon. (Though maybe someone can come up with a clever statistical method to show something.
4. The Pioneer anomaly has been modelled with a complex thermal model with over 2000 finite elements and two adjustable parameters. That's complex in my book.
There is no indication that this model is any more complex than the physical geometry and thermal properties of the spacecraft itself, as I said above, the adjustable parameters exist to allow better modelling for when the complexity of the real spacecraft exceeds the model's capability to reliably predict. No matter how many times you ask for it, modelling the spacecraft as a spherical cow will not give you meaningful results.
5. Yes, I have made predictions, but you have to read my papers to see most of them.
Can you share some? A particularly relevant one I am pretty sure isn't in any paper yet published is the ones for the tweet I linked to. It would be good to see your predictions before any experiment is run.
6. The 8.8x10^26 m diameter of the cosmos is the currently accepted value for its size, given the need to also include inflation to explain the flatness problem. It is widely accepted, see eg Bars and Terning, 2009. extra Dimensions of Space and Time, Springer.
I am not asking about the experimentally measured value of the parameter. I am asking for the derivation of why it shows up in your theory in the form that it does.

You can publish infinite papers on galaxy rotation...
For all of you galaxy rotaters out there:
http://www.illustrisproject.org/media/

meberbs. To take your points backwards, because why not:
6. The cosmic size shows up as a diameter in the derivations, but it does not matter. All I'm saying is that the longest allowed Unruh wave must be twice the cosmic diameter, or 4 times the radius. This is basic cavity physics.
5. For the Madrid experiment the prediction is ~1microN. The Dresden experiment has not yet been built. We expect more like a mN but it will depend on the exact configuration.
4. It is, shall we say, not ideal, to use complex computer models to fit known results. Also, a Pioneer anomaly of the same size was also seen in the Ulysses spacecraft which had a different shape.
3. Wide binary stars simply cannot be modelled with GR, dark matter or MoND. So these hypotheses have been falsified. Also, the galactic problems all start at the exact galactic radius where Unruh waves reach the cosmic scale. That is clear evidence for QI.
2. Do read the paper.
1. When there is no force then I would expect the anomalous acceleration to be towards the cosmic edge.

meberbs. To take your points backwards, because why not:
6. The cosmic size shows up as a diameter in the derivations, but it does not matter. All I'm saying is that the longest allowed Unruh wave must be twice the cosmic diameter, or 4 times the radius. This is basic cavity physics.
Can you point me to a paper where you do this derivation or not?
5. For the Madrid experiment the prediction is ~1microN. The Dresden experiment has not yet been built. We expect more like a mN but it will depend on the exact configuration.
Thank you
4. It is, shall we say, not ideal, to use complex computer models to fit known results. Also, a Pioneer anomaly of the same size was also seen in the Ulysses spacecraft which had a different shape.
You still don't seem to understand, the only fitting done was to known spacecraft temperatures. Besides which, using a model comparably complex to the spacecraft itself is the only way to get real answers. Models like these are used for the design of all spacecraft, they work.
As for the Ulysses spacecraft, using numbers quoted from your blog:
Uluyses:12+/3x10^10 m/s^2 towards the Sun
Pioneer anomaly of 8.74+/1.33x10^10 m/s^2
12 is 25% larger than 9. That is a reasonable difference for different spacecraft with different shapes, masses, and temperature distributions. Meanwhile, do you have calculations showing what you predict for these anomalies?
3. Wide binary stars simply cannot be modelled with GR, dark matter or MoND. So these hypotheses have been falsified. Also, the galactic problems all start at the exact galactic radius where Unruh waves reach the cosmic scale. That is clear evidence for QI.
So I take it you did not read the paper I linked. Wide binary stars were modeled in GR with no issues. Your statements here are wrong. You don't even acknowledge what I wrote, and just repeat your incorrect claims as if it will become true if you say it often enough.
2. Do read the paper.
I will, though so far you aren't showing equivalent consideration on your end. Go read the paper I linked, before you dig yourself into a deeper whole with your false assertions.
1. When there is no force then I would expect the anomalous acceleration to be towards the cosmic edge.
That is a nonstatement, the "cosmic edge" is in all directions.
Meanwhile, I have now mentioned twice about a simple, straightforward mathematical error in your paper (The part about Taylor series). I have not considered the consequences of doing the math correctly, since that is your job, it might even support your point, but you have completely ignored 2 mentions of it. Between this and you ignoring the paper I linked to, you aren't showing much interest in having a conversation or considering that you might actually be wrong.

A deeper whole...
We all do get to tease you from time to time.

meberbs. There's nothing wrong with my binomial approximation. For v=0.9c the error is about 30%, but that is compared with a predicted difference of 22 orders of magnitude or 1000,000,000,000,000,000,000,000% !!!

meberbs. There's nothing wrong with my binomial approximation. For v=0.9c the error is about 30%, but that is compared with a predicted difference of 22 orders of magnitude or 1000,000,000,000,000,000,000,000% !!!
30% is well outside the range where a Taylor series is considered a useful approximation.
You state in your paper "when v is higher still, the effect of QI decreases even further." Here you are taking the limit as the error in the approximation you are using is approaching either 100% or infinity % depending on which way you define your error. As I pointed out, the relevant velocity in the LHC should be somewhere around 0.99999999 c. At that point the Taylor series is off by orders of magnitude.
There doesn't seem to be any actual point to your taking of a Taylor series, since sqrt(1x^2) is not a difficult formula to calculate. I have no clue why you seem so resistant to fixing this simple mistake. Also, it seems strange that you would take the ratio of the relativistic contribution and your contribution to determine detectability. The only thing that should matter is the magnitude of the your effect. Your modification is multiplied by the total relativistic mass, which directly changes the energy and momentum, and that would potentially be detectable when they do the careful energy and momentum balances of the results of particle collisions, or possibly through issues with the beam timing. I don't know if it would be detectable or not since you don't actually consider the sensitivity of possible detection mechanisms relative to the magnitude of your effect.

I have now gone through the paper on quantized inertia and conservation of energy:
I find issues with the justification section for the new form of the energy balance, but I won't address those directly since a better justification can be written if it turns out the proposed new conservation law works as advertised.
The proposal is in equation 2:
m_{1}c^{2} + kTN_{1} ln 2 = m_{2}c^{2} + kTN_{2} ln 2
I assume that the m_{1} and m_{2} terms are relativistic mass rather than rest mass, to be a full account of massenergy, including kinetic energy.
An interesting thing to note about the above equation is the k*N*ln(2) potion of the energy released into entropy terms. this can be rearranged to say k*ln(2^{N}) = k*ln(Ω ) = S. In this equation Ω is simply the number of available microscopic states, and S is simply the total entropy of the system. The first step is clearly the origin of Landauer's principle, as all it does is convert the number of states into units of bits. The resulting equation is literally the definition of entropy. This is where it begins to look like there may be a major problem, since entropy can never decrease, massenergy must continuously be decreasing in the universe. This is disturbing, but lets work an example to see how this works out and see if it can make sense.
First I will rewrite the above equation based on the definition of entropy. This simplifies things, so I can just reference standard deltaentropy calculations and avoid the cosmological horizon method of calculating entropy used in the paper (which I have doubts about but again will skip over since it does not affect what I am doing.)
m_{1}c^{2}+ S_{1}T = m_{2}c^{2}+ S_{2}T
As in McCulloch's equivalent of this equation I simply will assume that temperature is a constant everywhere for the purposes of this example.
Now to setup the system:
Lets stake a simple "rocket" that is 2 masses, a large mass M_{1} and a much smaller mass M_{2}, separated by a (nearly massless) spring, which for good measure we attach to the large mass. When a latch is released, they push away from each other with a constant force F for a distance d, (the spring has a very special design so that I don't have to deal with variable acceleration.)
The energy stored in the compressed spring, F*d is added to the rest mass of the large mass (M_{1}+F*d)*c^{2}
Using standard physics, the result is as follows: (skip ahead for the version using McCulloch's theory)
initial energy:
(M_{1}+F*d/c^2)*c^{2} + M_{2}*c^{2}
The equation of motion using special relativity is given here (https://xphysics.wordpress.com/2010/11/07/relativisticaccelerationduetoaconstantforce/) I could skip this, since to make McCulloch's effect significant a low acceleration and low velocity limit is appropriate. Additionally, the m_{0} in the linked equations for the large mass actually should account for the decreasing rest mass due to the release of the potential energy in the spring. This I will neglect, for similar reasons to the spring itself originally being essentially massless (don't want to have to account for the different parts of the spring having different velocities)
defining A = M_{1}*c / F, and B = M_{1}*c / F
You can solve for t (https://www.wolframalpha.com/input/?i=d+%3D+(c%2FA)*(sqrt(1%2BA%5E2+t%5E2)1)%2B(c%2FB)*(sqrt(1%2BB%5E2+t%5E2)1)+solve+for+t)
d = (c/A)*(sqrt(1+A^2 t^2)1)+(c/B)*(sqrt(1+B^2 t^2)1)
The result is a bit messy in the full relativistic form, but it is what it is. At this point, it is a it easier to treat t as the input value, with d being calculated from t as above.
The momentum of each mass after a time t is simply p=F*t, both masses have equal and opposite momentum as expected due to momentum conservation.
The total energy of the system after acceleration is:
sqrt(p^{2}c^{2}+M_{1}^{2} c^{4})+sqrt(p^{2}c^{2}+M_{2}^{2} c^{4})
This looks fairly different from the original expression for the energy, but lets plug in the expression for d in terms of t into the original expression, multiplying through the F/c^2, this causes terms qual to M_{1} c^{2} and M_{2} c^{2} to nicely cancel things out
M_{1} c^{2}sqrt(1+(F t/(M_{1} c))^{2}) + M_{2} c^{2}sqrt(1+(F t/(M_{2} c))^{2})
Pulling the m c^{2} into the sqrts, and you can see you get the same energy as the final result.
Now lets try that again using McCulloch's theory
First problem: what is the starting energy?
This is basically a great big divide by zero error. McCulloch gets around this by deciding that there is a minimum acceleration, in some direction he has yet to define. At this minimum acceleration value, the (12c^{2}/(aΘ)) is equal to 0. McCulloch multiplies this by relativistic mass in his recent paper, so it is quite clear that this multiplies the portion of the rest mass of the large mass that is due to the potential energy of the spring. That makes the initial energy very easy to calculate:
T*S_{1}
With no rest mass, there is no kinetic energy, and there is no potential energy either when dealing with relativity since then you would have rest mass.
Since in this state there is an acceleration happening in an unknown direction, lets move on to the instant after the latches are released, before the systems starts accumulating any significant velocity:
Now the energy in the system is:
(12c^{2}/(a_{1}Θ))*(M_{1}+F*d/c^2)*c^{2} + (12c^{2}/(a_{2}Θ))*M_{2}*c^{2} + T*S_{1}
We can assume the latches are small and reversible, so entropy hasn't changed yet, though the spring is about to start doing finite work in finite time, which will increase entropy. Already this proposal for a conservation law fails, it just proved to describe a nonconserved quantity.
[I will come back later to address the state of the system during the last instant of acceleration, for now note that the extra factor applied to the potential energy of the spring will prevent the cancellation that made the case come out nicely in standard physics. Also, note that in now matters significantly which mass the nearly massless spring is glued to, because it has a different acceleration environment in each case]
Moving on to just after the last instant of acceleration, again rest masses drop back to 0, however the total massenergy information is now:
T*S_{2}
This is increased from the original state, again demonstrating that Equation 2 of the paper is wrong and does not describe a conservation law.

To quote meberbs: "This is where it begins to look like there may be a major problem, since entropy can never decrease, massenergy must continuously be decreasing in the universe".
I appreciate the thought that went into the derivation, but it is wrong from the beginning because you assumed that entropy is solely related to the arrangement of bits on the horizon (the kTNln2 term). This is wrong, because the entropy that has been proven to increase is the entropy of the massenergy part that we can see/detect (the other term), and your derivation ignores its contribution to entropy completely. The derivation is otherwise interesting, but needs rehashing.

To quote meberbs: "This is where it begins to look like there may be a major problem, since entropy can never decrease, massenergy must continuously be decreasing in the universe".
I appreciate the thought that went into the derivation, but it is wrong from the beginning because you assumed that entropy is solely related to the arrangement of bits on the horizon (the kTNln2 term). This is wrong, because the entropy that has been proven to increase is the entropy of the massenergy part that we can see/detect (the other term), and your derivation ignores its contribution to entropy completely. The derivation is otherwise interesting, but needs rehashing.
What you just stated is inconsistent with the definition of N in your paper. you defined it as:"for a system that changes its information content from N_{1} to N_{2}" The information content of the system is related to its actual physical entropy, and it is not related to the number of Planck areas on the cosmological horizon. This is obvious because a system of 2 atoms will generally have much lower information content than a system of 1000 or 6.022*10^23 atoms. Te information content actually within the system at least is a reasonable definition fo something to try to fit into a conservation law, the new definition you just proposed couldn't even be applied to the situation I just described because different accelerations are present within the system, so there is no single way to do an equivalent of what you do in the paper and apply it to the system.
I still intend to show the step I skipped over, since I believe that will be interesting, but it seems clear at this point that you don't actually have a solution to conservation laws (energy or momentum) within your theory. Your paper essentially seems to attempt to rederive your theory using a different method, and even then , you only get something similar, not the same. What you actually need to do is a calculation like the ones I did to show that you have working conservation laws. (Actually a general proof would be preferred, but a good example is a good place to start.) Ultimately this is your responsibility to show, as it is your theory. I see no reasonable way to fix your theory, so I can't help with that.
There is a simple prerequisite in science for new theories: they have to be consistent with what we already know. That means for a theory like yours, it has to match conservation laws, and it has to be able to replicate the standard tests of GR as absolute minimum requirements (with multiple gravitational wave detections confirmed now, that is another test of GR that will become standard). You have been promoting your theory for a while but you haven't even started to meet these.

"There is a simple prerequisite in science for new theories: they have to be consistent with what we already know. "
Politely disagree. There is a prerequisite in science for new theories be consistent with the observed universe, nothing else.

"There is a simple prerequisite in science for new theories: they have to be consistent with what we already know. "
Politely disagree. There is a prerequisite in science for new theories be consistent with the observed universe, nothing else.
I don't understand the point of disagreement. I thought context implied "things we know about the observed universe" rather than "things we know about politics," which clearly would be irrelevant. Meanwhile just saying "consistent with the observed universe" seems like it would be too difficult to show, because there are things within our powers of observation that we don't "know" meaning that there is room for experimental uncertainty, or just that we don't know how to consistently combine certain experiments in any theory. The minimum should only be as good as current theories, not better, though you can't expect existing theories to be abandoned unless the new one is clearly better.
Do you have any issue with the specific examples I gave for this case? If not, then we are probably just using different words to communicate the same idea. I went with giving a specific example, rather than trying to precisely define what I meant by "know" in this context.

Theories are only as good as their success with describing we can observe in the universe, nothing else matters. As an obvious example GR is not consistent with Quantum Mechanics, but both describe aspects of the universe to great fidelity. It is not a requirement that any new theory is consistent with current theory, only that it is a better description of the universe. Should such a theory emerge it would make clear where older theories were in error. To assume current theories are the yardstick against which new theories should be measured is not useful.

"There is a simple prerequisite in science for new theories: they have to be consistent with what we already know. "
Politely disagree. There is a prerequisite in science for new theories be consistent with the observed universe, nothing else.
"what we know" and "what we have observed' are basically the same thing.
A scientific Theory is not the same thing as a "theory" to a layperson, where a "theory" is synonymous with a guess, or speculation.
A scientific Theory is a framework that is used to explain and understand what is observed in that field of science.
So if someone wants to create a new Theory to replace an existing one, then it has to produce results that are either  at the very least  consistent with what the previous Theory produces (that is to say, its predictions for known values must match), or it must produce results that more accurately match observations than the previous Theory's results. If the predictions made by the new theory do not match what is observed, then the new theory is probably not useful.

"So if someone wants to create a new Theory to replace an existing one, then it has to produce results that are either  at the very least  consistent with what the previous Theory produces (that is to say, its predictions for known values must match),"
No
" or it must produce results that more accurately match observations than the previous Theory's results. "
Yes
"If the predictions made by the new theory do not match what is observed, then the new theory is probably not useful."
Yes

"So if someone wants to create a new Theory to replace an existing one, then it has to produce results that are either  at the very least  consistent with what the previous Theory produces (that is to say, its predictions for known values must match),"
No
*snip*
Yes.
A new theory must, at the very least, produce results as good as an existing Theory. If it is producing results that do not match established values, we can be pretty confident in saying it is not correct.

This is becoming repetitive so this will be my final comment on the topic. There is no requirement for a new theory to meet the predictions of any existing theory. It is only necessary for it to describe the universe more accurately.

This is becoming repetitive so this will be my final comment on the topic. There is no requirement for a new theory to meet the predictions of any existing theory. It is only necessary for it to describe the universe more accurately.
I can think of several examples where that's not true, but OK.
For example, there have been plenty of theories which would eliminate Dark Matter and better match observations, but fail because they do not accurately describe some other wellestablished aspect of physics.

It is not a requirement that any new theory is consistent with current theory, only that it is a better description of the universe.
No one has said anything different than that. New theories don't have to be consistent with the previous theories, but they do have to be consistent with the predictions of the previous theories that have been experimentally confirmed. Actually there is a slight problem with your statement, since you are leaving the threshold at "better," when a theory only needs to be "as good."
"So if someone wants to create a new Theory to replace an existing one, then it has to produce results that are either  at the very least  consistent with what the previous Theory produces (that is to say, its predictions for known values must match),"
No
" or it must produce results that more accurately match observations than the previous Theory's results. "
Yes
You still don't seem to be understanding, since you just contradicted yourself. How can a theory more accurately match observations, if it doesn't make predictions that are consistent with the previous theory in regimes where there is no discernible difference between the previous theory and observations?
As far as I can tell we are in violent agreement, but you haven't recognized yet that we are using different words to say the same thing.
Also, often, the simplest way to show that a theory is matches known results is to show that it simply reduces to the previous theory in some limit. This can be done for special relativity for example, where it reduces to Newtonian mechanics in the low velocity limit, which covers basically all experimental results before it was developed. For something like McCulloch's theory, it is clearly inconsistent with the basic principles of GR, so instead it takes a lot more work to show that his theory can predict results such as the precession of Mercury and other things that GR predicts to within experimental error. Again, since you keep misinterpreting this, this is NOT saying it has to meet these predictions solely because GR makes them. It is required because we have experimental and observational evidence of these phenomena. The reference to GR is for 2 reasons: 1. "Standard tests of GR" is a convenient shorthand that anyone in the field would recognize without detailed description. and 2. This limits it to things that we already have a good theory for. A new theory does not have to explain dark energy, inconsistencies in measurements of the cosmological constant, or other observations that we don't have an accepted explanation for. It is only expected to be as good as current theories, better is nice but should not be required.

I disagree with meberbs comment in the above debate that, to slightly condense it: "new theories have to be consistent with what we already know. That means a new theory has to match conservation laws". I agree with SteveKelsey that conservation laws are not empirical data. Take, for example, heat. At first the generation of heat through friction seemed to break the conservation laws, until those laws were extended. QI suggests we need to extend the conservation laws to include information, and the data (eg QI predicts galaxy rotation) backs that addition.
The only important criterion for selecting theories is that they predict nature 'before the observations are made'. Otherwise they are not useful. They have to be predictive. General relativity does not do that in low acceleration regimes (must of the cosmos). For example, you have to use GR to predict galaxy rotation speeds, then notice how wrong it is (usually by a factor of ten!) and add the right amount of invisible dark matter arbitrarily to make it work. No clearminded scientist can be happy with that. GR cannot predict most of nature without looking at it first! However. GR does predict well at high acceleration and of course QI has to compete there as well, eg light bending by the Sun. I am working to see if QI also does that for a workshop in Prague, oddly enough.

I disagree with meberbs comment in the above debate that, to slightly condense it: "new theories have to be consistent with what we already know. That means a new theory has to match conservation laws". I agree with SteveKelsey that conservation laws are not empirical data. Take, for example, heat. At first the generation of heat through friction seemed to break the conservation laws, until those laws were extended. QI suggests we need to extend the conservation laws to include information, and the data (eg QI predicts galaxy rotation) backs that addition.
You claim to be disagreeing with me, but nothing you said actually disagrees with my point. You need conservation laws, no one said they have to look exactly the same (I would use electromagnetism and special relativity as an example, if there was ever any confusion about friction and thermal energy, it happened long enough ago that no one really cares anymore.) Noether's theorem is the mathematical proof that a reasonable consistent theory should obey some form of conservation laws. As I already demonstrated in this thread, your theory currently does not provide a working alternative to conservation of energy, let alone momentum. The fact that you attempted to find a way to rescue conservation of energy shows that you have at least some understanding of the importance of maintaining some form of conservation laws.
The only important criterion for selecting theories is that they predict nature 'before the observations are made'.
That is a terrible criteria, under either of the 2 readings of it I can think of.
Under the first way I see it, it means that your theory fails always and forever, because you came up with it in response to observations of galaxy rotation and not the other way around. This obviously shouldn't be an actual criteria, because theories being modified or developed in response to unexpected observations is a completely valid order of events.
The other way, which seems to be what you meant based on the rest of your post isn't any better. No theory exists that can make predictions without observational data on the system. You can't predict the motion of an electron in an electric field without knowing its charge/mass ratio. You can't predict the orbit of the moon around the Earth without first measuring the mass of the Earth, even allowing you to know the starting positions and velocities. This is a direct analogy with dark matter. Out best estimates of the mass of the Earth (and other large bodies in the Solar system) come from measuring orbits. If there is additional matter that can't easily be seen around galaxies, then of course you have to measure that somehow. Even if it was just regular old dust and gas, the best way to measure the total amount would still end up being observing its gravitational effects. For dark matter, we know there would have to be enough to at least have some detectable effect on the light passing through if it was regular matter, but even imagining an alternate universe where it was regular matter, we wouldn't be able to measure the total amount very well except through gravitational effects. Meanwhile, you still haven't pointed me to an actual derivation of how the Hubble diameter ends up in your equation, which leaves me with nothing but your word that it wasn't picked to tune the numbers to match the acceleration you wanted. (This matters less for the validity of your theory than for the potential hypocrisy in your complaints about GR.)
However. GR does predict well at high acceleration and of course QI has to compete there as well, eg light bending by the Sun. I am working to see if QI also does that for a workshop in Prague, oddly enough.
The only odd thing is how many years you have spent on your theory without doing such a basic test. After that, there is gravitational redshift (In GR this also covers time dilation as it affects GPS satellites), and the precession of Mercury before you can reasonably start claiming that you have a possible alternative theory. Of course you really should also cover more modern tests like frame dragging and gravitational waves as well.
Actually I do have some idea why you haven't shown your theory actually matches the above listed experiments, it is because you seem to spend all of your time and energy trying to discredit GR, with what amounts to really weak arguments, rather than trying to actually demonstrate that your theory works. The way you replace a theory in physics is by showing that the new one is better, not that the old one is wrong. I know someone who really likes the quote "all models are wrong; some are useful." You have yet to actually demonstrate that yours is useful. Scientists already know that GR is wrong at some level (though generally it isn't in many of the ways you try to claim.) At this point you have dug yourself into a hole, especially with things like the Pioneer anomaly. (No really, read my previous posts about it again, and don't bother complaining about the resolution unless you are willing to first learn something about standard thermal modelling techniques.)

“No theory exists that can make predictions without observational data on the system.”
Politely disagree.
Dirac predicted the existence of antimatter as a consequence of his proposed theory of relativistic quantum mechanics. The ‘negative energy’ aspect of his theory fell out of the math. Contemporaries of Dirac thought this disproved his theory because no such phenomena had been observed, but Dirac insisted the math was so elegant it must be correct.
“After ruling out the possibility that this particle was simply the proton – which has a hugely greater mass – Dirac predicted the existence of a new particle with the same mass of the electron but with a charge that was positive rather than negative.
That particle was found experimentally on 2 August 1930. Carl Anderson was observing the trails produced in the particle shower that was created in his cloud chamber when cosmic rays passed through it. His observations included a particle with the same mass as the electron but the opposite charge – its track bent in the “wrong” direction in a magnetic field. Anderson coined the name “positron” for his new discovery.
In 1933 Dirac went on to predict the existence of the antiproton, the counterpart to the proton. It was discovered in 1955 by Emilio Segrè and Owen Chamberlain at the University of California, Berkeley.”
From https://www.iop.org/resources/topic/archive/antimatter/index.html

To quote meberbs "You claim to be disagreeing with me, but nothing you said actually disagrees with my point. You need conservation laws, no one said they have to look exactly the same"
OK, we agree then. We can change the conservation laws.
"if there was ever any confusion about friction and thermal energy, it happened long enough ago that no one really cares anymore."
I was trying to make a historical analogy, but you apparently don't like history. You should, because history provides real data on which scientific attitudes work and which don't.
"That is a terrible criteria"
No, predictability is the best criteria we have. If science does not predict what we do not yet know then it is worthless.
"You can't predict the orbit of the moon around the Earth without first measuring the mass of the Earth"
Most of your points are misunderstandings of what I am saying. I'm saying a theory should be able to predict observations before we have them. This means a theory should be able to predict the orbit of the Moon without us having to observe the orbit of the moon. Of course, we need to know some things such as the Earth's mass, but that is not what we are trying to predict!
"The only odd thing is how many years you have spent on your theory without doing such a basic test"
I started out by focusing on the observations that physics could not predict, all of them at low accelerations, such as galaxy rotation. This is quite a normal empirical attitude, and I assumed that GR was still valid in high acceleration regimes for which the effect of QI are predicted to vanish. I am moving to the view that GR is wrong in principle even at high accelerations..
"You have yet to actually demonstrate that yours is useful"
QI is already useful. It predicts galaxy rotation whereas no other theory can (without fudging). Of course, that use is quite academic I suppose, and QI is not yet directly useful in the sense that it can power a car or propel a spacecraft, but that is a future possibility.

“No theory exists that can make predictions without observational data on the system.”
Politely disagree.
Dirac predicted the existence of antimatter as a consequence of his proposed theory of relativistic quantum mechanics. ...
You seem to have missed the point I was making. We could have a long detailed discussion about whether or not the special case you brought up counts as an exception to my statement. This would generally be a waste of time though, because none of the reasons that you could consider that an exception apply to any of the theories that are on topic for this thread.

To quote meberbs "You claim to be disagreeing with me, but nothing you said actually disagrees with my point. You need conservation laws, no one said they have to look exactly the same"
OK, we agree then. We can change the conservation laws.
And I would be quite interested if you can propose some new conservation laws that you can demonstrate actually equates to something being conserved. (At some point when I have time, I intend to go back to that step I skipped in my previous post, but that is mostly for fun, it already is clear that your proposal doesn't work as a conservation law.)
"if there was ever any confusion about friction and thermal energy, it happened long enough ago that no one really cares anymore."
I was trying to make a historical analogy, but you apparently don't like history. You should, because history provides real data on which scientific attitudes work and which don't.
And now you move into personal attacks, falsely claiming that I don't like history, even after I suggested a historical example that I felt was more relevant than yours. My statement about debates about friction and thermal energy was a general one, not a personal one. Of all of the history lessons that got embedded in science classes (and there were quite a good amount) debate about friction and thermal energy never made the cut, so it doesn't seem to be one that is generally considered notable.
"That is a terrible criteria"
No, predictability is the best criteria we have. If science does not predict what we do not yet know then it is worthless.
Cool, GR met that criteria easily, your theory doesn't. The criteria I gave should be a lower bar than yours taken literally.
"You can't predict the orbit of the moon around the Earth without first measuring the mass of the Earth"
Most of your points are misunderstandings of what I am saying. I'm saying a theory should be able to predict observations before we have them. This means a theory should be able to predict the orbit of the Moon without us having to observe the orbit of the moon. Of course, we need to know some things such as the Earth's mass, but that is not what we are trying to predict!
You think I am misunderstanding? You seem to have a major misunderstanding: We don't know the Earth's mass independently. The best measurement we have of it backs the mass out from gravitational effects, not the other way around. You are asking for the impossible.
"The only odd thing is how many years you have spent on your theory without doing such a basic test"
I started out by focusing on the observations that physics could not predict, all of them at low accelerations, such as galaxy rotation. This is quite a normal empirical attitude, and I assumed that GR was still valid in high acceleration regimes for which the effect of QI are predicted to vanish. I am moving to the view that GR is wrong in principle even at high accelerations..
A key word you just used "empirical." Your theory is at the end of the day is purely empirical. That means it describes galactic rotation, but it does not predict them. I have lost count of how many posts I have written that implicitly or explicitly asked you for a derivation of the constant term in your equation. You have yet to provide anything resembling a derivation. Based on your repeated refusal to provide this derivation, for now I can only assume that this is just an invented number that you tied into some properties of the universe that gave about the right answer.
Your theory is sufficiently different from GR that for your theory to be correct, your statement about GR being wrong in principle at "high accelerations" would have to be correct. The problem is that GR has in fact made a number of predictions that have been confirmed by later experiments. Until and unless you can at least show your theory actual also can replicate those results at a minimum, all of your talk about GR being wrong and your theory being great amounts to nothing but misleading rhetoric.
"You have yet to actually demonstrate that yours is useful"
QI is already useful. It predicts galaxy rotation whereas no other theory can (without fudging). Of course, that use is quite academic I suppose, and QI is not yet directly useful in the sense that it can power a car or propel a spacecraft, but that is a future possibility.
You seem to be missing what useful was intended to mean in this context. It doesn't need to have immediate practical application (little of GR does) but it needs to provide value as a theory. Empirical models have their uses too, but they are different than a theory. Your claims so far are more at the level of an empirical model than a theory. You continuing to refer to dark matter as a "fudge factor" amounts to simply ignoring the details of the actual current model (lambdaCDM) which does a lot more than just predict galaxy rotation curves. Your theory honestly looks like just a fudge factor to me. I have already listed the basic expectations if you want to move your work from the level of limited empirical model to general theory, it is up to you to do them.*
*Clarification on this point for any who may need it: What I have asked from McCulloch is something that should be expected to take a while if he hasn't already done the work (which it appears he hasn't.) The only thing that should reasonably be expected in the near term is for him to tone down his claims until he has actually done the work to support them.

I think we need to give Dr McCulloch time and space to develop his theory. To expect him to deliver a theory that instantly meets all the criteria that fully developed theories describe is to set too high a standard. After all, even Einstein could not meet this standard.
Special relativity was published in September 1905 after many years effort including the contributions of Poincare and Lorentz. It was an incomplete description as it omitted gravity and GR followed ten years latter in 1915.Despite its beauty and power it was also incomplete. In fact it had a serious problem in that energy was not conserved. This would be a flaw that you might be tempted to use to reject GR had you been an interested engineer in the 1900’s. As you know Emmy Noether rescued GR in 1919.
So it took from circa 1900 to 1919 for GR to emerge as the description of reality that we understand today.
In a way we are very lucky. Via the internet we can observe the development of new theories as they are formed, something that Einstein and Noether were not exposed to.
While I am at it, my Dirac example it is not as special a case as you suggest. I note below a brief list of phenomena that “fell out of the math” and could not be based on observation as, like antimatter, they were not described by earlier theories and were completely unanticipated by them.
Special Relativity
length contraction,
time dilation,
relativistic mass,
mass–energy equivalence,
a universal speed limit,
General Relativity added
Black Holes
Singularities
Event Horizons
Wormholes
Closed timelike curves
I will not make a list but just note that solving that irritating blackbody radiation problem at the end of the 19th Century led to a vast array of quantum mechanical phenomena that had not been observed before.
I think you might find it rewarding to read more about how science actually advances.
You may bridle at his criticism of GR, but I think he is no more critical than the more enquiring physicists were of the luminous aether theories that thrived before Special Relativity consigned them to history. In fact, if you read the history of that time there are strong parallels between the luminous aether debate and dark matter now. Dr McCulloch is far from being alone in believing Dark Matter is an unhelpful cutdesac. Dr McCulloch instincts I find to be sound. He is exploring the territory the current physics cannot explain in the search for better theory. This is an attempt to be encouraged, critiqued of course, criticism is an essential part of the scientific method, but let’s be realistic about what we can expect at this time in the development.
As for relevance, we are in the New Physics for Space Propulsion thread discussing Dr McCulloch’s theory. I think all the contributions are relevant

I was asked to post the following link here as well, as it is relevant for this thread.
FYI
https://www.researchgate.net/publication/334987450_A_sceptical_analysis_of_Quantized_Inertia

I was asked to post the following link here as well, as it is relevant for this thread.
FYI
https://www.researchgate.net/publication/334987450_A_sceptical_analysis_of_Quantized_Inertia
It should be noted that Dr. McCulloch has referred even on Twitter that reference, when It was first published on viXra. It was the act of intellectual honesty.
Also, I have had original article in my hand on QI superluminal travel and asked my opinion on it in JSE. Dr. McCulloch has put that QI implication even though he did not need to. Who would put FTL implication of his theory in its development? Also which journal would published such implication? This approach deserves to be researched further. In mainstream theories there are also issues. This framework deserves research!
Regards

It should be also noted that he was the reviewer of that paper, so if he wanted he could have reviewed it negatively (blocking the publication of this paper in that journal) or radically influence the content of the paper and its wording (that actually could have been good, as the paper includes two adjustments of the theory, which can influence it in special circumstances, but in the introduction section it says that it may invalidate the theory, which does not seem to be true, it is an exageration).

Also, I have had original article in my hand on QI superluminal travel and asked my opinion on it in JSE. Dr. McCulloch has put that QI implication even though he did not need to. Who would put FTL implication of his theory in its development? Also which journal would published such implication? This approach deserves to be researched further. In mainstream theories there are also issues. This framework deserves research!
Any honest scientist who actually has a theory that includes FTL would say so when they publish it, of course most would triple check their theory first as the presence of FTL likely means that it does not match reality. Certain forms of FTL could potentially exist, and it is known that GR allows for such (though generally in nonachievable situations) so journals would not necessarily reject a paper for this, if it was addressed in an appropriate manner. Of course in this case, as I mentioned up thread, McCulloch has been using a journal known to act as a predatory journal, probably having no peer review even if they claim to. Publishing in known predatory journals is not recommended, it is just a way to fund scammers.

It should be also noted that he was the reviewer of that paper, so if he wanted he could have reviewed it negatively (blocking the publication of this paper in that journal) or radically influence the content of the paper and its wording (that actually could have been good, as the paper includes two adjustments of the theory, which can influence it in special circumstances, but in the introduction section it says that it may invalidate the theory, which does not seem to be true, it is an exageration).
No, reviewers do not generally have the level of power you ascribe to them. You may be confused with editors, or people who actually work at the journal in question.
The paper is fully justified in claiming "Such flaws, if they do not invalidate, at least will require a major rethinking of the whole theory" This, by the way, is from the conclusion, not the abstract or the introduction. They found very fundamental problems, and results that show things that McCulloch has been treating as basic principles of his theory, such as a "minimum acceleration" do not actually exist. Based on this, absolutely no claim from McCulloch to date can be considered as valid, all of the work needs to be redone.
What they show actually has potential to resolve some of the issues in the theory that I brought up in this thread, as it does away with the fundamental illogical situations caused by the insistence on a minimum acceleration. It seems much more likely to be possible to write down a sensible set of conservation laws in this situation among other things. It still remains McCulloch's responsibility to go through this work and update his theory and claims.

I did not say that they have formally that power, but they have that power in practice, because editors usually do not publish negatively reviewed papers. So as I said, if he wanted he could have blocked this publication in that journal by negatively reviewing it. He did not do that though.
He is aware, that he needs to implement some corrections:
https://twitter.com/memcculloch/status/1175442826169069568
He claims they do not invalidate QI:
https://twitter.com/memcculloch/status/1159026047406546944
https://twitter.com/memcculloch/status/1157685617339379712

Also, I have had original article in my hand on QI superluminal travel and asked my opinion on it in JSE. Dr. McCulloch has put that QI implication even though he did not need to. Who would put FTL implication of his theory in its development? Also which journal would published such implication? This approach deserves to be researched further. In mainstream theories there are also issues. This framework deserves research!
Any honest scientist who actually has a theory that includes FTL would say so when they publish it, of course most would triple check their theory first as the presence of FTL likely means that it does not match reality. Certain forms of FTL could potentially exist, and it is known that GR allows for such (though generally in nonachievable situations) so journals would not necessarily reject a paper for this, if it was addressed in an appropriate manner. Of course in this case, as I mentioned up thread, McCulloch has been using a journal known to act as a predatory journal, probably having no peer review even if they claim to. Publishing in known predatory journals is not recommended, it is just a way to fund scammers.
One of the editors of that journal is Tajmar  you have some guts to call him a scammer.

as the presence of FTL likely means that it does not match reality.
I am not surprised that he derived nonlocality because his theory has basics in QM and Casimir effect. I am proponent of the explanation of Casimir effect by van der Waals forces (e.g. Nikolic (2016) @ https://arxiv.org/abs/1605.04143) because it is from microscopic perspective and relativistic process. But when you apply uncertainty principle to photons, the locality should be violated at small distances, so I stay open.
Zlatan

Of course in this case, as I mentioned up thread, McCulloch has been using a journal known to act as a predatory journal, probably having no peer review even if they claim to. Publishing in known predatory journals is not recommended, it is just a way to fund scammers.
One of the editors of that journal is Tajmar  you have some guts to call him a scammer.
Are you claiming that Tajmar is an editor for Trade Science Inc journals, the place McCulloch has published his papers? I have not seen a list of their editors, but I doubt that. I am not the one who originally identified it as predatory, I just found a list:
A known problem with modern scientific publishing is that there are a large number of predatory journals out there that take advantage of the pressure for academics to publish and charge high fees for publication, often not performing proper peer review even if they claim to. Someone went to the trouble to compile a list and it was no surprise to find the one you are using is one of them: https://scholarlyoa.com/publishers/
I did not say that they have formally that power, but they have that power in practice, because editors usually do not publish negatively reviewed papers. So as I said, if he wanted he could have blocked this publication in that journal by negatively reviewing it. He did not do that though.
"He didn't actively do something extremely dishonest" is not high praise. If he had any criticisms of the paper he should have shared them during peer review, that is what it is for. Corrections during peer review are expected, not automatic rejection.
He is aware, that he needs to implement some corrections:
...
He claims they do not invalidate QI:...
Those tweets do not give me confidence that he actually understands the conclusion of the paper. It doesn't even make sense to claim that the corrections are "untested." They are corrections of significant mathematical flaws in his theory, and contradict things he has been claiming as fundamental results. His theory itself already has been experimentally disproven. This would be a great chance for him to reset, throw out all of his old predictions and start over with something that has a non zero chance of matching reality, but I get the impression he has no intention of doing so.

as the presence of FTL likely means that it does not match reality.
I am not surprised that he derived nonlocality because his theory has basics in QM and Casimir effect. I am proponent of the explanation of Casimir effect by van der Waals forces (e.g. Nikolic (2016) @ https://arxiv.org/abs/1605.04143) because it is from microscopic perspective and relativistic process. But when you apply uncertainty principle to photons, the locality should be violated at small distances, so I stay open.
Zlatan
Quantum nonlocality has special caveats that prevent it from being "true" FTL. McCulloch's replies in this thread make it clear that his FTL is frame independent, which means it absolutely is in the category of breaking causality. He is clearly not talking about quantum effects, but actual FTL motion of real particles.

:D
it absolutely is in the category of breaking causality.
Yes. So far all his theory achievements fall with this implication. Even with his shift to information theory I do not see how he could handle this problem, but I will continue to follow his research.
Zlatan

Are you claiming that Tajmar is an editor for Trade Science Inc journals, the place McCulloch has published his papers? I have not seen a list of their editors, but I doubt that.
Yes, Tajmar is an editor of the Journal of Space Exploration:
https://www.tsijournals.com/journals/journalofspaceexplorationeditors.html
I believe that McCulloch shared his criticism during peer review, although perhaps not sufficiently IMHO.
It does make sense to claim that the corrections are "untested" from his point of view, because he said in one of his tweets that if his theory is corrected with Rendo's corrections and it won't agree with data, then these corrections are wrong.

It does make sense to claim that the corrections are "untested" from his point of view, because he said in one of his tweets that if his theory is corrected with Rendo's corrections and it won't agree with data, then these corrections are wrong.
That statement is simply untrue. His theory is wrong today, both because of these corrections not being present, and because it does not match experimental data. (I have listed other issues in this thread that have not been addressed as well.)
If his theory does not match experimental data when the corrections are applied, it does not mean that the corrections are wrong, it means that his theory is wrong. Concluding that the corrections are wrong is only possible by pointing out errors in the math of the corrections.

If his theory is wrong then why "the observed cutoff acceleration of galaxy rotation makes it obvious that quantised inertia is the cause. It's a smoking gun obvious to all who look at the data: http://physicsfromtheedge.blogspot.co.uk/2016/06/asmokinggunineverygalaxy.html "
https://twitter.com/memcculloch/status/1236339527779827714
https://twitter.com/memcculloch/status/1185567678003662848
His views on maths:
https://twitter.com/memcculloch/status/1219317672862736389
https://twitter.com/memcculloch/status/1237287901995700224
https://twitter.com/memcculloch/status/1146484641903386624

If his theory is wrong then why "the observed cutoff acceleration of galaxy rotation makes it obvious that quantised inertia is the cause. It's a smoking gun obvious to all who look at the data: http://physicsfromtheedge.blogspot.co.uk/2016/06/asmokinggunineverygalaxy.html "
Physics is a graveyard of models that can describe one phenomenon while getting everything else in the universe wrong. If the model explains one thing but requires physically and observationally false descriptions of reality to work, the model is inadequate. Confident assertions are not facts.

Does it get everything else in the Universe wrong?

Does it get everything else in the Universe wrong?
Frame independent theories may as well have; reference frames are an experimental fact.

If his theory is wrong then why "the observed cutoff acceleration of galaxy rotation makes it obvious that quantised inertia is the cause. It's a smoking gun obvious to all who look at the data: http://physicsfromtheedge.blogspot.co.uk/2016/06/asmokinggunineverygalaxy.html "
Because that is nothing but a meaningless and unscientific assertion. It doesn't matter if QI fits that data, because so does lambdaCDM. His theory is wrong because it can't predict the lack of the Pioneer Anomaly. (He hasn't actually shown whether it matches for many other important tests, but that is irrelevant at this point.)
His views on maths:]
That first quote he is arguing against Galileo. One of these people is more generally respected than the other.
While it is important to keep in mind that math is just a tool, it is not just "invented," there are things about it that are very fundamental. Making arguments like in those tweets is something I have seen when someone doesn't like that math can be used to undeniably prove fundamental flaws in their pet "theory."

https://twitter.com/memcculloch/status/1238392132865572864
https://twitter.com/memcculloch/status/1238459942136356865

https://twitter.com/memcculloch/status/1238392132865572864
This is simply a lie. As he very well should know, the standard model in cosmology lambdaCDM does in fact describe things like dark matter and dark energy without conflict with observation. There are still things to learn about dark matter and a lot about dark energy, but his assertions that the standard model is falsified are simply baseless and wrong.
Meanwhile he just ignores that QI does not correctly predict the observed behavior of the Pioneer spacecraft, has not been compared with standard tests of GR, and cannot even be said to describe the behavior of balls on a pool table due to the lack of any consistent definition of basic conservation laws.
https://twitter.com/memcculloch/status/1238459942136356865
The user who provided a partial quote of my post blatantly misrepresented what I said. I was obviously not denying the experimental data, but rather the absurd assertion that it somehow proves QI, despite the fact that galaxy rotation curves are perfectly well predicted by lambdaCDM. Given this fact, it is simply unscientific to assert that "QI is the cause." It is equivalent logic to "the sky is blue" +"blueberries are blue" = "The sky is made of blueberries."

Anyone who considers dark matter and dark energy as being real things (despite them being only an adhoc hypothesis) is not being serious, therefore further discussion with such zealot is pointless, because nothing would change his/her mind.

Anyone who considers dark matter and dark energy as real things (despite them being only an adhoc hypothesis) is not serious, and therefore further discussion with such zealot is pointless, because nothing would change his/her mind.
Dark energy is basically a place holder, a single variable in GR in a spot where Einstein himself thought there may be a free parameter (and then rejected, as his original reasoning for it existing was not good or supported by data unlike dark energy.) Dark matter is way more detailed than that and explains much more than just galactic rotation curves. (For example gravitational lensing using galaxies)
Anyone who outright rejects hypotheses without even properly understanding what they are, and calls entire groups of scientists "zealots" (as you just did with astronomers and astrophysicists) clearly has no interest in or understanding of anything related to science.

Dark matter and dark energy are really a pseudoscience now. They only pretend it to be science, because this is their livelihood (sinecure).
These "entities" have not been found despite decades of search and there is some evidence that falsifies dark matter:
http://physicsfromtheedge.blogspot.com/2017/10/darkmatterdoesnotexist.html
So there is no reason to believe that they are real. If we assume that they are not real, then how long would you want to wait for a detection of non existing things? You have been already waiting for half a century, would 100 years suffice or 200 perhaps?

Dark matter and dark energy are really a pseudoscience now. They only pretend it to be science, because this is their livelihood (sinecure).
These "entities" have not been found despite decades of search and there is some evidence that falsifies dark matter:
The only thing you just proved is your ignorance. There is no reason to expect dark matter to be detectable other than through gravitational effects, so a lack of detection says nothing. That part about livelihood possibly explains McCulloch's behavior, but astronomers and astrophysicists would have MORE work to do and research to perform if some theory like QI was shown to be useful, making assertions like yours here selfdefeating. It doesn't stop similar statements coming from every crackpot with a perpetual motion machine though.
http://physicsfromtheedge.blogspot.com/2017/10/darkmatterdoesnotexist.html
That blog does not provide actual evidence, it provides handwaving assertions, bad logic and otherwise incorrect statements some of which have already been addressed hear such as the incorrect claim about wide binaries.
So there is no reason to believe that they are real. If we assume that they are not real, then how long would you want to wait for a detection of non existing things? You have been already waiting for half a century, would 100 years suffice or 200 perhaps?
No reason except for the fact that lambdaCDM fits the data, and despite years of supposedly working on it, McCulloch has failed to show that his theory is applicable anywhere outside of very specific cases, or address fundamental problems with it as I have described in this thread. On the other hand, a variety of types of observations such as gravitational lensing support the existence of dark matter. More evidence isn't really needed, though it would be nice to be able to point to a box filled with dark matter, its known properties are such that such a thing may forever be out of human capability, but that is no reason to deny the existing evidence. So how long are you going to continue to zealously defend the lone person yelling on twitter about how scientists are in some sort of massive conspiracy against him, when he can't even demonstrate his theory passing widely accepted basic tests of matching reality?