• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Universe is Deterministic

The detectors can be 20 lightyears apart when they measure the particles,
I'm not willing to grant you that this has actually been tested, as far as the 20ly. ;-)

Nevertheless, if the Universe once was the size in the ballpark of one Planck length, and fully deterministic, the behavior of either detector would be consequential of events that ultimately lead back to the BB. This would also hold true for detectors billions of lyrs apart, actually.

What if the Universe is some giant sort of a fractal structure?

Not saying that the Universe is ultimately deterministic, but a little skeptical of the claim that QM + Relativity disprove determinism.
 
Could you expand on this a bit?

Well, a pretty central element of logic is cause and effect. Certainly it's crucial to physics as we understand it. Physics is formulated in terms of differential equations, which you can think of as a machine that cranks the film forward by one frame. You give it the state of things now, it tells you what will be the state of things an instant later - and by re-applying it many times, the state of things at any amount of time later.

If information can travel ftl, then it can also travel backwards in time in relativistic theories (due to the relativity of simultaneity). But if info can travel back in time, the structure described in the paragraph above dissolves into a puddle of celluloid, because if the present can affect the past it can no longer determine the future (kill your own grandfather yada yada yada).

I'm not willing to grant you that this has actually been tested, as far as the 20ly. ;-)

Fair enough! It has been tested over meter distances, and the principle is the same regardless of distance. But I'll grant you that it's possible QM could be wrong somehow on long scales.

Nevertheless, if the Universe once was the size in the ballpark of one Planck length, and fully deterministic, the behavior of either detector would be consequential of events that ultimately lead back to the BB. This would also hold true for detectors billions of lyrs apart, actually.

Actually even though the universe was once highly curved, according to standard cosmology it's not true that distant points on the sky are connected by a common past. It's basically because the highly curved and small phase lasted for very little time, and the smaller it got the less time it lasted, so even in the limit light didn't move very far.

But that's a bit of a derail, and we can't be sure what happened near or at the BB. So, question: is it possible that everything we measure today was determined by a past causal connection, including violations of Bell's inequality?

Perhaps so, yes - if the detectors and their operators always choose to align themselves along the right axis so that the results make it appear that QM is correct even though it's not.... it's like saying that everything we think we know about probability could be wrong, because it might all be planned out in advance. Or that evolution never happened, because god planted the fossils to make it look like it did. In other words I think such a thing is possible, but it requires some kind of freakishly precise predetermination, vastly more than in classical physics (because not only are things predetermined, they are predetermined in precisely the right way to make it look like Bell's inequality is violated).

In fact I'm not sure even this option is possible - there are versions of Bell's experiment where one single measurement suffices to rule out determinism (as opposed to his original treatment, which required violating a statistical inequality) - but I'll have to think about it.

What if the Universe is some giant sort of a fractal structure?

Not sure how that helps.

Not saying that the Universe is ultimately deterministic, but a little skeptical of the claim that QM + Relativity disprove determinism.

It doesn't - many worlds is the prime example. But it does add a lot to the debate by greatly restricting the options.
 
Last edited:
Perhaps so, yes - if the detectors and their operators always choose to align themselves along the right axis so that the results make it appear that QM is correct even though it's not....
I used Moire patterns as an analogy.
it's like saying that everything we think we know about probability could be wrong,
How does one prove "true randomness" of any particular data set? I used Moire patterns as an analogy, because they only become evident as the result of structured data ("stuff following rules") superimposed on the structure of a measuring device ("stuff following rules".)

If "probability being wrong" merely means that - on the quantum scale - it is impossible to measure stuff at e.g. truly "random" intervals, e.g. that randomness - on that level - is an illusion... ...I'd rather get rid of probability, than get rid of sanity.
because it might all be planned out in advance.
I'm afraid I'm losing you here... in the same way that a chemical reaction "might all be planned out in advance" ???
Or that evolution never happened, because god planted the fossils to make it look like it did.
Neither do I see how straw persons would be of any help here.
In other words I think such a thing is possible, but it requires some kind of freakishly precise predetermination, vastly more than in classical physics
Well, if determinism it is, then it isn't just somewhat accurate, but rather f'ing outrageously freakishly precise - by definition.

Moire pattern.
 
I'm afraid I'm losing you here... in the same way that a chemical reaction "might all be planned out in advance" ???

No, it's nothing like that.

Consider an experiment: an elementary particle decays into three photons which fly off along paths separated by 120 degrees. After traveling for 10 years through empty space each of the three photons gets detected. The experiment can be repeated, with an identical particle, identical detectors, etc. Each detector has a switch with two positions that determines what property of the photon is measured, so there are actually 8 (2*2*2) possible experiments corresponding to the 8 possible switch settings. Let's perform all 8 and collect the results.

What Bell and his successors demonstrated is that QM makes predictions for the results of those experiments that are incompatible with the following claim: that the results of the measurements are determined by the switch setting of each detector plus anything else about the state of the detector and photon it measures.

There is one caveat, however: we need to assume that when we repeat it, each instance of the experiment is identical except for possibly different switch settings.* In other words we have to assume that it is possible to actually repeat the experiment - to set up the same initial state and do the experiment over again, perhaps with the same switch settings or perhaps not.

If that's not the case then we cannot ever learn anything from doing any experiment, because every time we think we've repeated it we're actually doing something different. We could never know if things are deterministic, because we could never do the same thing twice and see if we get the same result both times. But if that were true, why is it that when we repeat experiments we do get consistent results?

That would already be very hard to swallow. But what's even worse is that whatever is actually determining the results must also be determining the choices we make for the various switches, because otherwise we'd find results inconsistent with QM and with previous experiments (and we don't when this experiment is done). So this mysterious force of yours is not only changing the rules for each individual experiment, it's also determining the choices of these people 20 lightyears apart, and correlating them in precisely the right way to get the results to appear to be consistent with QM and inconsistent with local determinism.

Neither do I see how straw persons would be of any help here.

As I've tried to explain, it's not a straw person at all. The degree of conspiracy necessary to fool us into believing QM when in fact everything was determined in the past is FAR beyond the degree required to plant dinosaur fossils, because there are many more data points for physics experiments than there are dino fossils.

So in other words yes, it's possible, but only if we live in the matrix.

*Actually they don't have to be perfectly identical, but they do have to be close enough.
 
Last edited:
<snip>



It does allow for superluminal info transfer. Not even its creator disputes that, as you can see e.g. here, section IV. Cramer thinks that's OK and doesn't lead to inconsistencies, but I think he's wrong.

Apparently no more information then the phase velocity of a wave packet can also carry superluminally. What inconsistencies do you think he is wrong about?

There is an analogy here to the situation with the group and phase velocities of electromagnetic waves in a wave guide15: The phase velocity can exceed c but cannot carry macroscopic information; the group velocity represents the speed of travel of macroscopic information but is never greater than c. If the phase of the wave can be considered to carry microscopic information (e.g., phase information which will affect interference phenomena) then its velocity represents a violation of strong causality. In any case, weak causality is not violated. We wish to emphasize that while there is abundant experimental evidence in support of the principle of weak causality, there is at present no experimental evidence for strong causality. Thus, strong-causality violations are not a compelling reason for rejecting any particular approach.



Also one thing to bear in mind in all this is that we already have a relativistic formulation of QM that works extraordinarily well, and it's not consistent with the transactional interpretation. I don't see any mention of quantum field theory anywhere on Cramer's site (perhaps I missed it), and that's extremely suspicious. QFT is the most successful theory in the history of science and it bears directly on this question; it cannot be simply ignored in these discussions.

Interesting point and I can’t say that I have noted any specific references to QFT in his writings. Could you be more specific about what aspects of QFT are not consistent with the transactional interpretation, other then the already mentioned problems of free emission and self interaction?
 
Is many worlds floated as something that's even remotely testable?

Just lost a long reply I wrote to this.... short answer is potentially yes, using mesoscopic systems like quantum computers plus a better understanding of decoherence on the theory/numerical side.

Apparently no more information then the phase velocity of a wave packet can also carry superluminally.

Not so. Superluminal phase velocity carries zero info superluminally, because the wavefront (the leading edge of the wave) always propagates below or at light speed. It makes no difference what the pattern of peaks and valleys behind the leading edge is doing if the leading edge is moving at lightspeed.

But the TI says waves really propagate back in time.

What inconsistencies do you think he is wrong about?

He thinks microscopic causality violation is OK because it doesn't lead to macroscopic violations. I think that's almost certainly wrong.

Interesting point and I can’t say that I have noted any specific references to QFT in his writings. Could you be more specific about what aspects of QFT are not consistent with the transactional interpretation, other then the already mentioned problems of free emission and self interaction?

QFT doesn't have waves going back in time. Such things are neither allowed nor necessary. The TI is based on was a pre-QFT idea of Feynman and Wheeler which they abandoned because it doesn't work, and because QFT does.
 
Not so. Superluminal phase velocity carries zero info superluminally, because the wavefront (the leading edge of the wave) always propagates below or at light speed. It makes no difference what the pattern of peaks and valleys behind the leading edge is doing if the leading edge is moving at lightspeed.

But the TI says waves really propagate back in time.

Yes it does but also doesn’t non-locality dictate some apparently instantaneous transfer of information at least at the micro level?


He thinks microscopic causality violation is OK because it doesn't lead to macroscopic violations. I think that's almost certainly wrong.

Well I do respect your opinion Sol, but I am a skeptic. Certainly violations of conservation of momentum and energy are not unheard of on the scale of uncertainty. Likewise from my understanding causality tends to become a bit blurred in virtual interactions and on the Planck scale. Particularly in the scenario where an electron encounters a virtual positron producing a photon of gamma radiation that then becomes a virtual positron electron pair, that electron then being the real one which may eventually be detected and the virtual positron being the cause of the original annihilation event.


QFT doesn't have waves going back in time. Such things are neither allowed nor necessary. The TI is based on was a pre-QFT idea of Feynman and Wheeler which they abandoned because it doesn't work, and because QFT does.

So what specifically precludes them such that they are not allowed? Non-locality seems to have put the kibosh on at least some micro information not being able to at least apparently travel faster then light and sum over histories along with virtual interactions seems to have done the same for no apparent micro causality violations.
 
Last edited:
Yes it does but also doesn’t non-locality dictate some apparently instantaneous transfer of information at least at the micro level?

Yes, that's my point. QFT (and QM as its non-relativistic limit) are strictly and absolutely local theories.

Certainly violations of conservation of momentum and energy are not unheard of on the scale of uncertainty.

What?? You're mistaken. As far as we know experimentally, and in all theories of physics, energy and momentum are always conserved at every point all the time.

Likewise from my understanding causality tends to become a bit blurred in virtual interactions and on the Planck scale. Particularly in the scenario where an electron encounters a virtual positron producing a photon of gamma radiation that then becomes a virtual positron electron pair, that electron then being the real one which may eventually be detected and the virtual positron being the cause of the original annihilation event.

That's not a valid process.

So what specifically precludes them such that they are not allowed? Non-locality seems to have put the kibosh on at least some micro information not being able to at least apparently travel faster then light and sum over histories along with virtual interactions seems to have done the same for no apparent micro causality violations.

I don't understand the question. You seem to be under the misapprehension that QM/QFT is non-local. It's not.
 
What?? You're mistaken. As far as we know experimentally, and in all theories of physics, energy and momentum are always conserved at every point all the time.


This. I can understand where people get the idea that it isn't, but read
http://math.ucr.edu/home/baez/physics/Quantum/virtual_particles.html
part 3

Some descriptions of this phenomenon instead say that the energy of the system becomes uncertain for a short period of time, that energy is somehow "borrowed" for a brief interval. This is just another way of talking about the same mathematics. However, it obscures the fact that all this talk of virtual states is just an approximation to quantum mechanics, in which energy is conserved at all times. The way I've described it also corresponds to the usual way of talking about Feynman diagrams, in which energy is conserved, but virtual particles can carry amounts of energy not normally allowed by the laws of motion.

That is precisely why I referred to “on the scale of uncertainty” but perhaps I should have worded it better.


Yes, that's my point. QFT (and QM as its non-relativistic limit) are strictly and absolutely local theories.


I don't understand the question. You seem to be under the misapprehension that QM/QFT is non-local. It's not.

Precisely my point “QFT (and QM as its non-relativistic limit) are strictly and absolutely local theories” but give up determinism for a probabilistic approach. As I said before from my understanding (in some interpretations) causality breaks down at the Planck scale and QFT fails to incorporate the influence of gravity on that scale. A non-local (or at least somewhat non-local) theory that permits micro casualty violations may overcome this problem as long as macro causality can be maintained and thus QFT above the Planck scale and QM above the non-relativistic limit. As I said I respect your opinion and agree that this violation of micro casualty while maintaining macro causality might “almost certainly be wrong”. However it does provide an alternative and we may find that we need to give up a bit of both deterministic and local only aspect to finally find a fully coherent theory. I had once proposed an intermediate Absorber/Emitter principle that incorporates uncertainty and virtual interactions as an intermediate Absorber/Emitter hopefully resolving the issues of self interaction and free emission for absorber theory and bringing a more ‘local’ aspect (if you will) to the non-locality inherent in absorber theory. However I still need to develop a more formal interpretation of that principle as well as some, well, formalism. Again that too might “almost certainly be wrong”.



That's not a valid process.

Well that was my understanding of a possible virtual particle self interaction for a traveling electron, but again I could be wrong.

I think that dang crow stole my hat again.
 
As I said before from my understanding (in some interpretations) causality breaks down at the Planck scale and QFT fails to incorporate the influence of gravity on that scale. A non-local (or at least somewhat non-local) theory that permits micro casualty violations may overcome this problem as long as macro causality can be maintained and thus QFT above the Planck scale and QM above the non-relativistic limit.

The transactional interpretation has nothing to do with gravity. Moreover, the only theory if quantum gravity we have - string theory - is actually just as local as QFT and causal both micro- and macroscopically, and it's inconsistent with the transactional interpretation for the same reasons QFT is.

However it does provide an alternative and we may find that we need to give up a bit of both deterministic and local only aspect to finally find a fully coherent theory.

What is it you find not "coherent" in our current description?

Well that was my understanding of a possible virtual particle self interaction for a traveling electron, but again I could be wrong.

I guess the diagram you have in mind is this one? That's just an electron that emits and then later reabsorbs a virtual photon.
 
The transactional interpretation has nothing to do with gravity. Moreover, the only theory if quantum gravity we have - string theory - is actually just as local as QFT and causal both micro- and macroscopically, and it's inconsistent with the transactional interpretation for the same reasons QFT is.

Where did I say the transactional interpretation specifically refers to gravity? String theory is “the only theory if quantum gravity we have”? Sorry, I must have missed that announcement.


What is it you find not "coherent" in our current description?

A lack of coherence between two of those descriptions, general relativity and QFT.




I guess the diagram you have in mind is this one? That's just an electron that emits and then later reabsorbs a virtual photon.

Nope it was specify as I described before, I’m sure it was in QED but I lent my copy to someone and have not gotten it back. I have seen it reference elsewhere on the web but unfortunately can not seem to find it, if I can I will post it.
 
String theory is “the only theory if quantum gravity we have”? Sorry, I must have missed that announcement.

Announcement?

A theory of quantum gravity needs to have two properties:

(1) it should be quantum; i.e. it should contain a parameter, h, which determines the degree to which quantities like angular momentum are quantized.

(2) it should be gravity; i.e. in the limit h->0 it should reduce to general relativity, or something very similar (like GR plus some other stuff).

The only theory which has both properties is string theory. "Loop quantum gravity" doesn't satisfy (2), and canonical quantum gravity doesn't satisfy (1).

A lack of coherence between two of those descriptions, general relativity and QFT.

I really don't think we need to bring gravity into the question asked in the OP. We're perfectly free to discuss a world in which Newton's constant is zero.

Nope it was specify as I described before, I’m sure it was in QED but I lent my copy to someone and have not gotten it back. I have seen it reference elsewhere on the web but unfortunately can not seem to find it, if I can I will post it.

OK.
 
If you want to make the universe both causal and consistent with QM, Bell proved you must do one of two things:

(1) allow instantaneous interactions between particles. Any such theory can and will allow transmission of information faster than light. Therefore if relativity is correct, it can also be used to transmit information back in time, and the theory is not consistent. If relativity is not correct it might be OK.

(2) give up a more fundamental postulate about reality; for example, give up the idea that there was only one result to the experiment. My favorite example, and the one I think is probably correct, is the so-called "Many Worlds Interpretation".

Is the MWI deterministic? It is in the sense that the wavefunction (which is the full description of the world) evolves deterministically. But on the other hand it is (almost?) completely impossible to predict the result you will observe in some experiments. So in any functional sense it is non-deterministic. Perhaps it's best thought of as analogous to chaos: deterministic but not fully predictive.

Isn't there a third alternative -- Bohmian mechanics? (Or would Bohm fall under your alternative (1)?)
http://plato.stanford.edu/entries/qm-bohm/#o
 
Last edited:
I guess the diagram you have in mind is this one? That's just an electron that emits and then later reabsorbs a virtual photon.

But a positron can be seen as an electron going back in time, right? And if we want to "explain" where the photon came from, we could argue that it was from the collision of an electron and a positron. Likewise, we could say that the electron and positron were created by gamma ray pair production. Of course we might say this explanation should be ignored since it seems to violate causality, but is it totally useless? Why else would a photon (even a virtual one) appear if not from an annihilation event?

- Dr. Trintignant
 
Isn't there a third alternative -- Bohmian mechanics? (Or would Bohm fall under your alternative (1)?)
http://plato.stanford.edu/entries/qm-bohm/#o

As you say, it falls under (1).

But a positron can be seen as an electron going back in time, right?

No. There is a formal similarity between antiparticles and particles going back in time, due to the CPT symmetry of particle physics (T is time reversal, CP turns left handed particles into right handed anti-particles). But an electron is an electron and a positron is a positron, they both have positive energy and they both propagate forward in time.

Of course we might say this explanation should be ignored since it seems to violate causality, but is it totally useless?

Yes, I would say so.

Why else would a photon (even a virtual one) appear if not from an annihilation event?

If a real electron annihilates with a real positron, it cannot produce one photon: it must produce at least two.

Apart from that, the fundamental interaction in QED is the emission or absorption of photons by electrons and positrons. While it's true that same interaction plays a role in annihilation, that's not really what it is. It's just the fact that charge is a source for EM fields.
 
But a positron can be seen as an electron going back in time, right? And if we want to "explain" where the photon came from, we could argue that it was from the collision of an electron and a positron. Likewise, we could say that the electron and positron were created by gamma ray pair production. Of course we might say this explanation should be ignored since it seems to violate causality, but is it totally useless? Why else would a photon (even a virtual one) appear if not from an annihilation event?

- Dr. Trintignant

That is basically how I recall it being expanded in Feynman’s QED book, the virtual positron is (or was in that example) considered an electron moving backwards in time.


If a real electron annihilates with a real positron, it cannot produce one photon: it must produce at least two.

Would that also be the case for a virtual positron real electron annihilation?

http://www.vectorsite.net/tpqm_13.html

Please see the diagram titled “Feynman Diagrams / Charge Conjugation” in the above link.

ETA:
Here is a reference to the virtual pair real electron interaction relating the latter part of the example I mentioned, that the virtual electron of the virtual pair becomes the real electron due the virtual positron annihilation with the original real electron.

http://en.wikipedia.org/wiki/Electron#Virtual_particles

A comparable shielding effect is seen for the mass of the electron. The total rest energy of the electron consists of the mass-energy of the "bare" particle plus the energy of the surrounding electromagnetic field. In classical physics, the energy of the field is dependent upon the size of the charged object, which, for a dimensionless particle, results in an infinite energy. However, because of vacuum fluctuations, allowance must be made for the interaction with virtual electron–positron pairs, when the virtual positron annihilates the original electron causing the virtual electron to become a real electron. This interaction creates a negative energy imbalance that counteracts the radius-dependency of the electric field.[77] The total mass is referred to as the renormalized mass, because a mathematical technique called renormalization is used by physicists to relate the observed mass and bare mass of the electron. This method replaces the terms used to compute the mass with the actual mass found experimentally, thereby avoiding problems with mathematical divergences in the formulas.[78]

The example I mentioned before was a fairly common example in Feynman diagrams, particularly referring to the path integral, ten or so years ago, but I guess that has changed.
 
Last edited:
Would that also be the case for a virtual positron real electron annihilation?

No.

The example I mentioned before was a fairly common example in Feynman diagrams, particularly referring to the path integral, ten or so years ago, but I guess that has changed.

Nothing has changed, but the process we're discussing is not acausal, nor does it involve anything going back in time.

In fact I couldn't find the text you quoted anywhere. Perhaps the wiki has changed? The closest I found was this, which is accurate:

While an electron-positron virtual pair is in existence, the coulomb force from the ambient electric field surrounding an electron causes a created positron to be attracted to the original electron, while a created electron experiences a repulsion. This causes what is called vacuum polarization. In effect, the vacuum behaves like a medium having a dielectric permittivity more than unity. Thus the effective charge of an electron is actually smaller than its true value, and the charge decreases with increasing distance from the electron.[74][75] This polarization was confirmed experimentally in 1997 using the Japanese TRISTAN particle accelerator.[76] Virtual particles cause a comparable shielding effect for the mass of the electron.[77]
 

So then the annihilation of a virtual positron and real electron can produce only one photon?


Nothing has changed, but the process we're discussing is not acausal, nor does it involve anything going back in time.

Well then since the processes “we're discussing” all depend upon virtual particles, in this case virtual electron positron pair production, what then causes that specific virtual pair production?

In fact I couldn't find the text you quoted anywhere. Perhaps the wiki has changed? The closest I found was this, which is accurate:

Certainly I do not see the fact that the article and specifically the text I quoted was changed just this morning as being ‘acausal‘. Nor do I find your claim that you “couldn't find the text you quoted anywhere” to be significant since you could have simply looked at the pervious version even after this mornings revision.

http://en.wikipedia.org/w/index.php?title=Electron&diff=313168826&oldid=313131522

Or you could even just view the reference cited (#77) by this mornings revision and the previous version of that article. On the top of page three the diagram to the right is precisely the interaction I was referring to as

Another contribution to the electron self-energy due to the Fluctuation of the vacuum.

And that reference gives the significance of this self energy contribution much the same as the wiki article before this morning's revision.

Therefore, the classical electrodynamics indeed does hit its limit of applicability at this distance scale, much earlier than 2.8 X10-13 cm as was exhibited by the problem of the fine cancellation above. Given this vacuum fluctuation process, one should also consider a process where the electron sitting in the vacuum by chance annihilates with the positron and the photon in the vacuum fluctuation, and the electron which used to be a part of the fluctuation remains instead as a real electron (Fig. 2, right). V. Weisskopf5 calculated this contribution to the electron self-energy, and found that it is negative and cancels the leading piece in the Coulomb self-energy exactly:


http://arxiv.org/PS_cache/arxiv/pdf/0709/0709.3041v1.pdf


Indeed nothing has changed from when I first mentioned that particular interaction even if some wiki article has, but thanks anyway, Sol, for helping me find the diagram I was referring to.
 
Last edited:

Back
Top Bottom