EPR, Aspect, Bell, and Understanding Quantum Weirdness

Perhaps you missed my point. There is no wave "created" by the particle. The wave is the particle, and vice-versa. Anything the wave does, the particle does. Regardless of the arguments Schneibster is having with the others, this is one thing I think he has right. He calls them "quanta", which is not strictly accurate (the word bascially means "a discrete amount"), I like the word "wavicle". The point is, you have to stop thinking about particles and waves. You have to think of something that can behave as either as particle or a wave depending on the situation.

I suspect telling people about sum over histories could be helpful in getting their heads around the wave-icle bit of quantum physics. Okay, it's a little mind-bending but not so bad.

Best not to tell them that precise position is only one way of looking at what's going on out of... er... several... ways of looking at that, though.

And of course, if this experiment is correct you have to think of them as something that can behave as both at the same time.

Going by the Afshar experiment, we will never have to think this. ;)

The experiment is roughly equivalent to finding a particle on the left of the interference pattern on the screen in the two slit experiment and then claiming we know which slit the particle went through because if it's on the left, it must have gone through the left slit.

And this is obviously not the case.
 
Last edited:
If a wave/photon is given off by something, it will only be given off for a short time, this is what I mean by discrete. Radiation is not given off continuously, it is given off in short pulses, and each of these pulses can be called a photon or wave train/packet, depending on how you look at it. The length of the wave train is simply the time it is emitted multiplied by the speed of light.


Yes, but what about this is specific to quantum mechanics? In classical electromagnetism too, an object can emit radiation for a short period of time and then stop emitting it.

3 is finite. 4 is finite. But 3.5 is finite, too. And so is the square root of 2, and pi, etc. Do you see what I mean?

Space can have as many dimensions as it likes, but light, and all particles and waves, travel in straight lines in one of them at any given time. In fact, light requires at least 3 dimensions since it is composed of oscilating electric and magnetic fields. The fields are at right angles to each other and are both at right angles to the direction of travel. Any less than 3 dimensions and this couldn't work.


Waves travel in one direction at any given time?

A wave in three dimensions spreads out. It travels in lots of different directions, all at once.
 
Going by the Afshar experiment, we will never have to think this. ;)

The experiment is roughly equivalent to finding a particle on the left of the interference pattern on the screen in the two slit experiment and then claiming we know which slit the particle went through because if it's on the left, it must have gone through the left slit.

And this is obviously not the case.


Yes, I agree. That's how it looks to me, too.
 
The experiment is roughly equivalent to finding a particle on the left of the interference pattern on the screen in the two slit experiment and then claiming we know which slit the particle went through because if it's on the left, it must have gone through the left slit.

And this is obviously not the case.
I don't want to take a position on what's happening, but I don't think it's necessarily obvious.

I also recognize that the traditional understanding is that the photon/wave are a duality, i.e., that they exist simultaneously. But, Afshar's experiment makes me wonder if the tradition may be about to change.

We know that the photons are sufficiently far from each other (10 km), that no two photons are between the plane of the slit openings and the wire grid simultaneously (unless we presume that the photon splits and enters both slits simultaneously). And we know that each photon hits only one detector (although we can't predict which one in advance -- which is why things are still uncertain). But, in between the plane of the slit screen and the lens, there is definitely an interference pattern -- otherwise there would be a scattering effect from the grid which is not recorded with both slits open.

So, either (1) the photon enters one uncertain slit, while the wave enters both, thereby creating the interference pattern, passess relatively undisturbed through the wire grid, is refocused by the lens and then eventually strikes only one detector, or (2) the photon/wave enters both slits, creates the interference pattern, passes relatively undisturbed through the wire grid, is refocused by the lens, and eventually collapses on one uncertain detector.

Either way, the uncertainty remains, but I'm thinking that there is something new to be learned from what is taking place between the plane of the slit openings and the detectors.
 
Waves travel in one direction at any given time?

A wave in three dimensions spreads out. It travels in lots of different directions, all at once.

It is often convenient to use plane waves as a basis set for electromagnetic waves (meaning anything else can be expressed as a superposition of such plane waves), and each of these plane wave bases do have only a single direction (though their superposition need not). But you can also form basis sets using spherical harmonics, for example, which propagate outwards in many directions. On a practical level, whether or not you want to associate a single propagation vector with a particular photon depends quite a bit on what basis you want to work in.
 
Heck I like you all, I am glad to read what Schneibster writes and just as glad to read what Pragmatist, Ziggurat and Cuddles write. Multiple opinions seems best for QM.

Where is Epekeke?
 
As I am not concerned with historical accuracies I will limit the issues to physical ones and give my perspective of the Afshar experiment. Since some of the issues being discussed involve modern and ongoing research many questions do not have definite answers.

The quantum theory is a theory that energy doesn't just flow out continuously from objects that emit it, or into objects that absorb it; instead, it comes in little "packets," of a determined size, one at a time.
See next.
So the quantum theory is the theory that all matter and all energy are made up of indivisible elementary particles called "quanta," and that all of physics can be explained in terms of interactions of these quanta.
Quanta are not elementary particles but rather the energy of all elementary particles must posses whole number units of energy. Quanta is therefore are not emitted one at a time but in whole number units. Some theoretical work involves extending this idea of Quantization to space and time. Although defining Plank units of space and time is trivial the relevance is empirically limited at the moment. One promising approach is "Doubly Special Relativity".

OK, so Heisenberg proposed that there were certain parameters of quanta that could not be simultaneously measured.
The Uncertainty relation does not say you can't measure conjugate parameters simultaneously. It states that if you try to measure one of the variable beyond a certainty accuracy you are limited in the accuracy you can know the other variable. Michael Hall actually reformulated this principle with a stronger relation called "Exact Uncertainty". He was even able to derive the Schrödinger equation directly from it. Under the Copenhagen interpretation (CI) and the collapse of the wave function you are not supposed to be able to observe both wave and particle aspects at the same time.

At first, he was a proponent of the idea that this was because the measurement of one quantity would disturb the measurement of another, but soon the math told him and others that in fact, those other values simply didn't exist. It had to be that way.
Though it is true that the measurement was not the source of the uncertainty that does not mean that the conjugate variable does not exist. Under the CI which is an empirically valid ontology attached to QM these conjugate variables don't have values before being measured. The ontology is not the same and is not needed by QM to make predictions. There are other ontologies that work also. There is an exact quantitative analogy in classical conjugate variables such as the frequency spectrum of sound at a single moment in time.

This debate continued until John Bell came on the scene in the 1960s.
The debate continues to this day which is why experimental attempts like the Afshar experiment are important. QM alone predicts the Afshar experiment perfectly, it's the interpretation that is in question.

Bell considered the situation in EPR carefully and realized that everyone had missed something important. What he showed was that although the spin on a second axis was not completely dependent upon the spin on the first, the probability of a certain spin was different if the other spin had some known value than if it did not have any value.
Again the question was not if conjugate variable had any value but whether or not they had definite values that exist independent of the measurement. What EPR showed was empirical consistency with the CI. Again other ontologies also work. Physicist are pragmatist (no pun intended) so unless or until new empirical information can be found most will not care what ontology is used. The CI therefore remains the standard.

In 1982, the time was finally ripe. Alain Aspect had an idea for a test of the CHSH inequality derived from Bell's Theorem. He enacted it, and proved beyond reasonable doubt that in fact the distribution of values on the measured axis was inconsistent with the existence of a real value consistent with the value measured on the other photon for that axis. The value did not exist, in other words; and "spooky action at a distance" was in fact reality.
The proof it "was inconsistent with the existence of a real value" depends on the assumptions of the CI. This is the interpretation that uses the idea that the values are not real until measured to show there is no "spooky action at a distance". It is the alternative interpretations that seems to introduce "spooky action at a distance" by giving conjugate variables real values.

But their assumptions were these:
1. Locality, or relativistic causality; that is, that local causes have local effects, that no cause can have an effect outside its light cone.
2. Local realism; that is, that all parameters have actual values whether they can be measured or not; that reality has independent existence apart from whether it is (or can be) observed. This is very closely related to counter-factual definiteness.
3. Completeness; that is, that quantum mechanics is a complete description of quantum reality.
Yes nice list of assumptions. We'll need them.
EPR, of course, hoped to prove that the last was inconsistent with the facts; however, Bell opened the way to differentiate between the first and second assumptions, and the third, and test them. And what Aspect showed, using CHSH's inequalities and Bell's test, was that either the first or second had to be wrong. Either influences could reach out beyond the light cone, or unmeasurable complementary values of measured parameters were not merely unmeasurable but non-existent.
The third assumption by definition can never be proved so the way it should have read is, "Bell opened the way to differentiate between the first and second assumptions, given the two assumptions about the third, and test them". Now let's look at the assumptions that go into the either/or proof.

{1} If @3 is true then;
"unmeasurable complementary values of measured parameters were not merely unmeasurable but non-existent". (Not that the "non-existent" condition is specifically inviked by CI the avoid "spooky action at a distance" and QM neither needs it or includes it. Again unmeasurable doesn't belong in this context. see above)
{2} If @3 is false then;
"influences could reach out beyond the light cone". (Note this is the very definition of "spooky action at a distance".)

By default @{1} assumes the CI in the definition of complete. As stated above it is CI that definitionally excludes "spooky action at a distance". This is also why making @3 false is said to prove "spooky action at a distance". This proof using @3=false depends only on removing the CI definition of nonexistant values prior to measurement and ignores other assumptions in QM about statistical ensembles and what a state of the system might actually mean. Just searching EPR should point out all kinds of explainations that don't fit the above proof such as: here. It should be noted that EPR correlations are frame dependent. Most will never care without new empirical information :cool: .

More detail;
QM defines the evolution of a system by a statistical wavefunction. These statistics defines the probability of finding a real particle somewhere. It has also been shown that real particles do things that they could only do if they were a wave, not the real particle we assumed them to be, yet they still act like real particles when we check.

When classical thermodyanamics was sufficiently generalized such that it encompassed both classical and Quantum effects it was done by redefining the ensemble. Instead of an ensemble consisting of N parts like a gas we have an ensemble defined by a single part but in N different states. Seems trivial but it is not. This makes sense with a dice as all states are represented by the number of sides it can land on. In classical systems this is still empirically and logically reducible to definate states for dice or gases. Quantum systems have thus far been empirically irreducible. All outcomes are precisely equivalent to what would be expected if the statistics were the reality except that we still see individual particles at the end.

Nice little double slit video: http://video.google.com/videoplay?docid=-4237751840526284618

The Afshar experiment is another in a long line of experiments to try to squeeze some empirical information out about the nature of of this wave particle duality. It in no way changes anything about QM as QM predicts the experiment perfectly. Afshar was designed to distinguish between the various interpretations. In that regard it appears partially successful to me.

Personal opinion;
The idea that the lense calapsed the wavefunction after the interference is difficult under CI but being a theory of ontology seperate but seemingly consistent with QM leaves wiggle room. There are ways to extend Afshar :D but it's still no magic bullet :boggled:, due to QM consistancy and new empirical content issue. IMO the wave nature of QM is real, the collapse of the wavefunction is an artifact of the ensemble not just representing the actual state but representing all possible states and the particle representation of measurements. You can see an analogy of the single particle interference here. This alone does not help when the very properties we identify with particles such as position and momentum are conjugate variables like frequency and time in classical mediums. This would require assuming that the particle and its properties are a part of the wave like a soliton. EPR correlations need not do any communicating at all if those properties were inherent to the particle from the time of emission. The confusion is from mistaking the measurement of conjugate properties as the same underlying principle as the statistical description. Statistical ensembles can define an arbitrary number parameters in a single entity. Classical statistics is in general a method of trading microscopic information for macroscopic inforation. IMO the Afshar experiment is just a tiny step toward demonstrating my opinion but falls a bit short of actually doing so. Of course there remains a wide range of issues my opinion did not cover.

About my personal opinion;
Everything I stated before the personal opinion is fully open to attack. I will only defend my opinion in a very general way as it was perspective that was asked for. In principle people could demand more info about my opinion than I am willing to give :boxedin:. This is neither the time nor the place and not fully defensible without some unique emperical backing :D.
 
My_wan, thank you for a nice example of how a scientific discussion should be conducted. :)

Quanta are not elementary particles but rather the energy of all elementary particles must posses whole number units of energy. Quanta is therefore are not emitted one at a time but in whole number units. Some theoretical work involves extending this idea of Quantization to space and time. Although defining Plank units of space and time is trivial the relevance is empirically limited at the moment. One promising approach is "Doubly Special Relativity".

I would disagree with one statement here, and that is the one that the energy of all elementary particles must possess whole number units of energy. I think I know what you meant to say, but as it stands the statement is misleading and needs qualification.

The energy of elementary particles can (as far as we know) take on any value, the energy is not set at any specific whole number of units of energy. There is no fundamental quantum of energy and so there is nothing fixed for there to be whole number units of.

Certain fundamental qualities are quantized. These include action and angular momentum (which are aspects of the same thing) and electric charge, magnetic flux etc. Each quantized property has some fundamental "quantum". For action and angular momentum it is Planck's constant (h), for electric charge it is (e), the charge on an electron or a proton, for magnetic flux it is the fluxoid quantum etc. And whilst it is true in general that these properties can only take on integer amounts of their respective quanta, there are exceptions. For example, action can occur in units of h/4*pi, the electric charge of quarks can be 1/3e or 2/3e, subquantum fluxes are possible etc. It's also worth mentioning that quarks never appear in isolation so we never see fractional charges occuring naturally.

Quantization of one property can lead to apparent quantization of another that does not in itself necessarily have any known quantization. An example is the quantization of angular momentum in particle spin which can cause the particle to assume discrete and apparently quantized spacial orientations (space quantization).

In this way, energy is quantized (secondarily) in for example, the excitation states of atoms. The energies of electrons in atomic "orbits" can only take on discrete values - but this does not imply that there is some fundamental quantum of energy, because an electron removed from an atom can take on any value of energy it likes i.e. kinetic energy just by changing its velocity. Although the "rest mass" or fundamental internal energy of an electron is fixed. A photon can take on any energy, a photon can have any frequency it likes and hence any energy.
 
I also recognize that the traditional understanding is that the photon/wave are a duality, i.e., that they exist simultaneously. But, Afshar's experiment makes me wonder if the tradition may be about to change.

The short version is that there is an overall picture of a particle moving from one place to another by all sorts of possible journeys which are interfering constructively and destructively with each other. This is how we get wave effects from a single particle.

Wave/particle duality is an early, rough rule about being careful when extracting useful but potentially misleading pseudo-classical descriptions for what was going on from the overall quantum description of an experiment, a way left unfinished and vague until the much more recent and seemingly widely misunderstood consistent histories approach. In brief, that says we must be careful what we say and not just ram together incompatible descriptions extracted from the overall description. To not do so will end in inconsistency and paradoxes.

Not putting together extracted wave descriptions and particle descriptions is the old, well-known rule that is wave/particle duality. Another example of these rules is about not just ramming together extracted incompatible descriptions for particle spins, for example, or we might end up thinking measuring one spin instantaneously affects the spin of another correlated ("entangled") particle at any distance. ;)

Anyhow, the flaw in Afshar's interpretation of his experiment is that he and his fellow authors think we go from talking about waves to talking about particles during the course of the experiment when it's actually waves all the way from the holes to the detectors. To take the old wave function collapse viewpoint, just because diagrams of the Afshar experiment make it look as if the wave function has collapsed at some point past the wires doesn't mean it has!

The Afshar experiment is useful in that it again highlights the hazards of helpful diagrams and descriptions when dealing with this subject.
 
Last edited:
Yes, but what about this is specific to quantum mechanics? In classical electromagnetism too, an object can emit radiation for a short period of time and then stop emitting it.

3 is finite. 4 is finite. But 3.5 is finite, too. And so is the square root of 2, and pi, etc. Do you see what I mean?

I have absolutely no idea what you think you mean. Classical EM is no way discrete, all classical systems are continuous. That is why quantum mechanics has the word "quantum" in it. Because it is quantised. As I said :
Radiation is not given off continuously, it is given off in short pulses, and each of these pulses can be called a photon or wave train/packet, depending on how you look at it.
Finite has nothing to do with it, I used the word "discrete" for a reason. In classical mechanics light can be given off for any length of time with any energy. In quantum mechanics it can only be given off one photon/wave packet at a time and the photons can only have certain energies.


Waves travel in one direction at any given time?

A wave in three dimensions spreads out. It travels in lots of different directions, all at once.

Nope. A wavefront travels in three dimensions. When a particle emits a photon/wave that photon/wave travels in one direction and one direction only. If you have lots of particles all emitting radiation randomly it will go in all directions, but each particle will always emit only one photon at a time. This can be, and has been, measured.
 
Finite has nothing to do with it, I used the word "discrete" for a reason.


I agree that "finite" has nothing to do with it. You used the word "discrete", but you also used the word "finite". That's why I got confused. (See post #28.)

In classical mechanics light can be given off for any length of time with any energy. In quantum mechanics it can only be given off one photon/wave packet at a time and the photons can only have certain energies.


A photon can have any amount of energy.

What amount of energy do you think no photon can have?

A particular kind of atom can emit photons of only certain energies. This discreteness is due to the nature of the atom, not the nature of photons in general. Other kinds of atom can emit photons of other energies.

Nope. A wavefront travels in three dimensions. When a particle emits a photon/wave that photon/wave travels in one direction and one direction only. If you have lots of particles all emitting radiation randomly it will go in all directions, but each particle will always emit only one photon at a time. This can be, and has been, measured.


I don't see how a photon can be thought of as travelling in one direction and one direction only. If a photon is emitted from a point on one side of a double slit and is subsequently detected somewhere on the other side, did it initially travel in the direction from the emission point toward slit A or in the direction from the emission point toward slit B?
 
A photon can have any amount of energy.

What amount of energy do you think no photon can have?

He may have gotten a little sloppy about the way he phrased it, but in quantum mechanics, electromagnetic radiation of a given frequency can only carry energy in discrete increments. This is not true classically, where the energy carried by a field can vary continuously down to arbitrarily low values, even at a fixed frequency, and I think that's what he was getting at.

A particular kind of atom can emit photons of only certain energies. This discreteness is due to the nature of the atom, not the nature of photons in general. Other kinds of atom can emit photons of other energies.

Only if they change frequency as well: that part IS a property of photons, and it does not exist in classical electromagnetism. I think that's his point

I don't see how a photon can be thought of as travelling in one direction and one direction only.

You are correct on that point. The wave function for a photon can indeed propagate outwards in many directions: subsequent detection at one point doesn't change that.
 
My_wan, thank you for a nice example of how a scientific discussion should be conducted. :)



I would disagree with one statement here, and that is the one that the energy of all elementary particles must possess whole number units of energy. I think I know what you meant to say, but as it stands the statement is misleading and needs qualification.

The energy of elementary particles can (as far as we know) take on any value, the energy is not set at any specific whole number of units of energy. There is no fundamental quantum of energy and so there is nothing fixed for there to be whole number units of.

Certain fundamental qualities are quantized. These include action and angular momentum (which are aspects of the same thing) and electric charge, magnetic flux etc. Each quantized property has some fundamental "quantum". For action and angular momentum it is Planck's constant (h), for electric charge it is (e), the charge on an electron or a proton, for magnetic flux it is the fluxoid quantum etc. And whilst it is true in general that these properties can only take on integer amounts of their respective quanta, there are exceptions. For example, action can occur in units of h/4*pi, the electric charge of quarks can be 1/3e or 2/3e, subquantum fluxes are possible etc. It's also worth mentioning that quarks never appear in isolation so we never see fractional charges occuring naturally.

Quantization of one property can lead to apparent quantization of another that does not in itself necessarily have any known quantization. An example is the quantization of angular momentum in particle spin which can cause the particle to assume discrete and apparently quantized spacial orientations (space quantization).

In this way, energy is quantized (secondarily) in for example, the excitation states of atoms. The energies of electrons in atomic "orbits" can only take on discrete values - but this does not imply that there is some fundamental quantum of energy, because an electron removed from an atom can take on any value of energy it likes i.e. kinetic energy just by changing its velocity. Although the "rest mass" or fundamental internal energy of an electron is fixed. A photon can take on any energy, a photon can have any frequency it likes and hence any energy.

Yes the statement was an over generalization. I didn't want to delve too deeply into the formalism as it would have created a lot of explanation muddying the point. I am aware of how the formalism defines the field as operators acting on quantum states making the term "energy" much more complex than indicated. I am also aware of lots of attempts at generalizing quantization especially with regard to Relativity, both special and general. That's why I mentioned "Doubly Special Relativity". In GR there are people working on geometric quantization using Weyl geometry. Most such attempts don't scale to standard Plank units in a clear manner.

<Rant> (just because I like it)
Personally my present approach is to drop the geometric description of GR and leave a relativistic interval operator from which a geometric topology can be defined. This seems to provide a simpler mechanism for transforming between various mathematical formalisms. It does set the whole theory within an abstract linear space and time that doesn't correspond to anything measurable, not even the coordinates. To match the empirical information a transformation defined by the interval operator is used. Curvature in one formalism translates directly to force in another and an interval in my intermediate approach. By giving this interval the identity of Plank units it seems in a very general way, specifically in some cases, to empirically match a Quantum Field Theory that includes gravity. A full formalism basically involves dimensionalizing and defining the probability function in terms of these intervals. The intervals themselves have statistical constraints. Since the interval is relativistically defined it is by definition Lorentz invariant. Although I envision this to describe not only the field but also the particles in it my motivation is primarily one of simplicity. I can point at conjugate variables etc for justification. Predictions, predictions... :D
</Rant>
 
<Rant> (just because I like it)
Personally my present approach is to drop the geometric description of GR and leave a relativistic interval operator from which a geometric topology can be defined. This seems to provide a simpler mechanism for transforming between various mathematical formalisms. It does set the whole theory within an abstract linear space and time that doesn't correspond to anything measurable, not even the coordinates. To match the empirical information a transformation defined by the interval operator is used. Curvature in one formalism translates directly to force in another and an interval in my intermediate approach. By giving this interval the identity of Plank units it seems in a very general way, specifically in some cases, to empirically match a Quantum Field Theory that includes gravity. A full formalism basically involves dimensionalizing and defining the probability function in terms of these intervals. The intervals themselves have statistical constraints. Since the interval is relativistically defined it is by definition Lorentz invariant. Although I envision this to describe not only the field but also the particles in it my motivation is primarily one of simplicity. I can point at conjugate variables etc for justification. Predictions, predictions... :D
</Rant>
OK, your comments are incomprehensibly far over my head, but you seem to know what you're talking about, so let me ask you about this experiment that you provided a link to earlier:

http://www.physorg.com/news78650511.html

In the article, the authors state that because of the macro-nature of the experiment, they are able to observe the droplet passing through one slit, while the wave passes through both.

But, it's what the authors don't say that I think is the most interesting. Their macroscopic experiment uses a fluid to create a medium which carries the wave through both slits. Whereas in the quantum double-slit experiment, there's no medium, unless we bring back the concept of the "ether," or characterize the wave as being in some other dimension/universe, etc.

This is what makes me wonder if Afshar is close to proving something unexpected, i.e., the existence of a carrier medium for the wave in what we would ordinarily observe as being empty space.
 
OK, your comments are incomprehensibly far over my head, but you seem to know what you're talking about, so let me ask you about this experiment that you provided a link to earlier:

http://www.physorg.com/news78650511.html

In the article, the authors state that because of the macro-nature of the experiment, they are able to observe the droplet passing through one slit, while the wave passes through both.

But, it's what the authors don't say that I think is the most interesting. Their macroscopic experiment uses a fluid to create a medium which carries the wave through both slits. Whereas in the quantum double-slit experiment, there's no medium, unless we bring back the concept of the "ether," or characterize the wave as being in some other dimension/universe, etc.

This is what makes me wonder if Afshar is close to proving something unexpected, i.e., the existence of a carrier medium for the wave in what we would ordinarily observe as being empty space.

The link to the physorg site is an analogy not to be taken too literally, just a general mechanism. I am not hostile to an ether theory in general. My previously stated "opinion" referring to quantum probability waves representing real waves alluded to this. I can be quiet hostile to a rehash of the Lorentz Ether Theory (LET) as somehow superior to Einsteins Relativity and don't even start with the one way speed of light BS. You wouldn't be off base to call my <rant> an ether theory. It not a classical ether theory with absolute space and time unless that's what you want to call the abstract linear space Newtonian space. It requires transforms due to the simple fact that space-time is not absolute. It is dimensionalized and scalable which by itself makes it an ether of sorts. Ultimately you want to know if is reducible to parts like a classical medium. The answer is sort of, it depends on the ontology you want to filter it through.

Ontology suffers from the fact that two or more mutually exclusive ontologies can often be empirically identical. Witness the different interpretations of QM. Different mathematical formalisms often have this same property but they can't be considered mutually exclusive, they are complementary. It was removing the ontology that give Einstein the advantage and breadth of sight over LET. The lack of a good ontology can also create confusion when trying to tie together seemingly unrelated concepts. This is customarily done by anology. It is a mechanistic anology used as the basis for my ensembles. This does not make it The Truth and perhaps not even right.
 
Well, it would be if it was at all accurate...
If it had been my intent to be absolutely perfectly technically accurate, no one without deep physics knowledge would understand it, in particular the person it was written for, and it would take weeks if not years to write. So basically, what you are doing is applying the standard of absolute technical accuracy to a document that was never intended to be absolutely technically accurate, so that you can say "wrong" twelve-some-odd times.

There is a name for this behavior: harassment. That is what is happening here, and if the post I reply to here were not sufficient proof, the one that follows it certainly is; and the capper is on the next page. There is no question as to your intent, you have stated it aloud; there is no question of that intent's appropriateness (or, more to the point, lack thereof) in this thread, or for the user who asked the question that made this thread appear, or, in my opinion, on this site.

You have been conclusively demonstrated to be wrong twice now, by me, and it is apparent that you are out for revenge, and it is your intent to harass. It is inappropriate here, and you are about to be proven conclusively wrong yet again.

Actually, that article says precisely what I said: Maxwell showed that energy must be emitted in packets (quanta), but never showed that it was also absorbed the same way. But the first is essentially the quantum theory; Einstein merely substantiated it, named it, and completed it. Planck discovered it. Note as well that Planck was awarded the Nobel Prize in 1919 (link in the next response), and it is specifically stated on the Nobel Committee's web site that this was for the discovery of quanta.

Wrong. Einstein was the first to propose quanta and particularly with regard to the photoelectric effect.
Tell it to the Nobel Committee. Look here. Please note the following prominently displayed phrase: "in recognition of the services he rendered to the advancement of Physics by his discovery of energy quanta." (Bold mine.) That is the official statement of the Nobel Committee regarding the reason for the award. If you disagree, you are free to do so; but it is clearly nothing but your opinion, as opposed to that of the Nobel Committee.

Furthermore, what Planck believed at the time is immaterial; by creating the formula that utilized his constant, he had implicitly proposed quanta. Whether he realized that implication or not, that is what he had done. That is what the Nobel Committee decided, as they indicated in plain words.

Wrong. Schrodinger's work came 20 years later.
Schroedinger proposed his wave equation in 1925, though it was not published in Physical Review Letters and Annalen der Physik until 1926; de Broglie proposed quantization of matter in 1923 in his PhD thesis, but this remained obscure until around the same time (1925) when Einstein began to talk about it, and unproven until 1927 when Davisson and Germer proved the hypothesis with their electron scattering experiment in a nickel crystal. That doesn't sound like twenty years to me. It sounds like two.

And Schrodinger's work was based on De Broglie's.
This, at least, is correct.

Wrong. There are no elementary particles called "quanta" - quantization is a process in which we find that dynamic systems change according to discrete states rather than continuously.
So, then photons, being elementary particles, must not be quanta of the electromagnetic field. (For reference, please note that each of the previous is a separate link to a different source; NASA, medical science, Princeton etymological, and Wikipedia definitions are given of the photon, each of which states that it is a quantum, each of which states that it is a particle, and three of which state that it is an elementary particle.) And Planck must not have discovered them, even though he was awarded the Nobel Prize for doing so in 1919.

This is nonsense. A quantum mechanics is a system of mechanics in which certain "properties" of the system appear to be quantized into discrete states. There are no magical entities known as "quanta".
See above. It appears from this statement that you do not believe in photons.

Wrong. Heisenberg proposed that certain qualities or properties of matter could not be simultaneously measured with absolute precision, there was an inherent uncertainty in all measurements which was related to the quantum of action.
For reference:
OK, so Heisenberg proposed that there were certain parameters of quanta that could not be simultaneously measured. At first, he was a proponent of the idea that this was because the measurement of one quantity would disturb the measurement of another, but soon the math told him and others that in fact, those other values simply didn't exist. It had to be that way.
This was intended to be a relatively non-technical discussion. If you want to (from the viewpoint of a relatively non-technical reader) quibble about the quantum of action, you are welcome to do so; but please do it somewhere else. The majority of readers here will not be interested in the action principle in the first place, nor are they interested in plowing through the derivation of the Hamiltonian to understand its application to quantum mechanics and why action is quantized. If you'd like to discuss the action principle, at least have the courtesy to start your own thread, or at least explain what you are talking about so that non-technical readers can understand.

I argue that my description is close enough for the non-technical to grasp the underlying idea; should I have felt that more detail was needed to get to Afshar, then I would have provided it. To state that this is "wrong" merely because I avoided a concept that a) is not necessary to understand what is happening, and b) is highly abstruse, is not a correction for accuracy's sake, but simple harassment.

He was also always a proponent of the idea that measurement of one quantity would disturb the measurement of another, but his argument also went beyond that.
You have implied that I said that measurement of one such quantity could not disturb the measurement of another; that is not what I said. What I said is, that is not what the uncertainty principle means. Nor is it what Heisenberg thought it meant; the gamma-ray microscope is a pedagogical tool, and Heisenberg himself said so. The following site contradicts you, in Heisenberg's own words, and it is the official historical site of the American Institute of Physics for the history of Heisenberg's uncertainty principle. See this page at the bottom: "So far the experiments all confirm Heisenberg's conviction that there is no 'real' microscopic classical collision at the bottom."

Furthermore, Heisenberg developed matrix mechanics; Schroedinger later showed that the wave equation was equivalent, but Heisenberg's view of things didn't originally include the wave equation, but matrices and the concept of the "quantum jump;" to state that Heisenberg believed that measurement of one quantity would disturb the value of another in the face of quantum jumps and matrices is patently ludicrous.

Now, I have brought matrices into the conversation; let me explain for the non-technical what is involved. Heisenberg deliberately avoided discussing any sort of description of the "orbit" of an electron around an atom. Max Born read his paper, and realized immediately that Heisenberg's formulation could be expressed mathematically using a technique called "matrices." He wrote a paper showing how, and Heisenberg and he (and Pascual Jordan, a student of Born's who had assisted Born on his paper and shared credit) then released another paper jointly on it the next year.

Matrix mechanics was not immediately accepted; matrices were not well-known in physics, and had not been widely studied. The technique was seen as very abstruse mathematics, whereas Schroedinger's wave equations were seen as much more concrete representations. Furthermore, matrix mechanics went far beyond stating that the position and momentum of a particle were conjugate under uncertainty; its most obvious feature, the quantum jump, implied that the position of an electron could vary discontinuously; that is, that an electron could be found at point A at one time, and point B at a later time, without having traversed the space between. It was not until Schroedinger showed that matrix mechanics and his wave equation were equivalent mathematically that matrices were widely accepted.

Whether his idea "went beyond that" or not, you have both imputed something false about what I said, and made a false statement about what Heisenberg believed according not merely to his own words, but to the very technical basis of his (and Born's) theory. Again, this is harassment, not valid criticism.

The math never "told him and others that in fact, those other values simply didn't exist", that it is simply one interpretation that might explain what is actually observed.
On this page, Heisenberg is quoted as follows: "In the sharp formulation of the law of causality-- 'if we know the present exactly, we can calculate the future'-it is not the conclusion that is wrong but the premise." The implication is that we cannot know the present exactly- that at least some of the values of parameters literally do not exist. Further, the clear implication of the quantum jump is that intermediate positions between the starting and finishing positions do not exist. And that is precisely what I said. You do not know the difference between matrix mechanics and wave mechanics.

Wrong. Einstein, Podolsky and Rosen set out to show that the wavefunction could not describe physical reality, they proposed an argument based on measurements of the position and momentum of a pair of unspecified particles. They did not mention photons or spin.
I preceded this with, "here is a generalized and simplified explanation." If you don't like it, I suggest you start your own thread. I chose photons; you chose harassment. It is more important that your opponent be "wrong" than that you bother to write something informative to those who may have less understanding than you. It is this that I find "wrong" with you.

The reason this is easier to explain with spin is because spin is discrete. The arguments also apply to continuous variables, but are much more difficult to understand. Note that understanding it there requires the action principle and the quantization of action; I have already covered why I did not believe that that level of detail was needed. This is a consistent pedagogical approach that avoids matters that are unnecessary to understand Aspect, Afshar, and the DCQE. I or someone else may later choose to add the action principle in another discussion; you may do so yourself, if you choose. But again, this is not a correction; it is harassment, plain and simple. Your intent is not to inform; if it were, you would have explained the action principle, and its quantization. That you did not do so proves conclusively that your intent was only to be able to write "wrong" twelve times, not to inform others. And that is harassment plain and simple, no question about it.

Wrong. Bohr's argument is complex and is based on unavoidable disturbances between the systems under measurement. Some parts of the later, refined argument do imply the non-simultaneous existence of the quantities but it was not explicit in the original argument. Bohr also did not ignore the idea of non-locality, he was explicitly opposed to it.
Again, simplified and generalized. "Wrong" here is again harassment, which is your entire goal.

To go into Bohr's argument here is another waste of time, another side-track that will do nothing but confuse the reader. That argument is based on an interpretation of quantum mechanics that most physicists no longer believe is consistent with the facts; specifically, the original unvarnished Copenhagen Interpretation. Even proponents of CI say that it doesn't make sense without decoherence; and decoherence was not to be developed for decades at the time of this argument. I have presented a version of the modern argument; it is easier to understand, does not involve either matrices or bra-ket Dirac notation, and gets to the heart of the matter.

Wrong. The debate began before EPR, EPR did not lead to the debate.
While true, it is (again) a simplified and generalized description. Every point you have made while technically true cannot help the layman understand the situation; in fact, all you have done is succeeded in derailing the thread and making it one (to judge from the comments that follow) that is incomprehensible to the individual who asked the question in the first place. You have not only harassed me, you have rendered the discussion of the subject useless to the person who it was intended for. This behavior is unjustifiable; it is harassment, and personal attack, nothing else but. You have no place here if your only motivation is attack; it is against the rules of this site. By engaging in it, you have not only broken the rules, but rendered a discussion that you did not understand the need for incomprehensible to the person for whom it was intended, as that individual has clearly indicated in as many words:

Well, this is just great. Everyone's managed to offend each other, but no one's bothered to discuss my question, so I'll offer it again...

Or, do I just have no idea of what I'm talking about (a genuine possibility)?
Good job, Pragmatist. I'm sure that an exhaustive discussion of the details of the action principle is precisely what kjkent wanted. S/he seems to have found it very useful and have full understanding. If you were pursuing an agenda of sharing knowledge, this would not have happened; but your agenda can be judged by your actions, and it has nothing to do with sharing, nor with politeness, nor even with correct understanding on the part of the layman. Your agenda is to harass.

And this is not the "measurement problem" - which is concerned with the resolution of superposition of states as well as the effect of disturbances on the system.
And here we have yet another side-track, to go with the action principle and the quantization of action: superposition. It is unnecessary to a layman's understanding of Afshar and the DCQE.

Superposition is a concept that I have explained elsewhere; its applicability to wave mechanics is that it is a mathematical method of describing waves by decomposition into simpler waves. The Schroedinger wave equation that describes a propagating (i.e., moving) particle can be decomposed into simpler components that are called "states."

The implication of wave mechanics is that particles ordinarily exist as a combination, that is, a superposition, of these states, when it is propagating in free space; but when the particle is detected, the superposition collapses into a single state, and that state interacts with the state of the detecting particle. This is the collapse of the wave function, which is also the quantum jump of matrix mechanics. The notation, invented by Paul Dirac, used to describe this is also called "bra-ket" notation.

Decoherence proposes that after this interaction, the descriptions of the two particles decohere back into their separate (or combined, if they happen to stick together and form a system) wave equations, which are again superpositions of states, and remain so until the next detection/interaction.

This is harassment, plain and simple, nothing else but. You prove it every time you say "wrong." Feel free to explain it so that people who don't want to have to understand every last detail can comprehend it. That is the intent of this thread. You have failed miserably; kjkent has no description that s/he can take away as to what is going on. All you have succeeded at is harassment.

This is nonsense. Bell's argument had nothing to do with mutual dependence of spin states on different axes.
OK, then feel free to explain it in terms that the original inquirer can understand. I have already done so; all you have done is confuse with a bunch of unnecessary detail and technical quibbles in pursuit of a personal attack agenda. I'll show precisely why in my next response.

Wrong. Bell set out the inequalities in his original paper and argued them through to a complete conclusion. Bell presented his paper as a complete proof in itself against EPR.
He did; however, to explain and describe his conclusion, which is based upon pure mathematics, to a layman is a daunting task. It is considerably eased by folding CHSH and the use of spin singlet states as CHSH, and then Aspect, did to physically show the inequality's underlying idea by a comprehensible (not to mention unambiguously experimentally testable) example.

I could, I suppose, have plagiarized Greene's Mulder and Scully magic box idea; but I am guessing anyone interested enough to ask in the first place will eventually wind up reading that, or may have already done so, so I prefer to provide another way to think about it so that Greene's explanation will hopefully prove more revealing when it is encountered. Of course, this explanation must also stand on its own; and it must be brief enough to fit in a post, yet descriptive enough for the idea to be relatively clear.

What you have succeeded in here is not to illuminate; it is to obfuscate. And harass, your original intent. You have basically made the conversation incomprehensible to all but an elite few. You have illuminated nothing, but you have succeeded in harassing.

Wrong. Aspect proved nothing beyond reasonable doubt.
Current repetitions of Aspect or more accurately of experiments based on Aspect's idea give beyond six sigma certainty; this is technically speaking the equivalent of "beyond a reasonable doubt." See Charles Weiss' arguments here. I am surprised that you would maintain this position when I have every reason to believe that you have already been exposed to this information when we were discussing "framing" and science in skeptigirl's recent thread.

Aspect is pretty much a done deal. There are a few people still arguing against it, but by and large the majority of the physics community accepts those results as definitively disproving either local realism or locality (and there are, I believe, still a majority who reject locality violations on the same grounds Hawking make the Chronology Protection Conjecture on; this is my position on the matter as well). Six sigma, for those not familiar with statistics, is a level of certainty of 99.9964%. I have seen it stated that recent instantiations have put this beyond nine sigma, but I cannot provide a reference; on the other hand, originally in January of 2003, and most recently updated May 24 2006, Richard Gill provides data in his second appendix to his paper, "Time, Finite Statistics, and Bell's Fifth Position" that show six-sigma results, here, and Weihs' results are available here, and finally you can look here where 242-sigma (no, that is not a typo- two hundred and forty-two standard deviations) results are presented. I have chosen preprints so that the arguments are available for inspection by those who do not have access to Physics, Nature, and other expensive literature of the physics profession.

So much for "Aspect proved nothing beyond a reasonable doubt." I have shown a rigorous definition of the level of scientific experimental certainty consistent with "beyond a reasonable doubt," and shown that Aspect experiments have raised the certainty beyond that level. If you are merely saying that Aspect's original experiment didn't put it at that level, I'm not sure that's even true; but even if it is, the long explanation involved in explaining it is not worthwhile in a post of reasonable length on a non-physics board.

And finally, if that's the only quibble you have, it certainly isn't worth a "wrong," unless you're looking to harass someone.

The experiment has been contested many times and repeated many times with increasing accuracy, although it's probable the results are valid, there are still outstanding and unanswered objections to it. And the argument about the reality of the values is a different one to the non-locality argument.
There is a great deal of proof in the three papers above; there is also a refutation of several loopholes in Gill's paper.

Now, I will not state that the majority of qualified physicists still questioning this are woos; in fact, very much the opposite. It is their task to attempt to find defects in the theory. If they succeed, they will make an inestimable contribution to the progress of physics. They will also be famous, and will likely win money, neither of which is to be sneezed at, but in most cases neither of which is a primary motivation. Still, it does add some spice to the pie. Furthermore, collectively, they will make that inestimable contribution even if they fail, because the more folks try and fail to disprove it, the more sure we can be that it is a useful theory.

On the other hand, anyone who is not a qualified physicist, who expresses an opinion in contradiction of the mainstream, and who engages in pre-emptive harassment of perceived opponents, IS a woo. No question about it, we can all go look at an Evilution thread on this very site to see numerous examples.

So I have to ask you straight out, Pragmatist, what is your position on proof by Aspect of counter-factual local realism as opposed to counter-factual locality? And do you know the difference between local realism and locality? Can you state it? Because this, you see, is absolutely the core of the discussion of Aspect, EPR, the DCQE, and Afshar. And if you don't know the difference, then you do not have sufficient knowledge to discuss this subject technically.

You may well be looking to prove locality violations; and whereas Aspect can be interpreted as either a breakdown of locality or a breakdown of local realism, the DCQE has no such alternative explanation that does not involve explicit violation of the Chronology Protection Conjecture. Where Aspect only violates locality (if it does; i.e., if you interpret it that way) in that it shows "spooky action at a distance," and this merely implies a causality violation, if the DCQE is violating locality, it is a directly perceivable causality violation, i.e. a violation of the time ordering of cause and effect. Of course, either can be explained by a breakdown of local realism, and this requires neither the implicit nor the explicit causality violation, but woos generally like locality violations rather than the more prosaic explanation that there are no local hidden variables. And that's why I ask. I want to know where you stand, and whether this is all just an exercise prompted by defense of belief in "nonlocal phenomena." This last has unfortunately become a faux justification for all sorts of mummery including psychic phenomena, ghosts, and other tripe.

In the immortal words of Pauli - so bad it's, "Not even wrong".
I think this merely adds grist to the mill of speculation; the fact that you went on and tried (and, by the way, completely failed) to anticipate my counter-arguments shows a pattern of behavior: the pre-emptive strike. The post where you did this is here.

Why is it necessary to attempt to pre-emptively discredit an opponent, Pragmatist? What is the point? Do you even care that what you have done has not helped anyone understand anything? Is it of any importance to you? If it is, why have you done this? What is your motivation? I will not even ask whether you have a justification, since it has to be obvious to any observer that nothing can justify rendering this subject incomprehensible to the person the thread was started for, and that that person is in fact not helped is beyond question. I'll leave judgement of the ethical implications of this to the reader.
 
Last edited:
As I am not concerned with historical accuracies I will limit the issues to physical ones and give my perspective of the Afshar experiment. Since some of the issues being discussed involve modern and ongoing research many questions do not have definite answers.
Well, to some extent, that is so; but I'll wait to see what precisely you have in mind before I state a definite position.

Quanta are not elementary particles
Well, according to four sources I produced, photons are quanta, and photons are particles, and according to three of them, they are elementary particles as well. I think you have a definition problem here; I expect that you will make clear what you mean, but since you seem to feel that technical accuracy is important, I will take you at your own evaluation and state that this is technically inaccurate. Quanta are the fundamental entities of which our universe is composed. Quantization does not always mean the rendering of a parameter into elementary particles; for example, the quantum of action, hbar, does not (as far as we can tell) have direct physical existence as an elementary particle. Nevertheless, some quanta have real physical existence, so much so that we can see individual spots on a phosphor screen, or a CCD chip, that mark their positions of impact.

but rather the energy of all elementary particles must posses whole number units of energy. Quanta is therefore are not emitted one at a time but in whole number units.
Quantization is not limited to energy. Not only that, but this gives the impression that energy values cannot be a continuous spectrum. This is not true. The energy of a photon can have any value; it is merely that energy is emitted in discrete packets, quanta, photons. In fact, in semiconductor theory, we find the idea of a band of permitted energies. The orbitals of adjacent atoms in a substance distort one another by their electromagnetic interaction, and form a structure in which absorption and emission of photons of slightly different wavelengths takes place. There is nothing that says that within that band, the exact energy of a photon cannot take up any value within the band; but when energy is emitted by any individual atom in the substance, it is emitted as a whole quantum, with a particular amount of energy determined by the precise shape of the orbital, which is determined not only by the structure of the atom, but by the atom's interactions with its neighbors.

As far as whether quantization is limited to energy, the existence of a quantum of action shows that this cannot be so. Further, spin is also quantized. Not only that, but so is charge.

Some theoretical work involves extending this idea of Quantization to space and time. Although defining Plank units of space and time is trivial the relevance is empirically limited at the moment. One promising approach is "Doubly Special Relativity".
Wow, you went a long way there; I have not yet explored Doubly Special Relativity, so I cannot comment, but I have no reason to believe that you are wrong. Quantization of spacetime is definitely a concept that is in play in the physics community now, and has been for a couple of decades.

The Uncertainty relation does not say you can't measure conjugate parameters simultaneously. It states that if you try to measure one of the variable beyond a certainty accuracy you are limited in the accuracy you can know the other variable. Michael Hall actually reformulated this principle with a stronger relation called "Exact Uncertainty". He was even able to derive the Schrödinger equation directly from it. Under the Copenhagen interpretation (CI) and the collapse of the wave function you are not supposed to be able to observe both wave and particle aspects at the same time.
While this is true of continuous parameters, my intent was to use spin, since that is the parameter used in Aspect and the DCQE. And because spin is discrete, it is in fact correct to state that in the case of spin, if you know the spin on one axis, you can't know it to some arbitrary precision; you know it to absolute precision, or you do not know it at all. And and spin on two axes is conjugate under uncertainty, so if you know the spin on one axis, you know it absolutely, and therefore cannot know anything of it on any other axis. Exact Uncertainty is very interesting, but outside the scope of the conversation.

Though it is true that the measurement was not the source of the uncertainty that does not mean that the conjugate variable does not exist.
If the variable is spin, it does.

Under the CI which is an empirically valid ontology attached to QM these conjugate variables don't have values before being measured. The ontology is not the same and is not needed by QM to make predictions. There are other ontologies that work also. There is an exact quantitative analogy in classical conjugate variables such as the frequency spectrum of sound at a single moment in time.
This is only true for continuous variables, not discrete variables such as spin.

The debate continues to this day which is why experimental attempts like the Afshar experiment are important. QM alone predicts the Afshar experiment perfectly, it's the interpretation that is in question.
Correct. It's also why the DCQE and Aspect are important; they might serve in combination to rule out some interpretations.

Again the question was not if conjugate variable had any value but whether or not they had definite values that exist independent of the measurement. What EPR showed was empirical consistency with the CI. Again other ontologies also work. Physicist are pragmatist (no pun intended) so unless or until new empirical information can be found most will not care what ontology is used. The CI therefore remains the standard.
Actually, the most popular current idea is decoherent CI. Another popular one is Consistent Histories. And the Many Worlds Interpretation has its adherents as well. John Cramer's Transactional Interpretation is also not ruled out, nor is Bohm's (though you will find some who believe that Bohm's idea is poppycock- I am not among those, but they exist). The problem with CI is the measurement problem; decoherence solves that problem, or at least most physicists seem to think it does. Finally, if spin is the variable in question, then it has to be clear that the line of argument that it has a definite value independent of measurement is ruled out by uncertainty; this is why a discrete variable gives the answer, where a continuous variable leaves doubt.

The proof it "was inconsistent with the existence of a real value" depends on the assumptions of the CI. This is the interpretation that uses the idea that the values are not real until measured to show there is no "spooky action at a distance". It is the alternative interpretations that seems to introduce "spooky action at a distance" by giving conjugate variables real values.
This is actually quite deep; here we encounter the conflicting options of contra-factual locality vs. contra-factual local reality. Most physicists take the position that Aspect proves the latter; but all admit that it cannot rule out the former. It is my contention that when combined with the DCQE, if one accepts the Chronology Protection Conjecture, the former is ruled out; however, I point out in fairness that the CPC is only a conjecture, which has no formal proof at this point.

Yes nice list of assumptions. We'll need them.
Thank you.

The third assumption by definition can never be proved so the way it should have read is, "Bell opened the way to differentiate between the first and second assumptions, given the two assumptions about the third, and test them". Now let's look at the assumptions that go into the either/or proof.
Now we can talk about the difference between locality and local realism. I fear, however, that I will have to revisit this for kjkent's benefit; still, c'est la vie. You bring an interesting different viewpoint to the conversation, and I don't want to exclude you.

{1} If @3 is true then;
"unmeasurable complementary values of measured parameters were not merely unmeasurable but non-existent". (Not that the "non-existent" condition is specifically inviked by CI the avoid "spooky action at a distance" and QM neither needs it or includes it. Again unmeasurable doesn't belong in this context. see above)
Ah, but see my above- what about discrete parameters, i.e. spin?

This is the case of contra-factual local reality, the most widely espoused interpretation.

{2} If @3 is false then;
"influences could reach out beyond the light cone". (Note this is the very definition of "spooky action at a distance".)
Indeed. No argument from me here. This is contra-factual locality; some interpret it as causality violation (and in fact, this is one horn of the dilemma EPR intended to impale Bohr on).

By default @{1} assumes the CI in the definition of complete. As stated above it is CI that definitionally excludes "spooky action at a distance". This is also why making @3 false is said to prove "spooky action at a distance". This proof using @3=false depends only on removing the CI definition of nonexistant values prior to measurement and ignores other assumptions in QM about statistical ensembles and what a state of the system might actually mean.
Well, again, if we're talking about spin, then there are only two eigenvalues; if it decomposes, then its state corresponds to one of them, and for the eigenvalues of spin on another axis to be decomposable (whether measurable or not) would imply, in Aspect, through the Bell's inequality CHSH, that we should see a different probability distribution of measurements on an entangled particle than we actually do.

Now, you can propose contra-factual conservation of angular momentum; or you can propose "spooky action at a distance," i.e. non-locality; or you can propose contra-factual local reality, i.e., the spin on another axis does not have any value.

Just searching EPR should point out all kinds of explainations that don't fit the above proof such as: here. It should be noted that EPR correlations are frame dependent. Most will never care without new empirical information :cool: .
I propose that if the Chronology Protection Conjecture can be substantiated, it would show through Aspect and the DCQE that real physics is not locally real. Heh.

More detail;
QM defines the evolution of a system by a statistical wavefunction. These statistics defines the probability of finding a real particle somewhere. It has also been shown that real particles do things that they could only do if they were a wave, not the real particle we assumed them to be, yet they still act like real particles when we check.
Yes, approximately.

When classical thermodyanamics was sufficiently generalized such that it encompassed both classical and Quantum effects it was done by redefining the ensemble. Instead of an ensemble consisting of N parts like a gas we have an ensemble defined by a single part but in N different states. Seems trivial but it is not. This makes sense with a dice as all states are represented by the number of sides it can land on. In classical systems this is still empirically and logically reducible to definate states for dice or gases. Quantum systems have thus far been empirically irreducible. All outcomes are precisely equivalent to what would be expected if the statistics were the reality except that we still see individual particles at the end.
I'm going to have to think about this a while. Keep in mind that Boltzmann's results used a model, the Boltzmann-Maxwell statistics, that fits neither of the sets of statistics used in QM: Bose-Einstein and Fermi-Dirac. This may well account for the Fluctuation Theorem. Does, in my opinion; but it is only an opinion, it will take someone a career in physics to prove me right or wrong in that particular conjecture. Certainly we have seen some very odd and apparently 2LOT-violating behavior just recently, and it is clear that QM is required as an underpinning for the FT.

The Afshar experiment is another in a long line of experiments to try to squeeze some empirical information out about the nature of of this wave particle duality. It in no way changes anything about QM as QM predicts the experiment perfectly. Afshar was designed to distinguish between the various interpretations. In that regard it appears partially successful to me.

Personal opinion;
The idea that the lense calapsed the wavefunction after the interference is difficult under CI but being a theory of ontology seperate but seemingly consistent with QM leaves wiggle room. There are ways to extend Afshar :D but it's still no magic bullet :boggled:, due to QM consistancy and new empirical content issue.
It ignores decoherence; after collapse, a new wave function must govern the propagation. Just as is the case with stacked polarizers, you are not prevented from measuring the spin first this way, then that; you are merely prevented from stating what the spin measured on the first axis is after measuring on the second. That's why inserting a third polarizer between a stack of two that are oriented orthogonally permits light to pass when it will not with only the two. After the new second measurement, the spin on the original axis is no longer definite.

IMO the wave nature of QM is real, the collapse of the wavefunction is an artifact of the ensemble not just representing the actual state but representing all possible states and the particle representation of measurements.
While I agree the wave nature of quanta is real, I disagree that the collapse is an artifact of the ensemble. I don't disagree that it's an artifact of something, however; I am not of the opinion that it has real existence, and this is my quibble with strict CI. But this is only my opinion, just as yours is yours; and in fact, they may turn out to be equivalent. You do have a point about the ensemble, since experiments that show interference in single particles do so only by demonstrating that an ensemble of such particles show it.

You can see an analogy of the single particle interference here. This alone does not help when the very properties we identify with particles such as position and momentum are conjugate variables like frequency and time in classical mediums. This would require assuming that the particle and its properties are a part of the wave like a soliton. EPR correlations need not do any communicating at all if those properties were inherent to the particle from the time of emission.
Ahhh, but what properties precisely when we are talking about a discrete parameter? It's this or it's that, there is no in-between. An ensemble then gives so many this and so many that. Each one has only one value. CAN have only one value, at least at the time it is measured. And lest we forget, spin is quantized. A wave with discrete values is a difficult thing to imagine.

The confusion is from mistaking the measurement of conjugate properties as the same underlying principle as the statistical description. Statistical ensembles can define an arbitrary number parameters in a single entity. Classical statistics is in general a method of trading microscopic information for macroscopic inforation. IMO the Afshar experiment is just a tiny step toward demonstrating my opinion but falls a bit short of actually doing so. Of course there remains a wide range of issues my opinion did not cover.
I still say you are ignoring discrete variables, but I'm not prepared to defend my position yet. I'd like to think about it some, and see what you have to say.

About my personal opinion;
Everything I stated before the personal opinion is fully open to attack. I will only defend my opinion in a very general way as it was perspective that was asked for. In principle people could demand more info about my opinion than I am willing to give :boxedin:. This is neither the time nor the place and not fully defensible without some unique emperical backing :D.
Heh, likewise over here, with respect to the meaning of these experimental results.
 
kjkent, I'm sorry but this derail has cost me all the time I had; it had to be addressed, though, if you were ever to get an answer you could sink your teeth into. I'll try to get you something next week. I wrestled with it last week, but was unsuccessful in coming up with a good path to show you, not to mention distracted by the derail.
 
kjkent, I'm sorry but this derail has cost me all the time I had; it had to be addressed, though, if you were ever to get an answer you could sink your teeth into. I'll try to get you something next week. I wrestled with it last week, but was unsuccessful in coming up with a good path to show you, not to mention distracted by the derail.
Your response to my_wan was useful (as much as I could comprehend). Your argument with pragmatist seems destined to continue the present battle. If getting back on track is what you're after, I fear you will not accomplish your goal.
 
Non-locality trivial if we are in a simulation.

The weirdness of QM, with regard to "non-locality", seems to me to have potential explanation through the ideas of the "digital physics" type of culture.

If you think in terms of this "reality" of ours being computed on a digital computer, like a simulation (not talking about weak SF ideas like TheMatrix), then the computational substrate of this reality could account for non-locality.

For instance, think of some object on the screen of your computer. It could be generated in your graphics card based on an object defined in the program running on your CPU. Multiple instantiations of the object could be present on your screen where they could all change parameters simultaneously - this would seemingly break the relativistic constraints on speed of information travel across space (across the screen). ...similar to the non-locality weirdness of QM. But, in this computer analogy, this is not at all weird. Maybe our concept of space-time is too simple.

Could a photon that is split into opposite polarizations and sent in opposite directions really be two instantiations of the same object? ... with mutually exclusive parameters (opposite polarization for a photon, or opp. spin for a particle)?

Our view of reality has been greatly challenged by QM weirdness, and so I personally think we should keep an open mind for a new paradigm that seems will be necessary to receive an "understanding".

It seems to me that existence of this "substrate" is already infered by QM and general relativity. The problem for many is that it infers a transcendent realm, which seems to be (from what I can tell) blasphemous among physicalist-materialists. ...and this induces emotional responses in discussions around here, I've already learned.
 

Back
Top Bottom