• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Merged Puzzling results from CERN

Well, look at it this way - if that could happen, we'd expect to observe such speed violations very often (not to mention all sorts of other weird and inexplicable effects).


This makes sense to a lay person such as me, but...


(Please note - I'm not saying it isn't Euclidean, or that this result should be taken seriously.)


This kind of attitude doesn't. Your position is that they shouldn't have come out like they did with this result, and that their results should be impossible according to current knowlege. Note that they acknowledge this, and are careful not to claim anything beyond their results, they are hoping for others to find the potential mistakes, because they couldn't do it. They have already answered to many questions in the QA and other scientists in the audience seemed impressed with the quality of their work.

I agree that it would've been better to do this more "behind the scenes", then again, maybe they've consulted enough people outside of their own group to think it was time to have this presentation and QA in public, I don't know the answer to that, maybe you do. In any case, it's out in public now and speculating things without seeing the video is not impressive at all.
 
Someone else has probably said this - isn't this a classic example that shows how silly the claims are about how the scientific community suppresses new ideas and anything that threatens "THE CONSENSUS"?

That's what I was thinking. According to Anders, ALL experiments that show the speed of light is a constant relative to the observer are suspect, but this one result from CERN should be accepted without question. WTF...?
 
This kind of attitude doesn't.

It was a bit of an overstatement. What I meant is that we shouldn't start throwing out the accumulated knowledge of the last century and trying to think of ways to explain this - not until it is far more firmly established, if that ever happens.

Your position is that they shouldn't have come out like they did with this result

My position is that they shouldn't have started with press release. They should have gone around and given talks, given the community a chance to debug this, and - if no flaws were found after some months - they could go public. Instead, going to the press first is (in my opinion) irresponsible and attention-seeking.

In any case, it's out in public now and speculating things without seeing the video is not impressive at all.

I saw the presentation live, Kuko, if that last was directed at me.
 
My position is that they shouldn't have started with press release. They should have gone around and given talks, given the community a chance to debug this, and - if no flaws were found after some months - they could go public. Instead, going to the press first is (in my opinion) irresponsible and attention-seeking.

I have a slightly different take on this: surely almost as soon as they'd started giving such talks the press would have got hold of it, and (to avoid only pure nonsense appearing in the press) they would have essentially been forced to make a press release. So it seems sensible to do both at the same time.

But given that they've apparently done 6 months checking, why not wait slightly longer and release the detailed analysis at the same time? I would have expected something like the current paper plus a 100 page (or whatever) appendix giving details of the kinds of things people asked about at the presentation, and giving enough detail of the statistics etc to give people a chance to check things.

I saw the presentation live, Kuko, if that last was directed at me.

If you mean you were actually there (or if you know anyway), could you PM me the name of the person in the audience who invited (right at the end, after questions) people there to e-mail him if they had questions?
 
If you mean you were actually there (or if you know anyway), could you PM me the name of the person in the audience who invited (right at the end, after questions) people there to e-mail him if they had questions?


It was live on the internet too. His first name was Antonio, so my guess is that he was Antonio Ereditato ("a physicist at the University of Bern in Switzerland and OPERA's spokesman"). Nevertheless, that's a good place to send your questions.
 
Is it possible that the detected neutrinos actually travelled less distance than the direct path from the source to the detector? I know neutrinos are known for being able to pass through matter with little interaction, but the fact that they can be detected means they will sometimes produce an interaction.

So .. what if a neutrino is occassionally absorbed by an atom or particle within the earth and re-emitted? Due to quantum effects, can the time between absorbtion and re-emission be less than the time it would take light to travel the distance from where the particle was absorbed then re-emitted? Could a serious of "quantum leaps" or tunnelling explain the time difference?

Note ... this post comes from someone who has, perhaps, only a little better grasp of physics than Lindman. :-)

-- Roger
 
I've tried to split out the CT stuff that I'd erroneously merged into this thread, and merged them with the related thread in CT here. If I've missed any, or moved any that shouldn't have been moved, please feel free to PM me. My apologies for the inconvenience.
Posted By: LashL
 
My position is that they shouldn't have started with press release. They should have gone around and given talks, given the community a chance to debug this, and - if no flaws were found after some months - they could go public. Instead, going to the press first is (in my opinion) irresponsible and attention-seeking.

Seems to happen a lot lately, no ?

And thanks to LashL for getting the crazy out of this thread.
 
Is it possible that the detected neutrinos actually travelled less distance than the direct path from the source to the detector? I know neutrinos are known for being able to pass through matter with little interaction, but the fact that they can be detected means they will sometimes produce an interaction.

There is some serious speculation that one of the reasons that gravity is so weak (under a quantum view of gravity) is that gravitons are tunneling through dimensions that are too small for most particles to go through, and that neutrinos are doing this, too, and the path is slightly shorter than what you would measure macroscopically.

It doesn't matter much overall, though, because it's the signal velocity that is important. If this be a real effect, then the value of c would have to be revised upward (which, in SI, would mean that the meter would have to be slightly longer, because c is given by a constant and the meter is defined with respect to c and a second). This would mean that if you calculated the distance using the revised definition of the meter, it would be long enough.

So .. what if a neutrino is occassionally absorbed by an atom or particle within the earth and re-emitted? Due to quantum effects, can the time between absorbtion and re-emission be less than the time it would take light to travel the distance from where the particle was absorbed then re-emitted? Could a serious of "quantum leaps" or tunnelling explain the time difference?

I've always found the absorption/re-emission metaphor a bit on the quasi-classical side, and I find it better to think in terms of QED, where the maximum probability path is wiggly and therefore a bit longer. Anyway, the result is the same, and yes, this happens. When a particle goes faster than the signal velocity of light in a medium, then you get Cherenkov radiation. I don't know how they are detecting the neutrinos, but one of the ways is to have them pass through a medium in which the signal speed of light is slower and measure the Cherenkov radiation.

Still, however, under relativity the signal speed would have to be less than c, Not that people call c the speed of light in a vacuum, but a vacuum isn't empty. It's boiling with short-lived particles, and it's been known for some time that when some of these particles are removed (as with Casimir plates), light goes just a hair faster than in an ordinary vacuum. This would mean that our measurements to date of c are just a hair too low, which I could believe.

As I've pointed out, actually violating relativity would call the GPS measurements into question, because they rely on relativity. However, some drifting of GPS has been observed, and this is normally corrected by signals from the ground. There has been some speculation about whether they may be caused by small irregularities in spacetime. This wouldn't violate relativity; it would just mean that we don't have a full understanding of all the sources of energy/momentum.
 
There is some serious speculation that one of the reasons that gravity is so weak (under a quantum view of gravity) is that gravitons are tunneling through dimensions that are too small for most particles to go through, and that neutrinos are doing this, too, and the path is slightly shorter than what you would measure macroscopically.

"Tunneling"? Which theories are you thinking of?

Still, however, under relativity the signal speed would have to be less than c, Not that people call c the speed of light in a vacuum, but a vacuum isn't empty. It's boiling with short-lived particles, and it's been known for some time that when some of these particles are removed (as with Casimir plates), light goes just a hair faster than in an ordinary vacuum. This would mean that our measurements to date of c are just a hair too low, which I could believe.

It doesn't actually go faster - that's a misstatement of the Scharnhorst effect. The signal velocity is exactly the same (it's c, as measured by light's propagation through vacuum without any Casimir plates).
 
So .. what if a neutrino is occassionally absorbed by an atom or particle within the earth and re-emitted? Due to quantum effects, can the time between absorbtion and re-emission be less than the time it would take light to travel the distance from where the particle was absorbed then re-emitted? Could a serious of "quantum leaps" or tunnelling explain the time difference?

Such interactions should slow down the propagation - never speed it up.
 
As I've pointed out, actually violating relativity would call the GPS measurements into question, because they rely on relativity. However, some drifting of GPS has been observed, and this is normally corrected by signals from the ground. There has been some speculation about whether they may be caused by small irregularities in spacetime. This wouldn't violate relativity; it would just mean that we don't have a full understanding of all the sources of energy/momentum.
(my bolding)

Do you have any cites for the bolded part? I'm not aware of any GPS drift that isn't explainable by far more mundane processes (e.g. clock drift, solar pressure, thruster leakage, outgassing, and limitations on measurement accuracies).
 
...
Not that people call c the speed of light in a vacuum, but a vacuum isn't empty. It's boiling with short-lived particles, and it's been known for some time that when some of these particles are removed (as with Casimir plates), light goes just a hair faster than in an ordinary vacuum.

Ooh - that's a Hackenthorpe Vacuum; who needs the Casimir Effect - just take an ordinary vacuum and suck all the vacuum out... Aye thang yew - I'm here all week :D

An interesting comment on Matt Strassler's blog - apparently there's a ~60ns difference between the geodetic (surface) path as given by GPS and the direct (chordal) path between source & detector... wouldn't that be an embarrassing mistake! :jaw-dropp
 
An interesting comment on Matt Strassler's blog - apparently there's a ~60ns difference between the geodetic (surface) path as given by GPS and the direct (chordal) path between source & detector... wouldn't that be an embarrassing mistake! :jaw-dropp

First off, I disagree with the number - I just checked, and I got closer to 400m (60ns is about 20m). Second, I understood that they obtained 3D coordinates for both endpoints from GPS. I cannot image they wouldn't have computed the Cartesian distance between the two points and compared.

So I don't think that's the answer.
 
Last edited:
I cannot image they wouldn't have computed the Cartesian distance between the two points and compared.

So I don't think that's the answer.


For something like this we shouldn't be confined to imagining what they did or didn't do. We're getting a lot of speculation from people sufficiently removed from the problem that one of them has a chance to hit on something everyone else thinks is too obvious.

What we need is an active list of every speculated cause of the error and a real analysis to eliminate that cause as a possible contributor. If nothing else, this will act as a filter to keep the same speculations from being asked over and over.
 

Back
Top Bottom