I don't think space is expanding.

Status
Not open for further replies.
Start with special relativity. Minkowski spacetime.

Then say that photons start out on the null geodesic, v=c (*edit* nope, that can no longer be true. it ends on the null geodesic near the observer), but every starting point for a photon has its own unique path through space time, according to v=c/(1+HD)2.

This makes all the connections between significantly distant events not strictly light-like.

Your profound ignorance strikes again.

You can't make light travel on time-like paths without that affecting EVERYTHING from chemistry to nuclear physics. Do you think that a star in which light traveled at only 1/2 c would look anything like a star in which light traveled at c? No. They would be completely different in every way, from size to temperature to spectra. None of it would match up.

You can't just tweak one setting and expect nothing else to change. Reality doesn't work that way.
 
Yes! 100%

If the photon and the graviton fall off by distance twice (once for the traditional inverse square law, second for the Hubble inverse square law), the maximum electromagnetic and gravitational always be where you are.

This explains the flatness of the universe.

You really don't get it. The universe that describes is quite explicitly NOT flat. Flatness requires homogeneity, and you've made it as inhomogeneous as possible. Plus, this explicitly contradicts your OTHER nonsense interpretation that c travels on time-like paths instead of null paths.
 
You really don't get it. The universe that describes is quite explicitly NOT flat. Flatness requires homogeneity, and you've made it as inhomogeneous as possible.

But every observer would view their Hubble volume the same way we view ours.
 
Yes it is.

Every observer would perceive the distant universe as redshifted.

Nothing you say makes any sense. You say light starts out traveling at less than c and then speeds up as it travels. But if that's the case everywhere, then light emitted AND detected locally should be traveling at less than c. But it isn't.

This fails even worse, and more obviously, than your original idea. That's quite an accomplishment.
 
Nothing you say makes any sense. You say light starts out traveling at less than c and then speeds up as it travels. But if that's the case everywhere, then light emitted AND detected locally should be traveling at less than c. But it isn't.

This fails even worse, and more obviously, than your original idea. That's quite an accomplishment.

It doesn't matter where the light is emitted.

It matters how far the light is from you.

If you shoot a laser beam, that light will decelerate after traveling for millions of years.

If it were to warp around a black hole and come flying back this way, it would gain speed and be moving at c when it reaches you.
 
Can't we just say, "I measured my meter stick today, and it's one meter long, same as last year, so space can't be expanding?"

That's actually less wrong than all these ad hoc varying light speed models.
 
One wonders what would happen if the force of expansion increased to the point where it overrode local gravity. Let alone local molecular bonds. If expansion was stretching your meter stick, you'd probably have much more serious problems to worry about. Like your own gradual but inevitable disintegration.

And the fact that your planet is probably already out of any stable orbit. And your sun is about to dissipate.
 
It doesn't matter where the light is emitted.

It matters how far the light is from you.

If you shoot a laser beam, that light will decelerate after traveling for millions of years.

If it were to warp around a black hole and come flying back this way, it would gain speed and be moving at c when it reaches you.

You have two contradicting theories here. Does light slow down, or does it speed up? Maybe it depends on the phase of the moon, or the direction of the wind. Let us hope it is not north-northwest.
 
It doesn't matter where the light is emitted.

It matters how far the light is from you.

If you shoot a laser beam, that light will decelerate after traveling for millions of years.

If it were to warp around a black hole and come flying back this way, it would gain speed and be moving at c when it reaches you.
So light always travels at c here and at less than c away from here? What’s so special about here?
 
You have two contradicting theories here. Does light slow down, or does it speed up? Maybe it depends on the phase of the moon, or the direction of the wind. Let us hope it is not north-northwest.

The theory is the same.

v=c(1+HD)2
If D increases, v goes slows down. If it decreases, v goes up.
 
So light always travels at c here and at less than c away from here? What’s so special about here?

No observer is special.

If two observers are at the same place but going at different relative speeds, then the speed of light is still c to both observers.
 
No observer is special.

If two observers are at the same place but going at different relative speeds, then the speed of light is still c to both observers.

That is a postulate of SR which you have abandoned. Plus nothing is moving with respect to anything else in your scheme. So I ask again. If light speed is c here and less than c elsewhere, what is special about here?
 
Status
Not open for further replies.

Back
Top Bottom