So when I said
"If you fired a gun horizontally, how far would the shell drop in 80 seconds?"
And you replied
"Depends on Muzzle Velocity for the most part."
And then you immediately followed up by quoting me:
"Since you've demontrated your mathematical incompetence already..."
And responded
"Really?? Go ahead and show where....?"
Basically you answered your own question.
If you are struggling to understand what's funny about that, maybe we should go through it in baby steps. Sorry to everyone else who already gets it.
When you drop something, it falls to the ground. I imagine you're familiar with that idea.
A guy called Isaac Newton came up with some maths which describes the effect in terms of a force of attraction acting between the object you drop and the ground.
Now the rate at which the falling object speeds up is predictable and regular, if we can ignore the resistance of the air. So we can calculate how far it will fall in a chosen amount of time. It's pretty accurate if it's something dense like a rock, but not so good if it's something light like a feather or a piece of paper because of the wind resistance. Stop me if I'm going too fast.
Now I don't know if they taught this in sniper school, but if you roll a ball off the side of a table it takes exactly the same time to hit the floor as if you dropped it from the same height as the table. Although one ball is moving sideways and the other just goes straight down, the time they take to drop is identical. The same goes for things moving sideways a lot faster than a rolled ball. A bullet, for example. And I'm damned sure they would have taught that in sniper school.
Even if the earth was flat, an object launched exactly sideways at enormous speed takes just as long to drop to the ground as a similar object which is simply allowed to fall to the ground from the same height.
So the distance it drops in a particular time does not depend on muzzle velocity. Therefore you were wrong.
Pressing on though, if the object was a rail gun projectile, and travelled at 5,600mph (I believe that's the figure you quoted) it would take 80 seconds to travel 125 miles to its target (again neglecting the drag of the air slowing the projectile down, just as a first approximation.).
By the time it arrived,
even if the earth was perfectly flat, the projectile would have dropped below the original point of aim by a distance you can calculate from the simple formula:
s = ½a * t²
Where
s is the distance,
a is the acceleration due to gravity and
t is the time in seconds. If I tell you that
a is approximately 9.8 ms
-2 or, if you prefer, 32 ft per second per second you can work it out for yourself.
No spoilers but if you get an answer near 20 miles then we probably agree.
So having flown 125 miles in 80 seconds the projectile would have dropped quite a long way. A sniper would know all that, of course. How high was that bulge the rail gun had to shoot over again?
