• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

How do physicists think about Zeno's arrow?

In other words, the definition of instantaneous velocity requires one to consider the motion of the arrow over a finite duration of time, as Aristotle pointed out.

Infinitesimals are well defined nowadays. No need to sweep them under the rug, as the limit notation does. It's also much easier to think about, especially
when it comes to double integrals. You can just think about a little square, or a little box, as most physicist always have done. Including Newton and Leibniz.
 
Ultimately all of those paradoxes manifest themselves because the notion of continuity must first be defined in a way that avoids paradox.

The meta-problem is what justifies abandoning paradox? It seems to rely on the mathematics having some force to override what may occur in the physical world. That is, that dividing by zero should be "disallowed" on mathematical grounds. I think this is a dangerous game. Don't physicists run into this with QM in the other direction, where "infinities" must be normalized?

It seems like, when reality and applied mathematics collide, reality ought to win. I'm not saying this is so in this case, just that "mathematically impossible" might not be enough to toss aside all the issues. In this view, paradoxes are more about our own psychology than anything "real."
 
Last edited:
To recap the original question : If one arrow is standing still, and the other is moving, what property distinguishes it from the still arrow
when we freeze time?

The problem is that you say 'freeze time', but the concept of motion doesn't make any sense without time.
If we define velocity as the change in position divided by the change in time, and the change in time is zero, then you're dividing by zero.
Bang, you're dead. Paradox galore.
This is why we use differential equations since it neatly bypasses this problem.
Ultimately all of those paradoxes manifest themselves because the notion of continuity must first be defined in a way that avoids paradox.

I don’t think the math breaking down is the problem. Mathematically you can still have velocity as the time delta approaches zero with no limit to how close to zero you take it.

Taking a step backwards to generalize the problem a bit, what you get is that we have a model for the universe that allows us to look at where an object is at any point in time but we can’t tell how it gets from one point to the next. We can divide the space between the points into more points and see where it is at each of those points, but we still have the same problem. The math allows us to keep dividing space infinity but never explains how the object gets from one point to the next.

Ultimately it isn’t the math that breaks as you are suggesting but that model of the universe. Once you divide down to quantum scales velocity position time, etc become dependent on each other and the preconditions of the paradox violate these dependencies. The “paradox” is actually using a set of conditions that isn’t physically possible.

As you look at smaller and smaller slices of time, energy becomes less and less certain.

As you look at more exact location momentum becomes less and less certain.

For constant mass (ie non-relativistic velocities) these both translate into velocity becoming less and less certain.

Furthermore, the best current understanding is that this isn’t just an observational phenomenon, but the physical property of the object itself that is uncertain. If you looked at both arrows in an infinitely small slice of time, they would both have all possible velocities so asking which one was moving is nonsensical. If you look at longer periods of time the range of velocities gets smaller. If you look at a long enough period of time for the effect not to matter, you are already in territory where you can use classical physics and techniques to figure out which arrow is moving and how fast.
 
Infinitesimals are well defined nowadays. No need to sweep them under the rug, as the limit notation does.

Yes, but an infinitesimal takes you out of the realm of classical physics, and out must respect quantum rules as well. The formulation in the OP violates the uncertainly principle by demanding knowledge of position and momentum simultaneously. To stay within the classical realm you’d need to look at the arrow long enough that it would be obvious which one was moving.
 
(some snipped)
Furthermore, the best current understanding is that this isn’t just an observational phenomenon, but the physical property of the object itself that is uncertain. If you looked at both arrows in an infinitely small slice of time, they would both have all possible velocities so asking which one was moving is nonsensical. If you look at longer periods of time the range of velocities gets smaller. If you look at a long enough period of time for the effect not to matter, you are already in territory where you can use classical physics and techniques to figure out which arrow is moving and how fast.

This bit is particularly nice. Best approach I've read yet.
 
The meta-problem is what justifies abandoning paradox? It seems to rely on the mathematics having some force to override what may occur in the physical world. That is, that dividing by zero should be "disallowed" on mathematical grounds. I think this is a dangerous game. Don't physicists run into this with QM in the other direction, where "infinities" must be normalized?

It seems like, when reality and applied mathematics collide, reality ought to win. I'm not saying this is so in this case, just that "mathematically impossible" might not be enough to toss aside all the issues. In this view, paradoxes are more about our own psychology than anything "real."

Surely reality winning is what happened? The original problem was a description of an unreal problem if the current mathematical thinking at the time were correct. The creation / discovery of infintesemals enabled mathematics to more accurately describe and explain what was observed to be happening in reality.
 
Infinitesimals are well defined nowadays. No need to sweep them under the rug, as the limit notation does. It's also much easier to think about, especially
when it comes to double integrals. You can just think about a little square, or a little box, as most physicist always have done. Including Newton and Leibniz.

It's a fair point that infinitesimals can be well-defined nowadays, though most physicists don't use infinitesimals of the well-defined variety. Newton and Leibniz certainly didn't, to be sure. Nearly all of the reasoning using "infinitesimals" I've seen in physics are really just reasoning using the lowest-order expansion in a Taylor series in some parameter, which is assumed to be extremely small, and in the end taking the parameter->0 limit. I don't see physicists using hyperreals in practice.
 
Surely reality winning is what happened? The original problem was a description of an unreal problem if the current mathematical thinking at the time were correct. The creation / discovery of infintesemals enabled mathematics to more accurately describe and explain what was observed to be happening in reality.

I think the explanation involving small scale uncertainty is better. That escapes the infinitesimal altogether, replacing it with a probabilistic viewpoint. But you are correct in that "getting the math that fits" is the way to go.

It is interesting though, in the case when our situation mirrors that of the ancient Greeks - no ability to capture the essential reality because the experiment cannot be done. In that case, while we look for "physicality," we don't sit back and wait, but spin mathematical constructs to propose ideas which might "fit." Pretty much what they did too.

And, of course, there are those physicists who assert the universe is "made of math." Surprisingly, not in a vague sense, but in a very concrete way. Here's an example, with Max Tegmark interviewed on the Rationally Speaking podcast: http://rationallyspeakingpodcast.or...-on-the-mathematical-universe-hypothesis.html
 
+schrodingasdawg The problem is I think that most treatments on hyperreals come from set theory. I've used dual numbers before in a computer program to implement
automatic differentiation, so you could write a function only once in code, and automatically have its derivative as well for 'free'.
Dual numbers are kind of like complex numbers, except that the extra unit squares to zero instead of minus one. This can be seen as a representation of an
infinitesimal. The approach is called smooth infinitesimal analysis. This is a more number-centric approach to infinitesimal quantities that I like more than
the set-theoretic approach.
I was researching how to 'do' the reciprocals of the infinitesimals so I can get them to cancel to a finite quantity. This approach then can represent
hyperreals completely. The number has a scalar part and the null vectors from Conformal Geometric Algebra, the ones that represent origin and infinity,
and the wedge product between them. a s + b no + c ni + d no^ni. In this way I was able to work out the multiplication table.
Not sure where this leads though ...
 
It's a fair point that infinitesimals can be well-defined nowadays, though most physicists don't use infinitesimals of the well-defined variety. Newton and Leibniz certainly didn't, to be sure. Nearly all of the reasoning using "infinitesimals" I've seen in physics are really just reasoning using the lowest-order expansion in a Taylor series in some parameter, which is assumed to be extremely small, and in the end taking the parameter->0 limit. I don't see physicists using hyperreals in practice.

It depends what you mean by 'in practice'. The mathematics of 'hyper reals' has always been part of calculus. The definitions of hyper reals just summarize all the rules of calculus that for the most part were determined empirically. The convergence of Taylor series is basically justified with hyperreals.

I don't see physicists, engineers or draftsmen using 'Euclidean geometry' in practice. Draftsmen were using some of the postulates of Euclidean geometry hundreds of years before Euclid compiled and formalized the rules of geometry.

Look at some of the bronze age and stone age constructionI conjecture that the engineers who built Stone Henge used compasses and straight edges to keep those stones in the form of a regular polygon. At any rate, they didn't need Euclid to tell them how to construct a circle. I am almost sure that the engineers who built the pyramids knew about inclined planes. They didn't need Archimedes to tell them what an inclined plane does.

That being said, Euclid and Archimedes advanced mathematics by bring them together. Logical proof has its uses.

Physicists may occasionally use hyper reals when the calculus leads to paradoxes. The use of infinitesimals has always been associated with apparent paradoxes, beginning with Zeno. If one doesn't have a formal framework, one can calculate physically absurd conclusions.

I think formal mathematics is often used as a tie breaker when examining an ambiguous conclusions from empirical data. If one analyzes the data using infinitesimals, one comes to important conclusions. Sometimes the conclusions are ambiguous. If ones intuition leads to ambiguous conclusions, then it is time to bring out the hyper reals.

So I think to a physicist, even to an engineer, hyper reals are useful. They reduce the ambiguity brought on by empirically derived mathematics. I think of them as a tie breaker rather, though. If one is happy with ambiguous solutions, then you don't need formal mathematics.
 
The definitions of hyper reals just summarize all the rules of calculus that for the most part were determined empirically. The convergence of Taylor series is basically justified with hyperreals.

...

I think formal mathematics is often used as a tie breaker when examining an ambiguous conclusions from empirical data.

...

If one is happy with ambiguous solutions, then you don't need formal mathematics.

Your post makes no sense to me. Calculus had already been put on a rigorous basis before hyperreals were postulated. Taylor series can be shown to converge (uniformly) within some radius of convergence using classical techniques for demonstrating convergence (e.g., the Weierstrass M-test), techniques that only involve functions from subsets of reals to reals. Hyperreals are absolutely unnecessary for a formal, rigorous treatment of this. I really hope you don't think real analysis is somehow ambiguous or merely "empirical."
 
Your post makes no sense to me. Calculus had already been put on a rigorous basis before hyperreals were postulated. Taylor series can be shown to converge (uniformly) within some radius of convergence using classical techniques for demonstrating convergence (e.g., the Weierstrass M-test), techniques that only involve functions from subsets of reals to reals. Hyperreals are absolutely unnecessary for a formal, rigorous treatment of this. I really hope you don't think real analysis is somehow ambiguous or merely "empirical."

Sorry. I was wrong. Or at least I stated my point too strong.

I am reading a book about the history of infinitesimals. I was noting that some mathematicians were adding and subtracting infinitesimals almost like they were real numbers. They used the concept of 'limit' but only as a last step.

Many of these mathematicians going back to classical times described their work as probing 'the structure of the continuum'. They treated infinitesimal quantities almost like they were inflatable atoms. They described infinitesimals as 'holes' in the real number line.

'Hyper reals' seems more geometric to me than infinite sequences. Infinite sequences are more algebraic than hyper real numbers. The concept of hyper reals evokes in my imagination the image 'holes in the number line'. The inequality relations used to prove convergence aren't easily associated with 'holes'.

The work on infinitesimals far preceded the 'Taylor series'. Infinitesimals were an extrapolation of Euclidean geometry (though standing outside of Euclidean geometry). The mathematicians justified such language in terms of the empirical properties of physical materials. Which is strange since an infinitesimal quantity can never be written as a real number.

The idea of infinitesimals really arose from geometry. So rigor for infinitesimals means being consistent with Euclidean geometry. The idea of limits really arose from symbolic algebra. So rigor for the concept of limit means being consistent with algebra. There is a great deal of overlap between the two, of course. However, I think Euclidean geometry way preceded symbolic algebra.

Galileo did a lot of work with infinitesimals. His followers developed lots of proofs showing how the work with infinitesimals was consistent but not equivalent with Euclidean geometry. Euclidean geometry assumes a finite number of simple operations, were simple designates straight edge and compass. Symbolic algebra wasn't used very much. Newton did not use symbolic algebra very much. Yet, Newton is somehow called the father of calculus.

It helps when reading this book to think about infinitesimals in a geometric way. Therefore, I think more of rigor in the geometric sense than in the algebraic sense. I think that the rigorous algebraic formalism is important too. However, many of the early mathematicians were firmly rooted in geometry.
 
What book is this?
I recommended this book so many times on this forum!

Well, once more:

‘Infinitesimal: How a Dangerous Mathematical Theory Shaped the Modern World’ by Amir Alexander (OneWorldBooks, 2014)’

New paper back edition May 2015.

Amazon link:

http://www.amazon.com/Infinitesimal-Dangerous-Mathematical-Theory-Shaped/dp/0374534993
In Infinitesimal, the award-winning historian Amir Alexander exposes the deep-seated reasons behind the rulings of the Jesuits and shows how the doctrine persisted, becoming the foundation of calculus and much of modern mathematics and technology. Indeed, not everyone agreed with the Jesuits. Philosophers, scientists, and mathematicians across Europe embraced infinitesimals as the key to scientific progress, freedom of thought, and a more tolerant society. As Alexander reveals, it wasn't long before the two camps set off on a war that pitted Europe's forces of hierarchy and order against those of pluralism and change.


I quoted this book as an example of where an apparent 'paradox' was debated for many years, leading to breakthroughs in science and technology.
 
It's a very small book. You get it by dividing a regular book in half and repeat the operation many times. You won't end up with the book you want, but you can get as close to it as you desire.

There are invisible pages that lie between each paper page. Indeed, they lie between each letter in the book. These pages can be 'inflated' to any size one wants to read, but they remain invisible. The invisible pages are the best part of the book! :)
 
It's a very small book. You get it by dividing a regular book in half and repeat the operation many times. You won't end up with the book you want, but you can get as close to it as you desire.

There are invisible pages that lie between each paper page. These pages can be 'inflated' to any size one wants to read, but they remain invisible.

I suggest you read the invisible pages. The invisible pages are the best part of the book! :)
 
Since there are many well-trained (even if not degreed) physics afficionados on this forum, I'd like to know, not necessarily what the resolution to the original "paradox" is, but how someone with a better understanding of physics (and more schooling) than me views the situation - how do you think about it?

It is time someone answered the OPs question. I can answer authoritatively only for myself, who is a physicist. However, I will also provide an authoritative reference that discusses this issue in a more detailed way than I can.

I am a degreed physicist (PhD). I have often worked with calculus, which uses concepts based on the concept of limits. So I have worked with conundrums similar to, although not always identical, to Zeno's paradox. This comes up in what one would call applied math, not what most people call philosophy.

Physicists worked with a smorgasbord of mathematical concepts which are often related to each other but superficially appear different. Hence the concept of limit gets expressed in a number of different ways. Therefore, there is more than one of mathematically valid way to resolve Zeno's paradox. The use of any of specific approach depends both on the application and on the aesthetic prejudices of the physicist or mathematician.

The concept of infinitesimals is one used a great deal by some physicists. Someone working with a solid may refer to differential quantities. A fluid dynamicist may refer to infinitesimals as fluid elements, for example. I am currently interested in the concept of infinitesimals, which has a long history in human thought. There are other ways to handle the concept of limits, such as topology which involves inequality relations. However, I have a nice reference that could answer your questions.

My current preference for dealing with Zeno's paradox is through the concept of infinitesimals. My preferences are not fixed, but that is what I am currently into. Another physicist could look at Zeno differently at this time. However, my current state of mind considers things in a way analogous to the way the 'infinitesimal' mathematicians did as described the following reference.

I recommend the following book especially for Marsplot and TheAdversary. The book describes the history of the infinitesimal idea starting with the paradoxes of Zeno. Although it introduces the topic of infinitesimals with Zeno, it continues a bit farther.

The book follows the infinitesimal concept up to the time of Newton. Newton used the concept in Principia. Some historians credit him with inventing the idea, but it was a controversial subject for may the two centuries before Principia. The infinitesimal concept had to be commonly accepted in order for Principia to be formally accepted. At any rate, Newton is the quintessential theoretical physicist of all time. So I think this will partly answer Marsplots question, 'What do physicists think about Zeno's paradox?'





‘Infinitesimal: How a Dangerous Mathematical Theory Shaped the Modern World’ by Amir Alexander (OneWorldBooks, 2014)’

New paper back edition May 2015.

Amazon link:

http://www.amazon.com/Infinitesimal-.../dp/0374534993


I would love to discuss the question of Zeno's paradox with someone else who finishes this book. A lot of things have advanced since Newton, so I think it would be interesting. The old physics could provide some illumination of the controversies in modern physics. So I highly recommend this book.
 
‘Infinitesimal: How a Dangerous Mathematical Theory Shaped the Modern World’ by Amir Alexander (OneWorldBooks, 2014)’

New paper back edition May 2015.

Amazon link: http://www.amazon.com/Infinitesimal-.../dp/0374534993

I would love to discuss the question of Zeno's paradox with someone else who finishes this book. A lot of things have advanced since Newton, so I think it would be interesting. The old physics could provide some illumination of the controversies in modern physics. So I highly recommend this book.


I have yet to read Amir Alexander's book, but hope to find the time to do so. I love those sort of books and get swept up in the historical drama for sometimes weeks on end.

But there is already a very good book on that subject, which Mr. Alexander will be hard pressed to top. Math historian Carl B. Boyer's classic book on the history and conceptual development of the calculus. It's an outstanding book; as good as it gets!

"The History of the Calculus and Its Conceptual Development"
Carl B. Boyer (1959)
Amazon link: Here

The book can also be read online (for free) at the following links:

PDF Format: Here
Text Format: Here
 
I have yet to read Amir Alexander's book, but hope to find the time to do so. I love those sort of books and get swept up in the historical drama for sometimes weeks on end.

But there is already a very good book on that subject, which Mr. Alexander will be hard pressed to top. Math historian Carl B. Boyer's classic book on the history and conceptual development of the calculus. It's an outstanding book; as good as it gets!

"The History of the Calculus and Its Conceptual Development"
Carl B. Boyer (1959)
Amazon link: Here

The book can also be read online (for free) at the following links:

PDF Format: Here
Text Format: Here

I actually met Professor Carl Boyer when I was in college. A very knowledgeable man.

I will read this book eventually. However, I see it also describes the history of calculus.


So this is the second recommendation I can give the OP, aka Marsplot.

I recommend that everyone read 'The History of Calculus and Its Conceptual Development' by Carl Boyer in addition 'Infinitesimals' by Amir Alexander. They are both fairly short books. However, both will provide a different view as to how Zeno's Paradox has influenced science and history. If you read both those books, you will get an idea as to how physicists view Zeno's paradox.

I conjecture that these books probably complement each other nicely. If anyone reads both books, he or she should tell us what they learned!
 

Back
Top Bottom