Deeper than primes

Status
Not open for further replies.
You are right jsfisher.

Being distinct point along 1-dimensional space is possible only if given any arbitrary pair of points, they are not adjacent.

Please define adjacent in this context. I bet you can't, because it's meaningless to say that two points are adjacent. They can't be. They are either distinct or the same.
 
I'm looking at a calculus textbook right now that agrees with me. It has this written almost exactly:

[latex]a + ar + ar^{2} + ... + ar^{n-1} + ... = \displaystyle\sum\limits_{n=1}^{\infty}ar^{n-1}=\lim_{n \to \infty}\displaystyle\sum\limits_{k=1}^{n}ar^{k-1}=\frac{a}{1-r}[/latex]

Substitute in 1 for a and 1/2 for r to get:

[latex]1 + 1/2 + 1/4 + ... + (1/2)^{n-1} + ... = \displaystyle\sum\limits_{n=1}^{\infty}(1/2)^{n-1}=\lim_{n \to \infty}\displaystyle\sum\limits_{k=1}^{n}(1/2)^{k-1}=\frac{1}{1-1/2}=2[/latex]

Subtract 1 from both sides and you have:

[latex]1/2 + 1/4 + 1/8 + ... = 1[/latex]

An analysis text I have from 1980 also uses the + ... to indicate the limit of the sequence of partial sums. It's not as precise as using the Sigma notation or using a limit symbol, but it's clear that it indicates an infinite summation, which is defined as the limit of the sequence of partial sums, which is 1 in this case. Hence, 1/2 + 1/4 + 1/8 + 1/16 + ... < 1 is nonsense.

If Doron defines 1/2 + 1/4 + 1/8 + 1/16 + ... in a nonstandard way, then he needs to specify that, which is what I've been asking him.

You grossly misunderstand the issue here. Your excerpt establishes the limit into the whole computation as soon as possible to arrive at a result. Without including the limit there would be no way to do so. By a common agreement among mathematicians, the limit is considered the sum of a convergent series, even though it is not so, coz the "margin of error" is absolutely negligible and no one ever complained about any inconsistencies when such a series becomes a part of some broader proposition. And so it often happens that the reference to the limit is omitted by publishing pseudo-mathematicians, such as Dr. Michon and others. Doron sees it and makes his own conclusion about "traditional mathematics" as he did in this particular case.

Your appeal to authority that you entirely depend on results in calling the expression 1/2 + 1/4 + 1/8 + 1/16 + ... < 1 nonsense. That particular expression, as Doron wrote it, doesn't mention any limit and there is no rule that would prohibit it to be a standalone. Therefore the inequality is correct as simply proved by

1 - 1/22 < 1 where n → ∞
-1/2n < 0
-1 < 0

Of course, Doron regards this inequality, which is correct but useless for any purpose, as his proof that traditional math is riddled with humoungous inconsistencies and contradictions, and only OM can save the computational future from a total collapse.
 
You grossly misunderstand the issue here. Your excerpt establishes the limit into the whole computation as soon as possible to arrive at a result. Without including the limit there would be no way to do so.

Well that's quite interesting considering you can establish that 1/2 + 1/4 + 1/8 + ... = 1 without using limits.

Consider the following:

[latex]S = 1/2 + 1/4 + 1/8 + ...[/latex]

and

[latex]S/2 = 1/4 + 1/8 + 1/16 + ...[/latex]

What happens when you subtract the second from the first?

[latex]S - S/2 = (1/2 + 1/4 + 1/8 + ...) - (1/4 + 1/8 + 1/16 + ...)[/latex]

[latex]S/2 = 1/2 + (1/4 + 1/8 + 1/16 + ...) - (1/4 + 1/8 + 1/16 + ...)[/latex]

[latex]S/2 = 1/2[/latex]

[latex]S = 1[/latex]

Hence,

[latex] 1/2 + 1/4 + 1/8 + ... = 1[/latex]

Of course, this proof would be unacceptable in higher level mathematics because if you disregard limits then 1/2 + 1/4 + 1/8 + ... doesn't truly have a precise meaning as summation by itself is only defined for finite collections of numbers.

By a common agreement among mathematicians, the limit is considered the sum of a convergent series, even though it is not so, coz the "margin of error" is absolutely negligible and no one ever complained about any inconsistencies when such a series becomes a part of some broader proposition.

That's an interesting claim. What "margin of error" can possibly exist between your so-called true value of 1/2 + 1/4 + 1/8 + ... and 1? No matter how close to 1 you choose a number x < 1, I'll always be able to set n large enough such that the partial sum of the series exceeds x. This is a proven fact, and it's exactly what the definition of a limit says. So, we know that the sum of the infinite series cannot be less than 1. We also know that the sum cannot be greater than 1 because no matter how large we choose n the nth partial sum will always be less than 1. This leaves one option: the sum of the infinite series is 1.

Your appeal to authority that you entirely depend on results in calling the expression 1/2 + 1/4 + 1/8 + 1/16 + ... < 1 nonsense. That particular expression, as Doron wrote it, doesn't mention any limit and there is no rule that would prohibit it to be a standalone.

It is nonsense solely for the reason that neither you nor Doron attempt to give this expression a precise meaning that differs from the traditional precise meaning as the limiting value of the partial sums of the series, yet you make an assertion about it.

1 - 1/22 < 1 where n → ∞
-1/2n < 0
-1 < 0

And this is wrong. 1 - 1/2^n < 1 is true for any natural number n, not as n approaches infinity. As n approaches infinity, we end up with 1 = 1. This holds true for the definition of the limit as the mathematical world has used it since the time of Weierstrass. If you mean something else by n approaching infinity then you must specify so precisely.
 
Last edited:
jsfisher said:
At any rate, given A as a point on line L and B as a point on line L, there is no implicit requirement that A and B be two different points.
In that case there is a collection of exactly single point with more than one name, it actually can have uncountable names, which does not change the fact that we have a collection of one and only one point, in this case.

Since we are explicitly dealing with a collection of more than a single point, such that each point of that collection is explicitly distinct, then given points A and B along a given line, they are distinct only if there is a distance > 0 between them.

This abstract fact is invariant, no matter what arbitrary pair of points is taken, along a given line.

Furthermore, the collection of distinct points is complete (no point is missing) and yet, being distinct point of that complete collection means that given arbitrary pair of points along a given line, they are distinct only if there is a distance > 0 between them.

In other words, the collection of all distinct points along a given line, can't completely cover it.
 
Last edited:
In that case there is a collection of exactly single point with more than one name, it actually can have uncountable names, which does not change the fact that we have a collection of one and only one point, in this case.

Since we are explicitly dealing with a collection of more than a single point, such that each point of that collection is explicitly distinct, then given points A and B along a given line, they are distinct only if there is a distance > 0 between them.

This abstract fact is invariant, no matter what arbitrary pair of points is taken, along a given line.

Furthermore, the collection of distinct points is complete (no point is missing) and yet, being distinct point of that complete collection means that given arbitrary pair of points along a given line, they are distinct only if there is a distance > 0 between them.

In other words, the collection of all distinct points along a given line, can't completely cover it.

And yet, somehow, they do, nor can you show anywhere on the line that there is no point.
 
And yet, somehow, they do, nor can you show anywhere on the line that there is no point.
zooterkin, this is the beauty in this case, I do no have to show anywhere on the line that there is no point, because any given point is already a distinct member of the collection of all points a long a given line.

zooterkin, this collection is complete (no point is missing) any yet all the points of that collection are distinct, such that given any arbitrary pair of this complete collection, there exists a distance > 0 between them.

Any attempt to force (distinction) AND (distance = 0 between them) is a logical contradiction.

So, your "yet, somehow, they do" is a contradiction and no one has to show anywhere on the line that there is no point, because the collection of distinct point along a given line is already complete (no point is missing).
 
Last edited:
Please define adjacent in this context. I bet you can't, because it's meaningless to say that two points are adjacent. They can't be. They are either distinct or the same.
It is easy: distance > 0

They are either distinct or the same.
No laca, they are distinct because they are not the same.

You can't use "They are" as a part of the definition of a single point, and this is exactly what you do in your

They are either distinct or the same.

invalid proposition.
 
Last edited:
No matter how close to 1 you choose a number x < 1, I'll always be able to set n large enough such that the partial sum of the series exceeds x. This is a proven fact, and it's exactly what the definition of a limit says.
EDIT:

Yet it does not mean that x=1.

The definition of the limit is based on contradiction if it is claimed that x=1.

All we need is a collection of distinct points and a line, in order to prove it.

Theorem: The collection of all distinct points of [0,1] can't completely cover [0,1].

Proof:

0 ≤ x ≤ 1

x=1 is trivially true, because x=1 is a collection of a single distinct point, which completely covers itself.

Proposition A: "Point x and point 1 are distinct"

Proposition B: "The distance between point x and point 1 is 0"

Proposition C: "A AND B is a contradiction"

Proposition D: "A AND ~B is a tautology"

Since C AND D is true, then the collection of all distinct points of [0,1] can't completely cover [0,1].

Q.E.D
 
Last edited:
No HatRack,

It is exactly the opposite.

The definition of the limit is based on contradiction, and all we need is a collection of distinct points and a line, in order to prove it:

Theorem: The collection of all distinct points of [0,1] can't completely cover [0,1].

Proof:

0 ≤ x ≤ 1

x=1 is trivially true, because x=1 is a collection of a single distinct point, which completely covers itself.

Proposition A: "Point x and point 1 are distinct"

Proposition B: "The distance between point x and point 1 is 0"

Proposition C: "A AND B is a contradiction"

Proposition D: "A AND ~B is a tautology"

Since C AND D is true, then the collection of all distinct points of [0,1] can't completely cover [0,1].

Q.E.D

If Proposition A is true, proposition B is not true.
 
And this is exactly the reason why A AND B is a contradiction, and proposition C is true.

I'm not disagreeing with that, either. What is ridiculous is your 'conclusion'.

If A is true, then B, by definition, cannot be true, because if the arbitrary point you chose is not 1, then it is distinct, and its distance from 1 must be >0. The fact that B is not true doesn't prove a thing.

ETA: Your argument would work if points took up space; then it would make sense to talk of distances between points. But points take up no space; if the distance between two points is zero, they are the same point and not distinct.
 
Last edited:
EDIT:

The fact that B is not true doesn't prove a thing.

Exactly, that's why A AND B, A AND ~B and C AND D are also used in the proof, and not just B.

ETA: Your argument would work if points took up space; then it would make sense to talk of distances between points.
No, all we need is distinction between more than one 0-dimensional space.

But points take up no space;
It does not matter, all we care is about the distinction between more than one 0-dimensional space.

if the distance between two points is zero, they are the same point and not distinct.
Be Careful, it is a mistake to speak about a single distinct point, by using words like: "they are the same point", and this is exactly why A AND B is a contradiction.

My conclusion is true. You simply do not grasp it yet, because you think that I claim that 0-dimensional space is greater than 0-dimensional space.

I do not claim such nonsense.
 
Last edited:
And this is wrong. 1 - 1/2^n < 1 is true for any natural number n, not as n approaches infinity. As n approaches infinity, we end up with 1 = 1. This holds true for the definition of the limit as the mathematical world has used it since the time of Weierstrass. If you mean something else by n approaching infinity then you must specify so precisely.
Do you mean that natural numbers 1, 2, 3, 4, ... have an upper bound, and the sequence ceases to be convergent and therefore cannot approach infinity? What kind of miraculous treatment natural 'n' has to undergo to be allowed to approach infinity? You are placing nonsensical restrictions on 'n': Let 'n' be any natural number that doesn't approach infinity . . . :confused:

That's amusing: the "limitless" and obviously incorrect assumptions regarding S and S/2 based on 2-1 + 2-2 + 2-3 + ... rely on negative integers as exponents that are allowed to approach negative infinity, but 1 - 1/2n, which is algebraic equivalent of the series in question, is not allowed to inherit the class of 'n' all the way through, according to your treatment of mathematics, and must be again specified.

I wonder if you and Doron would arrive at the same limit in

1 + 1/2 + 1/4 + 1/27 + 1/256 + 1/3125 + 1/46656 + ...

Try it out according to Weierstrass.
 
jsfisher said:
At any rate, given A as a point on line L and B as a point on line L, there is no implicit requirement that A and B be two different points.

In that case there is a collection of exactly single point with more than one name, it actually can have uncountable names, which does not change the fact that we have a collection of one and only one point, in this case.

Yeah, so? If A is a member of a singleton set and B is a member of the same singleton set, then A = B. Again you belabor the obvious and trivial.

Since we are explicitly dealing with a collection of more than a single point, such that each point of that collection is explicitly distinct

It wasn't always explicit, but it isn't really relevant, either.

...then given points A and B along a given line, they are distinct only if there is a distance > 0 between them.

Well, given that a point's position is its only relevant property in this discussion, you again cite the trivial.

This abstract fact is invariant, no matter what arbitrary pair of points is taken, along a given line.

Yes. And where are you going with this trivial observation. Given points A and B, they are distinct if and only if they have a non-zero distance between them.

Furthermore, the collection of distinct points is complete (no point is missing)

Yep.

...and yet, being distinct point of that complete collection means that given arbitrary pair of points along a given line, they are distinct only if there is a distance > 0 between them.

Why are you repeating this triviality, yet again?

In other words, the collection of all distinct points along a given line, can't completely cover it.

Oh, and you were doing so well up to this point. It was a bit like an echo chamber, but at least you were making sane statements. Now, this: a baseless leap to a false conclusion.

The uncountably infinite still escapes you, doesn't it? That's one of the many reasons most students are encouraged to continue school beyond kindergarten.
 
Last edited:
EDIT:

Yet it does not mean that x=1.

The definition of the limit is based on contradiction if it is claimed that x=1.

All we need is a collection of distinct points and a line, in order to prove it.

Theorem: The collection of all distinct points of [0,1] can't completely cover [0,1].

Proof:

0 ≤ x ≤ 1

x=1 is trivially true, because x=1 is a collection of a single distinct point, which completely covers itself.

Proposition A: "Point x and point 1 are distinct"

Proposition B: "The distance between point x and point 1 is 0"

Proposition C: "A AND B is a contradiction"

Proposition D: "A AND ~B is a tautology"

Since C AND D is true, then the collection of all distinct points of [0,1] can't completely cover [0,1].

Q.E.D

Your conclusion doesn't follow from propositions C and D both being true. All you can establish from that is that if a point X is different from point 1, then point X is different from point 1.

Few will be impressed by this.

You continue to conflate adjacency with completeness. The continuum doesn't work that way.
 
It is easy: distance > 0

Adjacent means distance > 0? Surely you are kidding. Using that definition, any two distinct points are adjacent.

No laca, they are distinct because they are not the same.

You can't use "They are" as a part of the definition of a single point, and this is exactly what you do in your



invalid proposition.

I'm clearly talking about two points. Try to read for comprehension next time. If you're capable of it, of course.
 
Do you mean that natural numbers 1, 2, 3, 4, ... have an upper bound, and the sequence ceases to be convergent and therefore cannot approach infinity? What kind of miraculous treatment natural 'n' has to undergo to be allowed to approach infinity? You are placing nonsensical restrictions on 'n': Let 'n' be any natural number that doesn't approach infinity . . . :confused:

That's amusing: the "limitless" and obviously incorrect assumptions regarding S and S/2 based on 2-1 + 2-2 + 2-3 + ... rely on negative integers as exponents that are allowed to approach negative infinity, but 1 - 1/2n, which is algebraic equivalent of the series in question, is not allowed to inherit the class of 'n' all the way through, according to your treatment of mathematics, and must be again specified.

I wonder if you and Doron would arrive at the same limit in

1 + 1/2 + 1/4 + 1/27 + 1/256 + 1/3125 + 1/46656 + ...

Try it out according to Weierstrass.

Your use of infinity-related terminology is not precise. When dealing with a sequence like we are in the case of {1/2, 3/4, 7/8, ..., 1 - 1/2^n, ...}, a mathematician uses the phrase "let n approach infinity" to refer to the limit of this sequence, which in this case is 1. It has no precise meaning otherwise.

Your issue seems to be that you don't think the sum of an infinite series should be defined as the limit of the sequence of its partial sums. I demonstrated earlier that it is nonsensical to define it to be anything but 1, for given an x < 1, you can always find an n such that 1 - 1/2^n > x. If you don't believe this to be true, then your issue lies with the Least Upper Bound Property of the real number system, which is itself derivable from the mere existence of the rational numbers as first shown by Dedekind and Cantor.

Natural numbers do not have an upper bound. It is true that I can find arbitrarily large natural numbers, but that doesn't make them infinite. No matter what natural number you, me, or anyone else in the world produces, it will be small compared to what we know as infinity. The entire concept of infinity when dealing with numerical sequences thus has no precise meaning without the concept of a limit.

If you take limits out of the picture when trying to deal with the sum of an infinite series, then you need to precisely specify what you mean by an infinite series to produce any meaningful results. For the last time, that is why 1/2 + 1/4 + 1/8 + ... < 1 is nonsense. It is at odds with the traditional precise meaning of the sum of an infinite series, yet no one here has specified what it means otherwise.
 
If you take limits out of the picture when trying to deal with the sum of an infinite series, then you need to precisely specify what you mean by an infinite series to produce any meaningful results. For the last time, that is why 1/2 + 1/4 + 1/8 + ... < 1 is nonsense. It is at odds with the traditional precise meaning of the sum of an infinite series, yet no one here has specified what it means otherwise.

I've already showed you: when you convert the series into its functional equivalent f(x) = (2x - 1)/2x where x → ∞, you see immediately that 1 in the numerator has to be the limit. If the inequality that you call nonsense holds, then indeed 1 is the limit.

Do this

1 + 1/2 + 1/4 + 1/27 + 1/256 + 1/3125 + 1/46656 + ...

step by step, so I would understand exactly what you mean when you talk about limits and infinite series.
 
When N opens its door, these guys get out: 1, 2, 3, 4, ...
They are distinct and there is no gap between them.
Well, Doron, that's the proof there is nothing you can do about, as you didn't.
The velocity of a projectile measured at time t1, t2, t3 ...
 
Status
Not open for further replies.

Back
Top Bottom