• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

I've proved that 1 = 2.

Re: Re: Re: Re: I've proved that 1 = 2.

Crossbow said:


I have read what other people have been writing, and I do not think that business about dividing by zero is correct; because what is happening in this problem is that it is a case of dividing zero into zero which should produce a quotient of one.
0/0 is not defined.

And even if that is not the case, it still does not change the fact that the expansion offered in the original proof is incorrect, therefore the proof is still incorrect.
Agreed.
 
And even if that is not the case, it still does not change the fact that the expansion offered in the original proof is incorrect, therefore the proof is still incorrect.
You're talking about step 5, right:

4. Subtract x² from both sides
x² - x² = x² - x²

5. Factor both sides to get
x(x-x) = (x+x)(x-x)

This step is correct, actually, as is your variation on that same step.
In general, (a+b)(a-b) = a<sup>2</sup>-b<sup>2</sup> for all real numbers a and b, so the original step is correct.

If we actually multiply (x+x)(x-x) out, we get:

x<sup>2</sup> - x<sup>2</sup> + x<sup>2</sup> - x<sup>2</sup>

From here we can either combine like terms to get your alternate step:

2x<sup>2</sup> - 2x<sup>2</sup>

or, we can cancel the first two terms to get the original step:

x<sup>2</sup> - x<sup>2</sup>
 
Neither expansion is incorrect.

x<sup>2</sup>-x<sup>2</sup>=0

So any expansion that yields 0 is correct

x<sup>2</sup>-x<sup>2</sup>=(x+x)(x-x)=(x+x)(x-x)=x<sup>4</sup>-x<sup>4</sup>=(x-x)(x+x+taffy)

Walt
 
Walter Wayne said:
Neither expansion is incorrect.

x<sup>2</sup>-x<sup>2</sup>=0

So any expansion that yields 0 is correct

x<sup>2</sup>-x<sup>2</sup>=(x+x)(x-x)=(x+x)(x-x)=x<sup>4</sup>-x<sup>4</sup>=(x-x)(x+x+taffy)

Walt

No. Wrong. Period. Full stop.

Any expansion that equals zero also equals zero. Nothing more.
 
TeaBag420 said:


No. Wrong. Period. Full stop.

Any expansion that equals zero also equals zero. Nothing more.
Yes, any expansion that equals zero equals zero, and it follows it then equals any other expansion that equals zero.

x<sup>2</sup>-x<sup>2</sup>=0
(x-x)(x+x)=0(2x) = 0
x<sup>4</sup>-x<sup>4</sup>=0
(x-x)(x+x+taffy)=0(2x+taffy)=0

Yup, all look equal to eachother.
 
Since people love thinking about what "...." means, here is another proof that 1=2 making use of it:

Consider the series S=sum_{i=0}^infinity (-1)^i

S=1-1+1-1+1-1+...
=(1-1)+(1-1)+(1-1)+...
=0+0+0...
=0

So S=0.

But:

S=1-1+1-1+1-1-...
=1-(1-1)-(1-1)-(1-1)-...
=1-0-0-0...
=1

Therefore S+1=0+1=1+1 and so 1=2
 
Are either of those really correct evaluations of the sum? You moved parentheses around, but shouldn't you be using calculus? You seem to be saying that the series converges to two different sums. My gut tells me that if the series converges at all, it sums to 1, but I haven't done the math.


Tez said:
Since people love thinking about what "...." means, here is another proof that 1=2 making use of it:

Consider the series S=sum_{i=0}^infinity (-1)^i

S=1-1+1-1+1-1+...
=(1-1)+(1-1)+(1-1)+...
=0+0+0...
=0

So S=0.

But:

S=1-1+1-1+1-1-...
=1-(1-1)-(1-1)-(1-1)-...
=1-0-0-0...
=1

Therefore S+1=0+1=1+1 and so 1=2
 
x^2 = x + x + ... + x (x times)
d(x^2)/dx = d(x + x + ... + x)/dx
2x = 1 + 1 + ... + 1 (x times)
2x = x
2 = 1
Could someone translate this to "strings", preferrably "unlimited", so I can play too?
 
At least most of you guys are now seeing that the real flaw in the proof is the expansion done in Step 5 as opposed to the divide by zero issue.

I still think that zero into zero equals one, but I may be wrong; so I will see what I can find out get and get back with a good answer.
 
Crossbow said:
I still think that zero into zero equals one, but I may be wrong; so I will see what I can find out get and get back with a good answer.
This is most definitely incorrect. It can equal one, but it can also equal 0.1 or a million and six. 0/0 is indeterminate.

From the Dr. Math Archive:
>2/ What is the value of 0/0 ? (is it really undefined or are there an
> infinite number of values)
>
There's a special word for stuff like this, where you could conceivably
give it any number of values. That word is "indeterminate." It's not the
same as undefined. It essentially means that if it pops up somewhere,
you don't know what its value will be in your case. For instance, if
you have the limit as x->0 of x/x and of 7x/x, the expression will have
a value of 1 in the first case and 7 in the second case. Indeterminate.
 
Crossbow said:
At least most of you guys are now seeing that the real flaw in the proof is the expansion done in Step 5 as opposed to the divide by zero issue.

I still think that zero into zero equals one, but I may be wrong; so I will see what I can find out get and get back with a good answer.
See the following Wikipedia articles:

Division by zero

Invalid proofs

If nothing else, try dividing 0 by 0 on your calculator. the answer won't be 1.
 
The Dr. Math Archive is wrong. Division by zero is undefined. It is in fact DEFINED as undefined. Why not indeterminate? Because 0/0 = 0 * 1/0 and 1/0 is undefined. Proof? Divide each side by one. Actually I see at least one flaw in that proof. Perhaps multiplying each side by zero is a better way to go.

Please present a case where 0/0 = one million and six. Thank you.


Cecil said:
This is most definitely incorrect. It can equal one, but it can also equal 0.1 or a million and six. 0/0 is indeterminate.

From the Dr. Math Archive:
 
The Dr. Math Archive is wrong. Division by zero is undefined. It is in fact DEFINED as undefined. Why not indeterminate? Because 0/0 = 0 * 1/0 and 1/0 is undefined. Proof? Divide each side by one. Actually I see at least one flaw in that proof. Perhaps multiplying each side by zero is a better way to go.
No. 1/0 is defined as undefined, but 0/0 is defined as indeterminate.
TeaBag420 said:
Please present a case where 0/0 = one million and six. Thank you.
Limit (x->0) 1000006x/x. If you don't like limits, then here's an even easier example to understand:

6*7 = 42, so 6 = 42/7.
Similarly,
1000006*0 = 0, so 1000006 = 0/0

More support for my position is located at the following sites.

http://documents.wolfram.com/v4/MainBook/3.1.8.html
The program Mathematica labels 0/0 as indeterminate.

http://www.mathmojo.com/interestinglessons/division_by_zero/division_by_zero_1.html
Especially the quote:
In a nutshell, the answers to your questions are:
#1) 0/0 = indeterminate
#2) 0/1 = 0
#3) 1/0 = undefined


Crossbow, here's a proof that 0/0 != 1, using Reductio Ad Absurdum.

Assume 0/0 = 1
(0+0) / 0 = 1
0/0 + 0/0 = 1
1 + 1 = 1 (substitution from assumption)
2 = 1.

Or maybe this is just another proof that I'm right!!! Crossbow, if you accept that 0/0 = 1 then you must join me in my new world order where equality is guaranteed for ALL integers, regardless of size. Together, we shall end this senseless discrimination!! :D
 
(S) said:
Apply Pope proof. It shows that, in a set containing Cecil and me, there is actually only one element. Therefore, I am Cecil, and I concede that 1 != 2.
Therefore, I am you and I just emptied your bank account. Now I'm rich and have no further need for this silly 2=1 debate.
 
Division by zero is undefined over the real numbers, including 0/0. In calculus, 0/0 is an indeterminate form.
 
I always thought that this was a bit backwards. You could divide by zero, but if that were "allowed" then this kind of proof would be valid. Therefore, by agreement, dividing by zero is not allowed. Just a convention. So, IF you were using a number system where division by zero were allowed, THEN 1=2=3=1,771,561. This would be mathematically correct, IN THAT NUMBER SYSTEM.

Now, since the use of such a number system gives answers that seem unrealistic in the real world, we have agreed to use a number system where division by zero is simply not allowed. This gives us a system where 1+1=2 and nothing else. This gives answers that relate to what we percieve to be reality.

Remember, mathematics is not real. It is simply a model we agree upon to represent reality, and at its higher strata is still being developed as a tool that reflects reality. Such levels of developement are beyond me, as I am not so highly educated in that field. However, I can understand the generality that mathematics is not a perfect model, nor will it ever be.

Now, can we all agree on one model? The generally accepted one is where division by zero is not allowed.
 
EternalUniverse said:
Actually, I haven't. I encountered a fun math puzzle that purportedly proves that 1 = 2. What is wrong with the proof? Assuming there is one ;)

1. Let
x = 1

2. It follows that
x = x

3. Square both sides to get
x2 = x2

4. Subtract x2 from both sides
x2 - x2 = x2 - x2

5. Factor both sides to get
x(x-x) = (x+x)(x-x)

6. Divide both sides by (x-x)
x=(x+x)

7. Since x = 1
1=(1+1)

8. So
1 = 2

Call me naive, but after step 4, how is it that x can equal one and zero.
 
TeaBag420 said:
Are either of those really correct evaluations of the sum? You moved parentheses around, but shouldn't you be using calculus? You seem to be saying that the series converges to two different sums. My gut tells me that if the series converges at all, it sums to 1, but I haven't done the math.
The sum "S=1-1+1-1+1-1...." does not converge; it is divergent. For a series to converge -- roughly -- it must become closer and closer to its limit as more and more terms are added. You must be able to take any arbitrary distance from the limit -- 1/2 or 1/5 or anything -- and find a number of terms to add so that the sum will always be within that distance after that many terms have been added. That does not happen with this sum, so it does not converge.
 
(S) said:

The sum "S=1-1+1-1+1-1...." does not converge; it is divergent. For a series to converge -- roughly -- it must become closer and closer to its limit as more and more terms are added. You must be able to take any arbitrary distance from the limit -- 1/2 or 1/5 or anything -- and find a number of terms to add so that the sum will always be within that distance after that many terms have been added. That does not happen with this sum, so it does not converge.

Thank you for the explanation. Note my use of the word "if".

Here's a simpler way to deal with the problem:

assume 1= 2

2 - 1 = ? must be zero, right?

1- 2 = ? must be zero, right?

1 +1 = ? perhaps 2?

1 + 0 = ? 1, maybe not, maybe 2

There is no need for any number other than zero. It just goes on and on. This is a silly discussion. The original "proof" was wrong. --- I DO recognize it was presented in the spirit of "hey, look at this!"
 

Back
Top Bottom