Deeper than primes

Status
Not open for further replies.
[qimg]http://img580.imageshack.us/img580/241/bookbyd.png[/qimg]

No epix, you have the joy to be against no matter what, why and how.

Here are the relevant part, taken from http://www.math.toronto.edu/mathnet/questionCorner/geomsum.html

It is this limit which we call the "value" of the infinite sum.

How do we find this value?

If we assume it exists and just want to find what it is, let's call it S. Now

S = 1 + 1/2 + 1/4 + 1/8 + · · ·
so, if we multiply it by 1/2, we get
(1/2) S = 1/2 + 1/4 + 1/8 + 1/16 + · · ·
Now, if we subtract the second equation from the first, the 1/2, 1/4, 1/8, etc. all cancel, and we get S - (1/2)S = 1 which means S/2 = 1 and so S = 2.

...

In your example, the finite sums were

1 = 2 - 1/1
3/2 = 2 - 1/2
7/4 = 2 - 1/4
15/8 = 2 - 1/8

and so on; the nth finite sum is 2 - 1/2^n. This converges to 2 as n goes to infinity, so 2 is the value of the infinite sum.

1) Traditional Math explicitly uses = , such that S - (1/2)S = 1 which means S/2 = 1 and so S = 2

2) The term value is explicitly used at the end of the answer.
 
Last edited:
Once again you simply show your inability to understand that a space is not independent of its sub-spaces.

Once again your inability to distinguish between

(
1(),0()
)

and

(
1(0())
)

shows its trivial reasoning.
 
No epix, you have the joy to be against no matter what, why and how.

Here are the relevant part, taken from http://www.math.toronto.edu/mathnet/questionCorner/geomsum.html



1) Traditional Math explicitly uses = , such that S - (1/2)S = 1 which means S/2 = 1 and so S = 2

2) The term value is explicitly used at the end of the answer.
Yes. Unlike OM, math is using the symbol '=' often, coz it is essential to mathematics. But the symbol is always accompanied by various terms that '=' holds identical, and that escaped your attention. In the particular case above, the writer replied to a 12th grader under the assumption that if that person has made it to the 12th grade, he or she would be able to understand on the first reading what I highlighted in my next-to-last post, that is
It is this limit which we call the "value" of the infinite sum.
That sum S is the limit and not just the sum of the infinite sequence. In other words if Limit = S and S = 2 then Limit = 2. And, according to the definition translated for the slower students, the limit is a number that another number approaches but can't ever reach, the same way infinity can't be reached.

Here is an explanation of the limit to those who never heard such a word. The explanation stresses from the beginning that the limit can't be reached and it uses the full expressive means that don't omit any term, as it often happens in the mathematician-to-mathematician correspondence, which proved to be the source of a great deal of confusion to you.
http://www.themathpage.com/acalc/limits.htm

Of course, the explanation assumes a normal fetal development on the part of the reader . . .
 
Last edited:

I was thinking the same thing. According to information entropy the expansion of primes would have the highest entropy, not the lowest I think. For example if zip compression was used on the sets describing the expansion of numbers (see below), then the sets for the primes would be the least compressible probably (the more repetitive order the more compression).


2
---
{1,1}


3
---
{1,1,1}
{2,1}


4
---
{1,1,1,1}
{2,1,1}
{2,2}
{3,1}


5
---
{1,1,1,1,1}
{2,1,1,1}
{2,2,1}
{3,1,1}
{3,2}
{4,1}


6
--
{1,1,1,1,1,1}
{2,1,1,1,1}
{2,2,1,1}
{2,2,2}
{3,1,1,1}
{3,2,1}
{3,3}
{4,1,1}
{4,2}
{5,1}


7
---
{1,1,1,1,1,1,1}
{2,1,1,1,1,1}
{2,2,1,1,1}
{2,2,2,1}
{3,1,1,1,1}
{3,2,1,1}
{3,2,2}
{4,1,1,1}
{4,2,1}
{5,1,1}
{5,2}
{6,1}

...
 
Last edited:
Yes. Unlike OM, math is using the symbol '=' often, coz it is essential to mathematics. But the symbol is always accompanied by various terms that '=' holds identical, and that escaped your attention. In the particular case above, the writer replied to a 12th grader under the assumption that if that person has made it to the 12th grade, he or she would be able to understand on the first reading what I highlighted in my next-to-last post, that is

That sum S is the limit and not just the sum of the infinite sequence. In other words if Limit = S and S = 2 then Limit = 2. And, according to the definition translated for the slower students, the limit is a number that another number approaches but can't ever reach, the same way infinity can't be reached.

Here is an explanation of the limit to those who never heard such a word. The explanation stresses from the beginning that the limit can't be reached and it uses the full expressive means that don't omit any term, as it often happens in the mathematician-to-mathematician correspondence, which proved to be the source of a great deal of confusion to you.
http://www.themathpage.com/acalc/limits.htm

Of course, the explanation assumes a normal fetal development on the part of the reader . . .

Traditional Math is very clear about this subjecet, as written at the end of the answer (by using the term value and not "value").


By the way, http://www.themathpage.com/ARITH/bio.htm the writer of http://www.themathpage.com/acalc/limits.htm
is not a good example of Traditional professional mathematician.

This is how professional mathematicians understand this subject:

http://www.youtube.com/watch?v=o7GLWMXq7jo ( http://www.merlot.org/merlot/viewMember.htm;jsessionid=7585E9CEEA22E24FC6F2544E69990553?id=26748 )
 
Last edited:
Hey doron, I understand that all the close-minded mathematicians don't want to read your gibberish. But for sure, you have found some application of your "theory", haven't you? Something like that would surely attract attention. Or posting on this forum is all you do? Doesn't that seem silly to you?
 
I was thinking the same thing. According to information entropy the expansion of primes would have the highest entropy, not the lowest I think. For example if zip compression was used on the sets describing the expansion of numbers (see below), then the sets for the primes would be the least compressible probably (the more repetitive order the more compression).


2
---
{1,1}


3
---
{1,1,1}
{2,1}


4
---
{1,1,1,1}
{2,1,1}
{2,2}
{3,1}


5
---
{1,1,1,1,1}
{2,1,1,1}
{2,2,1}
{3,1,1}
{3,2}
{4,1}


6
--
{1,1,1,1,1,1}
{2,1,1,1,1}
{2,2,1,1}
{2,2,2}
{3,1,1,1}
{3,2,1}
{3,3}
{4,1,1}
{4,2}
{5,1}


7
---
{1,1,1,1,1,1,1}
{2,1,1,1,1,1}
{2,2,1,1,1}
{2,2,2,1}
{3,1,1,1,1}
{3,2,1,1}
{3,2,2}
{4,1,1,1}
{4,2,1}
{5,1,1}
{5,2}
{6,1}

...
It depends on the interpretation of collections of distinct and non-distinct elements.

In my model, more distinction is less entropy, and less distinction is more entropy.

Prime numbers are characterized by more distinction among the elements of a given partition, and therefore they are characterized by less entropy.
For example if zip compression was used on the sets describing the expansion of numbers (see below), then the sets for the primes would be the least compressible probably (the more repetitive order the more compression).
You are right, the sets for the primes would be the least compressible and therefore have the least entropy.
 
Last edited:
Traditional Math is very clear about this subjecet, as written at the end of the answer (by using the term value and not "value").
You need to understand something about how the quotation marks are used in this case. The writer didn't quote anyone by putting the word value into the quotation marks; the writer used them to indicate the as if that the quotation marks are sometimes used for. After that, the quotation marks are no longer required if the word is repeated. Remember that the writer is replying to a 12th grader who writes essays and knows English grammar well. You picked up an example of mathematician-to-nonmathematician correspondence as an example of what traditional math claims, but that may not work well with you due to the informality of the text that isn't written in your native language. It's the Catch 22: If you pick up the mathematician-to-mathematician text instead to demonstrate alleged fallacies, then you are likely to misinterpret it, coz you are not a "traditional mathematician."


By the way, http://www.themathpage.com/ARITH/bio.htm the writer of http://www.themathpage.com/acalc/limits.htm
is not a good example of Traditional professional mathematician.
Really? What seems to be the problem?
 
You need to understand something about how the quotation marks are used in this case.

No epix, you need to understand something about how the quotation marks are not used at the final part of this case, exactly as shown in http://www.internationalskeptics.com/forums/showpost.php?p=6532043&postcount=12241 .


This time please look at this movie which shows how professional mathematicians understand this subject:

http://www.youtube.com/watch?v=o7GLWMXq7jo ( http://webalt.math.helsinki.fi/content/about/people/seppala/index_eng.html )

Really? What seems to be the problem?
He does not have a Ph.D in Mathematics.
 
Last edited:
It depends on the interpretation of collections of distinct and non-distinct elements.

In my model, more distinction is less entropy, and less distinction is more entropy.

Prime numbers are characterized by more distinction among the elements of a given partition, and therefore they are characterized by less entropy.

You are right, the sets for the primes would be the least compressible and therefore have the least entropy.

Maybe I see what you mean. That the less distinction there is the more 'noise' there is, which is a form of disorder which in turn is related to entropy.

It depends on how entropy is defined I guess. Using Shannon entropy, the less compressible some information is, the more Shannon entropy it has.
 
Maybe I see what you mean. That the less distinction there is the more 'noise' there is, which is a form of disorder which in turn is related to entropy.

It depends on how entropy is defined I guess. Using Shannon entropy, the less compressible some information is, the more Shannon entropy it has.
But if information is less compressible it means that there is more distinct information and therefore less entropy.
 
Maybe this has already been mentioned in this thread. It's a long thread, and I haven't read much of it. But I think it's important to look at complexity. What is complexity first of all? Complexity is tricky to define it seems. Generally, complexity is order, but not simple order.

Primes are in a way perhaps more complex than other numbers (natural number primes, not complex numbers in the sense of having real and imaginary parts).
 
But if information is less compressible it means that there is more distinct information and therefore less entropy.

Shannon entropy is probably not similar to entropy in physics. For example, a CD filled with random 1s and 0s has a very high Shannon entropy. A CD filled with music also has a high Shannon entropy. So Shannon entropy cannot distinguish between information that is noise/random/disorder and information that is very ordered and has a lot of structure in it.

An example of very low Shannon entropy would be a CD filled with only zeros.
 
No epix, you need to understand something about how the quotation marks are not used at the final part of this case, exactly as shown in http://www.internationalskeptics.com/forums/showpost.php?p=6532043&postcount=12241 .
I told you so . . .
It is this limit which we call the "value" of the infinite sum.

How do we find this value?
The writer will not repeat the quotation marks once again. The 12th grader understands that very well.



This time please look at this movie which shows how professional mathematicians understand this subject:

http://www.youtube.com/watch?v=o7GLWMXq7jo ( http://webalt.math.helsinki.fi/content/about/people/seppala/index_eng.html )
"Professional mathematicians" don't use this infamous informal proof. He has the audacity to verbalize(!!!) the "proof" by saying: "As 'n' grows, the blue domain also grows and eventually covers the whole square." Blah-blah. Show me the mathematical proof. He can't go beyond YouTube with this crap.

He does not have a Ph.D in Mathematics.
Stop talking nonsense all the time. He does.
http://www.themathpage.com/ARITH/bio.htm

That mental case on YouTube doesn't.
 
Shannon entropy is probably not similar to entropy in physics. For example, a CD filled with random 1s and 0s has a very high Shannon entropy. A CD filled with music also has a high Shannon entropy. So Shannon entropy cannot distinguish between information that is noise/random/disorder and information that is very ordered and has a lot of structure in it.

An example of very low Shannon entropy would be a CD filled with only zeros.
Complexity is indeed a main concept of that thread, which is measured by Uncertainty x Reduncancy matrix, for example look at http://www.scribd.com/doc/17039028/OMDP .
 
Last edited:
Stop talking nonsense all the time. He does.
http://www.themathpage.com/ARITH/bio.htm
Wrong, master's degree is less than Ph.D

That mental case on YouTube doesn't.

Dr. Mika Seppälä ( That mental case on YouTube, as you call him ) is a professor of computer-aided mathematics at the University of Helsinki and Professor of Mathematics at Florida State University. He was the Finnish representative in the Database Committee of the European Mathematical Council and later in the various Euromath projects (1984 – 1994). He is the secretary of the OpenMath Society and he was the coordinator of the first OpenMath project (1994 – 1996) to get community support. Seppälä has extensive experience in creating educational on-line materials for calculus.
 
Last edited:
Complexity is indeed a main concept of that thread, which is measured by Uncertainty x Reduncancy matrix, for example look at http://www.scribd.com/doc/17039028/OMDP .

Whoa! Forgive me for saying it, but the explanation on that page looks kind of wacky or maybe it's something clever, but it's way over my head.

In simple terms, how could uncertainty increase complexity?
 
Whoa! Forgive me for saying it, but the explanation on that page looks kind of wacky or maybe it's something clever, but it's way over my head.

In simple terms, how could uncertainty increase complexity?


If you think about Complexity in terms of Information, I think that the useful energy that is involved depends on our ability to deal with uncertain and/or redundant conditions.

Such conditions are usually considered as "white noise" that has to be reduced in order to get some desired results by using minimum energy.

I think that the tendency of "getting maximum results by using minimum energy" can quickly lead us into triviality and atrophy if redundancy and uncertainty are not taken as natural and challenged conditions, which stimulate us to research beyond our current cozy spotlight.

I have found that the main stream of the mathematical development of the past 3,000 years is based step-by-step serial view of the researched (abstract or not) things, and as a result it uses techniques that are tuned to deal with strict-only information.

Take for example the intuitive concept of Set, which is based on a collection of distinct things such that, for example, {a,a,b}={a,b}.

It means that a property like redundancy as found in {a,a,b} is not taken as an essential case of information but as a case that has to be reduced into {a,b}, in order to get the optimal conditions for analytic research, which is basically a step-by-step serial way of thinking.

By using only analytic methods, our ability to use also parallel thinking is not stimulated and not developed.

Uncertainty and Redundancy are actually natural results of parallel thinking, which deals with information at-once, where order or strict distinction are not its natural characters.

Take for example the concept of Superposition of opposites, which is understood in terms of addition of values that are summed to 1, for example: 0.5True + 0.5False .

This approach weaken the involved opposites in order to avoid logical contradiction, where this logical contradiction is a direct result of step-by-step thinking, which can't deal at-once with opposites like True and False.

But under parallel thinking 1True + 1False is simply an uncertain information of "strong" superposition, which may or may not collapsed into certain 1True or certain 1False.

For more details about this non-standard view, please look at http://www.scribd.com/doc/18453171/International-Journal-of-Pure-and-Applied-Mathematics-Volume-49 .
 
Dr. Mika Seppälä ( That mental case on YouTube, as you call him ) is a professor of computer-aided mathematics at the University of Helsinki and Professor of Mathematics at Florida State University. He was the Finnish representative in the Database Committee of the European Mathematical Council and later in the various Euromath projects (1984 – 1994). He is the secretary of the OpenMath Society and he was the coordinator of the first OpenMath project (1994 – 1996) to get community support. Seppälä has extensive experience in creating educational on-line materials for calculus.
That's the sorrowful state of affair. That nuts from Finland goes into stuff where things are approaching zero, and shows a square on YouTube that is allegedly closing the sequence. He wants the viewer to discern a distance like 0.0000000000000000000000000000000001 inches and calls it a proof.

Well, if you believe him through an appeal to authority, and he represents the traditional math, then the traditional math must be in trouble, as you said. That Fin claims that the sum of that sequence is 1 and that it got nothing to do with the limit, but you said that this can't be true. So what kind of mathematician Dr. Mika Seppälä, the secretary of the OpenMath Society, is when YOU -- a person that publishes in "Religion & Philosophy" -- found a mistake in his computations? That's mighty contradictory, ain't that? LOL.

In conclusion: the traditional math that Mika Seppälä speaks for according to your and only your choice, isn't good, coz Doron found that Mika Seppälä Ph.D. got that wrong and the sum of the sequence cannot equal 1. Well, other mathematicians hold the sum as a limit and can explain it to others, while Seppälä can't.

You can repeat that the tradititional math is in trouble and point to Seppälä as the evidence ad infinitum, but who do you think is going to believe you? Sorry, but you are just chewing on your own recipe for hearty appeal to authority. LOL. You just created the Mother of All Contradictions.
 
The writer will not repeat the quotation marks once again. The 12th grader understands that very well.


Aye! and there's the rub. As Doron has professed some time ago, he aspires to have the mathematical capacity of a kindergarten student. It continues to be an uphill battle.
 
Status
Not open for further replies.

Back
Top Bottom