Deeper than primes

Status
Not open for further replies.
(2,1)=(1,2)
(2,0)=(0,2)
(1,0)=(0,1)

because order has no significance at this fundamental level, as can be seen by the following trees:

Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

Also please look at http://www.scribd.com/doc/21967511/...considerations-of-Some-Mathematical-Paradigms .

Ok, for example (2,1) means 2 uncertainty and 1 redundancy. Which leads to distinction states (AB,A) and (AB,B).

Code:
(AB,A)

A * *
  | |
B *_.

(AB,B)

A * .
  | |
B *_*

The uncertainty in (2,1) is 2 (the first number) and appears in two ways: i) as two distinction sets, and ii) as the superposition state AB in each distinction set.

The redundancy in (2,1) is 1 (the second number) and appears in the possible states (A,A) and (B,B).

In the diagram * means identity, and when there are more than one identity on the x-axis it means redundancy and when there is more than one identity on the y-axis it means uncertainty.

I haven't looked at the part about Zeno's paradox yet, but if I remember correctly it's about how no motion is possible if a length can be zero (for the motion to start, the first step needs to be larger than zero).
 
Last edited:
Yeah, I haven't read all of the original post. I will give it another try.
Before you do, focus on the arguments listed in "Notion #1," namely on the conclusion. In the premise, Doron leaves his extension of the definition of "entropy" to the reader, but you can extract it through the partition of integers. But then, Doron concludes:
As can be seen, Prime numbers have the least entropy, from this point of view.

His conclusion represents the non sequitur fallacy that is the most frequent in Doron's logical constructs. Non sequitur means a disconnection between the premise and the conclusion. To say that prime numbers have the least entropy with respect to the premise that locally defines the word entropy, one should refer to the composition of digits in prime numbers. For example, prime number 104729 has no entropy, coz all its digits are distinct. But when Doron says that prime numbers have the least entropy, he refers to the partition of prime numbers, not to primes as numbers. In this particular case, you get the chance to re-phrase the conclusion and find out what Doron actually meant when concluding, but in the majority of cases, Doron doesn't give you that chance and continues to build his sand castle, as it becomes very clear when his "Notion #2" dumps more sand.
 
Last edited:
Before you do, focus on the arguments listed in "Notion #1," namely on the conclusion. In the premise, Doron leaves his extension of the definition of "entropy" to the reader, but you can extract it through the partition of integers. But then, Doron concludes:


His conclusion represents the non sequitur fallacy that is the most frequent in Doron's logical constructs. Non sequitur means a disconnection between the premise and the conclusion. To say that prime numbers have the least entropy with respect to the premise that locally defines the word entropy, one should refer to the composition of digits in prime numbers. For example, prime number 104729 has no entropy, coz all its digits are distinct. But when Doron says that prime numbers have the least entropy, he refers to the partition of prime numbers, not to primes as numbers. In this particular case, you get the chance to re-phrase the conclusion and find out what Doron actually meant when concluding, but in the majority of cases, Doron doesn't give you that chance and continues to build his sand castle, as it becomes very clear when his "Notion #2" dumps more sand.

Yes, I meant the partitions of the primes not the primes themselves. Although I do believe the primes themselves can be seen as being less redundant in a sense.
 
Could it be that information containing many primes is less 'noisy' than information containing fewer primes, when both strings of information have the same information entropy? And that the string with more primes has more complexity.

For example, that would mean that a 1 megabyte file of random bits would (in general) have fewer primes than a 1 megabyte MP3 file containing music.
 
The concept of uncertainty and redundancy for numbers could perhaps be useful in quantum computing. As I understand it, a quantum computer has uncertain bits that are entangled with each other, so that it's difficult using ordinary math to deal with quantum computers with more than a few bits.
 
His conclusion represents the non sequitur fallacy that is the most frequent in Doron's logical constructs.

1) epix it is a good thing to look from time to time at the mirror, in order to realize how you are using reasoning. From your last posts about the Limit concept and the sum of convergent series like 1/2+1/4+1/8+... it seems that you do not understand Traditional Math's claims about this subject.

2) We have here a professional mathematician that calls himself jsfisher. jsfisher has a Ph.D in mathematics.

3) I have asked jsfisher in http://www.internationalskeptics.com/forums/showpost.php?p=6536242&postcount=12263 to explain to you the claims and proofs of Traditional Math about convergent series like 1/2+1/4+1/8+... = 1 (where 1 is the limit of that series).

4) I suggest to you to ask jsfisher what is the result of 1/2+1/4+1/8+... according to Traditional Math, and please ask him to provide the rigorous proof that stands at the basis of his answer.

5) Also there is here a professional physicist, called The Man, so also please ask him about he result of 1/2+1/4+1/8+... according to Traditional Math, and please ask him to provide the rigorous proof that stands at the basis of his answer.
 
Last edited:
Could it be that information containing many primes is less 'noisy' than information containing fewer primes, when both strings of information have the same information entropy? And that the string with more primes has more complexity.

For example, that would mean that a 1 megabyte file of random bits would (in general) have fewer primes than a 1 megabyte MP3 file containing music.
Your conclusion opens the premise that the MP3 compressing algorithm is somewhat avoiding to index data expressed as prime numbers. But a query regarding this possibility shows zero result -- not even a hint. So it's hard to tell from the Internet source, the same way Doron's terminology is not a part of the huge Internet database.
 
Could it be that information containing many primes is less 'noisy' than information containing fewer primes, when both strings of information have the same information entropy? And that the string with more primes has more complexity.

For example, that would mean that a 1 megabyte file of random bits would (in general) have fewer primes than a 1 megabyte MP3 file containing music.
The concept of uncertainty and redundancy for numbers could perhaps be useful in quantum computing. As I understand it, a quantum computer has uncertain bits that are entangled with each other, so that it's difficult using ordinary math to deal with quantum computers with more than a few bits.

Let us use the example of 2-Uncertainty x 2-Redundancy matrix.

Organic Mathematics extends this question and asks:

Do given information forms can be reduced into the particular case of (A,B) under Frame (1,1), without loosing information's diversity?

Please be aware of the fact that (A,B) is just a particular case of Frame (1,1) under 2-Redundancy x 2-Uncertainty Distinction Tree, as follows:

Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

(2,2) = (AB,AB)
(2,1) = (AB,A),(AB,B)
(2,0)=  (AB)
[COLOR="magenta"][B](1,1)[/B][/COLOR] = (A,A),(B,B), [COLOR="magenta"][B](A,B)[/B][/COLOR]
(1,0)=  (A),(B)
(0,0)=  ()
 
Last edited:
Your conclusion opens the premise that the MP3 compressing algorithm is somewhat avoiding to index data expressed as prime numbers. But a query regarding this possibility shows zero result -- not even a hint. So it's hard to tell from the Internet source, the same way Doron's terminology is not a part of the huge Internet database.

The idea was that the result (say 1 megabyte) of the MP3 compression will generally have more primes if the original file has meaningful structure in it (such as recorded music) compared to a 1 megabyte file of random bits.

It's also possible to use MP3 compression on random noise. Then the premise is that this MP3 file will have fewer primes in it than a MP3 file of the same size containing an original file with meaningful information on it.

So the MP3 algorithm itself does not make a distinction between primes and non-primes.

The term 'meaningful' is not very well defined, but anyway we can usually tell the difference between noise/disorder/randomness and meaningful data.
 
Your conclusion opens the premise that the MP3 compressing algorithm is somewhat avoiding to index data expressed as prime numbers. But a query regarding this possibility shows zero result -- not even a hint. So it's hard to tell from the Internet source, the same way Doron's terminology is not a part of the huge Internet database.
Novel notions are naturally not a part of the huge Internet database of some considered subject.

So we have to check if these novel notions provide a fruitful conditions for further research and development, or not.

To unconditionally ignore novel notions just because they do not fit to the current excepted knowledge base, is not a wise thing.
 
Let us use the example of 2-Uncertainty x 2-Redundancy matrix.

Organic Mathematics extends this question and asks:

Do given information forms can be reduced into the particular case of (A,B) under Frame (1,1), without loosing information's diversity?

Please be aware of the fact that (A,B) is just a particular case of Frame (1,1) under 2-Redundancy x 2-Uncertainty Distinction Tree, as follows:

Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

(2,2) = (AB,AB)
(2,1) = (AB,A),(AB,B)
(2,0)=  (AB)
[COLOR="magenta"][B](1,1)[/B][/COLOR] = (A,A),(B,B), [COLOR="magenta"][B](A,B)[/B][/COLOR]
(1,0)=  (A),(B)
(0,0)=  ()

What I don't really understand is the (0,0) = () part. Zero uncertainty and redundancy would mean (A,B) or (B,A) rather than (A,B) and (B,A). Does () mean that the actual precise local states of the 2 bit system in this case are known?
 
Last edited:
What I don't really understand is the (0,0) = () part. Zero uncertainty and redundancy would mean (A,B) or (B,A) rather than (A,B) and (B,A). Does () mean that the actual precise local states of the 2 bit system in this case are known?
(0,0) is the absence of information of any kind, by this model.

I think that it is clearly expressed by:
Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

Please see http://www.internationalskeptics.com/forums/showpost.php?p=6540703&postcount=12288 .
 
Last edited:
(0,0) is the absence of information of any kind, by this model.

I think that it is clearly expressed by:
Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

Please see http://www.internationalskeptics.com/forums/showpost.php?p=6540703&postcount=12288 .

Then a dot (.) in the diagram means no identity at all. Hmm... I can understand that information can be uncertain, but no information at all? For example a quantum computer with two bits, then the information can be uncertain I guess but can there be no information at all in the 2 bit system?
 
epix, you can add Dr. Gérard P. Michon ( http://www.numericana.com/ , http://www.numericana.com/answer/ ) to your "mental cases" list.

http://www.numericana.com/answer/sets.htm#infinity

Grasping Infinity

Mathematicians routinely study things whose infinite versions turn out to be much simpler than the finite ones. One example is the sum (properly called a "series"):

1/2 + 1/4 + 1/8 + 1/16 + 1/32 + 1/64 + 1/128 + 1/256 + ...

This sum is equal to 1-2^-n when carried out only to its n-th term. It's simply equal to 1 if all of the infinitely many terms are added up.

When the ancient Greeks were still wrestling with the concept of infinity, the above sum was underlying something called Zeno's paradox : Before an arrow reaches its target it must first travel half of the distance to it (1/2), then half of what's left (1/4), half of what's left after that (1/8), and so forth. Although there are infinitely many such "steps" the arrow does reach its target... (Try it!)
 
Last edited:
Then a dot (.) in the diagram means no identity at all. Hmm... I can understand that information can be uncertain, but no information at all? For example a quantum computer with two bits, then the information can be uncertain I guess but can there be no information at all in the 2 bit system?
You can think about the "content" of the empty set, or a closed radio.
 
Last edited:
You can think about the "content" of the empty set, or a closed radio.

Ok, but the bits in a quantum computer for example are always on I think (I could be wrong, I don't know how a quantum computer works). And then empty information would not be a valid state. EDIT: At least not a state where one bit is off and another bit is on at the same time.
 
Last edited:
Novel notions are naturally not a part of the huge Internet database of some considered subject.
MP3 is not a "novel notion," it's a data compression method that has been around for quite a while, and so if there is an idea regarding MP3, there could be a reference to it.

Your "novel notions" too often frequent the part of your definitions reserved for definiens that must be made of the established terminology, so the meaning of the novel term could be understood. And so, an unsuspecting reader may look for assistance on the web, not knowing that you are addicted to the violation of every rule that has been and will be.
 
Status
Not open for further replies.

Back
Top Bottom