Deeper than primes

Status
Not open for further replies.
If you think about Complexity in terms of Information, I think that the useful energy that is involved depends on our ability to deal with uncertain and/or redundant conditions.

Such conditions are usually considered as "white noise" that has to be reduced in order to get some desired results by using minimum energy.

I think that the tendency of "getting maximum results by using minimum energy" can quickly lead us into triviality and atrophy if redundancy and uncertainty are not taken as natural and challenged conditions, which stimulate us to research beyond our current cozy spotlight.

I have found that the main stream of the mathematical development of the past 3,000 years is based step-by-step serial view of the researched (abstract or not) things, and as a result it uses techniques that are tuned to deal with strict-only information.

Take for example the intuitive concept of Set, which is based on a collection of distinct things such that, for example, {a,a,b}={a,b}.

It means that a property like redundancy as found in {a,a,b} is not taken as an essential case of information but as a case that has to be reduced into {a,b}, in order to get the optimal conditions for analytic research, which is basically a step-by-step serial way of thinking.

By using only analytic methods, our ability to use also parallel thinking is not stimulated and not developed.

Uncertainty and Redundancy are actually natural results of parallel thinking, which deals with information at-once, where order or strict distinction are not its natural characters.

Take for example the concept of Superposition of opposites, which is understood in terms of addition of values that are summed to 1, for example: 0.5True + 0.5False .

This approach weaken the involved opposites in order to avoid logical contradiction, where this logical contradiction is a direct result of step-by-step thinking, which can't deal at-once with opposites like True and False.

But under parallel thinking 1True + 1False is simply an uncertain information of "strong" superposition, which may or may not collapsed into certain 1True or certain 1False.

For more details about this non-standard view, please look at http://www.scribd.com/doc/18453171/International-Journal-of-Pure-and-Applied-Mathematics-Volume-49 .

Ok, that's similar to my guess in another post that what you mean by entropy is related to redundancy which can be seen as a kind of noise.

Yeah, that could be true. And redundancy is also related to uncertainty. Kind of at least I can see. For example if we have strings ab, aab, aaab, aaaab and bab, then only strings ab and bab are free from 'noise' so to speak, and thereby free from uncertainty in some sense.
 
Complexity is indeed a main concept of that thread, which is measured by Uncertainty x Reduncancy matrix, for example look at http://www.scribd.com/doc/17039028/OMDP .

Ah! Wait a minute. Now I see the point here. I didn't understand what 0-dim and 1-dim meant. From the text: "For example, by using direct perception we immediately understand that the claim that infinitely many 0-dim elements can completely cover a 1-dim element, is a false claim."

It says that an infinite number of zero-length dots cannot cover any length. Correct. Space is discrete. So there is a limit to how small actual dots can be. This limit could be the Planck length, or something even smaller that is yet to be discovered, yet not zero length.
 
Aye! and there's the rub. As Doron has professed some time ago, he aspires to have the mathematical capacity of a kindergarten student. It continues to be an uphill battle.

jsfisher, please help to epix.

He claims that traditional professional mathematicians actually prove that the sum of an infinite series like 1/2+1/4+1/8+1/16+... is < than 1.

He also claims that traditional professional mathematicians actually prove that the sum of an infinite series like 0.9[base 10]+0.09[base 10]+0.009[base 10]+... is < than 1.

As a professional mathematician, what is your answer to epix's claims?
 
Ah! Wait a minute. Now I see the point here. I didn't understand what 0-dim and 1-dim meant. From the text: "For example, by using direct perception we immediately understand that the claim that infinitely many 0-dim elements can completely cover a 1-dim element, is a false claim."

It says that an infinite number of zero-length dots cannot cover any length. Correct. Space is discrete. So there is a limit to how small actual dots can be. This limit could be the Planck length, or something even smaller that is yet to be discovered, yet not zero length.

Please look at this diagram:
4711080643_d71d687fc4_b.jpg

Let us understand Non-locality and Locality as building-blocks, and their use by collections.

Self-reference (whether it is True or False) is the building-block of certain ids, because it is the simultaneity of no more than one value.

Non self-reference (whether it is True or False) is the building-block of uncertain ids because it is the simultaneity of more than one value.

Since the considered framework is at least the linkage of Self-reference AND Non Self-reference, we get these different states:

a) A collection (which is actually based on both building-blocks) has an asymmetrical character if we are focused on the certain id's aspect (the simultaneity of no more than one value) of the linkage (we are able to define direction that is based on some order among certain id's).

b) A collection (which is actually based on both building-blocks) has symmetrical character if we are focused on the uncertain id's aspect (the simultaneity of more than one value) of the linkage (we are unable to define direction that is based on some order among uncertain id's, which is exactly the uncertainty of id's superposition).

c) A collection (which is actually based on both building-blocks) has several symmetrical degrees between Symmetry (superposition of ids) and Asymmetry (non superposition of ids)

d) Any given form is both global AND local state of the considered framework, for example:

Jsfisher and you clearly demonstrate that the number of given things cannot fully capture the non-trivial meaning of what Number is, simply because the traditional meaning of Number is based only on clear distinction of the involved.

By using Non-locality/Locality linkage as the qualitative foundation of Number, clear distinction is simply one of the options, for example:

Quantity 2 (and it does not matter if it is a whole number or two places of some 0.xx fraction) can't be used unless there is at least connector/connected linkage, where the connector has non-local quality and the connected has local quality.

From this qualitative foundation, Uncertainty and Redundancy are the fabric of the mathematical space that enables:

1) Strong symmetric observation of the linkage, which is resulted by superposition of identities (uncertain ids, for example: (AB)).

2) Weak symmetric observation, which is resulted by non-distinct replacement among clear ids (redundant ids, for example: (A,A), (B,B), (AB,AB) (in the last case AB superposition is ignored and taken as 'AB' notation for clear id of superposition representation)).

3) Asymmetric observation, which is resulted by clear ids (for example: (A,B))

By the way, the ( 1) , 2) , 3) ) explanation above uses the Asymmetric observation (3), but again, no one of the options above has any privilege and we as participators (and not only observers) of this mathematical universe actually design it for our purpose.

The coherence of this mathematical universe is guaranteed by its Non-local/Local linkage qualitative foundation, where Non-locality and Locality complement each other into a one complex fabric.
 
Yes, yes, doron, all very nice and colored. But what's the use for it? You know, like "conventional" mathematics enables us to, let's say, calculate areas, volumes, solve differential equations and such.

Let's say I accept that your marvelous "theory" is consistent and sensical. OK. Next step? Keep in mind that "conventional" mathematics has been proven to work. They're using it all over physics for example. Reality agrees with the results. Show us something that's the result of OM and reality agrees with it. We'll take it from there.
 
Last edited:
Can you identify where on that line there is no point?

I think it's more correct to think of a position on a line to have a minimum level of uncertainty that is always greater than zero. So that a position on a line can only be determined to a certain degree of accuracy, not infinite accuracy even in theory.
 
Please look at this diagram:
[qimg]http://farm5.static.flickr.com/4068/4711080643_d71d687fc4_b.jpg[/qimg]
...
a) A collection (which is actually based on both building-blocks) has an asymmetrical character if we are focused on the certain id's aspect (the simultaneity of no more than one value) of the linkage (we are able to define direction that is based on some order among certain id's).

b) A collection (which is actually based on both building-blocks) has symmetrical character if we are focused on the uncertain id's aspect (the simultaneity of more than one value) of the linkage (we are unable to define direction that is based on some order among uncertain id's, which is exactly the uncertainty of id's superposition).

I don't fully understand the diagram. And symmetry (S) vs asymmetry (A), is it like the below partitions?

2
---
{1,1} S


3
---
{1,1,1} S
{2,1} A


4
---
{1,1,1,1} S
{2,1,1}
{2,2} S
{3,1} A
 
I don't fully understand the diagram. And symmetry (S) vs asymmetry (A), is it like the below partitions?

2
---
{1,1} S


3
---
{1,1,1} S
{2,1} A


4
---
{1,1,1,1} S
{2,1,1}
{2,2} S
{3,1} A

Exactly, where {2,1,1} (for example) is some degree between S and A.

But I go deeper than Partition's first level by using recursion of the previuos partitions as a form of the current considered partiton, for example, look at http://www.scribd.com/doc/16542245/OMPT pages 18-19.
 
I like the idea of redundancy and uncertainty related to information. I don't understand how it relates to natural numbers but I can see it has a meaning for information in general.

For example, using information (Shannon) entropy, both a string of random information and a string of complex information give a high information entropy value, which kind of sucks because it would be good to be able to calculate a distinction between random information (noise, disorder) and ordered complex information.

Random information is a result of uncertainty. Complex information is on the other hand related to low redundancy. It would be useful therefore, probably even useful in practical applications (such as effective noise filters), to have some mathematical way of measuring the amount of uncertainty in digital information. That measurement in combination with information entropy would show the noise-complexity ratio for any source of digital information.
 
Exactly, where {2,1,1} (for example) is some degree between S and A.

But I go deeper than Partition's first level by using recursion of the previuos partitions as a form of the current considered partiton, for example, look at http://www.scribd.com/doc/16542245/OMPT pages 18-19.

Holy mackerel! I sort of grasp a bit of the local - non-local idea, but the part about cybernetic kernels was like two miles above my head. :D But I will read a bit more of the first part of the paper to see if I can understand some more.
 
I like the idea of redundancy and uncertainty related to information. I don't understand how it relates to natural numbers but I can see it has a meaning for information in general.

For example, using information (Shannon) entropy, both a string of random information and a string of complex information give a high information entropy value, which kind of sucks because it would be good to be able to calculate a distinction between random information (noise, disorder) and ordered complex information.

Random information is a result of uncertainty. Complex information is on the other hand related to low redundancy. It would be useful therefore, probably even useful in practical applications (such as effective noise filters), to have some mathematical way of measuring the amount of uncertainty in digital information. That measurement in combination with information entropy would show the noise-complexity ratio for any source of digital information.

Take for example the Fibonacci number (http://en.wikipedia.org/wiki/Fibonacci_number), which can be found at the basis of many natural phenomena.

By not ignoring Uncertainty and Redundancy, we are able to construct many variations of Fibonacci number, which are based on mixture of uncertain + redundant + strict information forms, as represented by the following partial example:
5163344885_a9eae21a88_b.jpg
 
Take for example the Fibonacci number (http://en.wikipedia.org/wiki/Fibonacci_number), which can be found at the basis of many natural phenomena.

By not ignoring Uncertainty and Redundancy, we are able to construct many variations of Fibonacci number, which are based on mixture of uncertain + redundant + strict information forms, as represented by the following partial example:
[qimg]http://farm5.static.flickr.com/4090/5163344885_a9eae21a88_b.jpg[/qimg]

The math is too complicated for me. Is it possible to use organic mathematics for binary information? For example, I can imagine that an organic bit would have both locality and non-locality at the same time.

A traditional binary string is simply 1s and 0s at exact positions, such as 011000010101. But an organic bit would be a superposition so that it could be both 0 and 1 at the same time (non-locality for the bit's value) and be in all positions 1 to 12 in the example at the same time (non-locality for the bit's position in the binary string).
 
The math is too complicated for me. Is it possible to use organic mathematics for binary information? For example, I can imagine that an organic bit would have both locality and non-locality at the same time.

A traditional binary string is simply 1s and 0s at exact positions, such as 011000010101. But an organic bit would be a superposition so that it could be both 0 and 1 at the same time (non-locality for the bit's value) and be in all positions 1 to 12 in the example at the same time (non-locality for the bit's position in the binary string).

In order to avoid confusion, let's use A,B instead of 0,1

For binary system of 2-Ucertainty x 2-Redundancy matrix we have 10 distinct forms:

Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

(2,2) has (AB,AB) form
(2,1) has (AB,A),(AB,B) forms
(2,0) has (AB) form
(1,1) has (A,A),(B,B),(A,B) forms
(1,0) has (A),(B) forms
(0,0) has () form

(AB,AB), which has 2-Ucertainty x 2-Redundancy degree, has a "room" for the rest of the forms, which are: (AB,A),(AB,B),(AB),(A,A),(B,B),(A,B),(A),(B),()

() does not have a "room" for any form.

Between these extremes there are (AB,A),(AB,B),(AB),(A,A),(B,B),(A,B),(A) forms, which are ordered by their ability to be "rooms" for other forms.
 
Last edited:
In order to avoid confusion, let's use A,B instead of 0,1

For binary system of 2-Ucertainty x 2-Redundancy matrix we have 10 distinct forms:

Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

(2,2) has (AB,AB) form
(2,1) has (AB,A),(AB,B) forms
(2,0) has (AB) form
(1,1) has (A,A),(B,B),(A,B) forms
(1,0) has (A),(B) forms
(0,0) has () form

(AB,AB), which has 2-Ucertainty x 2-Redundancy degree, has a "room" for the rest of the forms, which are: (AB,A),(AB,B),(AB),(A,A),(B,B),(A,B),(A),(B),()

() does not have a "room" for any form.

Between these extremes there are (AB,A),(AB,B),(AB),(A,A),(B,B),(A,B),(A) forms, which are ordered by their ability to be "rooms" for other forms.

(2,2) I take it means maximum uncertainty and maximum redundancy for a two bit system. And the lesser value pairs for the uncertainty and redundancy in this case are: (2,1) (1,2) (2,0) (0,2) (1,0) (0,1) and (0,0). But that's 8 combinations and you listed only 6 combinations. I must be mistaken or have missed something.
 
(2,2) I take it means maximum uncertainty and maximum redundancy for a two bit system. And the lesser value pairs for the uncertainty and redundancy in this case are: (2,1) (1,2) (2,0) (0,2) (1,0) (0,1) and (0,0). But that's 8 combinations and you listed only 6 combinations. I must be mistaken or have missed something.
(2,1)=(1,2)
(2,0)=(0,2)
(1,0)=(0,1)

because order has no significance at this fundamental level, as can be seen by the following trees:

Code:
(AB,AB) (AB,A)  (AB,B)  (AB)    (A,A)   (B,B)   (A,B)   (A)     (B)     ()

A * *   A * *   A * .   A * .   A * *   A . .   A * .   A * .   A . .   A . .
  | |     | |     | |     | |     | |     | |     | |     | |     | |     | |
B *_*   B *_.   B *_*   B *_.   B ._.   B *_*   B ._*   B ._.   B *_.   B ._.

Also please look at http://www.scribd.com/doc/21967511/...considerations-of-Some-Mathematical-Paradigms .
 
Last edited:
Yes, yes, doron, all very nice and colored. But what's the use for it? You know, like "conventional" mathematics enables us to, let's say, calculate areas, volumes, solve differential equations and such.
Hold the horses! Pause and think when you decide to compute the volume of a cylinder, for example. According to Doron, traditional math is asking for trouble by not realizing that 3-dim space is independent of its subspaces. So if you think that the volume of the cylinder depends on the area of the circle (2-dim space) that makes the cylinder's bottom, and the circle area depends on the length of the radius (1-dim space), then you think wrong. The reason why car engines work when they leave the drawing board has been a case of a sheer, unimaginable coincidence coupled with a direct divine intervention. But God needs to go to the bathroom once a millennium and when it happens . . . Hold the horses for real.
 
Last edited:
But that's 8 combinations and you listed only 6 combinations. I must be mistaken or have missed something.
I would suggest to read the intro to Doronetics once again:
http://www.internationalskeptics.com/forums/showthread.php?t=125220

7
---
{1,1,1,1,1,1,1}
{2,1,1,1,1,1}
{2,2,1,1,1}
{2,2,2,1}
{3,1,1,1,1}
{3,2,1,1}
{3,2,2}
{4,1,1,1}
{4,2,1}
{5,1,1}
{5,2}
{6,1}

As you see two partitions are missing; namely {3,3,1} and {4,3}. It's like a harbinger of things to come . . .

In the intro, Doron extends the meaning of "entropy" to the case where a collection of identical items can be reorganized, but there is no way to prove that the reorganization took place, as opposed to the case involving the reorganization of distinct items.

Since prime numbers cannot be divided by other numbers but themselves and number 1, the partitions with the "highest entropy" such as {3,3,3,...}, {5,5,5,...}, and so on are absent in the partitions of primes. That might lead to a conclusion which Doron made. But the conclusion really depends on the incidence of the "lowest entropy" cases (the blue partitions with distinct blue elements.) In this respect, the primes actually display the common property of all odd numbers, and there is no statistically significant difference in this respect between primes and odd numbers, based on a sample much larger than Doron's.
 
Last edited:
I would suggest to read the intro to Doronetics once again:
http://www.internationalskeptics.com/forums/showthread.php?t=125220



As you see two partitions are missing; namely {3,3,1} and {4,3}. It's like a harbinger of things to come . . .

In the intro, Doron extends the meaning of "entropy" to the case where a collection of identical items can be reorganized, but there is no way to prove that the reorganization took place, as opposed to the case involving the reorganization of distinct items.

Since prime numbers cannot be divided by other numbers but themselves and number 1, the partitions with the "highest entropy" such as {3,3,3,...}, {5,5,5,...}, and so on are absent in the partitions of primes. That might lead to a conclusion which Doron made. But the conclusion really depends on the incidence of the "lowest entropy" cases (the blue partitions with distinct blue elements.) In this respect, the primes actually display the common property of all odd numbers, and there is no statistically significant difference in this respect between primes and odd numbers, based on a sample much larger than Doron's.

Yeah, I haven't read all of the original post. I will give it another try.
 
Status
Not open for further replies.

Back
Top Bottom