• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Deeper than primes

Status
Not open for further replies.
I had high hopes....but, as expected, my head now hurts from trying to unscramble the psuedo-scientific babble that is prevailing.

I should know better, but it's simply too hard to look away.
 
The aim of the organic paradigm of the mathematical science is to understand what enables to define things, where distinct definitions is only a particular case of Distinction (if Distinction is a first-order property of the mathematical science).
 
The aim of the organic paradigm of the mathematical science is to understand what enables to define things, where distinct definitions is only a particular case of Distinction (if Distinction is a first-order property of the mathematical science).


Doron,
That is not the topic of this thread. If you now drift into your organic non-mathematics non-science, then expect this thread to find its way to the Philosophy forum where you will likely abandon it.
 
Distinction is the relation between the certain and the uncertain.

The certain and the uncertain complement each other.

It means that they are not defined in terms of the other.

For example, if the uncertain is darkness, you cannot use light (the certain) in order to research the darkness (the uncertain) because by using light you change the researched subject (darkness, in this case).

The idea is to define the common property that stands at the basis of both darkness and light, and then you are able to get their relations and each one of them without changing them by your research.

By using this knowledge we can develop new methods in order to improve the relations between complement states (any cardinal of complement states can be found, it does not matter) in addition to their property to contradict (prevent) each other.

Multi-set is the result of things that simultaneously complement AND prevent each other, and the best knowledge is based on the non-trivial relation between the certain and the uncertain.

Please look again at ONN5 represented by Penrose tiling:

[qimg]http://www.geocities.com/complementarytheory/Penrose.jpg[/qimg]

ONN5 is a one thing that simultaneously defined as several states of Distinction.

Please let your mind to get the perception of this distinction.

You can look into each state and directly get how your perception spontaneously defines its distinction (it can clearly be seen if you compare between the first top-left case (that has maximum entropy) and the last bot-right case (that has minimum entropy)).

But please do not forget that this comparison is nothing but the particular case of clear distinction.

The current paradigm of the mathamatical science is limited to clear distinction as its first-order property.

By the the organic paradigm, Distinction is itsef a first-order property of the mathematical science, and it is not limited to any particular case of it.

Let’s make something clear.

Distinction is the ability to point out the property that makes 2 different elements different.

If there is no distinction (can’t tell no difference) between {1,2,3} and {2,2,2}
Why the heck should you say that {1,2,3} has no entropy and that {2,2,2} has full entropy the same has {3,3}, WHEN YOU CAN’T TELL THE DIFFERENCE?

WHAT MAKES THE DIFFERENCE, AND HOW DO YOU CLASSIFIE THOSE DIFFERENCES!!!!!!
 
WHAT MAKES THE DIFFERENCE, AND HOW DO YOU CLASSIFIE THOSE DIFFERENCES!!!!!!

The complementation between the common AND the difference.

No one of them alone is the general state of the mathematical science.

Again, knowing the difference is nothing but some particular case of the mathematical science, where entropy is both inherent state of some multi-set, and a distinct result based on comparing between different multisets (which is the particular case of clear distinction).
 
The complementation between the common AND the different.

No one of them alone is the general state of the mathematical science.

To be able to tell the difference is WHY WE HAVE MATH. To tell what are odds and evens, what properties makes them different, what do they implicate, etc, etc, etc.
To define your terms, IS THE FIRST STEP IN MATH.

And sense you are missing a quite fundamental element, your work is COMPLETELY USELESS.

Get it trough your head.
 
The complementation between the common AND the difference.

Three more undefined terms.

Again, knowing the difference is nothing but some particular case of the mathematical science, where entropy is both inherent state of some multi-set, and a distinct result based on comparing between different multisets (which is the particular case of clear distinction).

And three more undefined terms.

If I save up enough of these, can I trade them in for green stamps or something?
 
The complementation between the common AND the difference.

No one of them alone is the general state of the mathematical science.

Again, knowing the difference is nothing but some particular case of the mathematical science, where entropy is both inherent state of some multi-set, and a distinct result based on comparing between different multisets (which is the particular case of clear distinction).


Doron,
Your post is off topic. You should expect many people will be reporting your posts as off-topic to the moderators.

Please return to the topic you, yourself, posed first in this thread. The topic has to do with a mathematical consideration of an Information Theory-like concept of entropy and its relationship to prime numbers.

A very good place to return to the topic at hand would be with the definition for your version of the term entropy. So far, you have presented examples to show it to be a partial ordering. You need to flesh out the rules for that ordering.
 
To be able to tell the difference is WHY WE HAVE MATH.
Here is some example:

Let us say that we compare things as long as they are between "{" and "}"

{a,a,a} has an entropy because no difference can be found between the members of this multi-set.

{a,b,c} has no entropy because a difference can be found between the members of this multi-set.

{{a,a,a},{a,b,c}} has no entropy because a difference can be found between the members of this multi-set.

So as you see, to know the difference is the particular case of no entropy, where Mathematics is not any of its particular cases.
 
Last edited:
Here is some example:

Let us say that we compare things as long as they are between "{" and "}"

{a,a,a} has an entropy because no difference can be found between the members of this multi-set.

{a,b,c} has no entropy because a difference can be found between the members of this multi-set.

You still haven't learned the difference between "no entropy" and "entropy zero," have you, despite many repetitions.

Entropy is a NUMBER, not just a property. (Except possibly in your framework, where it may be a partial ordering, but it's hard to tell because you still haven't given us a definition of entropy in your framework. It may be a partial order, or it may be complete happy horse manure. I'm now taking bets on which.)
 
I don't think she said that.

I think she said that the magnitude of the difference between two elements is unimportant, as long as they differ. I.e. the quasi-entropy of [2,1,1]* is the same as the quasi-entropy of [5,1,1] or of [Coke, Pepsi, Pepsi] as long as we're dealing with a multiset with one singleton element and one pair.

Similarly, I don't remember her saying that the value of the quasi-entropy isn't important. She's merely demonstrated a complete inability (or more charitably unwillingness) to quantify her notion of quasi-entropy so that we can actually calculate that value.


(*) See, DDT, I'm using your multiset notation. Happy?

Heh, I don't know -- that's why I'm asking Doron. I't's unwise to speculate what Doron means, beyond being confused :)

Btw Doron's a he, not a she, inspite of what one might conclude from the name. (DDT's provided conclusive evidence Doron's some manager at Tahal, some Israeli construction concern.)
 
That you keep reiterating that order is not important, even though multisets (and sets) are unordered, leads me to suspect you do not know what a multiset (or set) is. So, let's start with some basic questions:

Do you agree or disagree that [a,a,b] and [a,b,a] are the same multiset?

Do you agree or disagree that [a,a,b] and [a,b,a] are different representations of the same multiset?

You've not responded to these questions. Do you not know what a multiset is?
 
Here is some example:
You're mostly repeating examples you already gave.

What about commenting with insight on the three entropy functions proposed in this thread?


Let us say that we compare things as long as they are between "{" and "}"
I see another "a set is the union of its members" trainwreck coming on.

Please also use square brackets [ ] for multisets.

{a,a,a} has an entropy because no difference can be found between the members of this multi-set.

{a,b,c} has no entropy because a difference can be found between the members of this multi-set.
Not "no entropy" - entropy 0.

{{a,a,a},{a,b,c}} has no entropy because a difference can be found between the members of this multi-set.
What's different about this than about the multiset [Pepsi, Coke] ? Both multisets have two members that are different. Whether those members are multisets themselves is irrelevant. You want to make matters more complicated when you already have no grasp of what you're dealing with now? Sick.

So as you see, to know the difference is the particular case of no entropy, where Mathematics is not any of its particular cases.
Is that English?

To add: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Here is some example:

Let us say that we compare things as long as they are between "{" and "}"

{a,a,a} has an entropy because no difference can be found between the members of this multi-set.

{a,b,c} has no entropy because a difference can be found between the members of this multi-set.

{{a,a,a},{a,b,c}} has no entropy because a difference can be found between the members of this multi-set.

So as you see, to know the difference is the particular case of no entropy, where Mathematics is not any of its particular cases.


Ok, so, according to your (ab-)use of the term entropy, it is a property that a multi-set either has or doesn't have. You continue to skirt examples like [1,1,2], but by your description, above, [1,1,2] has "no entropy" because a difference can be found between the members.

By the way, your description contradicts previous examples. In particular, [1] has "an entropy".

So, where are we? Well, you now have provided some meaning to your non-standard use of the term entropy. Unfortunately, in renders meaningless just about all of your Notion #1.

That doesn't leave you with very much to discuss in this thread.
 
Distiction as a first-order property, is important.

Refrain from using any more terms you don't understand or define. "Distinction" and "first-order property" are off-limits for you now. Otherwise, you might find a herring head on your pillow. (*)

To add: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".

(*) see my custom title.
 
For example:

a = 0

b = 1

a < c < b

and we get {a,b,c} that is some case with no entropy.
Why do you re-introduce some kind of order? For all we know, there is no ordering on a, b, and c. We only can compare them for (in)equality. This example is totally irrelevant. And, we do get it now.

Where's your definition of entropy? Any of the proposed ones to your liking?

To add: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
As long as you ignore Distinction as multi-set's first-order property, you do not get my idea.

What idea? The OP? Despite all help offered we've come no inch closer to anything resembling real math. Why would that be?

Oh yes, you profoundly misunderstood the two proposed definitions in posts #75 and #82. You didn't comment on TMiguel's one. Now, who here doesn't get basic math?

To add: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Here is some example:

Let us say that we compare things as long as they are between "{" and "}"
This is not useful at all.

{a,a,a} has an entropy because no difference can be found between the members of this multi-set.
Ok?

{a,b,c} has no entropy because a difference can be found between the members of this multi-set.
What difference? Don’t answer that, because it is obvious to see what it is you failed to mention (which is they have different elements, and not there is a difference), but we are getting somewhere.

So as you see, to know the difference is the particular case of no entropy, where Mathematics is not any of its particular cases.
Evens are different from odds, that has nothing to whit entropy.

Before:
{x} = Full entropy
{x} = Intermediate entropy
{x} = No entropy
You mentioned 3 different states, you only explained (very badly) 2.

Now how do you express that mathematically?
Let me give you a hint:
Lets try this one.
Being “A(n,k)” a partition k of a number n
Being F(s) the accounting function for the distinct (none repeated) elements within a multi-set
If F(A(n,k))=1 then the group is said to be un-distinct (in your case max entropy)
Else If #A(n,k)-F(A(n,k)=0 is said to be fully-distinct (in your case no entropy)
Else it is said to be semi-distinct (in your case intermediate entropy)
 
Status
Not open for further replies.

Back
Top Bottom