• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Deeper than primes

Status
Not open for further replies.
Since order is not important, then any multiset's elements have to be defined in parallel (not by step-by-step serial research).

It is easier to get it by using a parallel view.

:jaw-dropp :eye-poppi :boggled: :eek: :covereyes

Define the elements of a multiset? What are you smoking?

ETA: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Last edited:
Each multi-set defines its own entropy according the distinction of its elements and its cardinality (the number of its elements).
No, the entropy is defined for each multiset. Not the other way round.

{1} has no entropy because anything is distinct.

{1,2} or {1,3} have no entropy

No entropy? So 'entropy' is not defined for each multiset? You certainly mean "entropy zero" or "entropy 0".

{1,1} has an entropy and {1,1,1} has more entropy than {1,1}.

{2,2,2} has the same entropy as {0,0,0} etc...

ETA(2):
In the OP, you say of both [1,1] and [1,1,1] they have "full entropy", suggesting their entropy values are the same.

So, are you absolutely sure now your idea of entropy is as stated above? Do you solemnly vow never to change that opinion again during this thread?​


Have you looked at posts #75 and #82?

ETA: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Last edited:
Each multi-set defines its own entropy according the distinction of its elements and its cardinality (the number of its elements).

No, as ddt pointed out, there is one definition for entropy -- entropy is defined for each multi-set.

You still haven't defined it, though. Remember that entropy is a number in both thermodynamics and information theory; it's not sufficient to simply say "x has more entropy than y."

{} has the highest entropy because nothing is distinct.

What does this mean? Does this mean that {} has infinite entropy?

{1} has no entropy because anything is distinct.

Again, as ddt pointed out, every multiset has an entropy. Do you mean {1} has entropy zero (0)?


{1,2} or {1,3} have no entropy.

Ditto.

{1,1} has an entropy and {1,1,1} has more entropy than {1,1}.

Well, of course {1,1} has an entropy. Every multiset has an entropy if any of them do.

But remember, entropy is a number. Consider the following sequence of multisets:
{1},{1,1},{1,1,1},{1,1,1,1},...

Which of those is the first multiset that has entropy >= 1.0? Which is the first one with entropy >= 10.0?

You still haven't defined what the "entropy" is that you're working with, despite two attempts by ddt to offer definitions.

And under the post #75 definition, for example, the sets {1,1} and {1,1,1,1} have identical entropy.
 
The the size of difference between the elements is not important.

So, it's an ordering. Is it partial or total? Given two multisets A and B, what is the procedure for ordering them by Entropy?
 
Last edited:
Each multi-set defines its own entropy according the distinction of its elements and its cardinality (the number of its elements).

{} has the highest entropy because nothing is distinct.

{1} has no entropy because anything is distinct.

{1,2} or {1,3} have no entropy.

{1,1} has an entropy and {1,1,1} has more entropy than {1,1}.

{2,2,2} has the same entropy as {0,0,0} etc...

The the size of difference between the elements is not important.


You seem to be avoiding cases like {1,3,1}. Why is that?

Does {1,1,1,1,1,1,1,1,1,1,1,1,3} have more or less entropy than {1,1,1,1}?


And how does one determine the entropy of a given multi-set? You need to clarify this.
 
What does this mean? Does this mean that {} has infinite entropy?
Again, as ddt pointed out, every multiset has an entropy. Do you mean {1} has entropy zero (0)?
Well, of course {1,1} has an entropy. Every multiset has an entropy if any of them do.

But remember, entropy is a number. Consider the following sequence of multisets:
{1},{1,1},{1,1,1},{1,1,1,1},...

You seem to be avoiding cases like {1,3,1}. Why is that?

Does {1,1,1,1,1,1,1,1,1,1,1,1,3} have more or less entropy than {1,1,1,1}?
Please, let's use the [x, y, z] notation. Or something else, but not the same notation as sets. Before you know, Doron uses {} again for sets without saying so and we spend another fruitless 100 or so posts to clarify that.

And under the post #75 definition, for example, the sets {1,1} and {1,1,1,1} have identical entropy.
Indeed. But the one from post #82 (without the -1) gives them different values (1/2 resp. 1/24).
 
Let me give you an example, of what we want.
Consider “F(s)” a function that counts the distinct number of elements whit in a set S.
Ex. “F({1,2,3,4})=4” (4 different entities), “F({1,2,2,4})=3” (because the entity 2 is repeated then it only counts has one)
And in your definition of entropy of a partition “A(n,k)” (partition k of the number n) is equal to “#A(n,k)-F(A(n,k))”.

This is an example of a definition, we now have a way to quantify it, attribute a relation of order, bash it and finally kill it.

If you can’t come up whit something like this, I suggest you stop wasting our time.
 
Last edited:
Doron: When you talk about Symmetry and multisets what are the transformations that you are calling the multisets symmetrical in respect to?
Please give examples, e.g. take {3,1,1} and list the transformations that you used to determine its symmetry.
 
Let me give you an example, of what we want.
Consider “F(s)” a function that counts the distinct number of elements whit in a set S.
Ex. “F({1,2,3,4})=4” (4 different entities), “F({1,2,2,4})=3” (because the entity 2 is repeated then it only counts has one)
And in your definition of entropy of a partition “A(n,k)” (partition k of the number n) is equal to “#A(n,k)-F(A(n,k))”.

This is an example of a definition, we now have a way to quantify it, attribute a relation of order, bash it and finally kill it.

If you can’t come up whit something like this, I suggest you stop wasting our time.

We have multi-sets and we use Distinction as their first-order property.

It means that if all we care is to define each multi-set by a distinct value, then we are actually closed under the particular case where our researched objects are translated to another multi-set of distinct elements.

In other words non-distinct results are perfectly valid and do not have to be translated to some distinct result, because by this limitation Distinction cannot be considered as a first-order property.

TMiguel uses, for example, this function:

F({1,2,3,4})=4 , F({1,2,2,4})=3 , etc …

By this method he simply wants to reduce multi-sets to their distinct case, but the whole idea here is to save the non-distinct case of 2,2 because Distinction is a first-order property of multi-sets.


Again: {1,1,1} is non-distinct and {1,1,1,1} is more non-distinct if they are compared to each other.

We can get some distinct result of this comparison, which is nothing but some particular case of clearly distinct result.

In other words, if Distinction is a first-order property of the researched framework, then it is not limited to any particular case of Distinction.


If Distinction is a first-order property of our framework, then our framework is not limited to any particular case of it, and each researched case can be both some particular case and general case of the entire framework.

Here is a diagram of Distinction:

icmfig4.jpg


As can be seen in this diagram, we are using the particular case of clearly distinct identification as a general viewpoint of the entire system, but any other case which is not a distinct viewpoint, can be used as a general viewpoint of the entire system as well.
 
Last edited:
Multi-sets do not have Distinction as their first-order property. That is not in their definition.
P.S. Please give a list of the second-order properties of multi sets. What about the third-order properties of multi-sets?

Have you thought about the fact that multisets do not have ordered members, have no Symmetry and so they all have minimum Entropy?
 
We have multi-sets and we use Distinction as their first-order property.

Now stop the crap and keep to the topic. You wanted a definition of entropy. You're not able to come up with one yourself. Other posters have come up with three possible definitions: two from me (posts #75 and #82) and one from TMiguel (post #107).

Now would you show the decency to comment on those?

And stop the crap with your nonsensical beating about the bush. And stop with those silly pictures.

ETA: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Last edited:
Multi-sets do not have Distinction as their first-order property. That is not in their definition.
P.S. Please give a list of the second-order properties of multi sets. What about the third-order properties of multi-sets?

Have you thought about the fact that multisets do not have ordered members, have no Symmetry and so they all have minimum Entropy?
Yes they have, for example:

{1,2,3} = {3,2,1} is a multi-set of distinct members (order is not important).

{1,1,2} = {1,2,1} is a multiset of intermediate-distinct members (order is not important).

{1,1,1} = {1,1,1} is a multiset of non-distinct members (order is not important).
 
Now stop the crap and keep to the topic. You wanted a definition of entropy. You're not able to come up with one yourself. Other posters have come up with three possible definitions: two from me (posts #75 and #82) and one from TMiguel (post #107).

Now would you show the decency to comment on those?

And stop the crap with your nonsensical beating about the bush. And stop with those silly pictures.

#82:

Permutation is not the case since order is not important.

#75:

Probability is not the case since your result is some distinct value (some particular case of Distinction).
 
Yes they have, for example:

{1,2,3} = {3,2,1} is a multi-set of distinct members (order is not important).

{1,1,2} = {1,2,1} is a multiset of intermediate-distinct members (order is not important).

{1,1,1} = {1,1,1} is a multiset of non-distinct members (order is not important).
I am glad you agree - order is not important and so all multisets have the same Symmetry. Thus all multisets have the same Entropy by your very own definition. My guess is this is minimum Entropy.
 
I am glad you agree - order is not important and so all multisets have the same Symmetry. Thus all multisets have the same Entropy by your very own definition. My guess is this is minimum Entropy.

No.

{a,a,b} has the same entropy as {a,b,a} (order is nor important).

{a,a,b} has the same entropy as {a,c,b} (order is nor important).

{a,b,a} does not have the same entropy as {a,b,c} (and again order is not important but distinction is important, in the case of entropy).
 
No.

{a,a,b} has the same entropy as {a,b,a} (order is nor important).

{a,a,b} has the same entropy as {a,c,b} (order is nor important).

{a,b,a} does not have the same entropy as {a,b,c} (and again order is not important but distinction is important, in the case of entropy).
Distinction is not important in the usual definitions of entropy: Please provide a citation.

Also what is the difference between distinction and order?

ETA: This post imples that you now have a definition for entropy that gives numerical values so that you can state that "{a,b,a} does not have the same entropy as {a,b,c}". Please give that definition.
 
Last edited:
#82:

Permutation is not the case since order is not important.

#75:

Probability is not the case since your result is some distinct value (some particular case of Distinction).

Epic fail. You obviously didn't understand either of them. I estimate you're the only poster in this thread - I guess all other posters not only understand both of them, but could also explain them.

Go back to start, you don't collect $200.

As for your math: go back to grade 1 and study math from that.

ETA: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Last edited:
No.

{a,a,b} has the same entropy as {a,b,a} (order is nor important).

{a,a,b} has the same entropy as {a,c,b} (order is nor important).

{a,b,a} does not have the same entropy as {a,b,c} (and again order is not important but distinction is important, in the case of entropy).

There's a hole in your reasoning here you can drive a truck through.

Order is not important - you keep repeating that - to be more precise: elements of a multiset have no order.

So the entropy of [a,a,b] is the same as that of [a,b,a], since they are the same multiset.

And the entropy of [a,b,c] is the same as that of [a,c,b], since they are the same multiset too.

Do you see the contradiction with what you wrote above?

ETA: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Last edited:
Yes they have, for example:

{1,2,3} = {3,2,1} is a multi-set of distinct members (order is not important).

{1,1,2} = {1,2,1} is a multiset of intermediate-distinct members (order is not important).

{1,1,1} = {1,1,1} is a multiset of non-distinct members (order is not important).


Please stop saying "order is not important" every time. We are talking about multi-sets, so of course order is not important. There is no reason to belabor the obvious.


Now, and I will adopt ddt's recommendation for multi-set notation, please note that [1] is a multiset of non-distinct members. Every element of [1] is exactly the same as every other member. So, why did you claim [1] had no (presumably meaning 0) entropy?

You have also said [1,1,1] has more entropy than [1,1]. How would you quantify that?

How does [1,1,1,3] compare to [1,1]? Does it have more or less entropy?

Does [1] have more, less, or the same entropy as [1,2,3]?
 
Distinction is not important in the usual definitions of entropy: Please provide a citation.

Also what is the difference between distinction and order?
I have the impression that "distinction" in Doron-speak here means: a measure of the number of distinct elements in a multiset. At least, with that, doron's posts make a little sense to me. In fact, it's a synonym for his earlier term "entropy". This is SOP for Doron, to introduce multiple words with the same meaning. Or, tomorrow, the intended meaning of either of the words may be different. That's also SOP for Doron.

ETA: This post imples that you now have a definition for entropy that gives numerical values so that you can state that "{a,b,a} does not have the same entropy as {a,b,c}". Please give that definition.
Doron is too little versed in mathematics to even understand the simple possible definitions given by others. Don't hold your breath he can come up with a definition of his own.

ETA: Doron, what about your statement "Hilbert was wrong". Are you going to retract that or what? Ditto for "Gödel was wrong".
 
Last edited:
Status
Not open for further replies.

Back
Top Bottom