• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Deeper than primes

Status
Not open for further replies.
Actually, there is no problem to use "{" and "}" because I am looking for the relations between sets and multisets as the framework of Entropy's research.

By using Symmetry I find Entropy as a state that is measured by its invariance under exchange, for example:

By using cardinal 4 multisets {1,1,1,1} or {2,2} are invariant under their members' exchange, and in this case we have maximum entropy.

The entropy of {1,1,1,1} is greater than {2,2} because more members are invariant under exchange.

On the contrary {3,1} has no entropy because it is variant (and asymmetric) under exchange.

In my opinion, by using this preliminary idea, we can unify Thermodynamics and Information theory under a one method based on Symmetry as the common measurement tool.

A lot of work has to be done in order to develop this idea, so before any further effort in this direction, I which to know what do you think about this preliminary idea.




When we are talking about natural numbers' partition we have to use notations like (1,1,1,1) or (2,2) instead of {1,1,1,1} or {2,2}, but in both cases Symmetry is used as invariance under element's exchange.

You still haven’t answered my question.
What does it really account for?
Which of the sets have more entropy {4,2,1,1,1}, {3,2,2,1,1}?
How do you really account them?
 
Then the whole (new) idea is wrong.
Entropy is a continuous numeric value (a real number).
Measures of Symmetry are discrete numeric values (integers) unless you are talking about objects that have infinite symetry, e.g a sphere. This is certainly not the case here. The "partitions" that you talk about do not have infinite symmetry.
Thus Entropy cannot be defined in terms of Symmetry.

R set is based on distinct values , and so is N set.

I am not talking here about sets, but about the relations between sets and multisets in terms of distinction, measured by symmetry (superposition of identities) and asymmetry (distinct identities).

My first post is just an outline of this idea.
 
Last edited:
Please show where Shannon defines Symmetry as a common framework for Thermodynamics and Information theory research about Entropy?

Symmetry is defined as invariance under manipulation, where Entropy is the most symmetrical state (no new information can be found under any manipulation, and no work can be done, in terms of thermodynamic equilibrium).
Entropy is not the most symmetrical state. The most symmetrical state is the state with the largest number of "invariance under manipulation".
The best that you can say is that there may be a definition of entropy that is a measure of the number of symmetries that an object has. So not all you have to do is provide us with that definition.

Just in case you have not got it yet:
Symmetry is a property of a mathematical object, i.e. when the object undergoes a transformation and that transformation gives you the original object then the object is symmetrical under that operation. It is not a number. It is what happens to the object when transformed.

Entropy is a number.
 
R set is based on distinct values , and so is N set.

I am not talking here about sets, but about the relations between sets and multisets in terms of distinction, measured by symmetry (superposition of identities) and asymmetry (distinct identities).

My first post is just an outline of this idea.
R set is based on distinct continuous values , and N set is based on distinct values.
Symmetry is not a "superposition of identities". It is the result of a transformation of an object. Asymmetry is not "distinct identities". It is a result of a transformation of an object.
 
jsfisher said:
Wow! You've only just started this post, and already you are using the term, entropy, differently than first implied. I suppose that is the best thing about never defining anything. You are free to misuse your terms anyway that pleases you.

When we are talking about natural numbers' partition we have to use notations like (1,1,1,1) or (2,2) instead of {1,1,1,1} or {2,2}, but in both cases Symmetry is used as invariance under element's exchange.


Two things jump out at me, doron. First, why did you edit my post you quoted? Admittedly, it was a trivial change - an italics tag pair was dropped - but why would you do it?

Second, your response has absolutely nothing to do with my post. My comment is on your inconsistent use of your own terms, and you go off on a tangent about an unnecessary notation shift that you have to use, but didn't.
 
By using cardinal 4 multisets {1,1,1,1} or {2,2} are invariant under their members' exchange, and in this case we have maximum entropy.

Are you saying the multi-set {2,2} has cardinality of 4? What did you really mean?

The entropy of {1,1,1,1} is greater than {2,2} because more members are invariant under exchange.

By exchange you mean rearrange the members of the multi-set, right? Since multi-sets, just like sets, are unordered, your statement has no meaning.

On the contrary {3,1} has no entropy because it is variant (and asymmetric) under exchange.

Are you trying to say {3,1} and {1,3} are different multi-sets?
 
Doron: Wikipedia is your friend (since you do not seem to have access to mathematical textbooks): Multiset.

(Emphasis added.)
Don't assume that. Wikipedia is what lead doron to believe a set is the union of its members - a belief rigidly held to this very day.
 
Please show where Shannon defines Symmetry as a common framework for Thermodynamics and Information theory research about Entropy?

Symmetry is defined as invariance under manipulation, where Entropy is the most symmetrical state (no new information can be found under any manipulation, and no work can be done, in terms of thermodynamic equilibrium).
Entropy in thermodynamics and information theory is equal to the number of possible states of a system, nothing to do whit invariance and specially not symmetry.
And you still havn't answered my question!
 
Are you saying the multi-set {2,2} has cardinality of 4? What did you really mean?
My mistake I mean the sum 4.


By exchange you mean rearrange the members of the multi-set, right? Since multi-sets, just like sets, are unordered, your statement has no meaning.
I am talking about Distinction. A set is asymmetric since each member is distinct. This is not the case in a multiset.

Are you trying to say {3,1} and {1,3} are different multi-sets?

{3,1} is a set. {3,3} is a multiset.

In both cases (set or mutliset) I an talking about thier internal structure, in terms of symmetry)
 
Last edited:
Please show where Shannon defines Symmetry as a common framework for Thermodynamics and Information theory research about Entropy?

See? You're doing it again.

I offer some suggestions, and you react with hostility.


Symmetry is defined as invariance under manipulation, where Entropy is the most symmetrical state

Really? To paraphrase you, "please show me where anyone (it doesn't have to be Shannon) has defined entropy as the most symmetrical state"?

I even offered you a chance to be the one who had created that definition, and you couldn't. So again I offer you the chance to define your terms:

Please complete the following phrase: The entropy of a (multi)set S is given by the expression:

(Please bear in mind that, as pointed out, entropy is generally a continuous variable, and multisets are discrete; also bear in mind that to be useful as a definition of "entropy" there should be a set of limiting cases where traditional measures such as H = - sigma (p lg p) and your formulation should get the same answer.)
 
{3,1} is a set. {3,3} is a multiset.

All sets are multisets, or more formally, a multiset is a generalization of a set.

In both cases (set or mutliset) I an talking about thier internal structure, in terms of symmetry)

So you're suggesting that the sets {3,1} and {1,3} are distinct?
 
Actually, there is no problem to use "{" and "}" because I am looking for the relations between sets and multisets as the framework of Entropy's research.
There's no problem in that if your objective is to muddy the waters. Using sets and multisets and denoting both with { } braces makes the whole thing unreadable. But then, this is your standard MO, isn't it?

The entropy of {1,1,1,1} is greater than {2,2} because more members are invariant under exchange.
In your OP, you claimed both multisets had "full entropy". So what is it: are their entropies the same or are they different? Goes to show that you can't keep your story straight within a handful of posts.

When we are talking about natural numbers' partition we have to use notations like (1,1,1,1) or (2,2) instead of {1,1,1,1} or {2,2}, but in both cases Symmetry is used as invariance under element's exchange.
You're back again at your "symmetry" hobbyhorse? Another word in a long line of words you don't understand.

I have a melody but I don't know how to write the notes for some group of musicians.
In addition to drkitten's response: You already have some false notes in there. See above.

I go to some musician how knows how to do it. He will do it if he thinks that this melody has a potential to become a nice piece of music.
But you know it better than the musician, don't you? You've shunned every advice you got in earlier threads. You couldn't even answer a simple question like if you use classical logic or not.

Forget about me, open your own thread. I believe that ddt, jsfisher, PixMisa, Nathan and more people that really know Mathematics, will be glad to help you.
What a silly suggestion. Wasn't your objective with this thread to get the answer to your questions? (whatever they may be). The idea is also that you at least have some understanding of the question you ask in the first place.

At least thanks for the compliment. Does this sentence also imply an admission that you don't know math yourself?

What about, then, actually trying to learn math yourself?

What about a retraction of your earlier statements like "Gödel was wrong", or "Hlibert was wrong"?

What about solemnly promising you stop with your symmetry delusion? Or with your X\Y complementation crap?
 
(Please bear in mind that, as pointed out, entropy is generally a continuous variable, and multisets are discrete; also bear in mind that to be useful as a definition of "entropy" there should be a set of limiting cases where traditional measures such as H = - sigma (p lg p) and your formulation should get the same answer.)

Well, I'll beat Doron to it, if it must.

View the multiset as a probability distribution. Map, e.g., the multiset [1, 1, 2, 3] to the probability distribution that belongs to drawing from an urn with 4 marbles marked with the numbers 1, 1, 2 and 3 resp. Then apply Shannon's formula.

That gives a result consistent with the OP.
 
Last edited:
My mistake I mean the sum 4.

That clears up one problem. But you still haven't cleared up exactly what you mean by entropy.

Is entropy a measure of some characteristic of a multi-set? Does it have a numeric value? What is it?

I am talking about Distinction. A set is asymmetric since each member is distinct. This is not the case in a multiset.

{3,1} is a set. {3,3} is a multiset.

In both cases (set or mutliset) I an talking about thier internal structure, in terms of symmetry)

As drkitten has also pointed out, {3,1} is also a multi-set. Be that as it may, it seems you are using asymmetric for (multi-)set to mean all members are distinct. Is that what you meant by "I am talking about Distinction"?

If so, what about {1,3,1}: is it symmetric or asymmetric? Is {1,3,1} distinct from {1,1,3} or {3,1,1}?
 
Well, I'll beat Doron to it, if it must.

View the multiset as a probability distribution. Map, e.g., the multiset [1, 1, 2, 3] to the probability distribution that belongs to drawing from an urn with 4 marbles marked with the numbers 1, 1, 2 and 3 resp. Then apply Shannon's formula.

That gives a result consistent with the OP.

Well done. Five points to Ravenclaw. (Although I had a different approach in mind in terms of permutation-transformations and topological fixed points, and I haven't taken the time or trouble to verify that it actually produces a useful entropy-like measure.)

Doron, wanna go double or quits on a proof that a "symmetric" multiset maximizes entropy using the definition that ddt has so kindly supplied?
 
That gives a result consistent with the OP.


Not quite. Doron also said this at the tail end of his Notion #1:

As can be seen, Prime numbers have the least entropy, from this point of view.


So, while Claude Shannon would be very proud how you rationalized doron's handwaving with multi-sets in Notion #1, it is still unclear what sort of entropy doron meant for integers.
 
All sets are multisets, or more formally, a multiset is a generalization of a set.



So you're suggesting that the sets {3,1} and {1,3} are distinct?

No, by set theory {3,1} = {1,3}.

I am talking about the internal structure of distinction, where each member is distinct (order is not important)

This is not the case in a "complete" multiset (fore example: {a,a,a,a,a,...}), where there is no distinction.

By this model (continues or not) a "complete" multiset has maximum entropy and a "complete" set has the minimum entropy.

As much as I know, this is a new idea about entropy.
 
Two things jump out at me, doron. First, why did you edit my post you quoted? Admittedly, it was a trivial change - an italics tag pair was dropped - but why would you do it?
[ot]There may be an innocent reason for this. I sometimes copy responses into Notepad for various reasons including instability of my network connection. If I'm copying wysiwyg text (as opposed to marked-up text), then everything is converted to plain text because I'm not copying the tags. Hence, it could be a simple mistake or oversight. Don't attribute to malice what is explainable by the stupid ways computers sometimes work. :)[/ot]
 
Status
Not open for further replies.

Back
Top Bottom