• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Deeper than primes

Status
Not open for further replies.
If we use partitions in order to define Entropy, then a multiset (a repetition of the same identity) has an entropy that is equivalent to the number of the repetitions that exists within it.

Ok, please do so: Define entropy in terms of partitions. We'll wait.

Since a set has no repetitions, it has no entropy.

Minor nit-pick: "Zero" and "no" are not precise synonyms.

...<snip>...
As can be seen, Prime numbers have the least entropy, from this point of view.

Wow! You've only just started this post, and already you are using the term, entropy, differently than first implied. I suppose that is the best thing about never defining anything. You are free to misuse your terms anyway that pleases you.

Here, you began with entropy being a property of a multi-set having something to do with element repetitions; now, it has somehow become a property of numbers (with no hint of how you got there).


By the way: A multi-set is not a repetition of the same identity. Please try to get at least something right.
 
Let us look at your first example (note that I have replaced the { and } with [ and ] so we are not confused with the standard notation for sets):
How is [1,1] found from the number 2?
How did you determine that [1,1] has a maximum entropy given that entropy is only a "notion" and you have no definition for it?
What is the numeric value that you determined for the entropy of [1,1]?

ETA: Just had a thought: Maybe you mean the definition of partitions in number theory? But these are not sets of numbers but the different ways that a positive integer can be written as the sum of positive integers and includes the actual integer itself, e.g.

Notice the + signs between the integers.

I do hope that you are not redefining a standard definition in mathematics once again.

Actually, there is no problem to use "{" and "}" because I am looking for the relations between sets and multisets as the framework of Entropy's research.

By using Symmetry I find Entropy as a state that is measured by its invariance under exchange, for example:

By using cardinal 4 multisets {1,1,1,1} or {2,2} are invariant under their members' exchange, and in this case we have maximum entropy.

The entropy of {1,1,1,1} is greater than {2,2} because more members are invariant under exchange.

On the contrary {3,1} has no entropy because it is variant (and asymmetric) under exchange.

In my opinion, by using this preliminary idea, we can unify Thermodynamics and Information theory under a one method based on Symmetry as the common measurement tool.

A lot of work has to be done in order to develop this idea, so before any further effort in this direction, I which to know what do you think about this preliminary idea.


jsfisher said:
Wow! You've only just started this post, and already you are using the term, entropy, differently than first implied. I suppose that is the best thing about never defining anything. You are free to misuse your terms anyway that pleases you.

When we are talking about natural numbers' partition we have to use notations like (1,1,1,1) or (2,2) instead of {1,1,1,1} or {2,2}, but in both cases Symmetry is used as invariance under element's exchange.
 
Last edited:
By using Symmetry I find Entropy as a state that is measured by its invariance under exchange, for example:

By using cardinal 4 multisets {1,1,1,1} or {2,2} are invariant under their members' exchange, and in this case we have maximum entropy.

The entropy of {1,1,1,1} is greater than {2,2} because more members are invariant under exchange.

The astonishing thing is that I think this could be worked and formalized into an insightful definition of structural information. Of course, I also suspect that Our Favorite Nutcase has neither the ability nor the interest to actually formalize it, and I suspect that any attempt to so formalize it will meet with her active hostility.

I know that I'm not going to lift a pen to work it out, certainly.

A lot of work has to be done in order to develop this idea, so before any further effort in this direction, I which to know what do you think about this preliminary idea.

I think in your hands it's gibberish.
 
In my opinion, by using this preliminary idea, we can unify Thermodynamics and information theory under a one method based on Symmetry as the common measurement tool.

A lot of work has to be done in order to develop this idea, so before any further effort in this direction, I which to know what do you think about this preliminary idea.

We[1] think it's so bad it's not even wrong.

[1] and I think I can speak for many people here on this matter.
 
The astonishing thing is that I think this could be worked and formalized into an insightful definition of structural information.
Go for it. You can do it in another thread; I promise not to interrupt you. I believe that it is a very good exercise for me just to watch of how other people formalizing this preliminary idea.
 
Go for it. You can do it in another thread;

I already have done in another thread, and I see no reason to try it again.

Mathematics is not a spectator sport. If you want to learn how to do mathematics, do mathematics -- and then LISTEN to the corrections you get.

I will, however, at least give you one strong hint.

You use the word "entropy" a lot, and you are claiming, for example, that {3,1} < {2,2} < {1,1,1,1}.

Prove it. Define "entropy" (the entropy of a (multi)set S is given by the following expression : [arglebargle]) and show the claim above to be numerically true.
 
Actually, there is no problem to use "{" and "}" because I am looking for the relations between sets and multisets as the framework of Entropy's research.
We will add this to the OT as Notion #3.

By using Symmetry I find Entropy as a state that is measured by its invariance under exchange, for example:

By using cardinal 4 multisets {1,1,1,1} or {2,2} are invariant under their members' exchange, and in this case we have maximum entropy.

The entropy of {1,1,1,1} is greater than {2,2} because more members are invariant under exchange.

On the contrary {3,1} has no entropy because it is variant (and asymmetric) under exchange.
Then we may as well use the words symmetry and asymmetric. It is nothing to do with entropy

In my opinion, by using this preliminary idea, we can unify Thermodynamics and information theory under a one method based on Symmetry as the common measurement tool.

A lot of work has to be done in order to develop this idea, so before any further effort in this direction, I which to know what do you think about this preliminary idea.
No we cannot since thermodynamicis is a lot more than entropy.

That is why I think that they have the minimum Entropy. In this case (

when we are talking about natural numbers' partition we have to use notations like (1,1,1,1) or (2,2) instead of {1,1,1,1} or {2,2}, but in both cases Symmetry is used as invariance under element's exchange).
This is just symmetry again - nothing to do with entropy.

You should really learn what entropy actually is and the thermodynamic and informaional definitions. It is not a measure of symmetry.
Theoretical relationship
Despite all that, there is an important difference between the two quantities. The information entropy H can be calculated for any probability distribution (if the "message" is taken to be that the event i which had probability pi occurred, out of the space of the events possible). But the thermodynamic entropy S refers to thermodynamic probabilities pi specifically.
Furthermore, the thermodynamic entropy S is dominated by different arrangements of the system, and in particular its energy, that are possible on a molecular scale. In comparison, information entropy of any macroscopic event is so small as to be completely irrelevant.
However, a connection can be made between the two, if the probabilities in question are the thermodynamic probabilities pi: the (reduced) Gibbs entropy σ can then be seen as simply the amount of Shannon information needed to define the detailed microscopic state of the system, given its macroscopic description. Or, in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more". To be more concrete, in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.
Furthermore, the prescription to find the equilibrium distributions of statistical mechanics, such as the Boltzmann distribution, by maximising the Gibbs entropy subject to appropriate constraints (the Gibbs algorithm), can now be seen as something not unique to thermodynamics, but as a principle of general relevance in all sorts of statistical inference, if it desired to find a maximally uninformative probability distribution, subject to certain constraints on the behaviour of its averages.
 
Last edited:
I already have done in another thread, and I see no reason to try it again.

Mathematics is not a spectator sport. If you want to learn how to do mathematics, do mathematics -- and then LISTEN to the corrections you get.

I will, however, at least give you one strong hint.

You use the word "entropy" a lot, and you are claiming, for example, that {3,1} < {2,2} < {1,1,1,1}.

Prove it. Define "entropy" (the entropy of a (multi)set S is given by the following expression : [arglebargle]) and show the claim above to be numerically true.

Let us think about my idea like this:

I have a melody but I don't know how to write the notes for some group of musicians.

I go to some musician how knows how to do it. He will do it if he thinks that this melody has a potential to become a nice piece of music.

If you think that my melody (preliminary idea) can be edited (can be addressed formally) in order to get a nice piece of music (some interesting mathematical work), then please go for it.
 
Last edited:
Let us think about my idea like this:

I have a melody but I don't know how to write the notes for some group of musicians.

No, let's not. You don't have a melody. You have a random collection of notes.


If you think the my melody (preliminary idea) can be edited (can be addressed formally) in order to get a nice piece of music (some interesting mathematical work), then please go for it.

No, thanks. As I pointed out, I tried that already in your previous thread, and got roundly abused for my suggestions.
 
No, let's not. You don't have a melody. You have a random collection of notes.
Maybe.

But you wrote this:

drkitten said:
The astonishing thing is that I think this could be worked and formalized into an insightful definition of structural information.
So there is something in your mind, so please go for it.
drkitten said:
No, thanks. As I pointed out, I tried that already in your previous thread, and got roundly abused for my suggestions.
Forget about me, open your own thread. I believe that ddt, jsfisher, PixMisa, Nathan and more people that really know Mathematics, will be glad to help you.
 
Last edited:
This is the whole (new) idea., to get Entropy in terms of Symmetry.

I told you that you don't have an melody, just a random collection of notes.

This basically proves it. You're trying to "get" an undefined term in terms of another undefined term. Based on this, I think I can "get" Margstrap in term of Bletcherality.

(But it must be important, because I capitalized both Margstrap and Bletcherality, right?)
 
jsfisher said:
By the way: A multi-set is not a repetition of the same identity. Please try to get at least something right.

A=A (self id) , 1=1 (self id)

{A,A,A}, {1,1,1,1}
 
Forget about me, open your own thread. I believe that ddt, jsfisher, PixMisa, Nathan and more people that really know Mathematics, will be glad to help you.

I neither need nor want their help; I don't think that the definition you propose will be useful enough to justify spending my effort on it. And I think it will be completely unable to help with your locality/nonlocality theology that you keep trying to push.

I do, however, find it amusing that you finally get one sentence (in what, 1500 posts?) that MIGHT be meaningful, and you refuse to explore the meaning further.

If you think your OP was meaningful, define your terms. Prove me wrong in my belief that you have NOTHING of interest to offer.....
 
I neither need nor want their help; I don't think that the definition you propose will be useful enough to justify spending my effort on it. And I think it will be completely unable to help with your locality/nonlocality theology that you keep trying to push.

I do, however, find it amusing that you finally get one sentence (in what, 1500 posts?) that MIGHT be meaningful, and you refuse to explore the meaning further.

If you think your OP was meaningful, define your terms. Prove me wrong in my belief that you have NOTHING of interest to offer.....
You wrote "The astonishing thing is that I think this could be worked and formalized into an insightful definition of structural information." at the beginning of http://www.internationalskeptics.com/forums/showpost.php?p=4088292&postcount=44 .

Don't you think that it will be a good idea to formalize what you (again, forget about me or my ideas) think as an insightful definition of structural information?
 
Last edited:
You wrote "The astonishing thing is that I think this could be worked and formalized into an insightful definition of structural information." at the beginning of http://www.internationalskeptics.com/forums/showpost.php?p=4088292&postcount=44 .

Don't you think that it will be a good idea to formalize what you (again, forget about me or my ideas) think as an insightful definition of structural information?

No, I don't, because I think the insights are already available through other channels. (Thank you, Claude Shannon.)

What is astonishing is the fact that you wrote it, since I think it's the first mathematical observation you've made in 1500 posts that might be meaningful.

If you define your terms properly, which you will not.
 
This is the whole (new) idea, to get Entropy in terms of Symmetry.
Then the whole (new) idea is wrong.
Entropy is a continuous numeric value (a real number).
Measures of Symmetry are discrete numeric values (integers) unless you are talking about objects that have infinite symetry, e.g a sphere. This is certainly not the case here. The "partitions" that you talk about do not have infinite symmetry.
Thus Entropy cannot be defined in terms of Symmetry.
 
No, I don't, because I think the insights are already available through other channels. (Thank you, Claude Shannon.)

What is astonishing is the fact that you wrote it, since I think it's the first mathematical observation you've made in 1500 posts that might be meaningful.

If you define your terms properly, which you will not.

Please show where Shannon defines Symmetry as a common framework for Thermodynamics and Information theory research about Entropy?

Symmetry is defined as invariance under manipulation, where Entropy is the most symmetrical state (no new information can be found under any manipulation, and no work can be done, in terms of thermodynamic equilibrium).
 
Status
Not open for further replies.

Back
Top Bottom