• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Entropy is counterintuitive to me - help?

jmercer

Penultimate Amazing
Joined
Jan 4, 2005
Messages
12,334
It's the question of order and chaos.

The universe is ruled by entropy; entropy ultimately results in order in the sense that eventually everything will reach the same high-order low-energy state uniformly, etc. I understand that entropy will also cause highly-order complex systems to progressively decompose into higher-order, lower-energy systems as well.

Where I get a bit weirded out is when I think about the universe as it started.

The universe started in a high-energy state that quickly started being affected by entropy and began to drop to a lower-energy state. However, the universe at that time was essentially a chaotic environment, so the only thing entropy directly affected was to reduce the high-energy state of the universe as it began the march toward the ultimate high-order low-energy state some time in the future.

As I look out my window, I instantly perceive complex systems that are high-order and - while not as high-energy as billions of years ago - are still clearly high-energy systems.

Why wouldn't the universe have simply "cooled off" and never formed such complex high-order systems? Sure, initially it may have started in a high-order high-energy state before the "big bang" - but the moment the bang happened, it became a low-order high-energy state. So why would stars and planets form? Why would life form?

Why wouldn't entropy prevent all of these things from happening?

I appreciate anyone's time in helping me understand this. It's bugged me for years. :)
 
The universe is ruled by entropy; entropy ultimately results in order in the sense that eventually everything will reach the same high-order low-energy state uniformly, etc.
This is technically incorrect. Highly concentrated energy is simpler and more orderly; as this energy becomes more evenly spread throughout space, it becomes more disorderly.
 
The universe started in a high-energy state that quickly started being affected by entropy and began to drop to a lower-energy state.

To start off with, entropy is NOT about higher or lower energy. It's about disorder or, more precisely, the number of available microscopic states for a given macroscopic state (often measured in terms of the log of the number of states).

To give a concrete example, imagine an ice cube sitting in a cup of hot water. The ice cube has low entropy: water molecules can wiggle around a little, but since they can't move around freely, the number of possible states is (relatively) small. The water has high entropy, since water molecules are zipping all over the place in lots of different possible states. The number of states available to the entire system is the product of the two.

What happens now, as I'm sure you know, is that the ice will melt. The energy of the ice cube increases, but the energy of the hot water decreases, so there is NO change in the energy of the entire system. The flow of energy isn't because it's going from high energy to low energy (that's often the case but it doesn't have to be), but because the system as a whole is trying to maximize its entropy. The hot water decreases its entropy as it cools, and the ice cube increases its entropy as it melts. But since the TOTAL entropy of the system comes from the product of the number of states for the hot water and the ice, this balancing act increases the entropy of the system as a whole (in other words, melting the ice increases entropy more than cooling the hot water decreases it).

The relationship between energy, entropy, and temperature is often straightforward (more energy = more entropy = hotter temperature), but not always. For a paramagnet in a magnetic field, for example (so only considering the energy/entropy of the magnetic moments), there's a hard limit to the amount of energy and entropy the system can have (maximum energy occurs when moments anti-align with the field, maximum entropy occurs when spins point in completely random directions). The temperature of the system, however, can in principle go infinite or even negative (though negative temperatures don't mean what people usually think they mean: they're unstable high-energy states which correspond to what we intuitively think of as hot, not cold). This is because temperature is NOT a measure of the thermal energy or even the entropy of a system, but instead is a measure of how entropy changes as you change energy. For a paramagnet, if you align the moments antiparallel to the field, you reduce the number of possible states (decrease entropy) but increase energy. That's how you get negative temperatures (positive temperature corresponds to increasing entropy with increasing energy), and that's also obviously unstable: let the magnetic moments go and they'll start pointing with the field, and arrive back in the positive-temperature regime.
 
The popular literature is littered with articles, papers, books, and various & sundry other sources, filled to overflowing with prosaic explanations of entropy. But it should be remembered that entropy, an idea born from classical thermodynamics, is a quantitative entity, and not a qualitative one. That means that entropy is not something that is fundamentally intuitive, but something that is fundamentally defined via an equation, via mathematics applied to physics. Remember in your various travails, that entropy is what the equations define it to be. There is no such thing as an "entropy", without an equation that defines it.
http://www.tim-thompson.com/entropy1.html
 
It did. We call it the CMB - Cosmic Microwave Background Radiation.

However, there were apparently some minor fluctuations in the early Big Bang. This is still considered a mystery in science, though there are several explanations put forth for the phenomenon. I know quantum vacuum fluctuations have been proposed and I would imagine that fluctuation theory has been proposed also (minor reduction of entropy on small, temporary scales).

In a sense, these fluctuations contained a great deal of negative entropy, and separated the galaxies from the CMB, so to speak. And here we are.
 
To all (especially Ziggy): Thanks! You've cleared up a lot of misconceptions on my part. :) I've read explanations of entropy before, but gotten confused; now I've got it, I think. :)

So... let me "clarify and confirm" my new understanding.

Entropy isn't about energy, it's about order and disorder. Entropy is the process of a more ordered state transitioning into a less ordered state. Entropy tends to a maximum in any closed system. (Can't have outside influences involved or it all goes out the window, right?) And at some point, maximum entropy will have been achieved and the closed system will become static in terms of entropy.

The universe is a closed system (as far as we know), and so it is, like all closed systems, in the process of maximizing it's entropy. Because the universe started out as very small, hot and highly ordered, when the event that started expansion occurred, the universe began to cool down. This is the rough equivalent of Ziggy's example of hot water cooling down, with the expansion of the universe acting as the "ice cube".

The cooling process has created things like nebula, dark matter, stars, planets, etc., so entropy - while happening continually - is really incidental to these items, even though they aren't immune to the process of entropy.

The reason we can show examples of ordered systems in today's universe is because entropy hasn't yet been maximized; and these systems aren't really closed; they can receive input from other systems. (I.E. - plants get sunlight and use the energy to support order in their systems, temporarily staving off entropy.)

Is this a reasonable (albeit highly simplified) summary?
 
Yes. Think of water behind a dam - you can use the water to do work because the level behind the dam and the level in front of the dam are inequal. As the water flows from one side to another, the flow can be put to use - at least, until the levels are equal.
 
Ah - so essentially, entropy can be viewed as an observable effect due to energy equalization throughout a given system?
 
It's about the availability of energy in a system.

No. The definition of entropy (and there is a hard mathematical definition) comes from the number of available microscopic states (which corresponds to disorder), NOT from the energy of the system. Energy does not enter directly into the definition of entropy at all. In situations where high energy = high entropy = high temperature (usually the case but not always), you can get away with your pseudo-definition, but it's still not correct. For example, it the previous example I gave of a paramagnet in a magnetic field, with three possible macroscopic states:

1) all spins parallel to the field
2) all spins randomly oriented
3) all spins antiparallel to the field

If you consider the entropy of these states, you will find that (1) and (3) are both low-entropy states, while (2) is a high-entropy state. If you consider the energy of each state, you will find that (1) has low energy, (2) has intermediate energy, and (3) has high energy. If you consider the temperature of each state, you will find that (1) has low positive temperature, (2) has high temperature, and (3) has low negative temperature. Not very intuitive, and probably not the answers you'd guess from using your rule of thumb.

It doesn't work to try to treat entropy as simply the amount of available energy, because it's not. The amount of available energy has a lot to do with how much you can CHANGE entropy (both (1) and (3) may have equally low entropy, but you can increase the entropy of a heat bath by putting (3) into contact with it, while you cannot do the same with (1)), but that's not quite the same thing, and the difference can matter. Again, entropy is defined in terms of the number of microscopic states available to the system. This is often dependent on energy, but not always in the same way, and it's not part of the definition itself.
 
Ah - so essentially, entropy can be viewed as an observable effect due to energy equalization throughout a given system?
No, that's not a good way to think of it either. I keep coming back to the example of a paramagnet, because it's a simple system in which many of our intuitive ideas about this stuff breaks down. "Energy equalization" is one of them. Consider a paramagnet in a field, placed in thermal contact with an ideal gas. The two systems equilibrate to some equal temperature: the gas molecules are bouncing around with some average energy per molecule, and the spins are partly aligned in the field but wiggling slightly away from parallel, each of them also having some average energy. Now, are the average energies of each molecule and each spin the same? No, generally speaking, they are not. And that's rather easy to figure out. First, consider the paramagnet: the maximum energy for each spin occurs when it anti-aligns with the field. There is, therefore, an absolute limit to the energy you can input into the system. For an ideal gas, the thermal energy of each molecule turns out to be essentially linear with temperature. That means you can easily increase the temperature of the combined system until the average energy of the gas molecules exceeds the maximum energy of the spins in the paramagnet. At that point, further heating basically dumps energy into the gas, and not the paramagnet. The two systems will, however, maintain thermal equilibrium: the temperature of the paramagnet can keep increasing without limit, but the energy it absorbs in the process drops towards zero (heat capacity drops towards zero). So it's not about energy equilibration. That, in fact, is why heat capacities for different systems are NOT the same: it's not about equilibrating energy, but temperature. Temperature isn't simply energy (as we saw in the negative temperature case), its definition comes from the energy-derivative of entropy (that is, how entropy changes with energy). Equilibrating temperatures of two systems ensures that you've maximized the entropy of the combined systems, there's no requirement that energy be equilibrated between them.
 
The definition of entropy (and there is a hard mathematical definition) comes from the number of available microscopic states (which corresponds to disorder)
There are many ways to define entropy mathematically, depending on whether one is defining it under classical thermodynamics, statistical mechanics, chemistry, or information theory.

NOT from the energy of the system. Energy does not enter directly into the definition of entropy at all.
Let's look at Clausius's definition from classical thermodyamics:

S = Q/T

Where S is the entropy, Q is the heat content of the system, and T is the temperature of the system.

And we can look at how (or whether) "energy" enters into this definition.
 
*Sits VERY still and reads the ongoing discussion in hopes of finally understanding entropy*
 
There are many ways to define entropy mathematically, depending on whether one is defining it under classical thermodynamics, statistical mechanics, chemistry, or information theory.

Stat Mech and information theory definitions are basically equivalent. As for chemistry, I'm not aware that they have a separate way to rigorously define entropy - they may talk about it from a slightly different perspective (especially when it comes to how to measure entropy experimentally, since there's plenty of ways to do that), but that's not the same thing as having its own definition.

Let's look at Clausius's definition from classical thermodyamics:

S = Q/T

Where S is the entropy, Q is the heat content of the system, and T is the temperature of the system.

And we can look at how (or whether) "energy" enters into this definition.

First off, if this is how you define entropy, then you're treating temperature as a more fundamental quantity, and the question just gets pushed back a step. And while it might be OK for 19th century scientists to define temperature in terms of ideal gas laws or somesuch, that doesn't constitute a rigorous definition. As far as I'm aware, the only way to rigorously define temperature on a first-principles basis is in terms of entropy, not the other way around.

And second, that equation is properly written as dS = dQ/T. And the distinction matters: a paramagnet in a magnetic field can only absorb a finite amount of energy. As you increase temperature towards infinity, then, using your form of the equation, entropy will have to approach zero. That clearly doesn't make any sense. If you use the differential form, that problem disappears: when T approaches infinity, the differential goes to zero, but you get S from the integral (where T now properly depends on Q, so your integral includes heat energy added when T is small), and S saturates at some fixed finite value, as expected.

Working from the differential form, we also see that 1/T = dS/dQ, which means that this formula is just as good at defining temperature in terms of entropy as it is at defining entropy in terms of temperature. And that's really the preferable way to do it, since again, I don't know of any fundamental way to define temperature without reference to entropy, but the reverse is easy.
 
Stat Mech and information theory definitions are basically equivalent.
As long as we're talking about entropy as Shannon used it, rather than the slightly different way Kolmogorov used it.

As for chemistry, I'm not aware that they have a separate way to rigorously define entropy - they may talk about it from a slightly different perspective
Yes. They use different terms (enthalpy and free energy), but the math works out the same.

First off, if this is how you define entropy, then you're treating temperature as a more fundamental quantity, and the question just gets pushed back a step.
I agree. It is assumed that the system is in thermodynamic equilibrium, so temperature is treated as a constant.

I don't know of any fundamental way to define temperature without reference to entropy, but the reverse is easy.
The challenge here is to define entropy in a way jmercer can understand. I would prefer to do that using thermodynamics (or at least start there and discuss the way the term evolved from there) while you appear to favor starting right in with statistical mechanics. Part of the problem seems to arise from the plethora of ways the term is used, from the rigorous and mathematically defined to the broad and metaphorical. Some of the latter applications of the term clearly qualify as "abuse". Though entropy is rigorously defined under both thermodynamics and statistical mechanics, I propose that "order" and "disorder" remain subjective under either (Shannon information theory enjoys the luxury of having this well defined through an implicit assumption: that a message in a communications channel was at 100% of its information capacity at the source).
 
I agree. It is assumed that the system is in thermodynamic equilibrium, so temperature is treated as a constant.
...
The challenge here is to define entropy in a way jmercer can understand.

OK. But even if we're just trying to get jmercer a definition he can think about and don't worry about whether its rigorous or not, I'm not sure a classical thermo approach is really any better than a stat-mech approach. His original question was about the evolution of the universe, and that's been a very non-equilibrium process, with temperature change a rather important component in all of that.
 

Back
Top Bottom