• Due to ongoing issues caused by Search, it has been temporarily disabled
  • Please excuse the mess, we're moving the furniture and restructuring the forum categories

Entropy and entropy

Thabiguy

Muse
Joined
Feb 3, 2007
Messages
814
There is an interesting thread here:
http://www.physicsforums.com/showthread.php?t=361792

The question is whether there is a difference in mass between empty and full memory. There seems to be a consensus that 0's and 1's can have different energies, and that can cause a mass difference, but a lack of consensus whether entropy of the data is a factor or not.

It'd be interesting to hear what the local science gurus have to say about it. :) So what do you think about the issues raised in that thread?

Can you weigh information? Are there two entropies (physical and information-theoretical), or only one? Can entropy of binary data be measured, or is it all in the eyes of the beholder? Is physical entropy observer dependent? What do the laws of physics say about write operations? Is there a minimum energy change? Can the system return to the same state? Does the mass of a hard disk depend on how easily one can describe its contents? Is a full disk less ordered? ...

So many questions, and so little agreement...!
 
Last edited:
Here's my uninformed, non-mathematic take on it;

It shouldn't make any difference. Entropy is a measure of disorder, which essentially takes into account the variety of different states a system can be in (more than happy to have better brains than mine correct me). Even though energy affects systems and increases entropy, the universe is a closed system. No energy in or out. And yet entropy increases. Hence why would the amount of entropy in a small open system suddenly have a relationship?

A sloppy response, I know. There certainly could be a reason. But my first thoughts on the matter say I wouldn't think there is a dependent relationship between entropy and mass.

Athon
 
The question is whether there is a difference in mass between empty and full memory. There seems to be a consensus that 0's and 1's can have different energies, and that can cause a mass difference, but a lack of consensus whether entropy of the data is a factor or not.

Entropy, in the statistical mechanics sense, is most definitely NOT a factor here. The entropy of any given memory configuration (counting only the 0's and 1s) is zero, because the number of accessible states is 1. The memory configuration can't change on its own.

Can you weigh information?

No.

Are there two entropies (physical and information-theoretical), or only one?

One can define entropy in different ways, and those different definitions need not be equivalent. Under the stat mech definition, every memory configuration has the same entropy: zero.

Can entropy of binary data be measured

Yes. But how you measure it depends on your definition.

What do the laws of physics say about write operations? Is there a minimum energy change?

Not exactly. From quantum mechanics, there is a relationship between the energies involved and the timescales needed to perform the operation: if you want to do an operation within a certain time period, then there's a minimum energy needed. If you're willing to wait longer, you can do it with less energy. We're not close to this quantum limit, however.
 
I agree with Zig that the weight of the hard drive need not depend on its state. On the other hand it certainly could, not because of its information content but simply because some configurations are more energetic than others (that's probably the case in reality, I would guess the hard drive tape likes to be magnetically aligned, so whatever logical string that corresponds to would weigh the least).

But as for write operations, I don't entirely agree. To write one must erase information, thereby reducing the entropy, and by the laws of thermodynamics that does cost energy. If the hard drive is in contact with a heat bath, the energy cost to erase one bit is kTln2. It should be the same to write one bit. That cost is unavoidable so long as the temperature is held fixed, regardless of how slowly the operation is carried out.
 
But as for write operations, I don't entirely agree. To write one must erase information, thereby reducing the entropy, and by the laws of thermodynamics that does cost energy. If the hard drive is in contact with a heat bath, the energy cost to erase one bit is kTln2. It should be the same to write one bit. That cost is unavoidable so long as the temperature is held fixed, regardless of how slowly the operation is carried out.

That's one of the controversial issues. Is the entropy reduction real (observable to others), or is it just a way of saying that the writer has lost one bit of uncertainty, and the energy cost corresponds to the writer's compensatory increase in entropy, as dictated by the second law of thermodynamics? Would reading the bit also incur such energy cost?

ETA: Let's say I read the bit, and then, based on its value, decide to either leave it be or flip it - does that erase information, considering that I can simply do the same operation again to restore the original value?
 
Last edited:
But as for write operations, I don't entirely agree. To write one must erase information

Generally speaking, yes, although there are exceptions. In particular, if you swap information from two different storages, in principle you don't need to erase any information when you write (though in practice, all our storage formats require an erase in order to write).

If the hard drive is in contact with a heat bath, the energy cost to erase one bit is kTln2.

Yes, but as with the time scale limit, there is (in principle) no lower limit to this.
 
That's one of the controversial issues.

It shouldn't be.

Is the entropy reduction real (observable to others), or is it just a way of saying that the writer has lost one bit of uncertainty, and the energy cost corresponds to the writer's compensatory increase in entropy, as dictated by the second law of thermodynamics?

No, it really costs energy. At some point in the not-so-distant future this may become a practical consideration, at which time we will switch to reversible computing (a form of computing where little or no erasing is necessary).

Would reading the bit also incur such energy cost?

No.

ETA: Let's say I read the bit, and then, based on its value, decide to either leave it be or flip it - does that erase information, considering that I can simply do the same operation again to restore the original value?

No. It's only irreversible processes (erasing, or formatting) that are subject to this energy cost.

Generally speaking, yes, although there are exceptions. In particular, if you swap information from two different storages, in principle you don't need to erase any information when you write (though in practice, all our storage formats require an erase in order to write).

Agreed.

Yes, but as with the time scale limit, there is (in principle) no lower limit to this.

Also agreed.
 
No. It's only irreversible processes (erasing, or formatting) that are subject to this energy cost.

Okay, so if get that right, setting a bit to 1 costs energy. But reading the bit, checking its value, and flipping it if it's 0 or doing nothing if it's 1 doesn't cost energy.

What am I missing?
 
Okay, so if get that right, setting a bit to 1 costs energy. But reading the bit, checking its value, and flipping it if it's 0 or doing nothing if it's 1 doesn't cost energy.

What am I missing?

"flipping it if it's 0" involves erasing it, unless you do it very carefully (essentially, you need to have an existing pool of ones, one of which gets swapped for the zero and preserved).
 
Here's a different question, that may make the issue clearer. We have a memory full of data, and it receives a single command: erase everything. So it does that, spending the necessary energy from its internal reserves. This means an increase in entropy, right? So does the entropy of the memory as a whole drop, rise, or not change at all? If erasing data means reducing entropy, where do we see the reduction of entropy?
 
Last edited:
Here's a different question, that may make the issue clearer. We have a memory full of data, and it receives a single command: erase everything. So it does that, spending the necessary energy from its internal reserves. This means an increase in entropy, right? So does the entropy of the memory as a whole drop, rise, or not change at all? If erasing data means reducing entropy, where do we see the reduction of entropy?

The entropy of the system as a whole includes both the (information-theoretic) entropy of the memory contents as well as the energy source used to power it.

Erasing the memory costs power, which in turn turns into heat, which increases the entropy of the power system.

So the information-theoretic entropy goes down while the thermodynamic entropy goes up, in the same way that the thermodynamic entropy goes up when you run down a battery.
 
Okay, so if get that right, setting a bit to 1 costs energy. But reading the bit, checking its value, and flipping it if it's 0 or doing nothing if it's 1 doesn't cost energy.

What am I missing?

What you describe erases the previous value. The state of the system after your operation does not contain information about the state of the system before your operation, so that information was erased.
 
The entropy of the system as a whole includes both the (information-theoretic) entropy of the memory contents as well as the energy source used to power it.

But doesn't this contradict what Zig said earlier - that the entropy of any given memory configuration is zero?

You seem to be saying that the memory contents does have some non-zero entropy, which can change, and that this entropy counts towards the entropy of the system.

Wouldn't this imply that if we have two memories that differ only by their memory contents, and the entropy of the contents is different (the possibility you seem to allow by saying that entropy of the contents can change), that there is a difference in entropy, therefore difference in energy/mass?

What you describe erases the previous value. The state of the system after your operation does not contain information about the state of the system before your operation, so that information was erased.

I have trouble reconciling this with sol's previous remark that the operation I describe doesn't erase information. I understand that the information is not in the system, but I have it, and can put it back, so is it really erased?

If any time the system does not contain information about the previous state means the information is erased, wouldn't that imply that all operations that change the memory erase information, and there are no reversible operations? After all, looking only at the system, how could someone else tell whether I've done a reversible operation or not, and thus tell what the previous state was?
 
Maybe it would help to have a physical example. Take a rectangular box that's divided into two equal cubical volumes by a partition. The box contains a single air molecule. If it's located to the left of the partition that's 0, if it's to the right, 1.

Now suppose we want to erase that bit (1->0 and 0->0). We can do that by removing the partition, and then compressing with a piston until the piston reaches where the partition was, then we re-insert the partition and pull the piston back. By a very standard thermodynamic calculation, that process costs an average of kTln2 of energy if the box is in equilibrium at temperature T. There is an energy cost because it's an irreversible process.

Instead, consider an operation that flips the bit (1->0 and 0->1). We could accomplish that by rotating the partition like a revolving door. If you work that out, it will cost zero energy on average (assuming an ideal system with no friction other than collisions with the air molecule).

Does that help?
 
Does that help?

Yes, it helps, thanks.

But from that model, it seems to me that reading the bit should also have a minimum energy cost.

Consider this: we have the rectangular box you describe, divided by a partition. We read the bit, finding out whether it's 0 or 1. We then insert a piston into the empty half of the box, remove the partition, and let the air molecule do work on the piston as we pull it back. When we're done, we reinsert the partition, read the bit again... and repeat ad nauseum. We created a cyclic process directly converting environmental heat into mechanical work, a.k.a. perpetuum mobile of the second kind.

It seems to me that the only way to prevent that is to require that any process that can read the bit - determine which half of the box the air molecule is in - has a minimum energy cost. Is that correct?
 
Yes, it helps, thanks.

But from that model, it seems to me that reading the bit should also have a minimum energy cost.

Consider this: we have the rectangular box you describe, divided by a partition. We read the bit, finding out whether it's 0 or 1. We then insert a piston into the empty half of the box, remove the partition, and let the air molecule do work on the piston as we pull it back. When we're done, we reinsert the partition, read the bit again... and repeat ad nauseum. We created a cyclic process directly converting environmental heat into mechanical work, a.k.a. perpetuum mobile of the second kind.

It seems to me that the only way to prevent that is to require that any process that can read the bit - determine which half of the box the air molecule is in - has a minimum energy cost. Is that correct?

Yes, more or less - but it's not exactly reading the bit that costs energy, it's writing it somewhere (which is necessary, since you need the value of the bit to determine which piston to push in). Storing the bit costs energy, because storing it somewhere is irreversible (it requires erasing whatever was there before). That's also the reason Maxwell's demon doesn't violate the laws of thermo, by the way.

If you could come up with a process that "reads" the bit without remembering it (like my revolving door) but acts on it in the way you specified, you'd have a true perpetual motion machine.
 
Back
Top Bottom