• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Is all information encoded?

Paul C. Anagnostopoulos said:
Perhaps this is the right thing to say:

A physical object has characteristics. Biological organisms and artifacts can derive information from the characteristics of the object when it impinges on the organism or artifact.

~~ Paul

I'd substitute "energy" for "it" in line two. Otherwise I have no problem at all with that statement. It's the inherent nature of information part that has me worried. It has a spooky feel.
 
Soapy said:
I'm curious about how you would compare this to the information you say caused this secondary information to be created. One set is stored in a plastic disc. One set is stored in a rock.(Blast. I forgot the gunshot!) Not only are they clearly different in every way, but set one is information about a rock inherent in that rock, while set two is information about a rock inherent in a plastic disc. There is, presumably a third set of information , about the disc, in the disc?
Yes.

Imagine a cave with a wall and a few holes opposite the wall that let in sunlight. Over time, a rough "photograph" of the holes appears on the wall due to light coming through the holes and bleaching the wall.

Now imagine a traditional photograph taken with light-sensitive film and a crude camera.

Now imagine a digital photograph.

Where does the "characteristics of physical objects" end and "information about physical objects" begin?

I'd substitute "energy" for "it" in line two. Otherwise I have no problem at all with that statement. It's the inherent nature of information part that has me worried. It has a spooky feel.
I was thinking of photons when I wrote "... characteristics of the object when it impinges ...", so I think we agree here.

Now we just need a theory about how organisms and their artifacts deduce information from characteristics. For example, when I look at a table, where do the characteristic-oriented processes end and the information-oriented processes begin?

~~ Paul
 
Paul C. Anagnostopoulos said:

Where does the "characteristics of physical objects" end and "information about physical objects" begin?

~~ Paul
Well-if I stick to my guns, I have to say the physical characteristics really exist in the object, while the information only exists in the head of the observer.


I was thinking of photons when I wrote "... characteristics of the object when it impinges ...", so I think we agree here.

Now we just need a theory about how organisms and their artifacts deduce information from characteristics. For example, when I look at a table, where do the characteristic-oriented processes end and the information-oriented processes begin?


I'd stick with the above till shown the error of my ways, but I agree it's harder to maintain the position about an artifact. Clearly, information describing an artifact exists at the fabrication stage. That might be coded and written down, but I still don't think it's coded in the artifact itself- except in the sense of the levels of information- ie a label might have a label on it, reading "This is a label".

I'm going to bed, now. If I die before morning at least I won't wake up with a headache.
 
Okay, forget artifacts.

Now we just need a theory about how organisms deduce information from characteristics. For example, when I look at a table, where do the characteristic-oriented processes end and the information-oriented processes begin?

The photon has characteristics. It enters my eye. At what point in the visual process does the information arise?

~~ Paul
 
If we define information to be a change in uncertainty, as Shannon does, then we can limit information to certain contexts. I'm not sure this is the right thing to do, but it might work.

~~ Paul
 
Soapy Sam said:
Well-if I stick to my guns, I have to say the physical characteristics really exist in the object, while the information only exists in the head of the observer.
But, by disconnecting information from the object you are asserting no relationship between the object and the information derived from it. Process is the key word in play. By connecting the object, its observer and the observer's ability to process, information emerges. Our desire to attribute information strictly to the object or strictly to the observer is our fundamental error. It is like asking which of the three line segments contributed the "triangleness." It is the relationships amongst the line segments and their constraints that contributed the "triangleness."
 
So an object has attributes that, by a process of interaction with a person*, produce information in the brain of the person by reducing uncertainty relative to the object.

How's that?

~~ Paul

* And other living organisms. And certain kinds of machines?
 
BillHoyt said:
But, by disconnecting information from the object you are asserting no relationship between the object and the information derived from it.

If I do that, then I'm clearly in error, but I don't think I do.

When photons bounce off a rock they are altered and patterned in a way isomorphous with the surface of the rock. There is a mapping between the surface and the pattern.

But is that what we call information?

I recently examined some kitchen work surfaces designed to look like polished granite. These are getting very good, very convincing, (so long as you don't touch them).
I suspect that the photon swarm coming to my eye from that surface would be very similar indeed to one coming from a slab of polished granite. Even an expert might be fooled.

But I knew it wasn't polished granite. That information was not in the worktop, or in any signal from the worktop: It was in the fact that I was in a DIY shop and knew you can't buy a 2m x 600mm x 38mm slab of polished granite for £38.00. That information was in my head when I entered the store, before I even saw the worktop.

My argument remains, in summary-
Incident energy from a surface is not information. It is energy capable of generating information in the right environment-(Usually a brain). I accept that the distinction is subtle. I can see that for some applications (notably communications technology), the distinction is functionally irrelevant.

But I think the distinction has value, nonetheless.


ETA- Paul I have not the tiniest glimmer of a clue how information is created in a brain. But you knew that. :)
 
Paul C. Anagnostopoulos said:
So an object has attributes that, by a process of interaction with a person*, produce information in the brain of the person by reducing uncertainty relative to the object.

How's that?

~~ Paul

* And other living organisms. And certain kinds of machines?

The answer, I think, is "yes." How? Start with a braitenberg vehicle, with very simple "processing" of ambient light information. Here's a braintenberg simulation as an example you can fiddle with. Its pretty cool and applicable to our discussion of information and processing.
 
Soapy Sam said:
The rest are because meaning has been conveyed perfectly.;)

You mean like the Movie Wallace's reply to Longshanks? :D
 
Soapy Sam said:
My argument remains, in summary-
Incident energy from a surface is not information. It is energy capable of generating information in the right environment-(Usually a brain). I accept that the distinction is subtle. I can see that for some applications (notably communications technology), the distinction is functionally irrelevant. [/B]

Again, this depends on the meaning of "information". Shannon does not require a user, but defines an abstract quantity of information that can be extracted from a given channel.

By that reasoning, the manufactured rock that looks like real rock would be sending the same kind of information via the VISUAL route as the real rock.

You have OTHER, additional information that allows you to distinguish, information not derived from your visual senses. Of course, the reflected light also has information that your eye can not detect, so one must then define a perceptual entropy as well as an information-theoretic entropy. Definitions of perceptual entropy have been made for very, very limited cases: Johnston, J. D., “Estimation of perceptual entropy using noise masking criteria,” ICASSP '88 Record, 1988, pp. 2524-2527 is one such example. (apologies for citing myself, but it's at hand) but in general the problem is not simple.

How this relates to "is it all encoded" is still completely a question of semantics, I think.
 
JJ, what the heck is perceptual entropy?

~~ Paul


I don't know either. But I think we may be up to our knees in it.:D
 
Soapy Sam said:
JJ, what the heck is perceptual entropy?

~~ Paul


I don't know either. But I think we may be up to our knees in it.:D

An attempt to measure the amount of information in a single-channel audio signal THAT THE EAR CAN ACTUALLY DETECT, which is a much smaller amount of information than actually 'exists'. It's an estimate, and only an "ok" one, at that, because it depends on models that are (*&(* all to verify the accuracy of.

Entropy ::= information via Shannon's interpretation.
 
Oh, you mean perceptual uncertainty.

information = uncertainty_before - uncertainty_after

Avoid the word entropy like the plague.

The story goes that Shannon didn't know what to call his measure so he asked von Neumann, who said `You should call it entropy ... [since] ... no one knows what entropy really is, so in a debate you will always have the advantage'

---M. Tribus
 
Paul C. Anagnostopoulos said:
Oh, you mean perceptual uncertainty.

information = uncertainty_before - uncertainty_after

Avoid the word entropy like the plague.

The story goes that Shannon didn't know what to call his measure so he asked von Neumann, who said `You should call it entropy ... [since] ... no one knows what entropy really is, so in a debate you will always have the advantage'

---M. Tribus
I disagree with throwing out entropy because it is poorly understood or defined. Personally, I believe it's expanded definition is very appropriate to the discussion about information.

The term emerged in the late 19th century after 400 years of trying to create a perpetual motion machine. By then, virtually everyone was resigned to the fact that energy was lost to the environment (through friction and/or heat loss). (As I recall, credit was given to Rudolph Clausius in the late 1800s for defining the term.)

By the mid-20th century, the concept was well understood. Here's Erwin Schrödinger's version:
"An isolated system or a system in a uniform environment ... increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state ..." - from Schrödinger's What is Life? (Schrödinger, 1944. pg. 78)
Not only do things tend to go from order to disorder (chaos), but information inherent within a system tends to be lost:
"In a closed system, there is a tendency for organization to change into disorganization, or for the amount of information available about the system to become smaller as time goes on." (Rothman, 1963. pg. 144)
Examples abound.

After beating an egg, what information remains about the yolk and the egg white? Nothing visually. The well ordered separation and organization of the egg is gone.

Look at the tread on any automobile tire. As the car is driven tens of thousands of miles, the tread is worn off leaving a "bald" tire. At that point, what remains to identifiy the original tread design? Only the manufacturer's coding molded on the side of the tire. All of the inherent information about the tread design is gone. (The manufacturer's coding is the surviving reference only because it has avoided the wear and tear which the tread endured. The tread suffered high friction, high entropy, and high information loss while the molded code had relatively low friction, negligible entropy, and no appreciable information loss.)

My favorite example is a fire made of logs. Suppose that four cedar logs are burning and each log has two knots. We could even have the knots in specific orientations - on opposite sides or one below the other. Also, as with all logs, tree rings which identify growth can be seen. Now, as fire consumes the logs, the knots and the tree rings disappear. The number and placement of knots, as well as the number of tree rings indicating the tree's age is suddenly lost forever. Eventually, even the fact that four logs have been consumed is gone. From the residue ash, you may be able to determine that cedar wood was burned, but how many logs and how many knots existed on each log as well as what pattern of tree rings existed would be totally obliterated.

All of the above examples highlight the "decreasing information" facet of entropy: with increased entropy, information is lost.

Entropy and information can be directly correlated in a reciprocal relationship - as one increases, the other decreases, and vice versa.
 
Originally posted by JAK
I disagree with throwing out entropy because it is poorly understood or defined
I disagree with your disagreement. If you don't understand the difference between the way entropy is defined under thermodynamics (or statistical mechanics) and the way it is defined under Shannon information theory, then your best bet may to avoid it. Of course, that tip may not be very helpful if you don't even understand that you don't understand.

Entropy is what the equations define it to be
 
Here is a couple of articles that I read, which relate to this conversation. I've cropped them both, as you can probably tell.

0803_cvr.gif


Information in the Holographic Universe
Theoretical results about black holes suggest that the universe could be like a gigantic hologram
By Jacob D. Bekenstein
August 2003 issue
Magazine Content
PHYSICS

Ask anybody what the physical world is made of, and you are likely to be told "matter and energy."

Yet if we have learned anything from engineering, biology and physics, information is just as crucial an ingredient. The robot at the automobile factory is supplied with metal and plastic but can make nothing useful without copious instructions telling it which part to weld to what and so on. A ribosome in a cell in your body is supplied with amino acid building blocks and is powered by energy released by the conversion of ATP to ADP, but it can synthesize no proteins without the information brought to it from the DNA in the cell's nucleus. Likewise, a century of developments in physics has taught us that information is a crucial player in physical systems and processes. Indeed, a current trend, initiated by John A. Wheeler of Princeton University, is to regard the physical world as made of information, with energy and matter as incidentals.

This viewpoint invites a new look at venerable questions. The information storage capacity of devices such as hard disk drives has been increasing by leaps and bounds. When will such progress halt? What is the ultimate information capacity of a device that weighs, say, less than a gram and can fit inside a cubic centimeter (roughly the size of a computer chip)? How much information does it take to describe a whole universe? Could that description fit in a computer's memory? Could we, as Wiliam Blake memorably penned, "see the world in a grain of sand," or is that idea no more than poetic license?

Remarkably, recent developments in theoretical physics answer some of these questions, and the answers might be important clues to the ultimate theory of reality. By studying the mysterious properties of black holes, physicists have deduced absolute limits on how much information a region of space or a quantum entity of matter and energy can hold. Related results suggest that our universe, which we perceieve to have three spatial dimensions, might instead by "written" on a two-dimensional surface, like a hologram. Our everyday perceptions of the world as three-dimensional would then be either a profound illusion or merely one of two alternative ways of viewing reality. A grain of sand may no encompass our world, but a flat screen might.

1104cover_75x100.gif


Black Hole Computers
In keeping with the spirit of the age, researchers can think of the laws of physics as computer programs and the universe as a computer
By Seth Lloyd and Y. Jack Ng
November 2004 issue
Magazine Content
PHYSICS

BLACK HOLE COMPUTER may sound absurd but is proving to be a useful conceptual tool for researchers studying cosmology and fundamental physics. And if physicists are able to create black holes in particle accelerators--as some predict will be possible within a decade--they may actually observe them perform computation.

What is the difference between a computer and a black hole? This question sounds like the start of a Microsoft joke, but it is one of the most profound problems in physics today. Most people think of computers as specialized gizmos: streamlined boxes sitting on a desk or fingernail-size chips embedded in high-tech coffeepots. But to a physicist, all physical systems are computers. Rocks, atom bombs and galaxies may not run Linux, but they, too, register and process information. Every electron, photon and other elementary particle stores bits of data, and every time two such particles interact, those bits are transformed. Physical existence and information content are inextricably linked. As physicist John Wheeler of Princeton University says, "It from bit."

Black holes might seem like the exception to the rule that everything computes. Inputting information into them presents no difficulty, but according to Einstein's general theory of relativity, getting information out is impossible. Matter that enters a hole is assimilated, the details of its composition lost irretrievably. In the 1970s Stephen Hawking of the University of Cambridge showed that when quantum mechanics is taken into account, black holes do ahve an output: they glow like a hot coal. In Hawking's analysis, this radiation is random, however. It carries no information about what went in. If an elephant fell in, an elephant's worth of energy would come out--but the energy would be a hodgepodge that could not be used, even in principle, to re-create the animal.

Box: Overview/Cosmic Computers
*Merely by existing, all physical systems store information. By evolving dynamically in time, they process that information. The universe computes.
*If information can escape from black holes, as most physicists now suspect, a black hole, too, computes. The size of its memory space is proportional to the square of its computation rate. The quantum-mechanical nature o finformation is responsible for this computational ability; without quantum effects, a black hole would destroy, rather than process, information.
*The laws of physics that limit the power of computers also determine the precision with which the geometry of spacetime can be measured. The precision is lower than physicists once though, indicating that discrete "atoms" of space and time may be larger than expected.

edit: some more snips from the black hole article

Analyzing the universe in terms of bits and bytes does not replace analyzing it in conventional terms such as force and energy, but it does uncover new and surprising facts. In the field of statistical mechanics, for example, it unknotted the paradox of Maxwell's demon, a contraption that seemed to allow for perpetual motion. In recent years, we and other physicists have been applying the same insights to cosmology and fundamental physics: the nature of black holes, the find-scale structure of spacetime, the behavior of cosmic dark energy, the ultimate laws of nature. The universe is not just a giant computer; it is a giant quantum computer. As physicist Paola Zizzi of the University of Padova says, "It from qubit."

AND

To calculate the total memory capacity of conventional matter, such as atoms, one can apply the standard methods of statistical mechanics and cosmology. Matter can embody the most information when it is converted to energetic, massless particles, such as neutrinos or photons, whose entropy density is proportional to the cube of their temperature. The energy density of the particles (which determines the number of operations they can perform) goes as the fourth power of their temperature. Therefore, the total number of bits is just the number of operations raised to the three-fourths power. For the whole universe, that amounts to 10e92 bits. If the particles contain some internal structure, the number of bits might be somewhat higher. These bits flip faster than they intercommunicate, so the conventional matter is a highly parallel computer, like the ultimate laptop and unlike the black hole.

note: the black hole is said to process like a serial computer

AND

What is the universe computing? As far as we can tell, it is not producing a single answer to a single question, like the giant Deep Thought computer in the science-fiction classic The Hitchhiker's Guide to the Galaxy. Instead the universe is computing itself. Powered by Standard Model software, the universe computes quantum fields, chemicals, bacteria, human beings, stars, and galaxies. As it computes, it maps out its own spacetime geometry to the ultimate precision allowed by the laws of physics Computation is existance.
 
Dymanic said:
I disagree with your disagreement. If you don't understand the difference between the way entropy is defined under thermodynamics (or statistical mechanics) and the way it is defined under Shannon information theory, then your best bet may to avoid it. Of course, that tip may not be very helpful if you don't even understand that you don't understand.

Entropy is what the equations define it to be

However, "disorder" was a crutch, i.e., it was a contrived support for visualization rather than a fundamental physical or theoretical cause for a higher entropy value. Others followed Boltzmann's lead; Helmholtz in 1882 called entropy "Unordnung" (disorder) (5), and Gibbs Americanized that description with "entropy as mixed-up-ness", a phrase found posthumously in his writings (6) and subsequently used by many authors.

Most general chemistry texts today still lean on this conceptual crutch of order-disorder either slightly with a few examples or as a major support that too often fails by leading to extreme statements and over extrapolation. The most egregious errors in the past century of associating entropy with disorder have occurred simply because disorder is a common language word with non-scientific connotations. Whatever Boltzmann meant by it, there is no evidence that he used disorder in any sense other than strict application to molecular energetics. But over the years, popular authors have learned that scientists talked about entropy in terms of disorder, and thereby entropy has become a code word for the "scientific" interpretation of everything disorderly from drunken parties to dysfunctional personal relationships,5 and even the decline of society.6

Of course, chemistry instructors and authors would disclaim any responsibility for such absurdities. They would insist that they never have so misapplied entropy, that they used disorder only as a visual or conceptual aid for their students in understanding the spontaneous behavior of atoms and molecules, entropy-increasing events.
- http://www.entropysite.com/cracked_crutch.html
"Disorder" may be a misapplication of the term "entropy," but using it to define "communication" and "information" may also be a misapplication. It appears that Shannon bent the definition of "entropy" for his own purposes (from the Tim Thompson site):
...
Shannon proves his Theorem 2, that this Boltzmann entropy is the only function which satisfy's the requirements for a function to measure the uncertainty in a message
...
I'm not yet convinced that Clausius was concerned with "uncertainty" when defining entropy. It appears that it has evolved in that direction to determine whether a chemical reaction will spontaneously occur or not:
...
The significance of this equation is that it is the value of delta-F which tells you whether any given chemical reaction will go forward spontaneously, or whether it needs to be pumped.
...
Further, why is this definition, used by chemists, preferred?
...
Much more important for us here is the bearing on the statistical concept of order and disorder, a connection that was revealed by the investigations of Boltzmann and Gibbs in statistical physics. This too is an exact quantitative connection, and is expressed by
entropy = k log D where k is the so-called Boltzmann constant (= 3.2983 (exp 10--24) cal./degrees celsius), and D a quantitative measure of the atomistic disorder of the body in question.
...
- Nobel Prize winning physicist, Erwin Schrödinger, from Heredity and the Quantun Theory

So, is this going to be a "pissing contest" over who has the best authoritative resource?

I believe the application of "entropy" to uncertainty in communication may not be completely serving our needs in defining communication or information. IMO, "order" and "disorder" should not be discarded from the discussion. Even Thompson gave off-handed respect to the relationship:
...
Entropy is not "disorder", although the two can be related to one another.
...
 
Originally posted by JAK
I'm not yet convinced that Clausius was concerned with "uncertainty" when defining entropy.
I wouldn't try to convince you of that. It was Shannon, not Clausius, who used it that way.

It appears that it has evolved in that direction to determine whether a chemical reaction will spontaneouly occur or not:
In that context, I don't see how it has anything whatsoever to do with 'uncertainty'.

I believe the application of "entropy" to uncertainty in communication may not be completely serving our needs in defining communication or information. IMO, "order" and "disorder" should not be discarded from the discussion.
Maybe we should be talking about Kolmogorov complexity instead of Shannon information.
 

Back
Top Bottom