Information
( taken from an e-mail wherein I try to explain briefly "information" to an English major. )
Information, in math and science, is formalized as a collection of answers to yes-no questions. Usually we talk about binary strings of bits "01011011101", and say that "1" is a "yes" and "0" is "no". Information is not meaning. "01011011101" can be a series of random coin tosses, or encode the password to your computer. How information is interpreted gives it meaning.[1]
Information, as a series of yes-no questions, can be transformed to do useful things. For example, controlling a robot or displaying pixels on your computer monitor. Translating a series of yes-no questions (bits) into the action of a machine can be done mechanically and automatically. This transformation could be considered the act of interpreting, or understanding the meaning of information.[1]
While its clear that human communication involves conveying information, we don't understand natural language as well as we do computer communication. Studies suggest that English transmits about 0.5 bits per letter[1]. In other words, if I told you I was thinking of an English phrase 100 letters long, you could probably guess what it was in 50 questions, assuming it was a sensible English phrase. It is much harder to say whether or not a human understands the meaning of information. There has been a lot of recent research in "information" within the brain.
The idea of information is pretty trippy. Take DNA for example :
DNA encodes information called a gene. The gene is the information, not the DNA encoding it. Since we can sequence DNA, we can store the gene inside a computer or on some printouts or carved in a collection of scratches in stone. The cellular machinery that reads DNA and produces life is the only system that can completely understand the meaning of DNA, so we still need to take that gene and sequence it into DNA and insert it into a living cell in order to really understand what the gene does ( the same gene may do different things to different species, or different things within different cells of the same organism ). This has been a major hurdle following the human genome project : a mass of information without meaning. [2]
Information and Entropy :
Since information is not inherently meaningful, there is a lot of meaningless information floating around out there. Its no accident that the units for measuring entropy are identical to the units for measuring information: systems with maximum entropy store the greatest possible amount of information. Some methods of compressing computer files are called "entropy encoding", and take long files with low information per byte and transform them into shorter files with a higher information per byte. So, the statement that "entropy increases in the universe", is actually the same thing as "information increases in the universe". The increase in entropy in the universe is related to the fact that the present contains information about what happened in the past, and there is an ever increasing amount of past. [2]
Sources :
[2] Some conversations with Keegan, possibly, or I just made it up.
>> So, the statement that "entropy increases in the universe", is actually the same thing as "information increases in the universe".
ReplyDeleteNitpick: increasing entropy is decreasing information.
There are really two competing intuitions about entropy you are talking about here.
1. Entropy as a general concept relates to how close to uniform some quantity is distributed (in the universe's case, energy over space).
2. Entropy when you are discussing the probability distribution for some random variable conditioned on your prior knowledge represents the amount of information you have. It relates to the first definition in that a probability distribution is one kind of distribution.
I'm not convinced "increasing entropy is decreasing information": Samples drawn from distributions of higher entropy convey more information, not less.
ReplyDelete