Information theory

From RationalWiki
(Difference between revisions)
Jump to: navigation, search
(These dpl navs are a pain in the arse)
Line 1: Line 1:
'''Information theory''' is a framework for understanding the transmission of information from sender to receiver, and the effects of complexity and interference with these transmissions.  [[deceit|Creationists]], in an attempt to coat their myths with a veneer of science, have co-opted the idea of information theory to use as a plausible-sounding attack on evolution.  Essentially, the claim is that the genetic code is like a language and thus transmits information, and due to the usual willful misunderstandings of the [[second law of thermodynamics]] (which is about energy, not information), they maintain that information can never be increased.<ref>Notably, this is a [http://creation.com/arguments-we-think-creationists-should-not-use change] from the tactic that there can be "no beneficial mutations." Because information theory is more difficult for the layman to understand, it is easy to hide behind information theory without really understanding it.  It is also intimately related to the "evolution couldn't have possibly have made eyes, wings, or flagella" arguments as well.</ref>  Therefore, the changes they cannot outright deny are defined as "losing information", while changes they disagree with are defined as "gaining information", which by their definition is impossible.  Note that at no point do creationists actually specify what information actually is, changing it on an ''ad hoc'' basis depending on the argument, relying on colloquial, imprecise definitions of information rather than quantifiable ones -- or worse, switching interchangeably between different definitions depending on the context of the discussion or argument.
+
'''Information theory''' is a framework for understanding the transmission of information from sender to receiver, and the effects of complexity and interference with these transmissions.  [[deceit|Creationists]], in an attempt to coat their myths with a veneer of science, have co-opted the idea of information theory to use as a plausible-sounding attack on evolution.  Essentially, the claim is that the genetic code is like a language and thus transmits information, and due to the usual willful misunderstandings of the [[second law of thermodynamics]] (which is about energy, not information), they maintain that information can never be increased.<ref>Notably, this is a [http://creation.com/arguments-we-think-creationists-should-not-use change] from the tactic that there can be "no beneficial mutations." Because information theory is more difficult for the layman to understand, it is easy to hide behind information theory without really understanding it.  It is also intimately related to the "evolution couldn't have possibly have made eyes, wings, or flagella" arguments as well.</ref>  Therefore, the changes they cannot outright deny are defined as "losing information", while changes they disagree with are defined as "gaining information", which by their definition is impossible.  Note that at no point do creationists actually specify what information actually is and often will purposefully not define the concept.  When asked what information is, for example, Dr. Werner Gitt, the leading creationist "expert" on information, says "That is not possible [to define information] because information is by nature a very complex entity.  The five-level model [that Gitt developed] indicates that a simple formulation for information will probably never be found."<ref>http://www.answersingenesis.org/articles/itbwi/questions-about-information-concept</ref>  The creationists tend to change their meaning on an ''ad hoc'' basis depending on the argument, relying on colloquial, imprecise definitions of information rather than quantifiable ones -- or worse, switching interchangeably between different definitions depending on the context of the discussion or argument.
  
 
== Shannon information ==
 
== Shannon information ==

Revision as of 22:34, 27 May 2009

Information theory is a framework for understanding the transmission of information from sender to receiver, and the effects of complexity and interference with these transmissions. Creationists, in an attempt to coat their myths with a veneer of science, have co-opted the idea of information theory to use as a plausible-sounding attack on evolution. Essentially, the claim is that the genetic code is like a language and thus transmits information, and due to the usual willful misunderstandings of the second law of thermodynamics (which is about energy, not information), they maintain that information can never be increased.[1] Therefore, the changes they cannot outright deny are defined as "losing information", while changes they disagree with are defined as "gaining information", which by their definition is impossible. Note that at no point do creationists actually specify what information actually is and often will purposefully not define the concept. When asked what information is, for example, Dr. Werner Gitt, the leading creationist "expert" on information, says "That is not possible [to define information] because information is by nature a very complex entity. The five-level model [that Gitt developed] indicates that a simple formulation for information will probably never be found."[2] The creationists tend to change their meaning on an ad hoc basis depending on the argument, relying on colloquial, imprecise definitions of information rather than quantifiable ones -- or worse, switching interchangeably between different definitions depending on the context of the discussion or argument.

Contents

Shannon information

Claude Shannon developed a model of digitized information transmission in terms of information entropy. It was developed to describe the transfer of digitized information through a noisy channel.

Digitized information consists of bits with quantized amounts. (Computers typically use a binary system, with 0 or 1 as allowed values. Genetic information can be thought of as digitized, with A, C, G, and T as allowed values.) If each position has one specific possible value, it can be said to have a low information entropy, or in more colloquial terms, high information content. As more values are possible at each point in the signal, the information entropy increases. Thus, a television signal with static has higher information entropy than the original signal.

Shannon developed his theory to provide a rigorous model of the transmission of information. Importantly, information entropy provides an operational and mathematical way to describe the amount of information that is transmitted.

In genetics, a point mutation increases the information entropy of a DNA base pair. However, natural selection counteracts this increase through eliminating organisms with harmful mutations and consequent higher information entropy (or colloquially, lower information content).[3] While information theory does not describe how a sequence of DNA bases is expressed into features for development, it clearly indicates that genetic information is transmitted from one generation to another mathematically. Any feature of a string that preserves fitness will have a lower information entropy or higher information content than a random string. Richard Dawkins's weasel program that investigates cumulative selection shows a lowering of information entropy.[4]

While there are similarities between thermodynamic entropy and information entropy, the former refers exclusively to the distribution of energy. Entropy increases in thermodynamics in a closed system according to the Second Law, but it is unclear what a closed system is genetic information. At the least, natural selection provides feedback to the information entropy.

Creationists really don't like this stuff, but won't say why. It's likely because they don't want an actual definition of information that can be argued against.[5]

Kolmogorov complexity

Kolmogorov complexity (also known as Chaitin information, or algorithmic information) deals with the use of algorithms to compress or decompress information.[6] Computer scientists developed it to discuss how to compress data in the most efficient way possible, so as to take up less disk space.

The Kolmogorov complexity depends on the number of steps that an algorithm would need to take to reproduce the information (sometimes called the "edit distance"). Thus "A20" can be thought of as a compression of "AAAAAAAAAAAAAAAAAAAA," and "(AB)9" can be thought of as a compression of "ABABABABABABABABAB." Any instruction including insertion, repeating, deletion, etc. can change the Kolmogorov complexity. Thus, the Kolmogorov complexity can be thought of as the maximum amount of information "in the string" or "in the sequence."

The Kolmogorov complexity depends entirely on the algorithm used. Hence, while there are uses in genetics, determining the change in Kolmogorov complexity would require a description of all the processes used to reproduce the developmental information from the DNA sequence; one cannot tell the amount of information (or Kolmogorov complexity) by just looking at a string of letters, symbols or DNA. (This is part of the reason why the amount of information in the words "car" and "vehicle" cannot be compared as it is dependent on the algorithms of linguistic interpretation, and why the number of letters is insignificant.) Notably, because the processes "change" (or "mutate") and "delete" can be thought of as an additional algorithmic step, they can increase the Kolomogorov complexity (or information content). More significant is that potentially they change the content.[7]

Note that any comparison of information "in the string" used by creationists (in the guise of meaning) is Kolmogorov complexity, while the "increase of noise" or "information loss by loss of DNA sequence fidelity" by mutations usually refers to Shannon's information entropy. The two cannot be used interchangeably.[8]

Word analogies

Word analogies are tricky when using concepts of information theory.

Any change of a string of text by nature of being a change is an increase of information entropy (or a loss of information content). This will be true if the string is a word ("rational" changed to "rasional") or is nonsense ("alkfd" to "alkfg"). (However, a proofreader, acting as an agent of natural selection, could reject erroneous copies of a text to retain the information entropy.)

In terms of Kolmogorov complexity, changes in letters can supply more or less information, but is dependent on the linguistic structure. The number of processes required to interpret the word through an algorithm may or may not depend on the number of letters and the identity of the letters, and hence "more" or "less" has little meaning. In the same way, mutations in genetics can potentially change how an organism develops, but without a complete understanding of the processes of development, a mutation is not "more" or "less" information.

An attempt to interpret a word analogy by both concepts at the same time fails because the two concepts are not independent but also not the same. It can be true that a change of a letter ("lost" to "post") is less copying fidelity (increased information entropy) and yet changes some linguistic meaning (different Kolmogorov complexity).

Information theory and genetics, evolution, and development

The relationship between biology and information theory given above as well as other approaches in the literature suggest that the words "biological information", "developmental information" or "genetic information" are ambiguous without clarification. Even then, there will be ambiguity:

In biology the term information is used with two very different meanings. The first is in reference to the fact that the sequence of bases in DNA codes for the sequence of amino acids in proteins. In this restricted sense, DNA contains information, namely about the primary structure of proteins. The second use of the term information is an extrapolation: it signifies the belief or expectation that the genome somehow also codes for the higher or more complex properties of living things. It is clear that the second type of information, if it exists, must be very different from the simple one-to-one cryptography of the genetic code. This extrapolation is based, loosely, on information theory. But to apply information theory in a proper and useful way it is necessary to identify the manner in which information is to be measured (the units in which it is to be expressed in both sender and receiver, and the total amount of information in the system and in a message), and it is necessary to identify the sender, the receiver and the information channel (or means by which information is transmitted). As it is, there exists no generally accepted method for measuring the amount of information in a biological system, nor even agreement of what the units of information are (atoms, molecules, cells?) and how to encode information about their number, their diversity, and their arrangement in space and time.[9]

See also

  • Gene expression describes how a DNA sequence, "genetic information", is expressed in an organism.

Footnotes

  1. Notably, this is a change from the tactic that there can be "no beneficial mutations." Because information theory is more difficult for the layman to understand, it is easy to hide behind information theory without really understanding it. It is also intimately related to the "evolution couldn't have possibly have made eyes, wings, or flagella" arguments as well.
  2. http://www.answersingenesis.org/articles/itbwi/questions-about-information-concept
  3. http://nar.oxfordjournals.org/cgi/reprint/28/14/2794
  4. See the Wikipedia article on weasel program.
  5. Creationists always seem to assume that science-y folks are using Shannon information, and then say it's wrong, such as in these exchanges between PZ Meyers and Michael Egnor. Of course, PZ Meyers wasn't talking about Shannon information in the first place.
  6. A fairly simple explanation of this form of information by Chaitin himself is here.
  7. See this blog post about this point.
  8. For an explanation of how creationists confuse different types of information, see this talkorigins letter.
  9. Nijhout, H. F. Bioessays, September 1990, vol. 12, no. 9; p.443


Mathematics Articles on RationalWiki

mathematics

Conservapedian mathematics  -  Delta function  -  Fermat's last theorem  -  Fibonacci sequence  -  Golden Ratio  -  Gödel's incompleteness theorems  -  Hypatia of Alexandria  -  Mathematics  -  Metric system  -  Phli (fun)  -  Pyramid  -  Rene Descartes  -  Sophie Germain  -  Statistics  -  wikiFactor  -  Zero  -
Personal tools
Namespaces

Variants
Actions
Navigation
Community
Tools
support