Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Home Topics Titles Quotes Blog Featured Help
Search

Cover Quote: June 1970

Consider any one relay. At any given time, say a millisecond, it can be in either of two states; that is, it can be transmitting one signal or none. Hence, two independent, or unconnected, relays can be in any one of four states; three in one of eight, four in one of sixteen, etc. That is, of n neurons, the number of possible states is 2n. It takes one signal, or one unit of information, to determine in which of two states any one relay is in any one relay time. The a priori, or logical, probability that a neuron is in a particular state at a particular time is one-half; that two are in a given state, one-fourth; and so on. Hence, information is exactly the logarithm to the base 2 of the reciprocal of the probability of the state. But this has a peculiarly familiar sound. Gibbs had defined entropy as the logarithm of the probability of the state. In Wiener’s words, entropy measures chaos, and information is negative entropy. So, corresponding to the second law of thermodynamics, that entropy must always increase, we can write for any computing machine the corresponding law—information can never increase. This ensures that no machine can operate on the future but must derive its information from the past. It can never do anything with this information except corrupt it. The transmission of signals over ordinary networks of communication always follows the law that deduction obeys, that there can be no more information in the output than there is in the input. The noise, and only the noise, can increase. Therefore, if we are to deal with knowers that are computing machines, we can state this much about them. Each is a device, however complicated, which can only corrupt revelation.



- Warren S. McCulloch
Through the Den of the Metaphysician, 1948
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy