# Information Theory ![rw-book-cover](https://images-na.ssl-images-amazon.com/images/I/51V2P3DvA5L._SL200_.jpg) ## Metadata - Author: [[James V Stone]] - Full Title: Information Theory - Category: #books ## Highlights - Thus, any form of inequality in the frequency of your words, or any form of redundancy in your words, means that you would generate information at a rate which is less than your channel capacity. ([Location 27965](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=27965)) - Tags: [[pink]] - It is almost as if Morse had recognised, long before Shannon’s formal proofs existed, that the number of possible messages far exceeds the number of messages actually used, and that this is the key to e"cient communication. ([Location 37576](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=37576)) - Tags: [[pink]] - Shannon’s information entropy is a measure of information, whereas thermodynamic entropy is a measure of the number of states a physical system (like a jar of gas) can adopt. These two di↵erent conceptualisations of entropy do not seem to be obviously related. But they are, and the relationship between them matters because thermodynamic entropy can be used to measure the energy cost of Shannon’s information entropy. If this were not true then it would be possible to use a hypothetical being, known as Maxwell’s demon, to run power stations on pure information. ([Location 81703](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=81703)) - Tags: [[pink]] [[favorite]] - So we now have to talk about what we mean by disorder and what we mean by order.... Suppose we divide the space into little volume elements. If we have black and white molecules, how many ways could we distribute them among the volume elements so that white is on one side and black is on the other? On the other hand, how many ways could we distribute them with no restriction on which goes where? Clearly, there are many more ways to arrange them in the latter case. We measure “disorder” by the number of ways that the insides can be arranged, so that from the outside it looks the same. The logarithm of that number of ways is the entropy. The number of ways in the separated case is less, so the entropy is less, or the “disorder” is less. Feynman R, 1964 15. ([Location 82139](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=82139)) - Tags: [[pink]] [[favorite]] - Evidence for the existence of the Landauer limit has been obtained by B´erut et al (2012) 7. Key point. No matter how e"cient any physical device is (e.g. a computer or a brain), it can acquire one bit of information only if it expends at least 0.693kT joules of energy. ([Location 85198](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=85198)) - Tags: [[pink]] [[favorite]] - In order to know when to open and shut the door, the demon must use information about the speed and direction of each molecule. However, we already know that each bit of information cannot cost less than the Landauer limit of 0.693kT joules/bit (Equation 8.16). ([Location 86073](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=86073)) - Tags: [[favorite]] [[pink]] - In other words, there is no net gain to be had from using information to accumulate fast-moving molecules in one half of a container, and using the resultant temperature di↵erence to generate electricity. This is important because it provides a fundamental link between the notion of Shannon’s information entropy, as defined in information theory, and thermodynamic entropy, as defined in physics. Indeed, within three years of the publication of Shannon’s theory, Gabor 18 declared: We cannot get anything for nothing, not even an observation. Gabor D, 1951. ([Location 86509](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=86509)) - Tags: [[pink]] - Most remarkable of all is that information has a definite lowest cost which can be measured in joules per bit. More than any other, this fact establishes the existence of a fundamental link between Shannon’s information entropy and Boltzmann–Gibbs’ thermodynamic entropy, between information and the disorderly arrangement of molecules in a jar of gas. ([Location 87382](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=87382)) - Tags: [[pink]] - The standard set of methods used to remove spatial and temporal redundancy are collectively called MPEG (Moving Picture Expert Group). Whilst these methods are quite complex, they all rely heavily on a core method called the cosine transform. In essence, this decomposes the data into image features of di↵erent sizes. ([Location 88258](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=88258)) - Tags: [[pink]] - Discarding certain spatial and temporal frequencies and recoding intensity/colour data means that data recorded at a rate of 1,500 million binary digits/s can be compressed and transmitted through a communication satellite channel with a capacity of 19.2 million bits/s, and then decoded to present you (the TV viewer) with 1,500 million binary digits of data per second (with some loss of information but no visible loss of quality). This represents an e↵ective compression factor of about 78(⇡ 1500/19.2), so it looks as if we can communicate 78 times more data than the channel capacity would suggest. ([Location 88694](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=88694)) - Tags: [[pink]] - According to the analysis summarised above, we now have a candidate for one of these laws: between successive generations, the collective ([Location 90879](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=90879)) - Tags: [[pink]] - The grand Question which every naturalist ought to have before him when dissecting a whale or classifying a mite, a fungus or an infusorian is “What are the Laws of Life?”. Darwin C, B Notebook, 1837. ([Location 90879](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=90879)) - Tags: [[pink]] - genome of a species should maximise the Shannon information acquired about its environment for each joule of expended energy. This general idea of e!cient evolution can be considered to be an extension of the e"cient coding hypothesis normally applied to brains. ([Location 91314](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=91314)) - Tags: [[pink]] - This general approach has been championed by Horace Barlow, who has been largely responsible for the genesis of the resultant e!cient coding hypothesis 5. Put simply, this states that data acquired by the eye should be encoded as e"ciently as possible before being communicated to the brain. ([Location 92190](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=92190)) - Tags: [[pink]] - Using data collected from mechanical receptors in the cricket, Warland et al (1992) found that neurons have an entropy of about 600 bits/s. However, it was found that only about half of this entropy is related to the neuron’s input, and the rest is noise. These neurons therefore transmit information about their inputs at a rate of about 300 bits/s, which represents a coding e!ciency of about 0.5 (i.e. 300/600). ([Location 93500](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=93500)) - Tags: [[pink]] - Laughlin’s experiment represents one of the first tests of an information-theoretic optimality principle within the brain (i.e. the e"cient coding hypothesis). This general approach has been vindicated in tests on other organisms (including humans 38;51;55) and on other sense modalities (e.g. olfaction 29 and audition 42). Indeed, Karl Friston’s free-energy theory 17;47 assumes that an organising principle for all behaviour consists of minimising the sum total of all future surprises (i.e. sensory entropy). ([Location 95247](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=95247)) - Tags: [[pink]] - The experiments and analyses described above suggest that the brain’s ability to process information is about as e"cient as it possibly can be. More importantly, information-theoretic analyses of such experiments have led to the general conclusion that, within sensory systems: ... information rates are very large, close to the physical limits imposed by the spike train entropy. Rieke et al, 1997 43. Without information theory, we would have no way of telling how well neurons perform, because we would have little idea of what it means to measure neuronal information processing performance in absolute terms. And so we would not be able to tell that the answer to the question, “are brains good at processing information?” is yes. More importantly, we could not know that, within the constraints imposed by their physical structure, brains operate close to the limits defined by Shannon’s mathematical theory of communication. ([Location 96558](https://readwise.io/to_kindle?action=open&asin=B07CBM8B3B&location=96558)) - Tags: [[pink]]