top of page

The Information Age

The Information Age  which is well-know nowadays, was in fact shaped by the ideas and discoveries made after theWorldWar II by scientists from all around the world. The tools developed by Shannon in A Mathematical Theory of Communication [106] were so powerful that even multiple decades later, they are still used in a lot of fields. This section will present three different applications based on the information theory  such as wireless communication, space exploration and neuroscience.

​

​

1. Communication Technology
​

Today’s communication technology is heavily based on [106]. Shannon then provided crisp upper bounds on the most efficient way of communicating information regardless the content of the message or the channel used as explained here.

        The review paper [54] propose a nice illustration of the impact of the Information Theory in communication technology. The authors only focus on the heritage of the capacity formula and the chain rule.

​

Channel capacity formula 
​

The channel capacity formula (see section 3.3 for more details) can be adapted to the context. In the case of band-limited signal the following expression is used. C denotes the capacity, S / N0 × BRF denotes the signal-to-noise ratio (SNR), BRF is the single-sided bandwidth and a one-sided power density of N0 (1).

      The above formula is really helpful because it allows to determine the powerbandwidth plane for an analog signal and the Shannon bound. Engineers were more efficient in their work because they know the frontier between what is achievable and what is impossible. It corresponds to the bound drawn in the Figure 24.

Fig. 24 Digital power-bandwidth plane, featuring various modulation types, and the Shannon bound.

Chain rule
​

The chain rule is really important in modern wireless communications. Indeed, the chain rule is a theorem of conditional entropies H(X1)+H(X2|X1)+...+H(Xn|X1,...,Xn−1). which decomposes the entropy of random variables H(X1,X2,...,Xn) in a sum of independent elements. This allows one to process the vectors in an optimum way as a “chain”.

Thanks to the chain rule, it is possible to process a Multiple Access Channel (MAC) optimally. The idea is that each source n is a source of noise for the source n−1 as illustrated in Figure 25.

Fig. 25 Schema of a Multiple Access Channel as presented by Huber and Fischer [54]

But with Successive Interference Cancellation (SIC) engineers are able to determine the best rate for multiple sources to communicate information efficiently through the channel. 

​

By applying this tool, Multiple-Input/Multiple-Output channels (MIMO) are created. MIMO are essential for current wireless communication standard such as Wi-Fi, 3G and 4G.

​
​
Deep-space communication
​

The paper of Massey [71] gives an interesting insight to the achievements that were possible thanks to information theory. In the focus of the paper are two space missions of the 60-ies - Pioneer 9 and Mariner 69 that used and proofed channel codes in their application. In Masseys paper the two different approaches in channel coding for each of the missions are outlined. Both cases are well adapted to channel coding and brought evidence to Shannons theory. This is due to the fact that the deep-space channel is accurately described by the additive white Gaussian noise (AWGN) channel, proposed by Shannon, but also that binary transmission can be effectively used in deeps-space communications due to the available bandwidth as presented here. When using Binary Phase Shift Keying  (BPSK), as it is used in both missions, the channel changes to a discrete-time AWGN . This results in the following channel capacity formula elaborated by Fano [40]:

deep-space

where N is the average power of the white Gaussian noise and E is identified as the average energy of the waveforms for a specific transmitted symbol. It is then shown that no loss is to be expected for energy-efficient operations (approx. 0 < E/N ≤ 1/2). In contrast to that the error probability for an uncoded signal is extensively high. This was found by Wozencraft and Jacobs [61]

Thanks to information theory, it was therefore well known at the time of development of deep-space probes that a reliable and energy-efficient binary modulation relies strongly on channel coding.

Due to this NASA implemented channel coding on two space probes Pioneer 9 and Mariner ’69 . Besides, newly developed decoding system -the Fano-algorithm sequential decoding- is implemented on Mariner'69 . These two missions proved that the use of BPSK and a sequential coding is successful. Further development took place for Viterbi’s coding as it was used for the famous Voyager missions. The field became so vast that an international standard for coding techniques has been established by the Consultative Committee for Space Data Systems (CCSDS).

 

In a way it is thanks to the work of Shannon that it is possible to send probes over long distances into space and still be able to communicate with them.

​
​
2. Information theory beyond communication technology
​

The Information Theory also found application in a more surprising field: neuroscience. Neuroscience is the study of the brain and the nervous system and their interactions to other physiological systems. As illustrated in [120], Information Theory  is used nowadays to analyze the way information is stored, processed and transmitted within the nervous system.

​

​

What is the relation between Shannon and the brain ?
​

The entropy is easily adapted to neuroscience. The nervous system can be seen wherein the information is stored, processed and transmitted. The neuron is a basic biological information processing element, with the axon being the transmission channel.

       The following simple example helps to understand the concept in this field. Considering the recording of neural activity R in response to a stimulus S. H(R) is the entropy of the neural response and H(S) is the entropy of the stimulus. H(R) is calculated by estimating the probability distribution of some feature in R. While the stimulus is generated by the experimenter, so H(S) is known. The experimenter wants to know the amount of information H(R|S) contained in the neural response after the stimulation. 

        Shannon has already provided the solution to this problem between a source X and a receiver Y (see here for preliminary explanations), the mutual information is expressed in terms of entropies by:

neuro

For Shannon I(X ;Y)  is interpreted as the of transmission across the channel. From a neuroscience point of view, the neuron is seen as a communication channel, the source corresponds to the stimuli and the receiver corresponds to the neural response. Therefore, I(R;S)  is the amount of information about the stimulus transmitted through the neuron to the response.

 

        J.Waxman presents different applications of Shannon’s tools in neuroscience in [120]. Two levels are explored; the level of the neuron and the level of the brain.

​

​

At the level of the neuron - study of the visual information encoding in a primate
​

The paper [19] explores such ideas in the study of visual information encoding in a primate. The researchers study the encoding rate of the neuron for time-varying stimuli or constant-velocity stimuli. Two approaches are used in this experiment. The first approach is the direct measure of the mutual information between the stimulus and the response (shown in Figure 26). While, the second approach is the reconstruction of the stimulus waveform from the neuronal recording (shown in Figure 27).

Fig. 26 The direct method to compute the mutual information between the stimulus and the response. The first stimulus is time-varying while the second is at a constant velocity [19].

Fig. 27 The reconstruction method is used to obtain the mutual information between the stimulus and the stimulus estimate.

The researchers conclude thanks to the result that constant-velocity stimuli would demand a lower encoding rate than time-varying stimuli.

​

​

​

At the level of the brain - information encoding between schizophrenic and control patients
​

At an higher level, the direct access to the individual information channel is lost. Instead the synchronous activity of huge populations of neurons can be observed. The neural activity can only be recorded with noninvasive techniques: electroencephalography (EEG) and functional Magnetic Resonance Imaging (fMRI). In the experiment presented in Waxman's paper, EEG is used to show the way information is transmitted differently between schizophrenic patients and regular patients. Cross-mutual information (CMI) are recorded between 16 EEG signal from all participant. The results are given in the following Figures 28 and 29. A clear distinction is easily noticed between regular patient and schizophrenic ones.

​

Fig. 28 The average-CMI for 16 EEG of the regular patient [78]

Fig. 29 The average-CMI for 16 EEG of the schizophrenic patient [78]

 

​

Conclusion of the impact of information theory in neuroscience

​

Waxman has actually collected in his paper many other studies that explore the nervous system by applying IT.

        It is remarkable that the information theory founds its use in a domain like medical science. However Waxman warns other scientists in using Shannon’s tools in every context without thinking if the theory is adequate for the matter. This was also discussed by Shannon himself in his Bandwagon paper [105].

​

3. Limitations of Information Theory
​
The Bandwagon

Soon after the publication of Shannon’s paper, Information Theory became extremely fashionable even in non-engineering fields in Neurophysiology:

”... birds clearly have the problem of communicating in the presence of noise ... an examination of birdsong on the basis of information theory might ... suggest newtypes of field experiment and analysis... ”. [10]

 Researches tried to apply Shannon’s theory in their own field despite the weakness of the relationship between their application and the communication field. ”Information Theory” or ”Cybernetics” were became hype words to attract funding. Everything has suddenly turned to ”digital”.

 

        In 1956, the father of the Digital Age himself wrote a gentle caveat in a paper called The Bandwagon [105]. He expressed his displeasure about the publicity and the turn Information Theory. He says ”it has perhaps been ballooned to an importance beyond its actual accomplishments ”. He recalls that his theory is a tool for the communication problems. Nevertheless, he believes that his concepts can be extended to an other dimensions but wants to inject a ”note of moderation” in the situation.

        Firstly, he emphasizes the origin and the assumptions of his breakthroughs. A deep understanding of the mathematics and the implication on communication is the right beginning to use the tools rather than applying them in a random manner

with experimental verification. Then, Shannon denounces the exaggerated publicity about Information Theory ”... has certainly been sold, if not oversold[105]. He underlines that science is not a business and advices to favour a research and critical mind rather than publishing a lot of poor and half-finished papers.

From the bandwagon until now, the successes of Shannon’s theory have been remarkable in technical fields related to communication. For example, the deep-space communication presented here is achievable thanks to mathematical tools such as the entropy, the channel capacity and especially the transition from analog to digital. The translation of continuous waves into 0’s and 1’s helps to communicate at long distance in noisy conditions. However, nowadays Shannon’s theory is still used in ’non-engineering’ fields.

​

​

The caveat of the bandwagon pursuits

 

The previous part illustrates the applications of Information theory to neuroscience [120]. These fields are combined to try to answer many different questions at different levels of abstraction. Shannon’s formulas have been extended by considering the neuron as a basic biological information processing element.

​

        At the level of the neuron, researchers study for example the rate of encoding by a visual neuron of a primate for constant-velocity stimuli compared to time-varying stimuli in [19]. The stimulus and the neural response are respectively seen as the

source and the receiver in Shannon’s theory. The entropy, which is the foundation of the study, relies on the computation of probabilities. The visual information is assumed to be encoded in neuron firing rates  in [19]. But others have hypothesized

that information is encoded by the precise timing of spike(2).

        A pitfall of these studies is that the modelling assumptions about the ”information channel” are often questionnable. So far, none of the studies in [120] gives a precise answer about how much encoded information is actually used by the nervous system and in what manner. By consequence, in the experiment on primate, the interpretation drawn for the rate of encoding for different types of stimuli could be correct as well as completely wrong. It is often tempting to exaggerate conclusions and interpretations beyond what their information-theoretic measures actually

express.

        At the level of the brain, the access to the individual information channel is lost. Instead, the synchronous activity of a population of neurons is observed. As explained here, information-theoretic tools have helped to distinguish patients that suffered from a disease from a control population. But the information metric are used completely out of the context in which they were initially formulated.

 

        Information theory has clearly proved to be a powerful tool and layout for studying various problems. But as Shannon claims in The Bandwagon, researchers have to be extremely careful and shall not make unwarranted conclusions. Since neuroscience is a fashionable field for the moment, the number of publications

increases continuously. In order to publish new articles, scientists use mathematical tools and forget the biological context. Therefore, their papers appear to be innovative but do not correspond to the reality. Shannon’s advices are still valid: a critical mind is essential for new breakthroughs.

​

footnote
  1. single-sided bandwidth and one-sided bandwidth are types of modulation used to transmit information by radio waves.

​

​

​

  2.  The comparison between firing rates or timing of spikes is discussed further in here

bottom of page