Integrated Information Theory

From formulasearchengine
Jump to navigation Jump to search

Template:Multiple issues

The Integrated Information Theory is a recently formulated theory which attempts to quantitatively measure consciousness. It was developed by psychiatrist and neuroscientist Giulio Tononi of the University of Wisconsin–Madison.[1]


Schematic diagram of how to decompose systems into overlapping complexes according to Tononi's Information Integration Theory

The theory is based on two key observations. The first is that every observable conscious state contains a massive amount of information. A common example of this is every frame in a movie. Upon seeing a single frame of a movie you have watched you instantly associate it with a "specific conscious percept."[2] That is to say you can discriminate a single frame from a film with any other single frame, including a blank, black screen. The mind, therefore, can discriminate amongst a massive number of possible visual states. This is a tremendous amount of information being represented. Compare our visual awareness to a simple photodiode which only can discriminate the presence of light from dark. It doesn't matter if the light is a lightbulb, a scene from Ben Hur or the bright light of noon on a summer day, the photodiode represents only minimal information. The hypothesis then is that the amount of consciousness an entity has is equal to the amount of information processing it contains. This brings us to the second key observation of the theory.

All of the information you have gleaned from conscious states is highly, and innately, integrated into your mind. It is impossible for you to see the world apart from all of the information that you are conscious of. When you are looking at an orange, for example, you cannot separate the color of the fruit (orange) from its shape (round). Consciousness is "integrated"; even though color processing and spatial processing are separately localized in the brain (a stroke victim can lose color perception yet maintain perfect spatial awareness, for example) conscious experiences cannot be atomized into distinct parts.

Template:Not a typo Tononi's initial ideas were further developed by Adam Barrett, who created similar measures of integrated information [3] such as "phi empirical".

Definition of Consciousness

In this theory, consciousness arises as a property of a physical system, its 'integrated information'. Integrated information is an exact quantity that can be measured using the following equations:


Given: a system (including current probability distribution) and Mechanism (which specifies the possible next state probability distribution, if the current state is perturbed with all possible inputs). You can determine: Actual Distribution - Possible system states at time t = -1 Thus: System and Mechanism constitute information (about the system's previous state), in the classic sense of 'reduction of uncertainty.'

Relative Entropy/Effective Information

Effective Information = relative entropy H between the actual and potential repertoires = Kullback-Leibler divergence

It is implicitly specified by mechanism and state, so it is an 'intrinsic' property of the system. One can calculate the actual repertoire of states by perturbing the system in all possible ways to obtain the forward repertoire of output states. After that, one applies Bayes' Rule.


System of two Binary elements - Four possible states (00, 01, 10, 11)

The first binary element operates randomly. The second binary element will be whatever the first element was in the previous state. Initially: (0, 0). maximum entropy: p = (1/4, 1/4, 1/4, 1/4) Given, at time t, state is 11 Previous state must have been 11 or 10, p = (0, 0, 1/2, 1/2) Generated one bit of information since where X is our system, mech is that system's mechanism, x1 is a state of the system, and p(X0(maxH)) is the uniform or potential distribution.

Integration ()


where X is our system, mech is that system's mechanism, is a state of the system, is the product of all the probability distributions of each part of the system in the minimal information partition.

It's clear then that will be high when there is a lot of information generated among the parts of a system as opposed to within them.


A complex is a set of elements that generate integrated information that is not fully contained in a larger set of higher .

This then leads naturally to the notion of a main complex, which is the complex in a system that generates the largest amount of . Note that a main complex can partially contain complexes of lower within it.

Interpretations of different aspects of consciousness

Quality of consciousness

We begin by defining a multi-dimensional space called qualia space, or Q-space. This space has an axis for every state of the system. A point in this space, then, has a component for every state; if we restrict the components to be numbers from 0 to 1, then we can view the components as probabilities that the system is in that state. Thus a point in Q-space represents a probability distribution. Now again using relative entropy we can measure the amount of information generated by a single connection c within the system with the following equation:

where Y is the system with that connection removed. Thus there are points Y and X in Q-space that correspond to the probability distributions of the system with and without the connection c, respectively. We can then draw a vector from Y to X that has length . This vector is associated with the connection c and is called a q-arrow. So a q-arrow is a representation of the informational relationship specified by a connection.

Properties of q-arrows

Context dependency




  1. {{#invoke:Citation/CS1|citation |CitationClass=journal }}
  2. Template:Cite web
  3. Barrett, A.B., & Seth, A.K. (2011). Practical measures of integrated information for time-series data. PLoS Comput. Biol., 7(1): e1001052

External links

Online videos