Arf invariant: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
avoid redirect
 
en>Addbot
m Bot: Migrating 1 interwiki links, now provided by Wikidata on d:q4789139
Line 1: Line 1:
Friends contact him Royal Cummins. Interviewing is what she does in her day job but soon her spouse and her will begin their own company. I currently live in Arizona but now I'm considering other options. One of his preferred hobbies is taking part in crochet but he hasn't made a dime with it.<br><br>My web blog ... [http://Pupo.ch/mypupo/panda/panda2/index.php?mod=users&action=view&id=161 http://Pupo.ch/]
In [[probability theory]] and in particular in [[information theory]], '''total correlation''' (Watanabe 1960) is one of several generalizations of the [[mutual information]].  It is also known as the ''multivariate constraint'' (Garner 1962) or ''multiinformation'' (Studený & Vejnarová 1999).  It quantifies the redundancy or dependency among a set of ''n'' random variables.
 
==Definition==
For a given set of ''n'' [[random variable]]s <math>\{X_1,X_2,\ldots,X_n\}</math>, the total correlation <math>C(X_1,X_2,\ldots,X_n)</math> is defined as the [[Kullback–Leibler divergence]] from the joint distribution <math>p(X_1, \ldots, X_n)</math> to the independent distribution of <math>p(X_1)p(X_2)\cdots p(X_n)</math>,
:<math>C(X_1, X_2, \ldots, X_n) \equiv \operatorname{D_{KL}}\left[ p(X_1, \ldots, X_n) \| p(X_1)p(X_2)\cdots p(X_n)\right] \; .</math>
 
This divergence reduces to the simpler difference of entropies,
:<math>C(X_1,X_2,\ldots,X_n) = \left[\sum_{i=1}^n H(X_i)\right] - H(X_1, X_2, \ldots, X_n)</math>
where <math>H(X_{i})</math> is the [[information entropy]] of variable <math>X_i \,</math>, and <math>H(X_1,X_2,\ldots,X_n)</math> is the [[joint entropy]] of the variable set <math>\{X_1,X_2,\ldots,X_n\}</math>. In terms of the discrete probability distributions on variables <math>\{X_1, X_2, \ldots, X_n\} </math>, the total correlation is given by
 
:<math>C(X_1,X_2,\ldots,X_n)= \sum_{x_1\in\mathcal{X}_1} \sum_{x_2\in\mathcal{X}_2} \ldots \sum_{x_n\in\mathcal{X}_n} p(x_1,x_2,\ldots,x_n)\log\frac{p(x_1,x_2,\ldots,x_n)} {p(x_1)p(x_2)\cdots p(x_n)}.
</math>
 
The total correlation is the amount of information ''shared'' among the variables in the set. The sum <math>\begin{matrix}\sum_{i=1}^n H(X_i)\end{matrix}</math> represents the amount of information in [[bit]]s (assuming base-2 logs) that the variables would possess if they were totally independent of one another (non-redundant), or, equivalently, the average code length to transmit the values of all variables if each variable was (optimally) coded independently. The term <math>H(X_{1},X_{2},\ldots ,X_{n})</math> is the ''actual'' amount of information that the variable set contains, or equivalently, the average code length to transmit the values of all variables if the set of variables was (optimally) coded together. The difference between
these terms therefore represents the absolute redundancy (in bits) present in the given
set of variables, and thus provides a general quantitative measure of the
''structure'' or ''organization'' embodied in the set of variables
(Rothstein 1952).  The total correlation is also the [[Kullback&ndash;Leibler divergence]] between the actual distribution <math>p(X_1,X_2,\ldots,X_n)</math> and its maximum entropy product approximation <math>p(X_1)p(X_2)\cdots p(X_n)</math>.
 
Total correlation quantifies the amount of dependence among a group of variables. A near-zero total correlation indicates that the variables in the group are essentially statistically independent; they are completely unrelated, in the sense that knowing the value of one variable does not provide any clue as to the values of the other variables. On the other hand, the maximum total correlation, given by
 
:<math>C_\max = \sum_{i=1}^n H(X_i)-\max\limits_{X_i}H(X_i)</math>
 
occurs when one of the variables is completely redundant with ''all'' of the other variables. The variables are then maximally related in the sense that knowing the value of one variable provides complete information about the values of all the other variables, and the variables can be figuratively regarded as ''cogs,'' in which the position of one cog determines the positions of all the others (Rothstein 1952).
 
It is important to note that the total correlation counts up ''all'' the redundancies among a set of variables, but that these redundancies may be distributed throughout the variable set in a variety of complicated ways (Garner 1962). For example, some variables in the set may be totally inter-redundant while others in the set are completely independent. Perhaps more significantly, redundancy may be carried in interactions of various degrees: A group of variables may not possess any pairwise redundancies, but may possess higher-order ''interaction'' redundancies of the kind exemplified by the parity function. The decomposition of total correlation into its constituent redundancies is explored in a number sources (Mcgill 1954, Watanabe 1960, Garner 1962, Studeny & Vejnarova 1999, Jakulin & Bratko 2003a, Jakulin & Bratko 2003b, Nemenman 2004, Margolin et al. 2008, Han 1978, Han 1980).
 
==Conditional total correlation==
Conditional total correlation is defined analogously to the total correlation, but adding a condition to each term.  Conditional total correlation is similarly defined as a Kullback-Leibler divergence between two conditional probability distributions,
:<math>C(X_1, X_2, \ldots, X_n|Y=y) \equiv \operatorname{D_{KL}}\left[ p(X_1, \ldots, X_n|Y=y) \| p(X_1|Y=y)p(X_2|Y=y)\cdots p(X_n|Y=y)\right] \; .</math>
 
Analogous to the above, conditional total correlation reduces to a difference of conditional entropies,
 
:<math>C(X_1,X_2,\ldots,X_n|Y=y) = \sum_{i=1}^n H(X_i|Y=y) - H(X_1, X_2, \ldots, X_n|Y=y)</math>
 
==Uses of total correlation==
 
[[Cluster analysis|Clustering]] and [[feature selection]] algorithms based on total correlation have been explored by Watanabe. Alfonso et al. (2010) applied the concept of total correlation on the optimisation of water monitoring networks.
 
==See also==
*[[Mutual information]]
*[[Dual total correlation]]
*[[Interaction information]]
*[[Multivariate mutual information]]
 
==References==
 
*Alfonso, L., Lobbrecht, A., and Price, R. (2010). ''Optimization of Water Level Monitoring Network in Polder Systems Using Information Theory'', ''Water Resources Research'', 46, W12553, 13 PP., 2010, {{doi|10.1029/2009WR008953}}.
* Garner W R (1962). ''Uncertainty and Structure as Psychological Concepts'', JohnWiley & Sons, New York.
* Han T S (1978). Nonnegative entropy measures of multivariate symmetric correlations, ''Information and Control'' '''36''', 133&ndash;156.
* Han T S (1980). Multiple mutual information and multiple interactions in frequency data, ''Information and Control'' '''46''', 26&ndash;45.
* Jakulin A & Bratko I (2003a). Analyzing Attribute Dependencies, in N Lavra\quad{c}, D Gamberger, L Todorovski & H Blockeel, eds, ''Proceedings of the 7th European Conference on Principles and Practice of Knowledge Discovery in Databases'', Springer, Cavtat-Dubrovnik, Croatia, pp.&nbsp;229&ndash;240.
* Jakulin A & Bratko I (2003b). Quantifying and visualizing attribute interactions [http://arxiv.org/abs/cs/0308002v1].
* Margolin A, Wang K, Califano A, & Nemenman I (2010). Multivariate dependence and genetic networks inference. ''IET Syst Biol'' '''4''', 428.
* McGill W J (1954). Multivariate information transmission, ''Psychometrika'' '''19''', 97&ndash;116.
* Nemenman I (2004). Information theory, multivariate dependence, and genetic network inference [http://arxiv.org/abs/q-bio.QM/0406015].
* Rothstein J (1952). Organization and entropy, ''Journal of Applied Physics'' '''23''', 1281&ndash;1282.
* Studený M & Vejnarová J (1999). The multiinformation function as a tool for measuring stochastic dependence, in M I Jordan, ed., ''Learning in Graphical Models'', MIT Press, Cambridge, MA, pp.&nbsp;261&ndash;296.
* Watanabe S (1960). Information theoretical analysis of multivariate correlation, ''IBM Journal of Research and Development'' '''4''', 66&ndash;82.
 
[[Category:Information theory]]
[[Category:Probability theory]]
[[Category:Statistical dependence]]

Revision as of 05:20, 16 March 2013

In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the multivariate constraint (Garner 1962) or multiinformation (Studený & Vejnarová 1999). It quantifies the redundancy or dependency among a set of n random variables.

Definition

For a given set of n random variables , the total correlation is defined as the Kullback–Leibler divergence from the joint distribution to the independent distribution of ,

This divergence reduces to the simpler difference of entropies,

where is the information entropy of variable , and is the joint entropy of the variable set . In terms of the discrete probability distributions on variables , the total correlation is given by

The total correlation is the amount of information shared among the variables in the set. The sum represents the amount of information in bits (assuming base-2 logs) that the variables would possess if they were totally independent of one another (non-redundant), or, equivalently, the average code length to transmit the values of all variables if each variable was (optimally) coded independently. The term is the actual amount of information that the variable set contains, or equivalently, the average code length to transmit the values of all variables if the set of variables was (optimally) coded together. The difference between these terms therefore represents the absolute redundancy (in bits) present in the given set of variables, and thus provides a general quantitative measure of the structure or organization embodied in the set of variables (Rothstein 1952). The total correlation is also the Kullback–Leibler divergence between the actual distribution and its maximum entropy product approximation .

Total correlation quantifies the amount of dependence among a group of variables. A near-zero total correlation indicates that the variables in the group are essentially statistically independent; they are completely unrelated, in the sense that knowing the value of one variable does not provide any clue as to the values of the other variables. On the other hand, the maximum total correlation, given by

occurs when one of the variables is completely redundant with all of the other variables. The variables are then maximally related in the sense that knowing the value of one variable provides complete information about the values of all the other variables, and the variables can be figuratively regarded as cogs, in which the position of one cog determines the positions of all the others (Rothstein 1952).

It is important to note that the total correlation counts up all the redundancies among a set of variables, but that these redundancies may be distributed throughout the variable set in a variety of complicated ways (Garner 1962). For example, some variables in the set may be totally inter-redundant while others in the set are completely independent. Perhaps more significantly, redundancy may be carried in interactions of various degrees: A group of variables may not possess any pairwise redundancies, but may possess higher-order interaction redundancies of the kind exemplified by the parity function. The decomposition of total correlation into its constituent redundancies is explored in a number sources (Mcgill 1954, Watanabe 1960, Garner 1962, Studeny & Vejnarova 1999, Jakulin & Bratko 2003a, Jakulin & Bratko 2003b, Nemenman 2004, Margolin et al. 2008, Han 1978, Han 1980).

Conditional total correlation

Conditional total correlation is defined analogously to the total correlation, but adding a condition to each term. Conditional total correlation is similarly defined as a Kullback-Leibler divergence between two conditional probability distributions,

Analogous to the above, conditional total correlation reduces to a difference of conditional entropies,

Uses of total correlation

Clustering and feature selection algorithms based on total correlation have been explored by Watanabe. Alfonso et al. (2010) applied the concept of total correlation on the optimisation of water monitoring networks.

See also

References

  • Alfonso, L., Lobbrecht, A., and Price, R. (2010). Optimization of Water Level Monitoring Network in Polder Systems Using Information Theory, Water Resources Research, 46, W12553, 13 PP., 2010, 21 year-old Glazier James Grippo from Edam, enjoys hang gliding, industrial property developers in singapore developers in singapore and camping. Finds the entire world an motivating place we have spent 4 months at Alejandro de Humboldt National Park..
  • Garner W R (1962). Uncertainty and Structure as Psychological Concepts, JohnWiley & Sons, New York.
  • Han T S (1978). Nonnegative entropy measures of multivariate symmetric correlations, Information and Control 36, 133–156.
  • Han T S (1980). Multiple mutual information and multiple interactions in frequency data, Information and Control 46, 26–45.
  • Jakulin A & Bratko I (2003a). Analyzing Attribute Dependencies, in N Lavra\quad{c}, D Gamberger, L Todorovski & H Blockeel, eds, Proceedings of the 7th European Conference on Principles and Practice of Knowledge Discovery in Databases, Springer, Cavtat-Dubrovnik, Croatia, pp. 229–240.
  • Jakulin A & Bratko I (2003b). Quantifying and visualizing attribute interactions [1].
  • Margolin A, Wang K, Califano A, & Nemenman I (2010). Multivariate dependence and genetic networks inference. IET Syst Biol 4, 428.
  • McGill W J (1954). Multivariate information transmission, Psychometrika 19, 97–116.
  • Nemenman I (2004). Information theory, multivariate dependence, and genetic network inference [2].
  • Rothstein J (1952). Organization and entropy, Journal of Applied Physics 23, 1281–1282.
  • Studený M & Vejnarová J (1999). The multiinformation function as a tool for measuring stochastic dependence, in M I Jordan, ed., Learning in Graphical Models, MIT Press, Cambridge, MA, pp. 261–296.
  • Watanabe S (1960). Information theoretical analysis of multivariate correlation, IBM Journal of Research and Development 4, 66–82.