|
|
Line 1: |
Line 1: |
| {{unreferenced|date=December 2010}}
| | Friends call him Royal. To perform badminton is some thing he really enjoys doing. My job is a production and distribution officer and I'm performing pretty good monetarily. Her family members life in Delaware but she needs to transfer simply because of her family members.<br><br>my weblog ... extended car warranty ([http://Vinodjoseph.com/ActivityFeed/MyProfile/tabid/154/userId/6690/Default.aspx great post to read]) |
| In [[quantum information theory]], '''quantum mutual information''', or '''von Neumann mutual information''', after [[John von Neumann]], is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon [[mutual information]].
| |
| | |
| == Motivation ==
| |
| | |
| For simplicity, it will be assumed that all objects in the article are finite dimensional.
| |
| | |
| The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are
| |
| | |
| :<math>p(x) = \sum_{y} p(x,y)\; , \; p(y) = \sum_{x} p(x,y).</math>
| |
| | |
| The classical mutual information ''I''(''X'', ''Y'') is defined by
| |
| | |
| :<math>\;I(X,Y) = S(p(x)) + S(p(y)) - S(p(x,y))</math>
| |
| | |
| where ''S''(''q'') denotes the [[Shannon entropy]] of the probability distribution ''q''.
| |
| | |
| One can calculate directly
| |
| | |
| :<math>\; S(p(x)) + S(p(y))</math>
| |
| | |
| :<math>\; = -(\sum_x p_x \log p(x) + \sum_y p_y \log p(y))</math>
| |
|
| |
| :<math>
| |
| \; = -(\sum_x \; ( \sum_{y'} p(x,y') \log \sum_{y'} p(x,y') ) + \sum_y ( \sum_{x'} p(x',y) \log \sum_{x'} p(x',y)))</math>
| |
| | |
| :<math>\; = -(\sum_{x,y} p(x,y) (\log \sum_{y'} p(x,y') + \log \sum_{x'} p(x',y)))</math>
| |
| | |
| :<math>\; = -\sum_{x,y} p(x,y) \log p(x) p(y) .</math>
| |
| | |
| So the mutual information is
| |
| | |
| :<math>I(X,Y) = \sum_{x,y} p(x,y) \log \frac{p(x,y)}{p(x) p(y)}.</math>
| |
| | |
| But this is precisely the [[relative entropy]] between ''p''(''x'', ''y'') and ''p''(''x'')''p''(''y''). In other words, if we assume the two variables ''x'' and ''y'' to be uncorrelated, mutual information is the ''discrepancy in uncertainty'' resulting from this (possibly erroneous) assumption.
| |
| | |
| It follows from the property of relative entropy that ''I''(''X'',''Y'') ≥ 0 and equality holds if and only if ''p''(''x'', ''y'') = ''p''(''x'')''p''(''y'').
| |
| | |
| == Definition ==
| |
| | |
| The quantum mechanical counterpart of classical probability distributions are [[density matrix | density matrices]].
| |
| | |
| Consider a composite quantum system whose state space is the tensor product
| |
| | |
| :<math>H = H_A \otimes H_B.</math> | |
| | |
| Let ''ρ''<sup>''AB''</sup> be a density matrix acting on ''H''. The [[von Neumann entropy]] of ''ρ'', which is the quantum mechanical analogy of the Shannon entropy, is given by
| |
| | |
| :<math>S(\rho^{AB}) = - \operatorname{Tr} \rho^{AB} \log \rho^{AB}.</math>
| |
| | |
| For a probability distribution ''p''(''x'',''y''), the marginal distributions are obtained by integrating away the variables ''x'' or ''y''. The corresponding operation for density matrices is the [[partial trace]]. So one can assign to ''ρ'' a state on the subsystem ''A'' by
| |
| | |
| :<math>\rho^A = \operatorname{Tr}_B \; \rho^{AB}</math>
| |
| | |
| where Tr<sub>''B''</sub> is partial trace with respect to system ''B''. This is the '''reduced state''' of ''ρ<sup>AB</sup>'' on system ''A''. The '''reduced von Neumann entropy''' of ''ρ<sup>AB</sup>'' with respect to system ''A'' is
| |
| | |
| :<math>\;S(\rho^A).</math>
| |
| | |
| ''S''(''ρ<sup>B</sup>'') is defined in the same way.
| |
| | |
| ''Technical Note:'' In mathematical language, passing from the classical to quantum setting can be described as follows. The ''algebra of observables'' of a physical system is a [[C*-algebra]] and states are unital linear functionals on the algebra. Classical systems are described by commutative C*-algebras, therefore classical states are [[probability measure]]s. Quantum mechanical systems have non-commutative observable algebras. In concrete considerations, quantum states are density operators. If the probability measure ''μ'' is a state on classical composite system consisting of two subsystem ''A'' and ''B'', we project ''μ'' onto the system ''A'' to obtain the reduced state. As stated above, the quantum analog of this is the partial trace operation, which can be viewed as projection onto a tensor component. ''End of note''
| |
| | |
| It can now be seen that the appropriate definition of quantum mutual information should be
| |
| | |
| :<math>\; I(\rho^{AB}) = S(\rho^A) + S(\rho^B) - S(\rho^{AB}).</math>
| |
| | |
| Quantum mutual information can be interpreted the same way as in the classical case: it can be shown that
| |
| | |
| :<math>I(\rho^{AB}) = S(\rho^{AB} \| \rho^A \otimes \rho^B)</math>
| |
| | |
| where <math>S(\cdot \| \cdot)</math> denotes [[quantum relative entropy]].
| |
| | |
| | |
| [[Category:Quantum mechanical entropy]]
| |
| [[Category:Quantum information theory]]
| |
Friends call him Royal. To perform badminton is some thing he really enjoys doing. My job is a production and distribution officer and I'm performing pretty good monetarily. Her family members life in Delaware but she needs to transfer simply because of her family members.
my weblog ... extended car warranty (great post to read)