|
|
Line 1: |
Line 1: |
| In [[quantum mechanics]], and especially [[quantum information| quantum information theory]], the '''linear entropy''' or '''impurity''' of a [[quantum state|state]] is a [[scalar (physics)|scalar]] defined as
| | Don Leishman is how I'm called but I don't like usually use my full title. Managing people has been my profession for a time. My friends say it's not beneficial for me but what Truly like doing is actually by keep bees and I will never stop doing this item. My wife and I chose to exist in Michigan and can never change. I've been working on my website for some time now. Keep reading here: http://www.vimeo.com/91560857<br><br>Here is my blog: [http://www.vimeo.com/91560857 kim kardashian porn movie] |
| | |
| :<math>S_L \, \dot= \, 1 - \mbox{Tr}(\rho^2) \,</math>
| |
| | |
| where ''ρ'' is the [[density matrix]] of the state.
| |
| | |
| The linear entropy can range between zero, corresponding to a completely pure state, and (1 − 1/''d''), corresponding to a completely mixed state. (Here, ''d'' is the [[dimension]] of the density matrix.)
| |
| | |
| The linear entropy is trivially related to the [[purity (quantum mechanics)|purity]] <math>\gamma \,</math> of a state by
| |
| :<math>S_L \, = \, 1 - \gamma \, .</math>
| |
| | |
| ==Motivation==
| |
| | |
| The linear entropy is a lower approximation to the (quantum) [[von Neumann entropy]] ''S'', which is defined as
| |
| | |
| :<math>S \, \dot= \, -\mbox{Tr}(\rho \ln \rho) = -\langle \ln \rho \rangle \, .</math>
| |
| | |
| The linear entropy then is obtained by expanding ln ''ρ'' = ln (1−(1−''ρ'')), around a pure state, ''ρ''<sup>2</sup>=''ρ''; that is, expanding in terms of the non-negative matrix 1−''ρ'' in the formal [[Mercator series]] for the logarithm,
| |
| :<math> - \langle \ln \rho \rangle = \langle 1- \rho \rangle + \langle (1- \rho )^2 \rangle/2 + \langle (1- \rho)^3 \rangle /3 + ... ~,</math> | |
| and retaining just the leading term.
| |
| | |
| The linear entropy and von Neumann entropy are similar measures of the degree of mixing of a state, although the linear entropy is easier to calculate, as it does not require [[Diagonalizable matrix|diagonalization]] of the density matrix.
| |
| | |
| ==Alternate definition==
| |
| Some authors<ref>{{cite journal | author=Nicholas A. Peters, Tzu-Chieh Wei, Paul G. Kwiat|title = Mixed state sensitivity of several quantum information benchmarks| year=2004 |journal = Physical Review A|volume=70|pages=052309 | doi=10.1103/PhysRevA.70.052309|arxiv = quant-ph/0407172 |bibcode = 2004PhRvA..70e2309P | issue=5}}</ref> define linear entropy with a different normalization
| |
| :<math>S_L \, \dot= \, \tfrac{d}{d-1} (1 - \mbox{Tr}(\rho^2) ) \, ,</math>
| |
| which ensures that the quantity ranges from zero to unity.
| |
| | |
| ==References==
| |
| <references/>
| |
| | |
| [[Category:Quantum mechanics]]
| |
| [[Category:Quantum mechanical entropy]]
| |
| | |
| | |
| {{quantum-stub}}
| |
Don Leishman is how I'm called but I don't like usually use my full title. Managing people has been my profession for a time. My friends say it's not beneficial for me but what Truly like doing is actually by keep bees and I will never stop doing this item. My wife and I chose to exist in Michigan and can never change. I've been working on my website for some time now. Keep reading here: http://www.vimeo.com/91560857
Here is my blog: kim kardashian porn movie