|
|
Line 1: |
Line 1: |
| {{DISPLAYTITLE:''H''-theorem}}
| | Jerrie Swoboda is what the public can call me so I totally dig regarding name. What me and my family appreciation is acting but I can't make it some profession really. The job I've been taking up for years is any kind of a people manager. Guam is where I've always been living. You does find my website here: http://circuspartypanama.com<br><br>my blog [http://circuspartypanama.com clash of clans hack no survey no jailbreak] |
| In classical [[statistical mechanics]], the ''' ''H''-theorem''', introduced by [[Ludwig Boltzmann]] in 1872, describes the tendency to increase in the quantity ''H'' (defined below) in a nearly-[[ideal gas]] of molecules.<ref>
| |
| L. Boltzmann, "[https://sites.google.com/site/articleshistoriques/theorie-cinetique/Boltzmann1872.pdf Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen]." Sitzungsberichte Akademie der Wissenschaften 66 (1872): 275-370.
| |
| <br/>English translation: {{cite doi| 10.1142/9781848161337_0015}}</ref> As this quantity ''H'' was meant to represent the [[entropy]] of thermodynamics, the ''H''-theorem was an early demonstration of the power of [[statistical mechanics]] as it claimed to derive the [[second law of thermodynamics]]—a statement about fundamentally [[irreversible process]]es—from reversible microscopic mechanics.
| |
| | |
| The ''H''-theorem is a natural consequence of the kinetic equation derived by Boltzmann that has come to be known as [[Boltzmann's equation]]. The ''H''-theorem has led to considerable discussion about its actual implications, with major themes being:
| |
| * What is entropy? In what sense does Boltzmann's quantity ''H'' correspond to the thermodynamic entropy?
| |
| * Are the assumptions (such as the ''Stosszahlansatz'' described below) behind Boltzmann's equation too strong? When are these assumptions violated?
| |
| | |
| == Definition and meaning of Boltzmann's ''H'' ==
| |
| | |
| The ''H'' value is determined from the function ''f''(''E'',''t'') ''dE'', which is the energy distribution function of molecules at time ''t''. The value ''f''(''E'',''t'') ''dE'' is the number of molecules that have kinetic energy between ''E'' and ''E'' + ''dE''. ''H'' itself is defined as:
| |
| | |
| <math> H(t) = \int_0^{\infty} f(E,t) \left[ \log\left(\frac{f(E,t)}{\sqrt{E}}\right) - 1 \right] \, dE </math>
| |
| | |
| For an isolated ideal gas (with fixed total energy and fixed total number of particles), the function ''H'' is at a minimum when the particles have a [[Maxwell–Boltzmann distribution]]; if the molecules of the ideal gas are distributed in some other way (say, all having the same kinetic energy), then the value of ''H'' will be higher. Boltzmann's ''H''-theorem, described in the next section, shows that when collisions between molecules are allowed, such distributions are unstable and tend to irreversibly seek towards the minimum value of ''H'' (towards the Maxwell–Boltzmann distribution).
| |
| | |
| (Note on notation: Boltzmann originally used the letter ''E'' for quantity ''H''; most of the literature after Boltzmann uses the letter ''H'' as here. Boltzmann also used the symbol ''x'' to refer to the kinetic energy of a particle.)
| |
| | |
| == Boltzmann's ''H'' theorem ==
| |
| | |
| [[Image:Translational motion.gif|thumb|right|300px|In this mechanical model of a gas, the motion of the molecules appears very disorderly. Boltzmann showed that, assuming each collision configuration in a gas is truly random and independent, the gas converges to the [[Maxwell speed distribution]] even if it did not start out that way.]]
| |
| | |
| Boltzmann considered what happens during the collision between two particles. It is a basic fact of mechanics that in the elastic collision between two particles (such as hard spheres), the energy transferred between the particles varies depending on initial conditions (angle of collision, etc.).
| |
| | |
| Boltzmann made a key assumption known as the ''Stosszahlansatz'' ([[molecular chaos]] assumption), that during any collision event in the gas, the two particles participating in the collision have 1) independently chosen kinetic energies from the distribution, 2) independent velocity directions, 3) independent starting points. Under these assumptions, and given their influence on the energy transfer, the energies of the particles after the collision will be randomized.
| |
| | |
| Considering repeated uncorrelated collisions, between any and all of the molecules in the gas, Boltzmann constructed his kinetic equation ([[Boltzmann's equation]]). From this kinetic equation, a natural outcome is that the continual process of collision causes the quantity ''H'' to decrease until it has reached a minimum.
| |
| | |
| == Impact ==
| |
| | |
| Although Boltzmann's H-theorem turned out not to be the absolute proof of the second law of thermodynamics as originally claimed (see Criticisms below), the H-theorem led Boltzmann in the last years of the 19th century to more and more probabilistic arguments about the nature of thermodynamics. The probabilistic view of thermodynamics culminated in 1902 with [[Josiah Willard Gibbs]]'s statistical mechanics for fully general systems (not just gases), and the introduction of generalized [[statistical ensemble (mathematical physics)|statistical ensemble]]s.
| |
| | |
| The kinetic equation and in particular Boltzmann's molecular chaos assumption inspired a whole family of [[Boltzmann equation]]s that are still used today to model the motions of particles, such as the electrons in a semiconductor. In many cases the molecular chaos assumption is highly accurate, and the ability to discard complex correlations between particles makes calculations much simpler.
| |
| | |
| == Criticism of the ''H''-theorem and exceptions ==
| |
| | |
| There are several notable reasons described below why the ''H''-theorem, at least in its original 1871 form, is not completely rigorous. As Boltzmann would eventually go on to admit, the arrow of time in the ''H''-theorem is not in fact purely mechanical, but really a consequence of assumptions about initial conditions.<ref>J. Uffink, "[http://philsci-archive.pitt.edu/2691/1/UffinkFinal.pdf Compendium of the foundations of classical statistical physics.]" (2006)</ref>
| |
| | |
| === Loschmidt's paradox ===
| |
| | |
| {{main|Loschmidt's paradox}}
| |
| | |
| Soon after Boltzmann published his ''H'' theorem, [[Johann Josef Loschmidt]] objected that it should not be possible to deduce an irreversible process from time-symmetric dynamics and a time-symmetric formalism. If the ''H'' decreases over time in one state, then there must be a matching reversed state where ''H'' increases over time ([[Loschmidt's paradox]]). The explanation is that Boltzmann's equation is based on the assumption of "[[molecular chaos]]", i.e., that it is acceptable for all the particles to be considered independent and uncorrelated. In some sense, this in assumption breaks time reversal symmetry and therefore [[begs the question]]. Once the particles are allowed to collide, their velocity directions and positions in fact ''do'' become correlated (however, these correlations are encoded in an extremely complex manner).
| |
| | |
| Boltzmann's reply to Loschmidt was to concede the possibility of these states, but noting that these sorts of states were so rare and unusual as to be impossible in practice. Boltzmann would go on to sharpen this notion of the "rarity" of states, resulting in his famous equation, his entropy formula of 1877 (see [[Boltzmann's entropy formula]]).
| |
| | |
| === Spin echo ===
| |
| | |
| As a demonstration of Loschmidt's paradox, a famous modern counterexample (not to Boltzmann's original gas-related ''H''-theorem, but to a closely related analogue) is the phenomenon of [[spin echo]].<ref>{{cite doi|10.1119/1.1934539}}</ref> In the spin echo effect, it is physically possible to induce time reversal in an interacting system of spins.
| |
| | |
| An analogue to Boltzmann's ''H'' for the spin system can be defined in terms of the distribution of spin states in the system. In the experiment, the spin system is initially perturbed into a non-equilibrium state (high ''H''), and, as predicted by the ''H'' theorem the quantity ''H'' soon decreases to the equilibrium value. At some point, a carefully constructed electromagnetic pulse is applied that reverses the motions of all the spins. The spins then undo the time evolution from before the pulse, and after some time the ''H'' actually ''increases'' away from equilibrium (once the evolution has completely unwound, the ''H'' decreases once again to the minimum value). In some sense, the time reversed states noted by Loschmidt turned out to be not completely impractical.
| |
| | |
| === Poincaré recurrence ===
| |
| | |
| {{expand section|date=September 2013}}
| |
| | |
| In 1896, [[Ernst Zermelo]] noted a further problem with the ''H'' theorem, which was that if the system's ''H'' is at any time not a minimum, then by [[Poincaré recurrence]], the non-minimal ''H'' must recur (though after some extremely long time). Boltzmann admitted that these recurring rises in ''H'' technically would occur, but pointed out that, over long times, the system spends only a tiny fraction of its time in one of these recurring states.
| |
| | |
| === Fluctuations of ''H'' in small systems ===
| |
| | |
| Since ''H'' is a mechanically defined variable that is not conserved, then like any other such variable (pressure, etc.) it will show [[thermal fluctuations]]. This means that ''H'' regularly shows spontaneous increases from the minimum value. Technically this is not an exception to the ''H'' theorem, since the ''H'' theorem was only intended to apply for a gas with a very large number of particles. These fluctuations are only perceptible when the system is small.
| |
| | |
| If ''H'' is interpreted as entropy as Boltzmann intended, then this can be seen as a manifestation of the [[fluctuation theorem]].
| |
| | |
| == Connection to information theory ==
| |
| | |
| ''H'' is a forerunner of Shannon's [[information entropy]]. [[Claude Shannon]] denoted his measure of [[Entropy (information theory)|information entropy]] ''H'' after the H-theorem.<ref>Gleick 2011</ref> The article on Shannon's [[information entropy]] contains a
| |
| [[Shannon entropy#Information entropy explained|good explanation]] of the discrete counterpart of the quantity ''H'', known as the information entropy or information uncertainty (with a minus sign). By [[Shannon entropy#Extending discrete entropy to the continuous case: differential entropy|extending the discrete information entropy to the continuous information entropy]], also called [[differential entropy]], one obtains the expression in Eq.(1), and thus a better feel for the meaning of ''H''.
| |
| | |
| The H-theorem's connection between information and entropy plays a central role in a recent controversy called the [[Black hole information paradox]].
| |
| | |
| == Tolman's ''H''-theorem ==
| |
| | |
| [[Richard C. Tolman|Tolman]]'s 1938 book "The Principles of Statistical Mechanics" dedicates a whole chapter to the study of Boltzmann's ''H'' theorem, and its extension in the generalized classical statistical mechanics of [[Josiah Willard Gibbs|Gibbs]]. A further chapter is devoted to the quantum mechanical version of the ''H''-theorem.
| |
| | |
| === Classical mechanical ===
| |
| | |
| <!-- This section need to be very explicit about the underpinning assumptions of the Boltzmann equation, e.g. Stosszahlansatz etc.
| |
| Technical math derivations are fine but they should go into a collapsible block, and technical terms like "µ-space" must be at least defined!
| |
| -->
| |
| | |
| Starting with a function f that defines the number of molecules in small region of µ-space denoted by <math>\delta q_1 ... \delta p_r</math>
| |
| | |
| :<math>\delta n = f(q_1 ... p_r,t)\delta q_1 ... \delta p_r.\,</math>
| |
| | |
| Tolman offers the following equations for the definition of the quantity H in Boltzmann's original H theorem.
| |
| | |
| : <math>H= \sum_i f_i \ln f_i \,\delta q_1 \cdots \delta p_r</math><ref>Tolman 1938 pg. 135 formula 47.5</ref>
| |
| | |
| Here we sum over the i regions into which µ-space is divided.
| |
| | |
| This relation can also be written in integral form.
| |
| | |
| : <math>H= \int \cdots \int f_i \ln f_i \,d q_1 \cdots dp_r</math><ref>Tolman 1938 pg. 135 formula 47.6</ref>
| |
| | |
| H can also be written in terms of the number of molecules present in the i cells.
| |
| : <math>
| |
| \begin{align}
| |
| H & = \sum( n_i \ln n_i - n_i \ln \delta v_\gamma) \\
| |
| & = \sum n_i \ln n_i + \text{constant}
| |
| \end{align}
| |
| </math><ref name="Tolman">Tolman 1938 pg. 135 formula 47.7</ref><ref name="Tolman" />
| |
| | |
| An additional way to calculate the quantity H is:
| |
| | |
| : <math>H = -\ln P + \text{constant}\,</math><ref>Tolman 1938 pg. 135 formula 47.8</ref>
| |
| | |
| Where ''P'' is the probability of finding a system chosen at random from the specified [[microcanonical ensemble]]
| |
| | |
| And can finally be written as:
| |
| | |
| : <math>H = -\ln G + \text{constant}\,</math><ref>Tolman 1939 pg. 136 formula 47.9</ref>
| |
| | |
| where ''G'' may be spoken of as the number of classical states. | |
| | |
| The quantity ''H'' can also be defined as the integral over velocity space{{Citation needed|date=March 2009}} :
| |
| | |
| :{| style="width:100%" border="0"
| |
| |-
| |
| | style="width:95%" |
| |
| <math> \displaystyle
| |
| H \ \stackrel{\mathrm{def}}{=}\
| |
| \int { P ({\ln P}) \, d^3 v}
| |
| = \left\langle \ln P \right\rangle
| |
| </math>
| |
| | style= | (1)
| |
| |}
| |
| where ''P''(''v'') is the probability.
| |
| | |
| Using the Boltzmann equation one can prove that ''H'' can only decrease.
| |
| | |
| For a system of ''N'' statistically independent particles, ''H'' is related to the thermodynamic entropy ''S'' through:
| |
| | |
| :<math>S \ \stackrel{\mathrm{def}}{=}\ - N k H</math>
| |
| | |
| so, according to the H-theorem, ''S'' can only increase.
| |
| | |
| === Quantum mechanical ===
| |
| | |
| In Quantum Statistical Mechanics (which is the quantum version of Classical Statistical Mechanics), the H-function is the function:<ref>Tolman 1938 pg 460 formula 104.7</ref>
| |
| : <math>H= \sum_i p_i \ln p_i, \,</math>
| |
| where summation runs over all possible distinct states of the system, and ''p<sub>i</sub>'' is the probability that the system could be found in the ''i''-th state.
| |
| | |
| This is closely related to the [[Gibbs entropy|entropy formula of Gibbs]],
| |
| :<math>S = - k \sum_i p_i \ln p_i \;</math>
| |
| and we shall (following e.g., Waldram (1985), p. 39) proceed using ''S'' rather than ''H''.
| |
| | |
| First, differentiating with respect to time gives
| |
| | |
| :<math>\begin{align} | |
| \frac{dS}{dt} & = - k \sum_i \left(\frac{dp_i}{dt} \ln p_i + \frac{dp_i}{dt}\right) \\
| |
| & = - k \sum_i \frac{dp_i}{dt} \ln p_i \\
| |
| \end{align}</math>
| |
| | |
| (using the fact that ∑ ''dp''<sub>''i''</sub>/''dt'' = 0, since ∑ ''p''<sub>''i''</sub> = 1).
| |
| | |
| Now [[Fermi's golden rule]] gives a [[master equation]] for the average rate of quantum jumps from state α to β; and from state β to α. (Of course, Fermi's golden rule itself makes certain approximations, and the introduction of this rule is what introduces irreversibility. It is essentially the quantum version of Boltzmann's ''Stosszahlansatz''.) For an isolated system the jumps will make contributions
| |
| :<math>\begin{align}
| |
| \frac{dp_\alpha}{dt} & = \sum_\beta \nu_{\alpha\beta}(p_\beta - p_\alpha) \\
| |
| \frac{dp_\beta}{dt} & = \sum_\alpha \nu_{\alpha\beta}(p_\alpha - p_\beta) \\
| |
| \end{align}</math>
| |
| where the reversibility of the dynamics ensures that the same transition constant ''ν''<sub>''αβ''</sub> appears in both expressions.
| |
| | |
| So
| |
| | |
| :<math>\frac{dS}{dt} = \frac{1}{2} k \sum_{ \alpha,\beta} \nu_{\alpha\beta}(\ln p_{\beta}-\ln p_{\alpha})(p_{\beta}- p_{\alpha}).</math>
| |
| | |
| But the two brackets will have the same sign, so each contribution to ''dS/dt'' cannot be negative.
| |
| | |
| Therefore
| |
| | |
| :<math>\Delta S \geq 0 \, </math>
| |
| | |
| for an isolated system.
| |
| | |
| The same mathematics is sometimes used to show that relative entropy is a [[Lyapunov function]] of a [[Markov process]] in [[detailed balance]], and other chemistry contexts.
| |
| | |
| == Gibbs' ''H''-theorem ==
| |
| | |
| [[File:Hamiltonian flow classical.gif|frame|Evolution of an ensemble of [[Hamiltonian mechanics|classical]] systems in [[phase space]] (top). Each system consists of one massive particle in a one-dimensional [[potential well]] (red curve, lower figure). The initially compact ensemble becomes swirled up over time.]]
| |
| | |
| [[Josiah Willard Gibbs]] described another way in which the entropy of a microscopic system would tend to increase over time.<ref name="gibbs12">Chapter XII, from {{cite book |last=Gibbs |first=Josiah Willard |authorlink=Josiah Willard Gibbs |title=[[Elementary Principles in Statistical Mechanics]] |year=1902 |publisher=[[Charles Scribner's Sons]] |location=New York}}</ref> Later writers have called this "Gibbs' ''H''-theorem" as its conclusion resembles that of Boltzmann's.<ref name="tolman">{{cite isbn|9780486638966}}</ref> Gibbs himself never called it an ''H''-theorem, and in fact his definition of entropy—and mechanism of increase—are very different from Boltzmann's. This section is included for historical completeness.
| |
| | |
| The setting of Gibbs' entropy production theorem is in [[statistical ensemble (mathematical physics)|ensemble]] statistical mechanics, and the entropy quantity is the [[Gibbs entropy]] (information entropy) defined in terms of the probability distribution for the entire state of the system. This is in contrast to Boltzmann's ''H'' defined in terms of the distribution of states of individual molecules, within a specific state of the system.
| |
| | |
| Gibbs considered the motion of an ensemble which initially starts out confined to a small region of phase space, meaning that the state of the system is known with fair precision though not quite exactly (low Gibbs entropy). The evolution of this ensemble over time proceeds according to [[Liouville's theorem (Hamiltonian)|Liouville's equation]]. For almost any kind of realistic system, the Liouville evolution tends to "stir" the ensemble over phase space, a process analogous to the mixing of a dye in an incompressible fluid.<ref name="gibbs12"/> After some time, the ensemble appears to be spread out over phase space, although it is actually a finely striped pattern, with the total volume of the ensemble (and its Gibbs entropy) conserved. Liouville's equation is guaranteed to conserve Gibbs entropy since there is no random process acting on the system; in principle, the original ensemble can be recovered at any time by reversing the motion.
| |
| | |
| The critical point of the theorem is thus: If the fine structure in the stirred-up ensemble is very slightly blurred, for any reason, then the Gibbs entropy increases, and the ensemble becomes an equilibrium ensemble. As to why this blurring should occur in reality, there are a variety of suggested mechanisms. For example, one suggested mechanism is that the phase space is coarse-grained for some reason (analogous to the pixelization in the simulation of phase space shown in the figure). For any required finite degree of fineness the ensemble becomes "sensibly uniform" after a finite time. Or, if the system experiences a tiny uncontrolled interaction with its environment, the sharp coherence of the ensemble will be lost. [[Edwin Thompson Jaynes]] argued that the blurring is anthropomorphic in nature, simply corresponding to a loss of knowledge about the state of the system.<ref name="jaynes1965">E.T. Jaynes; Gibbs vs Boltzmann Entropies; American Journal of Physics,391,1965</ref> In any case, however it occurs, the Gibbs entropy increase is irreversible provided the blurring cannot be reversed.
| |
| | |
| Gibbs' entropy increase mechanism solves some of the technical difficulties found in Boltzmann's ''H''-theorem: The Gibbs entropy does not fluctuate nor does it exhibit Poincare recurrence, and so the increase in Gibbs entropy, when it occurs, is therefore irreversible as expected from thermodynamics. The Gibbs mechanism also applies equally well to systems with very few degrees of freedom, such as the single-particle system shown in the figure. To the extent that one accepts that the ensemble becomes blurred, then, Gibbs' approach is a cleaner proof of the [[second law of thermodynamics]].<ref name="jaynes1965"/>
| |
| | |
| == See also ==
| |
| | |
| * [[Loschmidt's paradox]]
| |
| * [[Arrow of time]]
| |
| * [[Second law of thermodynamics]]
| |
| * [[Fluctuation theorem]]
| |
| | |
| ==Notes==
| |
| {{Reflist}}
| |
| | |
| == References ==
| |
| {{refbegin}}
| |
| * {{cite book
| |
| | last1=Lifshitz |first1=E. M.
| |
| | last2=Pitaevskii |first2=L. P.
| |
| | year=1981
| |
| | title=Physical Kinetics
| |
| | edition=3rd
| |
| | series=[[Course of Theoretical Physics]]
| |
| | volume=10
| |
| | publisher=[[Pergamon Press|Pergamon]]
| |
| | isbn=0-08-026480-8
| |
| }}
| |
| * {{cite book
| |
| | last=Waldram |first=J. R.
| |
| | year=1985
| |
| | title=The Theory of Thermodynamics
| |
| | publisher=[[Cambridge University Press]]
| |
| | isbn=0-521-28796-0
| |
| }}
| |
| * {{cite book
| |
| | last=Tolman |first=R. C.
| |
| | year=1938
| |
| | title=The Principles of Statistical Mechanics
| |
| | publisher=[[Oxford University Press]]
| |
| }}
| |
| * {{cite book
| |
| |last=Gull |first=S. F.
| |
| |year=1989
| |
| |chapterurl=http://www.ucl.ac.uk/~ucesjph/reality/entropy/text.html
| |
| |chapter=Some Misconceptions about Entropy
| |
| |editor1-last=Buck |editor1-first=B.
| |
| |editor2-last=Macaulay |editor2-first=V. A
| |
| |publication-date=1991
| |
| |title=Maximum Entropy in Action
| |
| |publisher=[[Oxford University Press]]
| |
| |isbn=0-19-853963-0
| |
| |accessdate=2012-02-05
| |
| }}
| |
| *{{cite book
| |
| | last=Reif |first=F.
| |
| | title=Fundamentals of Statistical and Thermal Physics
| |
| | year=1965
| |
| | publisher=[[McGraw-Hill]]
| |
| | isbn=978-0-07-051800-1
| |
| }}
| |
| * {{cite book
| |
| | last=Gleick |first=J.
| |
| | title=[[The Information: A History, a Theory, a Flood]]
| |
| | publisher=[[Random House|Random House Digital]]
| |
| | year=2011
| |
| | isbn=978-0-375-42372-7
| |
| }}
| |
| * {{cite journal
| |
| |last1=Badino |first1=M.
| |
| |year=2011
| |
| |title=Mechanistic Slumber vs. Statistical Insomnia: The early history of Boltzmann's H-theorem (1868–1877)
| |
| |journal=[[European Physical Journal H]]
| |
| |doi=10.1140/epjh/e2011-10048-5
| |
| |bibcode = 2011EPJH...36..353B }}
| |
| {{refend}}
| |
| | |
| {{DEFAULTSORT:H-Theorem}}
| |
| [[Category:Non-equilibrium thermodynamics]]
| |
| [[Category:Thermodynamic entropy]]
| |
| [[Category:Philosophy of thermal and statistical physics]]
| |
| [[Category:Physics theorems]]
| |
| [[Category:Concepts in physics]]
| |
| [[Category:Statistical mechanics theorems]]
| |