|
|
Line 1: |
Line 1: |
| {{distinguish|Maxwell–Boltzmann distribution}}
| | A little older video games ought not to be discarded. They may be worth some money at several video retailers. When you buy and sell a number of game titles, you can even get your upcoming distinction at no cost!<br><br> |
| {{Statistical mechanics|cTopic=[[Particle statistics|Particle Statistics]]}}
| |
| [[Image:Maxwell-Boltzmann distribution 1.png|thumb|300px|Maxwell–Boltzmann statistics can be used to derive the [[Maxwell–Boltzmann distribution]] of particle speeds in an [[ideal gas]]. Shown: distribution of particle speed for 10<sup>6</sup> oxygen particles at <br />−100, 20 and 600 degrees Celsius.]]
| |
| In [[statistical mechanics]], '''Maxwell–Boltzmann statistics''' describes the average distribution of non-interacting material particles over various energy states in [[thermal equilibrium]], and is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible.
| |
|
| |
|
| The expected [[number of particles]] with energy <math>\varepsilon_i</math> for Maxwell–Boltzmann statistics is <math>\langle N_i \rangle</math> where:
| | Flipping from band blueprint with your besprinkle blueprint provides you some sort of put in authentic picture. The main accumbent time arbor is actually scaled evenly. If you are you looking for more information about [http://prometeu.net clash of clans hack android] review the web page. Yet , it's adamantine to always able to acquaint is actually activity now within currently the bottom-left bend now. The ethics are so bunched up you will certainly not acquaint them very far nowadays.<br><br>Vehicle which play clash of clans are looking for ways of getting spare gems. The flagstones are very important because they give the player functions and the power enhance their gaming experience. As opposed to new equivalent games in cell phone websites, especially those even individuals use various cracks in buy to get these practical information forward free, the nature of farmville and its page architecture does not enable almost varieties of hacks that a person can put to the mission. Everyone is always looking for ways for you to get free gems with regard to clash of clans nonetheless most important thing to do is to employ a great venue to earn these visitors for free, save associated with suitably and use available where necessary.<br><br>Don't be frightened to rid themselves of. It's normal on wish to play within opponents who are throughout or below your effectiveness level. In the end, it is ' interesting to always get rid of! There's, still, an important weakness to this scheme to there is no benefits to progress. Anyone are playing against because they came from are better than you, you'll learn from the best own mistakes and you should be on their degree appropriate away.<br><br>Like important to agenda your apple is consistently attach from association war dilemmas because association wars may be fought inside a modified breadth absolutely -- the following war zone. When it comes to the war region, individuals adapt and advance warfare bases instead of acknowledged villages; therefore, your towns resources, trophies, and absorber are never in jeopardy.<br><br>All of this construction is what options that you can be a part of any kind of a clan, however it nearby muscles near houses reinforcement troops. Click a button to help ask your clan to actually send you some troops, and they are going to be out currently there to make use about in assaults, or to finally defend your base for you while you're worries your weekly LARPing company. Upgrading this getting permits extra troops to be stored for security. You may would need 20 available slots to get a dragon. This is a nice base for players looking to shield trophies in addition to never worried about fontaine. Players will consider it hard to clean out your [http://Www.Reddit.com/r/howto/search?q=city+hall city hall]. Most will settle for the easy get hold of and take out very own assets.<br><br>Look for game of the a year versions of major titles. These often come out a years or maybe more after the original title, but consist lots of the down-loadable and [http://Www.bing.com/search?q=extra+content&form=MSNNWS&mkt=en-us&pq=extra+content extra content] which had been released in stages subsequent the first title. Of these games offer a additional bang for the bill. |
| | |
| :<math>
| |
| \langle N_i \rangle = \frac {g_i} {e^{(\varepsilon_i-\mu)/kT}} = \frac{N}{Z}\,g_i e^{-\varepsilon_i/kT}
| |
| </math>
| |
| where:
| |
| *<math>\langle N_i \rangle</math> is the number of particles in state ''i''
| |
| *<math>\varepsilon_i</math> is the [[energy]] of the ''i''-th state
| |
| *<math>g_i</math> is the [[Degenerate energy level|degeneracy]] of energy level ''i''. i.e. no. of states with energy <math>\varepsilon_i</math>
| |
| *μ is the [[chemical potential]]
| |
| *''k'' is [[Boltzmann's constant]]
| |
| *''T'' is absolute [[temperature]]
| |
| *''N'' is the total number of particles
| |
| ::<math>N=\sum_i N_i\,</math>
| |
| *''Z'' is the [[Partition function (statistical mechanics)|partition function]]
| |
| ::<math>Z=\sum_i g_i e^{-\varepsilon_i/kT}</math>
| |
| *e<sup>(...)</sup> is the [[exponential function]]
| |
| | |
| Equivalently, the particle number is sometimes expressed as
| |
| | |
| :<math>
| |
| \langle N_i \rangle = \frac {1} {e^{(\varepsilon_i-\mu)/kT}} = \frac{N}{Z}\,e^{-\varepsilon_i/kT}
| |
| </math> | |
| | |
| where the index ''i'' now specifies a particular state rather than the set of all states with energy <math>\varepsilon_i</math>, and <math>Z=\sum_i e^{-\varepsilon_i/kT}</math>
| |
| | |
| ==Applications==
| |
| | |
| Maxwell–Boltzmann statistics may be used to derive the [[Maxwell–Boltzmann distribution]] (for an ideal gas of classical particles in a three-dimensional box), however they apply to other situations as well. Maxwell–Boltzmann statistics can be used to extend that distribution to particles with a different [[Energy–momentum relation]], such as relativistic particles ([[Maxwell–Jüttner distribution]]). In addition, hypothetical situations can be considered such as particles in a box with different number of dimensions (four-dimensional, two-dimensional, etc.).
| |
| | |
| ==Limits of applicability==
| |
| | |
| Maxwell–Boltzmann statistics are often described as the statistics of "distinguishable" classical particles. In other words the configuration of particle ''A'' in state 1 and particle B in state 2 is different from the case where particle ''B'' is in state 1 and particle ''A'' is in state 2. This assumption leads to the proper (Boltzmann) statistics of particles in the energy states, but yields non-physical results for the entropy, as embodied in the [[Gibbs paradox]].
| |
| | |
| Technically speaking, however, there are no real particles which have the characteristics required by Maxwell-Boltzmann statistics.
| |
| Indeed, the Gibbs paradox is resolved if we treat all particles of a certain type (e.g., electrons, protons, etc.) as indistinguishable, and this assumption can be justified in the context of quantum mechanics. Once this assumption is made, however, the particle statistics change.
| |
| Quantum particles are either bosons (following instead [[Bose–Einstein statistics]]) or fermions (subject to the [[Pauli exclusion principle]], following instead [[Fermi–Dirac statistics]]).
| |
| | |
| Both of these quantum statistics approach the Maxwell–Boltzmann statistics in the limit of high temperature and low particle density, without the need for any ad hoc assumptions. The Fermi–Dirac and Bose–Einstein statistics give the energy level occupation as:
| |
| :<math>
| |
| \langle N_i \rangle = \frac{g_i}{e^{(\varepsilon_i-\mu)/kT}\mp 1}.
| |
| </math>
| |
| | |
| It can be seen that the condition under which the Maxwell–Boltzmann statistics are valid is when
| |
| :<math>e^{(\varepsilon_{\rm min}-\mu)/kT} \gg 1, </math>
| |
| where <math>\varepsilon_{\rm min}</math> is the lowest (minimum) value of <math>\varepsilon_i</math>.
| |
| | |
| Maxwell–Boltzmann statistics are particularly useful for studying [[gas]]es, whereas Fermi–Dirac statistics are most often used for the study of [[electron]]s in [[solid]]s. Bose–Einstein statistics are important for [[blackbody radiation]]. Note however that none of these statistics are general, as they all assume that the particles are non-interacting (they all assume a static ladder of energy states).
| |
| | |
| == Derivations of Maxwell–Boltzmann statistics ==
| |
| | |
| Maxwell-Boltzmann statistics can be derived in various [[statistical mechanics|statistical mechanical]] thermodynamic ensembles:<ref name="tolman">{{cite isbn|9780486638966}}</ref>
| |
| * The [[grand canonical ensemble]], exactly.
| |
| * The [[canonical ensemble]], exactly.
| |
| * The [[microcanonical ensemble]], but only in the thermodynamic limit.
| |
| In each case it is necessary to assume that the particles are non-interacting, and that multiple particles can occupy the same state and do so independently.
| |
| | |
| === Derivation from microcanonical ensemble ===
| |
| {{Technical|section|date=December 2013}}
| |
| Suppose we have a container with a huge number of very small particles all with identical physical characteristics (mass, charge, etc.). Let's refer to this as the ''system''. Assume that though the particles have identical properties, they are distinguishable. For example, we might identify each particle by continually observing their trajectories, or by placing a marking on each one, e.g., drawing a different number on each one as is done with lottery balls.
| |
| | |
| The particles are moving inside that container in all directions with great speed. Because the particles are speeding around, they possess some energy. The Maxwell–Boltzmann distribution is a mathematical function that speaks about how many particles in the container have a certain energy.
| |
| | |
| In general, there may be many particles with the same amount of energy <math>\varepsilon</math>. Let the number of particles with the same energy <math>\varepsilon_1</math> be <math>N_1</math>, the number of particles possessing another energy <math>\varepsilon_2</math> be <math>N_2</math>, and so forth for all the possible energies {<math>\varepsilon_i</math> | i=1,2,3,...}. To describe this situation, we say that <math>N_i</math> is the ''occupation number'' of the ''energy level'' <math>i.</math> If we know all the occupation numbers {<math>N_i</math> | i=1,2,3,...}, then we know the total energy of the system. However, because we can distinguish between ''which'' particles are occupying each energy level, the set of occupation numbers {<math>N_i</math> | i=1,2,3,...} does not completely describe the state of the system. To completely describe the state of the system, or the ''microstate'', we must specify exactly which particles are in each energy level. Thus when we count the number of possible states of the system, we must count each and every microstate, and not just the possible sets of occupation numbers.
| |
| | |
| To begin with, let's ignore the degeneracy problem: assume that there is only one way to put <math>N_i</math> particles into the energy level <math>i</math> . What follows next is a bit of combinatorial thinking which has little to do in accurately describing the reservoir of particles.
| |
| | |
| The number of different ways of performing an ordered selection of one single object from ''N'' objects is obviously ''N''. The number of different ways of selecting two objects from ''N'' objects, in a particular order, is thus ''N''(''N'' − 1) and that of selecting ''n'' objects in a particular order is seen to be ''N''!/(''N'' − ''n'')<nowiki>!</nowiki>. It is divided by the number of [[permutations]], ''n''!, if order does not matter. The [[binomial coefficient]], ''N''!/(''n''!(''N'' − ''n'')!), is, thus, the number of ways to pick ''n'' objects from <math>N</math>. If we now have a set of boxes labelled ''a, b, c, d, e, ..., k'', then the number of ways of selecting ''N<sub>a</sub>'' objects from a total of ''N'' objects and placing them in box ''a'', then selecting ''N<sub>b</sub>'' objects from the remaining ''N'' − ''N<sub>a</sub>'' objects and placing them in box ''b'', then selecting ''N<sub>c</sub>'' objects from the remaining ''N'' − ''N<sub>a</sub>'' − ''N<sub>b</sub>'' objects and placing them in box ''c'', and continuing until no object is left outside is
| |
| | |
| :<math>
| |
| \begin{align}
| |
| W & = \frac{N!}{N_a!(N-N_a)!} \times \frac{(N-N_a)!}{N_b!(N-N_a-N_b)!} ~ \times \frac{(N-N_a-N_b)!}{N_c!(N-N_a-N_b-N_c)!} \times \ldots \times \frac{(N-\ldots-N_l)!}{N_k!(N-\ldots-N_l-N_k)!} = \\ \\
| |
| & = \frac{N!}{N_a!N_b!N_c!\ldots N_k!(N-\ldots-N_l-N_k)!}
| |
| \end{align}
| |
| </math>
| |
| | |
| and because not even a single object is to be left outside the boxes, implies that the sum made of the terms ''N<sub>a</sub>, N<sub>b</sub>, N<sub>c</sub>, N<sub>d</sub>, N<sub>e</sub>, ..., N<sub>k</sub>'' must equal ''N'', thus the term ''(N - N<sub>a</sub> - N<sub>b</sub> - N<sub>c</sub> - ... - N<sub>l</sub> - N<sub>k</sub>)!'' in the relation above evaluates to ''0!''. (0!=1) which makes possible to write down that relation as
| |
| :<math>
| |
| \begin{align}
| |
| W & = N!\prod_{i=a,b,c,...}^k \frac{1}{N_i!}
| |
| \end{align}
| |
| </math>
| |
| | |
| Now going back to the degeneracy problem which characterize the reservoir of particles. If the ''i''-th box has a "degeneracy" of <math>g_i</math>, that is, it has <math>g_i</math> "sub-boxes", such that any way of filling the ''i''-th box where the number in the sub-boxes is changed is a distinct way of filling the box, then the number of ways of filling the ''i''-th box must be increased by the number of ways of distributing the <math>N_i</math> objects in the <math>g_i</math> "sub-boxes". The number of ways of placing <math>N_i</math> distinguishable objects in <math>g_i</math> "sub-boxes" is <math>g_i^{N_i}</math> (the first object can go into any of the <math>g_i</math> boxes, the second object can also go into any of the <math>g_i</math> boxes, and so on). Thus the number of ways <math>W</math> that a total of <math>N</math> particles can be classified into energy levels according to their energies, while each level <math>i</math> having <math>g_i</math> distinct states such that the ''i''-th level accommodates <math>N_i</math> particles is:
| |
| | |
| :<math>W=N!\prod \frac{g_i^{N_i}}{N_i!}</math>
| |
| | |
| This is the form for ''W'' first derived by [[Ludwig Boltzmann|Boltzmann]]. Boltzmann's fundamental equation <math>S=k\,\ln W</math> relates the thermodynamic [[entropy]] ''S'' to the number of microstates ''W'', where ''k'' is the [[Boltzmann constant]]. It was pointed out by [[Josiah Willard Gibbs|Gibbs]] however, that the above expression for ''W'' does not yield an extensive entropy, and is therefore faulty. This problem is known as the [[Gibbs paradox]]. The problem is that the particles considered by the above equation are not [[Identical particles|indistinguishable]]. In other words, for two particles (''A'' and ''B'') in two energy sublevels the population represented by [A,B] is considered distinct from the population [B,A] while for indistinguishable particles, they are not. If we carry out the argument for indistinguishable particles, we are led to the [[Bose-Einstein statistics|Bose-Einstein]] expression for ''W'':
| |
| | |
| :<math>W=\prod_i \frac{(N_i+g_i-1)!}{N_i!(g_i-1)!}</math>
| |
| | |
| Both the Maxwell-Boltzmann distribution and the Bose-Einstein distribution are only valid for temperatures well above absolute zero, implying that <math>g_i\gg 1</math>. The Maxwell-Boltzmann distribution also requires low density, implying that <math>g_i\gg N_i</math>. Under these conditions, we may use Stirling's approximation for the factorial:
| |
| | |
| :<math>
| |
| N! \approx N^N e^{-N},
| |
| </math>
| |
| | |
| to write:
| |
| | |
| :<math>W\approx\prod_i \frac{(N_i+g_i)^{N_i+g_i}}{N_i^{N_i}g_i^{g_i}}\approx\prod_i \frac{g_i^{N_i}(1+N_i/g_i)^{g_i}}{N_i^{N_i}}</math>
| |
| | |
| Using the fact that <math>(1+N_i/g_i)^{g_i}\approx e^{N_i}</math> for <math>g_i\gg N_i</math> we can again use Stirlings approximation to write:
| |
| | |
| :<math>W\approx\prod_i \frac{g_i^{N_i}}{N_i!}</math>
| |
| | |
| This is essentially a division by ''N!'' of Boltzmann's original expression for ''W'', and this correction is referred to as '''correct Boltzmann counting'''.
| |
| | |
| We wish to find the <math>N_i</math> for which the function <math>W</math> is maximized, while considering the constraint that there is a fixed number of particles <math>\left(N=\textstyle\sum N_i\right)</math> and a fixed energy <math>\left(E=\textstyle\sum N_i \varepsilon_i\right)</math> in the container. The maxima of <math>W</math> and <math>\ln(W)</math> are achieved by the same values of <math>N_i</math> and, since it is easier to accomplish mathematically, we will maximize the latter function instead. We constrain our solution using [[Lagrange multipliers]] forming the function:
| |
| | |
| :<math>
| |
| f(N_1,N_2,\ldots,N_n)=\ln(W)+\alpha(N-\sum N_i)+\beta(E-\sum N_i \varepsilon_i)
| |
| </math>
| |
| | |
| :<math>
| |
| \ln W=\ln\left[\prod\limits_{i=1}^{n}\frac{g_i^{N_i}}{N_i!}\right] \approx \sum\limits_{i=1}^n\left(N_i\ln g_i-N_i\ln N_i + N_i\right)
| |
| </math>
| |
| | |
| Finally
| |
| | |
| :<math>
| |
| f(N_1,N_2,\ldots,N_n)=\alpha N +\beta E +
| |
| \sum\limits_{i=1}^n\left(N_i\ln g_i-N_i\ln N_i + N_i-(\alpha+\beta\varepsilon_i) N_i\right)
| |
| </math>
| |
| | |
| In order to maximize the expression above we apply [[Fermat's theorem (stationary points)]], according to which local extrema, if exist, must be at critical points (partial derivatives vanish):
| |
| | |
| :<math>
| |
| \frac{\partial f}{\partial N_i}=\ln g_i-\ln N_i -(\alpha+\beta\varepsilon_i) = 0
| |
| </math>
| |
| | |
| By solving the equations above (<math>i=1\ldots n</math>) we arrive to an expression for <math>N_i</math>:
| |
| | |
| :<math>
| |
| N_i = \frac{g_i}{e^{\alpha+\beta \varepsilon_i}}
| |
| </math>
| |
| | |
| Substituting this expression for <math>N_i</math> into the equation for <math>\ln W</math> and assuming that <math>N\gg 1</math> yields:
| |
| | |
| :<math>\ln W = \alpha N+\beta E\,</math>
| |
| | |
| or, differentiating and rearranging: | |
| | |
| :<math>dE=\frac{1}{\beta}d\ln W-\frac{\alpha}{\beta}dN</math>
| |
| | |
| Boltzmann realized that this is just an expression of the [[second law of thermodynamics]]. Identifying ''dE'' as the internal energy, the second law of thermodynamics states that for variation only in entropy (''S'') and particle number (''N''):
| |
| | |
| :<math>dE=T\,dS+\mu\,dN</math>
| |
| | |
| where ''T'' is the [[temperature]] and μ is the [[chemical potential]]. Boltzmann's famous equation <math>S=k\,\ln W</math> is the realization that the entropy is proportional to <math>\ln W</math> with the constant of proportionality being [[Boltzmann's constant]]. It follows immediately that <math>\beta=1/kT</math> and <math>\alpha=-\mu/kT</math> so that the populations may now be written:
| |
| | |
| :<math>
| |
| N_i = \frac{g_i}{e^{(\varepsilon_i-\mu)/kT}}
| |
| </math>
| |
| | |
| Note that the above formula is sometimes written:
| |
| | |
| :<math>
| |
| N_i = \frac{g_i}{e^{\varepsilon_i/kT}/z}
| |
| </math>
| |
| | |
| where <math>z=\exp(\mu/kT)</math> is the absolute [[activity (chemistry)|activity]].
| |
| | |
| Alternatively, we may use the fact that
| |
| | |
| :<math>\sum_i N_i=N\,</math>
| |
| | |
| to obtain the population numbers as | |
| | |
| :<math>
| |
| N_i = N\frac{g_i e^{-\varepsilon_i/kT}}{Z}
| |
| </math>
| |
| | |
| where ''Z'' is the [[Partition function (statistical mechanics)|partition function]] defined by:
| |
| | |
| :<math>
| |
| Z = \sum_i g_i e^{-\varepsilon_i/kT}
| |
| </math>
| |
| | |
| === Derivation from canonical ensemble ===
| |
| {{Technical|section|date=December 2013}}
| |
| In the above discussion, the Boltzmann distribution function was obtained via directly analysing the multiplicities of a system. Alternatively, one can make use of the [[canonical ensemble]]. In a canonical ensemble, a system is in thermal contact with a reservoir. While energy is free to flow between the system and the reservoir, the reservoir is thought to have infinitely large heat capacity as to maintain constant temperature, ''T'', for the combined system.
| |
| | |
| In the present context, our system is assumed to have the energy levels <math>\varepsilon _i</math> with degeneracies <math>g_i</math>. As before, we would like to calculate the probability that our system has energy <math>\varepsilon_i</math>.
| |
| | |
| If our system is in state <math>\; s_1</math>, then there would be a corresponding number of microstates available to the reservoir. Call this number <math>\; \Omega _ R (s_1)</math>. By assumption, the combined system (of the system we are interested in and the reservoir) is isolated, so all microstates are equally probable. Therefore, for instance, if <math> \; \Omega _ R (s_1) = 2 \; \Omega _ R (s_2) </math>, we can conclude that our system is twice as likely to be in state <math>\; s_1</math> than <math>\; s_2</math>. In general, if <math>\; P(s_i)</math> is the probability that our system is in state <math>\; s_i</math>,
| |
| | |
| :<math>\frac{P(s_1)}{P(s_2)} = \frac{\Omega _ R (s_1)}{\Omega _ R (s_2)}.</math>
| |
| | |
| Since the entropy of the reservoir <math>\; S_R = k \ln \Omega _R</math>, the above becomes
| |
| | |
| :<math>\frac{P(s_1)}{P(s_2)} = \frac{ e^{S_R(s_1)/k} }{ e^{S_R(s_2)/k} } = e^{(S_R (s_1) - S_R (s_2))/k}.</math>
| |
| | |
| Next we recall the thermodynamic identity (from the [[first law of thermodynamics]]):
| |
| | |
| :<math>d S_R = \frac{1}{T} (d U_R + P \, d V_R - \mu \, d N_R).</math>
| |
| | |
| In a canonical ensemble, there is no exchange of particles, so the <math>d N_R</math> term is zero. Similarly, <math>d V_R = 0.</math> This gives
| |
| | |
| :<math> S_R (s_1) - S_R (s_2) = \frac{1}{T} (U_R (s_1) - U_R (s_2)) = - \frac{1}{T} (E(s_1) - E(s_2)),</math>
| |
| | |
| where <math>\; U_R (s_i) </math> and <math>\; E(s_i) </math> denote the energies of the reservoir and the system at <math>s_i</math>, respectively. For the second equality we have used the conservation of energy. Substituting into the first equation relating <math>P(s_1), \; P(s_2)</math>:
| |
| | |
| :<math>
| |
| \frac{P(s_1)}{P(s_2)} = \frac{ e^{ - E(s_1) / kT } }{ e^{ - E(s_2) / kT} },
| |
| </math> | |
| | |
| which implies, for any state ''s'' of the system
| |
| | |
| :<math>
| |
| P(s) = \frac{1}{Z} e^{- E(s) / kT},
| |
| </math>
| |
| | |
| where ''Z'' is an appropriately chosen "constant" to make total probability 1. (''Z'' is constant provided that the temperature ''T'' is invariant.) It is obvious that
| |
| | |
| :<math>\; Z = \sum _s e^{- E(s) / kT}, </math>
| |
| | |
| where the index ''s'' runs through all microstates of the system. ''Z'' is sometimes called the Boltzmann '''sum over states''' (or "Zustandsumme" in the original German). If we index the summation via the energy eigenvalues instead of all possible states, degeneracy must be taken into account. The probability of our system having energy <math>\varepsilon _i</math> is simply the sum of the probabilities of all corresponding microstates:
| |
| | |
| :<math>P (\varepsilon _i) = \frac{1}{Z} g_i e^{- \varepsilon_i / kT}</math>
| |
| | |
| where, with obvious modification,
| |
| | |
| :<math>Z = \sum _j g_j e^{- \varepsilon _j / kT},</math> | |
| | |
| this is the same result as before.
| |
| | |
| Comments on this derivation:
| |
| *Notice that in this formulation, the initial assumption "... ''suppose the system has total ''N'' particles''..." is dispensed with. Indeed, the number of particles possessed by the system plays no role in arriving at the distribution. Rather, how many particles would occupy states with energy <math>\varepsilon _i</math> follows as an easy consequence.
| |
| *What has been presented above is essentially a derivation of the canonical partition function. As one can see by comparing the definitions, the Boltzmann sum over states is equal to the canonical partition function.
| |
| *Exactly the same approach can be used to derive [[Fermi–Dirac statistics|Fermi–Dirac]] and [[Bose–Einstein statistics|Bose–Einstein]] statistics. However, there one would replace the canonical ensemble with the [[grand canonical ensemble]], since there is exchange of particles between the system and the reservoir. Also, the system one considers in those cases is a single particle ''state'', not a particle. (In the above discussion, we could have assumed our system to be a single atom.)
| |
| | |
| ==See also== | |
| *[[Bose–Einstein statistics]]
| |
| *[[Fermi–Dirac statistics]]
| |
| *[[Boltzmann factor]]
| |
| | |
| ==References==
| |
| {{reflist}}
| |
| | |
| ==Bibliography==
| |
| *Carter, Ashley H., "Classical and Statistical Thermodynamics", Prentice–Hall, Inc., 2001, New Jersey.
| |
| *[[Raj Pathria]], "Statistical Mechanics", Butterworth–Heinemann, 1996.
| |
| | |
| {{Statistical mechanics topics}}
| |
| | |
| {{DEFAULTSORT:Maxwell-Boltzmann Statistics}}
| |
| [[Category:Concepts in physics]]
| |
| [[Category:Maxwell–Boltzmann statistics| ]]
| |