|
|
Line 1: |
Line 1: |
| {{Thermodynamics|cTopic=[[List of thermodynamic properties|System properties]]}}
| | The author's title is Andera and she thinks it sounds fairly great. Invoicing is what I do for a residing but I've usually wanted my personal business. What me and my family members adore is doing ballet but I've been using on new issues recently. I've usually loved residing in Kentucky but now I'm considering other options.<br><br>Review my weblog - [http://brazil.amor-amore.com/irboothe psychic readings] |
| {{Introductory article|Entropy}}
| |
| The idea of "[[irreversibility]]" is central to the understanding of '''[[entropy]]'''. Everyone has an intuitive understanding of irreversibility - if one watches a movie of everyday life running forward and one of it running in reverse, it is easy to distinguish between the two. The movie running in reverse shows impossible things happening - water jumping out of a glass into a pitcher above it, smoke going down a chimney, water "unmelting" to form ice in a warm room, crashed cars reassembling themselves, and so on. The intuitive meaning of expressions such as "don't cry over spilled milk" or "you can't take the cream out of the coffee" is that these are irreversible processes. There is a direction in time by which spilled milk does not go back into the glass. | |
| | |
| In thermodynamics, one says that the "forward" processes - pouring water from a pitcher, smoke going up a chimney, etc. are "irreversible" - they cannot happen in reverse, even though, on a microscopic level, no [[laws of physics]] are being violated. All real physical processes involving systems in everyday life, with many atoms or molecules, are irreversible. <u>For an irreversible process in an [[isolated system]], the thermodynamic state variable known as entropy is always increasing.</u> The reason that the movie in reverse is so easily recognized is because it shows processes for which entropy is decreasing, which is physically impossible (or, more correctly, statistically improbable){{Cn|date=January 2014}}. In everyday life, there may be processes in which the increase of entropy is practically unobservable, almost zero. In these cases, a movie of the process run in reverse will not seem unlikely. For example, in a 1-second video of the collision of two billiard balls, it will be hard to distinguish the forward and the backward case, because the increase of entropy during that time is relatively small. In thermodynamics, one says that this process is practically "reversible", with an entropy increase that is practically zero. The statement of the fact that entropy never decreases is found in the [[second law of thermodynamics]].
| |
| | |
| In a physical system, '''entropy''' provides a measure of the amount of thermal energy that ''cannot'' be used to do [[Work (thermodynamics)|work]]. In some other definitions of entropy, it is a measure of how evenly energy (or some analogous property) is distributed in a system. ''Work'' and ''[[heat]]'' are determined by a process that a system undergoes, and only occur at the boundary of a system. ''Entropy'' is a function of the state of a system, and has a value determined by the state variables of the system.
| |
| | |
| The concept of entropy is central to the second law of thermodynamics. The second law determines which physical processes can occur. For example, it predicts that the flow of heat from a region of high temperature to a region of low temperature is a [[spontaneous process]] – it can proceed along by itself without needing any extra external energy. When this process occurs, the hot region becomes cooler and the cold region becomes warmer. Heat is distributed more evenly throughout the system and the system's ability to do work has decreased because the temperature difference between the hot region and the cold region has decreased. Referring back to our definition of entropy, we can see that the entropy of this system has increased. Thus, the second law of thermodynamics can be stated to say that the entropy of an isolated system always increases, and such processes which increase entropy can occur spontaneously. Since entropy increases as uniformity increases, the second law says qualitatively that uniformity increases.
| |
| | |
| The term ''entropy'' was coined in 1865 by the German physicist [[Rudolf Clausius]], from the Greek words ''en-'', "in", and ''trope'' "a turning", in analogy with ''[[energy]]''.<ref>{{Cite web|title=etymonline.com:entropy|url=http://www.etymonline.com/index.php?search=entropy&searchmode=none|accessdate=2009-06-15}}</ref>
| |
| | |
| ==Explanation==
| |
| The concept of thermodynamic entropy arises from the [[second law of thermodynamics]]. It uses entropy to quantify the capacity of a system for change, namely that heat flows from a region of higher temperature to one with lower temperature, and to determine whether a thermodynamic process may occur.
| |
| | |
| Entropy is defined by two descriptions, first as a [[macroscopic]] relationship between [[heat flow]] into a system and the system's change in temperature, and second, on a microscopic level, as the [[natural logarithm]] of the number of [[Microstate (statistical mechanics)|microstates]] of a system.
| |
| | |
| Following the formalism of Clausius, the first definition can be mathematically stated as:<ref>I. Klotz, R. Rosenberg, ''Chemical Thermodynamics - Basic Concepts and Methods'', 7th ed., Wiley (2008), p. 125</ref>
| |
| : <math>{\rm d}S = \frac{{\rm \delta}q}{T}.</math>
| |
| Where d''S'' is the change in entropy, δ''q'' is the heat added to the system, which holds only during a ''reversible'' process {{Why|date=February 2013}}, and ''T'' is temperature. If the temperature is allowed to vary, the equation must be [[integral|integrated]] over the temperature path. This definition of entropy does not allow the determination of an absolute value, only of differences. In this context, the Second Law of Thermodynamics may be stated that for heat transferred over any valid process for any system, whether isolated or not,
| |
| : <math>{{\rm d}S} \ge {\frac{{\rm \delta}q}{T}}.</math>
| |
| | |
| The second definition of entropy comes from [[statistical mechanics]]. The entropy of a particular [[Microstate (statistical mechanics)|macrostate]] is defined to be [[Boltzmann's constant]] times the [[natural logarithm]] of the number of microstates corresponding to that macrostate, or mathematically
| |
| :<math>S = k_{B} \ln \Omega,\!</math>
| |
| Where ''S'' is the entropy, ''k<sub>B</sub>'' is Boltzmann's constant, and Ω is the number of microstates.
| |
| | |
| The macrostate of a system is what we know about the system, for example the [[temperature]], [[pressure]], and [[volume (thermodynamics)|volume]] of a gas in a box. For each set of values of temperature, pressure, and volume there are many arrangements of molecules which result in those values. The number of arrangements of molecules which could result in the same values for temperature, pressure and volume is the number of microstates.
| |
| | |
| The concept of [[energy]] is related to the [[first law of thermodynamics]], which deals with the [[conservation of energy]] and under which the loss in heat will result in a decrease in the [[internal energy]] of the [[thermodynamic system]]. Thermodynamic entropy provides a comparative measure of the amount of this decrease in internal energy of the system and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualization of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.
| |
| | |
| Entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. [[Information entropy]] takes the mathematical concepts of [[statistical thermodynamics]] into areas of [[probability theory]] unconnected with heat and energy.
| |
| | |
| [[Image:Ice water.jpg|thumb|Ice melting provides an example of entropy ''increasing'']]
| |
| | |
| ==Example of increasing entropy==
| |
| {{Main|Disgregation}}
| |
| Ice melting provides an example in which entropy increases in a small system, a thermodynamic system consisting of the surroundings (the warm room) and the entity of glass container, ice, water which has been allowed to reach [[thermodynamic equilibrium]] at the melting temperature of ice. In this system, some [[heat]] (''δQ'') from the warmer surroundings at 298 K (77°F, 25°C) transfers to the cooler system of ice and water at its constant temperature (''T'') of 273 K (32°F, 0°C), the melting temperature of ice. The entropy of the system, which is ''δQ/T'', increases by ''δQ/273K''. The heat ''δQ'' for this process is the energy required to change water from the solid state to the liquid state, and is called the [[enthalpy of fusion]], i.e. ''ΔH'' for ice fusion.
| |
| | |
| It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of ''δQ/298K'' for the surroundings is smaller than the ratio (entropy change), of ''δQ/273K'' for the ice and water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy.
| |
| | |
| As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the ''δQ/T'' over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice + water was introduced and became a 'system' within it.
| |
| | |
| ==Origins and uses==
| |
| Originally, entropy was named to describe the "waste heat," or more accurately, [[energy loss]]es, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions, as more was understood about the behavior of molecules on the microscopic level. In the late 19th century, the word "disorder" was used by [[Ludwig Boltzmann]] in developing [[Entropy (statistical views)|statistical views of entropy]] using [[probability theory]] to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by [[Werner Heisenberg]] and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and [[statistical mechanics]].
| |
| | |
| For most of the 20th century, textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the [[kinetice|"motional" (i.e. kinetic) energy]] of molecules. More recently, there has been a trend in chemistry and physics textbooks to describe [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref>[http://entropysite.oxy.edu/ welcome to entropysite.]</ref> Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.
| |
| | |
| The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of [[information entropy]] where a constant replaces the temperature which is inherent in thermodynamic entropy.
| |
| | |
| ==Heat and entropy==
| |
| At a microscopic level, [[kinetic energy]] of molecules is responsible for the [[temperature]] of a substance or a system. “Heat” is the kinetic energy of molecules being transferred: when motional energy is transferred from hotter surroundings to a cooler system, faster-moving molecules in the surroundings collide with the walls of the system which [[energy transfer|transfers]] some of their energy to the molecules of the system and makes them move faster.
| |
| | |
| * Molecules in a [[gas]] like [[nitrogen]] at room temperature at any instant are moving at an average speed of nearly 500 miles per hour ([[orders of magnitude|210 m/s]]), repeatedly colliding and therefore exchanging energy so that their individual speeds are always changing. Assuming an [[ideal gas|ideal-gas]] model, average kinetic energy increases [[linear correlation|linearly]] with temperature, so the average speed increases as the square root of temperature.
| |
| ** Thus motional molecular energy (‘heat energy’) from hotter surroundings, like faster-moving molecules in a [[flame]] or violently vibrating iron atoms in a hot plate, will melt or boil a substance (the system) at the temperature of its melting or boiling point. That amount of motional energy from the surroundings that is required for melting or boiling is called the phase-change energy, specifically the enthalpy of fusion or of vaporization, respectively. This phase-change energy breaks bonds between the molecules in the system (not chemical bonds inside the molecules that hold the atoms together) rather than contributing to the motional energy and making the molecules move any faster – so it does not raise the temperature, but instead enables the molecules to break free to move as a liquid or as a vapor.
| |
| ** In terms of energy, when a solid becomes a liquid or a liquid a vapor, motional energy coming from the surroundings is changed to ‘potential energy‘ in the substance ([[phase transition|phase change]] energy, which is released back to the surroundings when the surroundings become cooler than the substance's boiling or melting temperature, respectively). Phase-change energy increases the entropy of a substance or system because it is energy that must be spread out in the system from the surroundings so that the substance can exist as a liquid or vapor at a temperature above its melting or boiling point. When this process occurs in a 'universe' that consists of the surroundings plus the system, the total energy of the 'universe' becomes more dispersed or spread out as part of the greater energy that was only in the hotter surroundings transfers so that some is in the cooler system. This energy dispersal increases the entropy of the 'universe'.
| |
| | |
| The important overall principle is that ''”Energy of all types changes from being localized to becoming dispersed or spread out, if not hindered from doing so. Entropy (or better, entropy change) is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature.''
| |
| | |
| ===Classical calculation of entropy===
| |
| When entropy was first defined and used in 1865 the very existence of atoms was still controversial and there was no concept that temperature was due to the motional energy of molecules or that “heat” was actually the transferring of that motional molecular energy from one place to another. Entropy change, <math>\Delta S</math>, was described in macroscopic terms that could be directly measured, such as volume, temperature, or pressure. However, today the classical equation of entropy, <math>\Delta S = \frac{q_{rev}}{T} </math> can be explained, part by part, in modern terms describing how molecules are responsible for what is happening:
| |
| | |
| * <math>\Delta S</math> is the change in entropy of a system (some physical substance of interest) after some motional energy (“heat”) has been transferred to it by fast-moving molecules. So, <math>\Delta S = S_{final} - S _{initial}</math>.
| |
| | |
| * Then, <math> \Delta S = S_{final} - S _{initial} = \frac{q_{rev}}{T}</math>, the quotient of the motional energy (“heat”) q that is transferred "reversibly" (rev) to the system from the surroundings (or from another system in contact with the first system) divided by T, the absolute temperature at which the transfer occurs.
| |
| ** “Reversible” or “reversibly” (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That’s easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example in the melting of ice at 273.15 K, no matter what temperature the surroundings are – from 273.20 K to 500 K or even higher, the temperature of the ice will stay at 273.15 K until the last molecules in the ice are changed to liquid water, i.e., until all the hydrogen bonds between the water molecules in ice are broken and new, less-exactly fixed hydrogen bonds between liquid water molecules are formed. This amount of energy necessary for ice melting per mole has been found to be 6008 joules at 273 K. Therefore, the entropy change per mole is <math>\frac{q_{rev}}{T} = \frac{6008 J}{273 K}</math>, or 22 J/K.
| |
| ** When the temperature isn't at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy (“heat”) from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of “T” at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many small temperature intervals or increments. For example, to find the entropy change <math>\frac{q_{rev}}{T}</math> from 300 K to 310 K, measure the amount of energy transferred at dozens or hundreds of temperature increments, say from 300.00 K to 300.01 K and then 300.01 to 300.02 and so on, dividing the q by each T, and finally adding them all.
| |
| ** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred “per incremental change in temperature” (the heat capacity, <math>C_p</math>), multiplied by the [[integral]] of <math>\frac{dT}{T}</math> from <math>T_{initial}</math> to <math>T_{final}</math>, is directly given by <math>\Delta S = C_p \ln\frac{T_{final}}{T_{initial}}</math>.
| |
| | |
| ==Introductory descriptions of entropy==
| |
| Traditionally, 20th century textbooks have introduced [[Entropy (order and disorder)|entropy as order and disorder]] so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with [[Frank L. Lambert]] describing [[Entropy (energy dispersal)|entropy as energy dispersal]].<ref>[http://entropysite.oxy.edu/ welcome to entropysite.]</ref>
| |
| | |
| ==See also==
| |
| *[[Entropy (energy dispersal)]]
| |
| *[[Second law of thermodynamics]]
| |
| *[[Statistical mechanics]]
| |
| * [[Thermodynamics]]
| |
| | |
| ==References==
| |
| {{Reflist}}
| |
| | |
| ==Further reading==
| |
| *{{cite book|author=Goldstein, Martin and Inge F.|year=1993|title=The Refrigerator and the Universe: Understanding the Laws of Energy|publisher=Harvard Univ. Press|url=http://books.google.de/books/about/The_refrigerator_and_the_universe.html?id=PDnG4dtaixYC&redir_esc=y|isbn=9780674753259}} chapters=4-12 touch on entropy
| |
| | |
| {{DEFAULTSORT:Introduction To Entropy}}
| |
| [[Category:Thermodynamic entropy]]
| |
| [[Category:Introduction articles|Entropy]]
| |