2 minute read

Entropy

Entropy Is Not Conserved



All that is needed to increase the entropy of an isolated system is to increase the number of microstates its particles can occupy; say, by allowing the system to occupy more space. It is beyond the scope of this discussion to prove that the entropy of a closed system cannot ever decrease, but this can be made plausible by considering the first law of thermodynamics, which forbids energy to be created or destroyed. As long as a system has the same number of atoms and the same number of quanta of energy to share between them, it is plausible that the system possesses a minimum number of possible microstates—and a minimum entropy.



It is sometimes claimed that "entropy always increases," and that the second law requires that disorder must always increase when nature is left to its own devices. This is incorrect. Note that in the above example, a system of three independent atoms is stipulated; yet atoms rarely behave independently when in proximity to each other at low temperatures. They tend to form bonds, spontaneously organizing themselves into orderly structures (molecules and crystals). Order from disorder is, therefore, just as natural a process as disorder from order. At low temperatures, self-ordering predominates; at high temperatures, entropy effects dominate (i.e., order is broken down). Furthermore, any system that is not isolated can experience decreased entropy (increasing order) at the expense of increasing entropy elsewhere. Earth, which shares the solar system with the Sun, whose entropy is increasing rapidly, is one such non-isolated system. It is therefore an error to claim, as some writers do, that biological evolution—which involves spontaneously increasing order in natural molecular systems—contradicts thermodynamics. Entropy does not forbid molecular self-organization because entropy is only one property of matter; entropy does discourage self-organization, but other properties of matter encourage it, and in some circumstances (especially at relatively low temperatures, as on the surface of the earth) will prevail.

An alternative derivation of the entropy concept, based on the properties of heat engines (devices that turn heat flows partially into mechanical work) is often presented in textbooks. This method produces a definition of entropy that seems to differ from S = kln(Pmax), namely, dS = dQrev/T, where dS is an infinitesimal (very small) change in a system's entropy at the fixed temperature T when an infinitesimal quantity of heat, Qrev, flows reversibly into or out of the system. However, it can be shown that these definitions are exactly equivalent; indeed, the entropy concept was originally developed from the analysis of heat engines, and the statistical interpretation given above was not invented until later.

The entropy concept is fundamental to all science that deals with heat, efficiency, the energy of systems, chemical reactions, very low temperatures, and related topics. Its physical meaning is, in essence, that the amount of work the universe can perform is always declining as its "orderliness" declines, and must eventually approach zero. In other words, things are running down, and there is no way to stop them.


Resources

Books

Dugdale, J.S. Entropy and Its Physical Meaning. Cornwall, UK: Taylor & Francis, 1996.

Goldstein, M., and I. Goldstein. The Refrigerator and the Universe: Understanding the Laws of Energy. Cambridge, MA: Harvard University Press, 1993.

Lee, Joon Chang. Thermal Physics: Entropy and Free Energies. World Scientific Publishing Co., Inc., 2002.


K. Lee Lerner
Larry Gilman

Additional topics

Science EncyclopediaScience & Philosophy: Electrophoresis (cataphoresis) to EphemeralEntropy - Entropy Is A Probabilistic Property, Entropy Is Additive, Entropy Is Not Conserved - Entropy measures disorder