3 minute read

Entropy

Entropy Is A Probabilistic Property, Entropy Is Additive, Entropy Is Not ConservedEntropy measures disorder



Entropy is a physical quantity that is primarily a measure of the thermodynamic disorder of a physical system. Entropy has the unique property in that its global value must always increase or stay the same; this property is reflected in the second law of thermodynamics. The fact that entropy must always increase in natural processes introduces the concept of irreversibility, and defines a unique direction for the flow of time.



Entropy is a property of all physical systems, the behavior of which is described by the second law of thermodynamics (the study of heat). The first law of thermodynamics states that the total energy of an isolated system is constant; the second law states that the entropy of an isolated system must stay the same or increase. Note that entropy, unlike energy, is not conserved, but can increase. A The 10 possible microstates for a system of three atoms sharing three units of energy. Illustration by Argosy. The Gale Group. system's entropy can also decrease, but only if it is part of a larger system whose overall entropy does increase.

Entropy, first articulated in 1850 by the German physicist Rudolf Clausius (1822–1888), does not correspond to any property of matter that we can sense, such as temperature, and so it is not easy to conceptualize. It can be roughly equated with the amount of energy in a system that is not available for work or, alternatively, with the orderliness of a system, but is not precisely given by either of these concepts. A basic intuitive grasp of what entropy means can be given by statistical mechanics, as described below.

On a fundamental level, entropy is related to the number of possible physical states of a system, S = k log (Gamma), where S represents the entropy, k is Boltzmann's constant, and (Gamma) is the number of states of the system.

Consider a system of three independent atoms that are capable of storing energy in quanta or units of fixed size eε. If there happens to be only three units of energy in this system, how many possible microstates—that is, distinct ways of distributing the energy among the atoms—are there? This question is most easily answered, for this example, by listing all the possibilities. There are 10 possible configurations.

If n0 stands for the number of atoms in the system with 0ε energy, n1 for the number with ε, n2 for the number with 2ε, and n3 for the number with 3ε. For example, in the microstates labeled 1, 2, and 3 in the figure that accompanies this article, (n0, n1, n2, n3) = (2, 0, 0, 3); that is, two atoms have 0ε energy, no atoms have 1ε or 2ε, and one atom has 3ε.

Each class or group of microstates that corresponds to a distinct (n0, n1, n2, n3) distribution. There are three possible distributions, and where P stands for the number of microstates corresponding to each distribution, P can equal 3, 6 or 1. The three values of P can be verified by counting the microstates that themselves reflect the energy distributions for a system of three atoms sharing three units of energy. Again, the number of possible microstates P corresponding to each distribution.

The distribution P2—representing the distribution (n0, n1, n2, n3) = (1, 1, 1, 0)—has the most possible microstates (six). If we assume that this system is constantly, randomly shifting from one microstate to another, that any microstate is equally likely to follow any other, and that we inspect the system at some randomly-chosen instant, we are, therefore, most likely to observe one of the microstates corresponding to distribution P2. Specifically, the probability of observing a microstate corresponding to distribution P2 is 0.6 (6 chances out of 10). The probability of observing distribution P1 is 0.3 (3 chances out of 10) and the probability of observing distribution P3 is 0.1 (1 chance out of 10).

The entropy of this or any other system, S, is defined as S = kln(Pmax), where Pmax is the number of microstates corresponding to the most probable distribution of the system (Pmax = 6 in this example), k is the Boltzmann constant (1.3803 × 10-16 ergs per degree C), and ln(&NA;) is the natural logarithm operation. Further inspection of this equation and the three-atom example given above will clarify some of the basic properties of entropy.


(1) Microstates 1, 2, and 3 of the three-atom system described above—those distributions in which the energy available to the system is segregated entirely to one atom—are in some sense clearly the most orderly or organized. Yet these three microstates (distribution 1) are also unlikely; their total probability of occurrence at any moment is only half that of distribution 2. Order is less likely than disorder.


Additional topics

Science EncyclopediaScience & Philosophy: Electrophoresis (cataphoresis) to Ephemeral