### Learning Outcomes

- Define entropy
- Explain the relationship between entropy and the number of microstates
- Predict the sign of the entropy change for chemical and physical processes

In 1824, at the age of 28, Nicolas Léonard Sadi Carnot (Figure 1) published the results of an extensive study regarding the efficiency of steam heat engines. In a later review of Carnot’s findings, Rudolf Clausius introduced a new thermodynamic property that relates the spontaneous heat flow accompanying a process to the temperature at which the process takes place. This new property was expressed as the ratio of the *reversible* heat (*q*_{rev}) and the kelvin temperature (*T*). The term **reversible process** refers to a process that takes place at such a slow rate that it is always at equilibrium and its direction can be changed (it can be “reversed”) by an infinitesimally small change is some condition. Note that the idea of a reversible process is a formalism required to support the development of various thermodynamic concepts; no real processes are truly reversible, rather they are classified as *irreversible*.

Similar to other thermodynamic properties, this new quantity is a state function, and so its change depends only upon the initial and final states of a system. In 1865, Clausius named this property **entropy ( S)** and defined its change for any process as the following:

[latex]\Delta S=\dfrac{{q}_{\text{rev}}}{T}[/latex]

The entropy change for a real, irreversible process is then equal to that for the theoretical reversible process that involves the same initial and final states.

## Entropy and Microstates

Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of *microstates* possible for the system. A **microstate ( W)** is a specific configuration of the locations and energies of the atoms or molecules that comprise a system like the following:

[latex]S=k\text{ln}W[/latex]

Here *k* is the Boltzmann constant and has a value of 1.38 [latex]\times [/latex] 10^{−23} J/K.

As for other state functions, the change in entropy for a process is the difference between its final (*S*_{f}) and initial (*S*_{i}) values:

[latex]\Delta S={S}_{\text{f}}-{S}_{\text{i}}=k\text{ln}{W}_{\text{f}}-k\text{ln}{W}_{\text{i}}=k\text{ln}\dfrac{{W}_{\text{f}}}{{W}_{\text{i}}}[/latex]

For processes involving an increase in the number of microstates, *W*_{f} > *W*_{i}, the entropy of the system increases, Δ*S* > 0. Conversely, processes that reduce the number of microstates, *W*_{f} < *W*_{i}, yield a decrease in system entropy, Δ*S* < 0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.

Consider the general case of a system comprised of *N* particles distributed among *n* boxes. The number of microstates possible for such a system is *n ^{N}*. For example, distributing four particles among two boxes will result in 2

^{4}= 16 different microstates as illustrated in Figure 2. Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called

*distributions*. The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates,

*the most probable distribution is therefore the one of greatest entropy*.

For this system, the most probable configuration is one of the six microstates associated with distribution (c) where the particles are evenly distributed between the boxes, that is, a configuration of two particles in each box. The probability of finding the system in this configuration is [latex]\frac{6}{16}[/latex] or [latex]\frac{3}{8}[/latex]. The least probable configuration of the system is one in which all four particles are in one box, corresponding to distributions (a) and (e), each with a probability of [latex]\frac{1}{16}[/latex]. The probability of finding all particles in only one box (either the left box or right box) is then [latex](\frac{1}{16} + \frac{1}{16}) = \frac{2}{16} \text{or} \frac{1}{8}[/latex].

As you add more particles to the system, the number of possible microstates increases exponentially (2^{N}). A macroscopic (laboratory-sized) system would typically consist of moles of particles (*N* ~ 10^{23}), and the corresponding number of microstates would be staggeringly huge. Regardless of the number of particles in the system, however, the distributions in which roughly equal numbers of particles are found in each box are always the most probable configurations.

This matter dispersal model of entropy is often described qualitatively in terms of the *disorder* of the system. By this description, microstates in which all the particles are in a single box are the most ordered, thus possessing the least entropy. Microstates in which the particles are more evenly distributed among the boxes are more disordered, possessing greater entropy.

The previous description of an ideal gas expanding into a vacuum (Figure 3) is a macroscopic example of this particle-in-a-box model. For this system, the most probable distribution is confirmed to be the one in which the matter is most uniformly dispersed or distributed between the two flasks. The spontaneous process whereby the gas contained initially in one flask expands to fill both flasks equally therefore yields an increase in entropy for the system.

A similar approach may be used to describe the spontaneous flow of heat. Consider a system consisting of two objects, each containing two particles, and two units of energy (represented as “*”) in Figure 4. The hot object is comprised of particles **A** and **B** and initially contains both energy units. The cold object is comprised of particles **C** and **D**, which initially has no energy units. Distribution (a) shows the three microstates possible for the initial state of the system, with both units of energy contained within the hot object. If one of the two energy units is transferred, the result is distribution (b) consisting of four microstates. If both energy units are transferred, the result is distribution (c) consisting of three microstates. And so, we may describe this system by a total of ten microstates. The probability that the heat does not flow when the two objects are brought into contact, that is, that the system remains in distribution (a), is [latex]\frac{3}{10}[/latex]. More likely is the flow of heat to yield one of the other two distribution, the combined probability being [latex]\frac{7}{10}[/latex]. The most likely result is the flow of heat to yield the uniform dispersal of energy represented by distribution (b), the probability of this configuration being [latex]\frac{4}{10}[/latex]. As for the previous example of matter dispersal, extrapolating this treatment to macroscopic collections of particles dramatically increases the probability of the uniform distribution relative to the other distributions. This supports the common observation that placing hot and cold objects in contact results in spontaneous heat flow that ultimately equalizes the objects’ temperatures. And, again, this spontaneous process is also characterized by an increase in system entropy.

### Example 1: **Determination of Δ***S*

*S*

Consider the system shown here. What is the change in entropy for a process that converts the system from distribution (a) to (c)?

**Check Your Learning**

Consider the system shown in Figure 4. What is the change in entropy for the process where *all* the energy is transferred from the hot object (**AB**) to the cold object (**CD**)?

## Predicting the Sign of Δ*S*

The relationships between entropy, microstates, and matter/energy dispersal described previously allow us to make generalizations regarding the relative entropies of substances and to predict the sign of entropy changes for chemical and physical processes. Consider the phase changes illustrated in Figure 5. In the solid phase, the atoms or molecules are restricted to nearly fixed positions with respect to each other and are capable of only modest oscillations about these positions. With essentially fixed locations for the system’s component particles, the number of microstates is relatively small. In the liquid phase, the atoms or molecules are free to move over and around each other, though they remain in relatively close proximity to one another. This increased freedom of motion results in a greater variation in possible particle locations, so the number of microstates is correspondingly greater than for the solid. As a result, *S*_{liquid} > *S*_{solid} and the process of converting a substance from solid to liquid (melting) is characterized by an increase in entropy, Δ*S* > 0. By the same logic, the reciprocal process (freezing) exhibits a decrease in entropy, Δ*S* < 0.

Now consider the gaseous phase, in which a given number of atoms or molecules occupy a *much* greater volume than in the liquid phase. Each atom or molecule can be found in many more locations than in the liquid (or solid) phase, corresponding to a much greater number of microstates. Consequently, for any substance, *S*_{gas} > *S*_{liquid} > *S*_{solid}, and the processes of vaporization and sublimation likewise involve increases in entropy, Δ*S* > 0. Likewise, the reciprocal phase transitions, condensation and deposition, involve decreases in entropy, Δ*S* < 0.

According to kinetic-molecular theory, the temperature of a substance is proportional to the average kinetic energy of its particles. Raising the temperature of a substance will result in more extensive vibrations of the particles in solids and more rapid translations of the particles in liquids and gases. At higher temperatures, the distribution of kinetic energies among the atoms or molecules of the substance is also broader (more dispersed) than at lower temperatures. Thus, the entropy for any substance increases with temperature (Figure 6).

The entropy of a substance is influenced by structure of the particles (atoms or molecules) that comprise the substance. With regard to atomic substances, heavier atoms possess greater entropy at a given temperature than lighter atoms, which is a consequence of the relation between a particle’s mass and the spacing of quantized translational energy levels (which is a topic beyond the scope of our treatment). For molecules, greater numbers of atoms (regardless of their masses) increase the ways in which the molecules can vibrate and thus the number of possible microstates and the system entropy.

Finally, variations in the types of particles affects the entropy of a system. Compared to a pure substance, in which all particles are identical, the entropy of a mixture of two or more different particle types is greater. This is because of the additional orientations and interactions that are possible in a system comprised of nonidentical components. For example, when a solid dissolves in a liquid, the particles of the solid experience both a greater freedom of motion and additional interactions with the solvent particles. This corresponds to a more uniform dispersal of matter and energy and a greater number of microstates. The process of dissolution therefore involves an increase in entropy, Δ*S* > 0.

Considering the various factors that affect entropy allows us to make informed predictions of the sign of Δ*S* for various chemical and physical processes as illustrated in Example 2.

### Example 2: **Predicting the Sign of ∆***S*

*S*

Predict the sign of the entropy change for the following processes. Indicate the reason for each of your predictions.

- One mole liquid water at room temperature [latex]\longrightarrow [/latex] one mole liquid water at 50 °C
- [latex]{\text{Ag}}^{\text{+}}\left(aq\right)+{\text{Cl}}^{-}\left(aq\right)\longrightarrow \text{AgCl}\left(s\right)[/latex]
- [latex]{\text{C}}_{6}{\text{H}}_{6}\left(l\right)+\frac{15}{2}{\text{O}}_{2}\left(g\right)\longrightarrow 6{\text{CO}}_{2}\left(g\right)+3{\text{H}}_{2}\text{O}\left(l\right)[/latex]
- [latex]{\text{NH}}_{3}\left(s\right)\longrightarrow {\text{NH}}_{\text{3}}\left(l\right)[/latex]

**Check Your Learning**

Predict the sign of the enthalpy change for the following processes. Give a reason for your prediction.

- [latex]{\text{NaNO}}_{3}\left(s\right)\longrightarrow {\text{Na}}^{\text{+}}\left(aq\right)+{\text{NO}}_{3}{}^{-}\left(aq\right)[/latex]
- the freezing of liquid water
- [latex]{\text{CO}}_{2}\left(s\right)\longrightarrow {\text{CO}}_{2}\left(g\right)[/latex]
- [latex]\text{CaCO}\left(s\right)\longrightarrow \text{CaO}\left(s\right)+{\text{CO}}_{2}\left(g\right)[/latex]

### Key Concepts and Summary

Entropy (*S*) is a state function that can be related to the number of microstates for a system (the number of ways the system can be arranged) and to the ratio of reversible heat to kelvin temperature. It may be interpreted as a measure of the dispersal or distribution of matter and/or energy in a system, and it is often described as representing the “disorder” of the system.

For a given substance, *S*_{solid} < *S*_{liquid} < *S*_{gas} in a given physical state at a given temperature, entropy is typically greater for heavier atoms or more complex molecules. Entropy increases when a system is heated and when solutions form. Using these guidelines, the sign of entropy changes for some chemical reactions may be reliably predicted.

#### Key Equations

- [latex]\Delta S=\dfrac{{q}_{\text{rev}}}{T}[/latex]
- [latex]S = {k}{ln}{W}[/latex]
- [latex]\Delta S=k\text{ln}\dfrac{{W}_{\text{f}}}{{W}_{\text{i}}}[/latex]

### Try It

- In Figure 2 all possible distributions and microstates are shown for four different particles shared between two boxes. Determine the entropy change, Δ
*S*, if the particles are initially evenly distributed between the two boxes, but upon redistribution all end up in Box (b). - In Figure 2 all of the possible distributions and microstates are shown for four different particles shared between two boxes. Determine the entropy change, Δ
*S*, for the system when it is converted from distribution (b) to distribution (d). - How does the process described in the previous item relate to the system shown in Figure 3?
- Consider a system similar to the one in Figure 2, except that it contains six particles instead of four. What is the probability of having all the particles in only one of the two boxes in the case? Compare this with the similar probability for the system of four particles that we have derived to be equal to [latex]\frac{1}{8}[/latex]. What does this comparison tell us about even larger systems?
- Consider the system shown in Figure 4. What is the change in entropy for the process where the energy is initially associated only with particle A, but in the final state the energy is distributed between two different particles?
- Consider the system shown in Figure 4. What is the change in entropy for the process where the energy is initially associated with particles A and B, and the energy is distributed between two particles in different boxes (one in A-B, the other in C-D)?
- Arrange the following sets of systems in order of increasing entropy. Assume one mole of each substance and the same temperature for each member of a set.
- H
_{2}(*g*), HBrO_{4}(*g*), HBr(*g*) - H
_{2}O(*l*), H_{2}O(*g*), H_{2}O(*s*) - He(
*g*), Cl_{2}(*g*), P_{4}(*g*)

- H
- At room temperature, the entropy of the halogens increases from I
_{2}to Br_{2}to Cl_{2}. Explain. - Consider two processes: sublimation of I
_{2}(*s*) and melting of I_{2}(*s*) (Note: the latter process can occur at the same temperature but somewhat higher pressure).

[latex]{\text{I}}_{2}\left(s\right)\longrightarrow {\text{I}}_{2}\left(g\right)[/latex]

[latex]{\text{I}}_{2}\left(s\right)\longrightarrow {\text{I}}_{2}\left(l\right)[/latex]

Is Δ*S*positive or negative in these processes? In which of the processes will the magnitude of the entropy change be greater? - Indicate which substance in the given pairs has the higher entropy value. Explain your choices.
- C
_{2}H_{5}OH(*l*) or C_{3}H_{7}OH(*l*) - C
_{2}H_{5}OH(*l*) or C_{2}H_{5}OH(*g*) - 2H(
*g*) or H(*g*)

- C
- Predict the sign of the entropy change for the following processes.
- An ice cube is warmed to near its melting point.
- Exhaled breath forms fog on a cold morning.
- Snow melts.

- Predict the sign of the enthalpy change for the following processes. Give a reason for your prediction.
- [latex]{\text{Pb}}^{2+}\left(aq\right)+{\text{S}}^{2-}\left(aq\right)\longrightarrow \text{PbS}\left(s\right)[/latex]
- [latex]2\text{Fe}\left(s\right)+3{\text{O}}_{2}\left(g\right)\longrightarrow {\text{Fe}}_{2}{\text{O}}_{3}\left(s\right)[/latex]
- [latex]2{\text{C}}_{6}{\text{H}}_{14}\left(l\right)+19{\text{O}}_{2}\left(g\right)\longrightarrow 14{\text{H}}_{2}\text{O}\left(g\right)+12{\text{CO}}_{2}\left(g\right)[/latex]

- Write the balanced chemical equation for the combustion of methane, CH
_{4}(*g*), to give carbon dioxide and water vapor. Explain why it is difficult to predict whether Δ*S*is positive or negative for this chemical reaction. - Write the balanced chemical equation for the combustion of benzene, C
_{6}H_{6}(*l*), to give carbon dioxide and water vapor. Would you expect Δ*S*to be positive or negative in this process?

## Glossary

**entropy ( S): **state function that is a measure of the matter and/or energy dispersal within a system, determined by the number of system microstates often described as a measure of the disorder of the system

**microstate ( W): **possible configuration or arrangement of matter and energy within a system

**reversible process: **process that takes place so slowly as to be capable of reversing direction in response to an infinitesimally small change in conditions; hypothetical construct that can only be approximated by real processes removed