#### Learning Objective

- Describe the relationship between entropy and microstates.

#### Key Points

- The entropy of an isolated system always increases or remains constant.
- The more such states available to the system with appreciable probability, the greater the entropy.
- Fundamentally, the number of microstates is a measure of the potential disorder of the system.

#### Terms

- microstateThe specific detailed microscopic configuration of a system.
- entropyA thermodynamic property that is the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work.

## Second Law of Thermodynamics

In classical thermodynamics, the second law of thermodynamics states that the entropy of an isolated system always increases or remains constant. Therefore, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature, in the form of heat.

These processes reduce the state of order of the initial systems. As a result, entropy (denoted by *S*) is an expression of disorder or randomness. Thermodynamic entropy has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J/K) in the International System of Units.

The interpretation of entropy is the measure of uncertainty, which remains about a system after its observable macroscopic properties, such as temperature, pressure, and volume, have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible microstates. In contrast to the macrostate, which characterizes plainly observable average quantities (temperature, for example), a microstate specifies all molecular details about the system, including the position and velocity of every molecule. With more available microstates, the entropy of a system increases. This is the basis of an alternative (and more fundamental) definition of entropy:

[latex]S = k ln \Omega[/latex]

in which k is the Boltzmann constant (the gas constant per molecule, 1.38 x 10^{–23}[latex]\frac{J}{K}[/latex] ) and Ω (omega) is the number of microstates that correspond to a given macrostate of the system. The more such microstates, the greater is the probability of the system being in the corresponding macrostate. For any physically realizable macrostate, the quantity Ω is an unimaginably large number.

Even though it is beyond human comprehension to compare numbers that seem to verge on infinity, the thermal energy contained in actual physical systems manages to discover the largest of these quantities with no difficulty at all, quickly settling in to the most probable macrostate for a given set of conditions.

## Phase changes and entropy

In terms of energy, when a solid becomes a liquid or a liquid a vapor, kinetic energy from the surroundings is changed to ‘potential energy‘ in the substance (phase change energy). This energy is released back to the surroundings when the surroundings become cooler than the substance’s boiling or melting temperature, respectively. Phase-change energy increases the entropy of a substance or system because it is energy that must be spread out in the system from the surroundings so that the substance can exist as a liquid or vapor at a temperature above its melting or boiling point. Therefore, the entropy of a solid is less than the entropy of a liquid, which is much less than the entropy of a gas:

S_{solid} < S_{liquid} << S_{gas}