Physics 222 -- Notes on Chapter 23

Entropy is a measure of the number of quantum mechanical states available to a body. More specificially, it is Boltzmann's constant times the logarithm of the number of states within a certain small energy range.

The entropy of two bodies is the sum of the individual entropies of each body.

The importance of entropy is that it is a quantity which is extremely unlikely to decrease in an isolated system as long as the system has a large number of degrees of freedom. This law is called the second law of thermodynamics, and it makes most processes in the everyday world irreversible. For instance, entropy increases when an ice cube melts in a glass of water, when cream is mixed in with your coffee, when you apply the brakes in your car, etc.

The main technical difficulties in computing entropy are first in computing a system's possible quantum mechanical states, and second in counting those states with less than a specified energy E. Once this is done, taking the logarithm of the derivative of the number of states with respect to the energy and multiplying it by Boltzmann's constant gives us the entropy.

The temperature is one over the derivative of entropy with respect to energy, holding other variables constant. This thermodynamic definition of temperature is in agreement with our more empirical measures.

From the above definition of temperature, we infer that if heat dQ is added to a system at temperature T, the amount of entropy added is dS = dQ/T.

The specific example used in this chapter is a brick, which is idealized as a bunch of harmonic oscillators. However, you should be familiar enough with the process of computing entropy to apply the same reasoning to other systems, i. e., to find the number of states as a function of the energy, the entropy, the temperature, etc.