Entropy
Entropy
Entropy (symbol ) is a fundamental thermodynamic property that measures the degree of disorder or randomness in a system. It quantifies the number of microscopic configurations that correspond to a thermodynamic system's macroscopic state.
Definition
Entropy is related to the number of possible microstates () by the Boltzmann equation:
where:
- = entropy
- = Boltzmann constant ()
- = number of accessible microstates of the system
Thermodynamic Definition
In classical thermodynamics, the change in entropy for a reversible process is defined as:
where:
- = infinitesimal change in entropy
- = infinitesimal heat absorbed reversibly
- = absolute temperature
For a finite reversible process between states 1 and 2:
Second Law of Thermodynamics
The Second Law states that the total entropy of an isolated system never decreases:
where equality holds for reversible processes and inequality for irreversible processes.
This law implies that natural processes tend to move towards increased entropy or disorder.
Entropy Change in Chemical Reactions
The entropy change of a system during a chemical reaction is:
This change helps determine the spontaneity of reactions when combined with enthalpy changes in the Gibbs free energy:
Statistical Interpretation
Entropy can also be viewed as a measure of uncertainty or information content in the system's microscopic state, connecting thermodynamics with information theory.
Applications
- Predicting spontaneity and direction of chemical reactions
- Explaining phase transitions and mixing phenomena
- Understanding biological processes and energy transfer
- Engineering systems such as engines and refrigerators
References
- Atkins, P., & de Paula, J. (2010). Physical Chemistry. Oxford University Press.
- Callen, H. B. (1985). Thermodynamics and an Introduction to Thermostatistics. Wiley.