Entropy was introduced by Rudolf Clausius (1822–1888) in 1865 to formalize the second law of thermodynamics. It measures the degree of disorder in a system. Mathematically, it is defined as \(\Delta S = \frac{Q_{rev}}{T}\), where \(Q_{rev}\) is the reversible heat exchanged and \(T\) is the absolute temperature.
Ludwig Boltzmann (1844–1906) provided a probabilistic interpretation: \( S = k_B \ln \Omega \), where \(k_B\) is Boltzmann’s constant and \(\Omega\) is the number of accessible microstates. The more microstates a system has, the higher its entropy. This approach directly links entropy to statistical information.
In 1948, Claude Shannon (1916–2001) adapted the concept of entropy to information theory. Shannon entropy \( H = -\sum p_i \log_2(p_i) \) measures the uncertainty associated with a message source. The more uniform the symbol distribution, the greater the uncertainty.
"A deck where all cards are in order is highly predictable: you know exactly which card will come next. Conversely, a well-shuffled deck makes each draw unpredictable, as all cards have equal probability."
In summary: order ↔ more predictable (macro), disorder ↔ more unpredictable (macro), and entropy ↔ measure of statistical unpredictability of microstates.
System | Description | Predictability | Entropy | Comment |
---|---|---|---|---|
Random draw of symbols A, B, C, D | Each symbol has exactly the same probability of appearing in each draw | Impossible to predict | High | Abstract model illustrating Shannon’s maximum entropy |
Biased draw of symbols A, B, C, D | Symbol A appears 90% of the time; B, C, D rarely appear | Easy to predict | Low | Abstract model of low entropy |
Fair die roll | Each face (1–6) has equal probability in each roll | Impossible to predict | High | Simple example of maximum randomness |
Loaded die roll | The die lands on 6 80% of the time; other faces rarely appear | Easy to predict | Low | Classic example of low uncertainty |
Well-shuffled deck of cards | Each card has an equal chance of being drawn randomly | Impossible to predict | High | Shows that initial order is lost after shuffling |
Partially sorted cards | Most drawn cards are red (75%) | Relatively easy to predict | Low | Pedagogical example of reduced entropy |
Random bits | Each bit (0 or 1) has exactly the same probability in a randomly generated sequence | Impossible to predict | High | Numerical example of maximum uncertainty |
Biased bits | Bits 0 appear 90% of the time; bits 1 appear 10% | Easy to predict | Low | Numerical example of low entropy |
The second law of thermodynamics states that in an isolated system, entropy tends to increase over time: \(\Delta S = S_{\text{final}} - S_{\text{initial}} \ge 0\).
Entropy grows in any isolated system, reflecting the fundamental irreversibility of natural phenomena and why they tend toward more disordered states.
These disordered states are statistically much more accessible because there is an enormous number of microstates (possible configurations of particle positions and velocities) corresponding to the same macroscopic state. This multitude of configurations makes these states much more probable, resulting in higher entropy.
Thus, increasing entropy reflects the spontaneous transition of systems from ordered to disordered configurations, where energy and matter can be arranged in many more ways.
Entropy measures the uncertainty of possible configurations, explaining why some natural processes never occur in reverse, such as heat always spreading from hot to cold.
Heat always flows from a hot body to a cold body due to the temperature gradient. A hot body has molecules with higher average kinetic energy than those in a cold body. When the bodies are in contact, molecular collisions cause a net transfer of energy from the hot side to the cold side, gradually reducing the temperature difference (gradient).
This process is not absolute at the level of individual collisions; some collisions may transfer energy in the opposite direction. But on a macroscopic scale, the net flow follows the temperature gradient, which is the most probable direction for the system’s evolution.
There are many more ways to distribute this energy in the total system (hot + cold) than when all energy is concentrated in the hot body. In terms of entropy, this energy transfer increases the number of accessible microstates for the total system. Thus, the thermal gradient acts as a natural driver of entropy increase.
System | Entropy Evolution | Comment |
---|---|---|
Ideal gas | Molecules confined to a small space → Molecules dispersed throughout available space | When molecules can occupy more possible positions, uncertainty about their arrangement increases, raising entropy |
Deck of cards | Perfectly sorted by color and value → Randomly shuffled cards | The initial order is practically impossible to restore after shuffling, illustrating increased uncertainty and entropy |
Symbol distribution | Some symbols dominate (e.g., A 20%) → Each symbol has equal probability (e.g., A 3.7%, B 3.7%, C 3.7%, D 3.7%, ...) | When symbols are more evenly distributed, predicting the next symbol becomes difficult, and entropy increases |
Bits in a computer sequence | Mostly 0 bits (75%) → 0 and 1 bits equiprobable (50%) | As bits become more balanced, sequence uncertainty increases, leading to higher entropy |
Sounds in a simple melody | A dominant note repeated → Notes chosen randomly with equal probabilities | Note variety increases uncertainty and illustrates rising entropy |
The Universe | Very homogeneous and dense state (Big Bang) → Increasingly dispersed and structured universe with stars, galaxies, black holes | Expansion and structure formation increase uncertainty about particle positions and energy, reflecting growing cosmic entropy |
At first glance, living organisms seem to create order: organized cells, structured DNA, complex tissues. This might appear to contradict the second law of thermodynamics, which states that entropy must increase.
However, Earth is not an isolated system: it receives energy from the Sun and exchanges heat and chemical waste with its environment. Organisms use this energy to build ordered structures but, in return, produce heat and waste that increase disorder in their surroundings.
Thus, even if local entropy (within the organism) decreases, the total entropy of the global system (organism + environment) increases. Life redistributes energy and matter, increasing the number of accessible microstates in the environment.
In summary: Life creates local order but generates greater disorder around it, complying with the second law of thermodynamics.