fr en es pt
Astronomy
 
Contact the author rss astronoo
 
 


Updated 02 August 2023

Boltzmann's equation (1877)

Boltzmann's equation (1877)

Image: The tomb of Ludwig Eduard Boltzmann (1844-1906) at the Zentralfriedhof, Vienna, with his bust and entropy formula.

S = k ln(W)

The entropy equation was developed by Austrian physicist Ludwig Boltzmann. Entropy (S) is a powerful concept that measures the disorder or uncertainty of a physical system.
Boltzmann's entropy equation is essential for understanding the relationship between entropy and the thermodynamic properties of a system, such as temperature, pressure, and energy. It plays a crucial role in the study of thermodynamic equilibrium processes, the evolution of physical systems, and in the statistical interpretation of the laws of thermodynamics.

Boltzmann's entropy equation is generally expressed as follows:
S = k ln(W)
- S is the entropy of the system.
- k is Boltzmann's constant, a fundamental constant in physics that relates thermal energy to temperature (k ≈ 1.38 x 10^-23 J/K).
- ln represents the natural logarithm function (or natural logarithm).
- W is the number of accessible microstates of the system at a certain energy level.

This equation shows that the entropy is proportional to the logarithm of the number of possible microstates of the system. A system is said to have high entropy if it can be in a large number of disordered microstates, while a system with few accessible microstates will have lower entropy (more order).

Entropy is a fundamental concept used in various fields of science, including thermodynamics, statistical mechanics, information theory, computer science, complexity science, and other fields. The definition of entropy may vary slightly depending on the context, but it shares common ideas in these areas.

In thermodynamics, entropy is a measure of the disorder or degree of molecular agitation in a physical system. It is related to the distribution of energy in the system and its ability to do work. Entropy is an extensive quantity, which means that it depends on the amount of matter in the system. According to the second law of thermodynamics, the entropy of an isolated system cannot decrease over time in a natural process.

In statistical mechanics, entropy is associated with the microscopic probability of the accessible states of a system. It is related to the distribution of particles, molecules or configurations in the phase space of the system. The Boltzmann entropy is defined as the logarithm of the number of accessible microstates of the system at a certain energy level.

In information theory, entropy is used to quantify the uncertainty or unpredictability of a random variable or a source of information. The more unpredictable a source, the higher its entropy. Shannon's entropy is the most commonly used measure in information theory.

Either way, entropy is a measure of the amount of information, disorder, diversity, or uncertainty in a system.

Yet in the universe we see increasingly ordered structures forming from initially less organized processes. This seems to go against the intuitive idea that entropy, as a measure of disorder, must always increase, as the second law of thermodynamics suggests.
However, increasing local order (such as the formation of galaxies and stars) does not imply a violation of the second law of thermodynamics. This principle concerns the whole system, and it states that the total entropy of an isolated system, the Universe, cannot decrease over time. When we observe the formation of galaxies and stars, we must consider the whole system, including the energy and large-scale processes involved.


1997 © Astronoo.com − Astronomy, Astrophysics, Evolution and Ecology.
"The data available on this site may be used provided that the source is duly acknowledged."