## Definition of the entropy | ||||

## What the entropy? | Automatic translation | Updated June 01, 2013 | ||

Entropy is related to the notions of order and microscopic disorder and specifically the transition from a disordered state to a more disordered state. A state is more disordered it may take many different microscopic states. The | Throughout these transformations, the energy degrades, in other words, its entropy increases. A cup which breaks itself never returns behind, a body which dies will not live again any more. The total entropy of an isolated system always has to increase, its disorder always has to grow, it is the second principle of the thermodynamics. Originally, the entropy indicates the phenomenon of the thermal exchange which equalizes the temperatures or the waste of the energy in heat, but it is a more general law and which is not so easy to interpret. It is even difficult to understand the concept energy, this greatness which for an isolated system, has the property to keep till mists of time. | ## Image: The files of a hard disk are also a demonstration of the entropy, they will never find the order of the departure. | ||

## Order or disorder | ||||

The statistical thermodynamics then supplied a new orientation in this abstract physical greatness. It is going to measure the degree of disorder of a system at the tiny level. The more the entropy of the system is raised, the less its elements are ordered, bound between them, capable of producing mechanical effects, and bigger is the part of the energy unused or used in a inconsistent way. | In this example the number of tiny configurations (thus the entropy) is indeed a measure of the disorder. If this notion of disorder is often subjective, the number Ω of configurations is him, objective because it is a number. | ## Image: The possibility to find all the molecules of the same side of the container so as to leave half of the void volume is low compared to immensely greater opportunities for which the molecules are uniformly distributed throughout the volume. Thus a thermodynamic equilibrium occurs when the system has the maximum entropy value. The entropy is defined by the Boltzmann formula, S = k log W. | ||

## Some examples of entropy | ||||

Both expressions of the entropy result simply from two different points of view, as we consider the system thermodynamics at the macroscopic level or at the tiny level. | In a liquid the mutual distances are smaller and molecules are less free. In solid one every molecule is connected elastically to its neighbors and vibrates around a fixed average position. The energy of every particle is unpredictable. | ## Image: Entropy, uncertainty, disorder, complexity, thus appear as adversities of the same concept. Under the one or other one of these forms, the entropy is associated with the notion of probability. |

1997 © Astronoo.com − Astronomy, Astrophysics, Evolution and Ecology.

"The data available on this site may be used provided that the source is duly acknowledged."

"The data available on this site may be used provided that the source is duly acknowledged."