Definition of the entropy
What the entropy?
| || Automatic translation|| ||Category: matter and particles|
Updated June 01, 2013
Entropy is related to the notions of order and microscopic disorder and specifically the transition from a disordered state to a more disordered state. A state is more disordered it may take many different microscopic states. The entropy, of Greek 'review', is a function thermodynamics. In thermodynamics, the entropy is a function of state (pressure, temperature, volume, quantity of matter) Introduced in the middle of the 19th century by Clausius within the framework of the second principle, according to the works of Carnot. Clausius introduced this greatness to characterize mathematically the irreversibility of physical processes such as a working transformation in heat. He showed that the relationship Q/T (where Q is the quantity of heat exchanged by a system in the temperature T) corresponds, in classic thermodynamics, to the variation of a function of state which it called entropy S and the unit of which is the joule by kelvin ( J/K). The unit of entropy, the Joule by Kelvin corresponds to the quantity of entropy won by a system which receives 1 joule of heat by Kelvin. Thrown in a turbine, the water of a dam transforms its gravitational energy into electrical energy, later, we shall make a movement in an electric engine or of the heat in a radiator.
Throughout these transformations, the energy degrades, in other words, its entropy increases. A cup which breaks itself never returns behind, a body which dies will not live again any more. The total entropy of an isolated system always has to increase, its disorder always has to grow, it is the second principle of the thermodynamics. Originally, the entropy indicates the phenomenon of the thermal exchange which equalizes the temperatures or the waste of the energy in heat, but it is a more general law and which is not so easy to interpret. It is even difficult to understand the concept energy, this greatness which for an isolated system, has the property to keep till mists of time.
The energy (mechanical, thermal, chemical, electromagnetic) keeps, it can be neither created, nor destroyed but it is transformed. Lavoisier said it " nothing gets lost, nothing builds up itself, everything is transformed ". Otherwise surprising is the concept entropy. For the same isolated system, the entropy cannot remain constant, it will only be growing infinitely. Entropy, uncertainty, disorder, complexity are a part of the same concept.
| || |
Image: The files of a hard disk are also a demonstration of the entropy, they will never find the order of the departure.
Order or disorder
| || || || |
The statistical thermodynamics then supplied a new orientation in this abstract physical greatness. It is going to measure the degree of disorder of a system at the tiny level. The more the entropy of the system is raised, the less its elements are ordered, bound between them, capable of producing mechanical effects, and bigger is the part of the energy unused or used in a inconsistent way. Ludwig Eduard Boltzmann (1844 − 1906) formulated a mathematical expression of the statistical entropy according to the number of tiny states Ω defining the state of balance of a system given to the macroscopic level.
Boltzmann's formula is S = k Ln Ω.
In reality the formula S = k log W is from Max Planck (1858 − 1957) but it was Boltzmann who understood it.
It is common to say that the entropy is a measure of the disorder. Indeed, let us consider for example a game of 52 cards and let us put them quite on the same side, let us say that they are in a perfect order. This macroscopic state can be realized only by a single way, then Ω = 1. Let us turn 1 card, what is the beginning of the disorder; But there is Ω = 52 manners to realize the macroscopic state "a single turned card". The disorder is maximum when 26 cards are on one side and 26 cards of the other one; the number of tiny configurations of this state of maximum disorder is then Ω = 4,96.1014.
In this example the number of tiny configurations (thus the entropy) is indeed a measure of the disorder. If this notion of disorder is often subjective, the number Ω of configurations is him, objective because it is a number.
Let us take back the game of 52 cards and let us suppose that we throw them in the air so that every card falls again on one side or of the other one with the same probability. If we begin again the operation a large number of time the previous numerical values show that the maximum disorder will appear much more often than quite other situation. Let us think now about a mole of gas completed in the normal conditions of temperature and pressure.
The number of particles is enormous NA = 6,022.1023. The possibility of storing all the molecules of the same side of the bowl so as to leave half of the empty volume, is weak with regard to the possibilities immensely bigger for which molecules are uniformly distributed in all the volume. The uniform distribution is thus realized immensely more often than quite other situation, to the point that it seems as a still balance. So the balance of a system thermodynamics occurs when its entropy has the maximal, compatible value with the constraints to which is subjected (here the volume).
Image: The possibility to find all the molecules of the same side of the container so as to leave half of the void volume is low compared to immensely greater opportunities for which the molecules are uniformly distributed throughout the volume. Thus a thermodynamic equilibrium occurs when the system has the maximum entropy value. The entropy is defined by the Boltzmann formula, S = k log W.
Some examples of entropy
| || || || |
Both expressions of the entropy result simply from two different points of view, as we consider the system thermodynamics at the macroscopic level or at the tiny level.
The difficulty giving an intuitive definition of the entropy of a system comes because it does not keep. It can increase spontaneously during an irreversible transformation. Indeed, according to the second principle of the thermodynamics, the entropy of an isolated system cannot decrease, it increases or it remains constant if the transformation is reversible.
The matter is formed by particles (molecules, atoms, electrons) in perpetual movement (thermal excitement) exercising the some on the others an interaction the intensity of which decrease when their mutual distance increases. In a gas this distance is relatively big, the interactions are thus weak, so that particles are free to move in all the volume which is offered to them, but undergo numerous collisions during which their energy varies.
In a liquid the mutual distances are smaller and molecules are less free. In solid one every molecule is connected elastically to its neighbors and vibrates around a fixed average position. The energy of every particle is unpredictable.
- the frictions being the main cause of irreversibility of the entropy, we understand why we try to minimize them; it is the purpose of the lubrication of details in contact and in movement in a mechanical set.
- with the same quantity of gasoline we are going to get back less mechanical work according to the speed of the car, the more the car goes fast and the less the crossed distance is big. The speed is there, a factor of irreversibility.
- an electric cell supplies more electric work if its functioning gets closer to the reversibility, that is if it has a weak tension and a weak current of functioning. On the other hand if we short-circuit electrodes, we get back practically only, of the heat.
Image: Entropy, uncertainty, disorder, complexity, thus appear as adversities of the same concept. Under the one or other one of these forms, the entropy is associated with the notion of probability.
It characterizes not an object in itself, but the knowledge which we have of it and our possibilities of making forecasts. It thus has a character at the same moment objective and subjective. (Roger Balian, University of all the knowledge).