How to define entropy

How to define entropy

Entropy — mysterious physical quantity. It has several definitions given by different scientists at different times. The concept of entropy appears in a set of problems of physics and related to it disciplines. Therefore it is very important to know what is entropy and as to define it.

Instruction

1. The first the concept of entropy was entered by the scientist Rudolf Klauzius in 1865. He called entropy a measure of dispersion of heat at any thermodynamic processes. The exact formula for this thermodynamic entropy looks so: ΔS = ΔQ/T. Here ΔS — increment of entropy in the described process, ΔQ — the amount of heat transferred to a system or which is taken away from it, T — absolute (taken in kelvins) system temperature. The first two beginnings of thermodynamics do not allow to tell more about entropy. They measure only its increment, but not an absolute value. The third beginning specifies that at approach of temperature to absolute zero the entropy tends to zero too. Thus, it gives the initial point for measurement of entropy. However, in the majority of real experiments of scientists interests change of entropy in each concrete process, but not its exact values at the beginning and the end of process.

2. Ludwig Boltzmann and Max Planck gave other definition to the same entropy. Having applied statistical approach, they came to a conclusion that entropy — a measure of approach of a system to the most probable state. The most probable, in turn, will be that state which is implemented by the maximum number of options. In a classical mental experiment with a pool table on which chaotically move spheres, is visible that the least probable will be that condition of this of "a sharodinamichesky system" when all spheres are in one half of a table. To within arrangement of spheres it is implemented by the one and only way. The state in which spheres are distributed evenly on all surface of a table is the most probable. Therefore, in the first state the entropy of a system is minimum, and in the second is maximum. The system will spend the most part of time being able with the maximum entropy. A statistical formula for determination of entropy is as follows: S = k*ln(Ω) where k is Boltzmann's constant (1.38*10^ (-23) J / J), and Ω — the statistical weight of a condition of a system.

3. The thermodynamics claims by the second beginning that in any processes the entropy of a system at least does not decrease. Statistical approach, however, says that even the most improbable states nevertheless can be implemented, so, fluctuations at which the entropy of a system can decrease are possible. The second beginning of thermodynamics is still fair, but only if to consider all picture on a wide interval of time.

4. Rudolf Klauzius on the basis of the second beginning of thermodynamics made a hypothesis of thermal death of the Universe when eventually all types of energy pass into thermal, and it will evenly be distributed on all world space, and life will become impossible. Subsequently this hypothesis was disproved: Klauzius did not consider in the calculations gravitation influence because of which the picture painted by it is not the most probable condition of the Universe at all.

5. The entropy is called sometimes a disorder measure as the most probable state, as a rule, is less structured in comparison with others. However such understanding is not always right. For example, the ice crystal in comparison with water is more ordered, but represents a state with bigger entropy.

Author: «MirrorInfo» Dream Team


Print