From Wikipedia, the free encyclopedia - View original article
The four laws of thermodynamics define fundamental physical quantities (temperature, energy, and entropy) that characterize thermodynamic systems. The laws describe how these quantities behave under various circumstances, and forbid certain phenomena (such as perpetual motion).
Classical thermodynamics describes the exchange of work and heat between closed systems. It has a special interest in systems that are individually in states of thermodynamic equilibrium. Thermodynamic equilibrium is a condition of systems which are adequately described by only macroscopic variables. Every physical system, however, when microscopically examined, shows apparently random microscopic statistical fluctuations in its thermodynamic variables of state (entropy, temperature, pressure, etc.). These microscopic fluctuations are negligible for systems which are nearly in thermodynamic equilibrium and which are only macroscopically examined. They become important, however, for systems which are nearly in thermodynamic equilibrium when they are microscopically examined, and, exceptionally, for macroscopically examined systems that are in critical states, and for macroscopically examined systems that are far from thermodynamic equilibrium.
The zeroth law of thermodynamics may be stated in the following form:
If two systems are both in thermal equilibrium with a third then they are in thermal equilibrium with each other.
The law is intended to allow the existence of an empirical parameter, the temperature, as a property of a system such that systems in thermal equilibrium with each other have the same temperature. The law as stated here is compatible with the use of a particular physical body, for example a mass of gas, to match temperatures of other bodies, but does not justify regarding temperature as a quantity that can be measured on a scale of real numbers.
Though this version of the law is one of the more commonly stated, it is only one of a diversity of statements that are labeled as "the zeroth law" by competent writers. Some statements go further so as to supply the important physical fact that temperature is one-dimensional, that one can conceptually arrange bodies in real number sequence from colder to hotter. Perhaps there exists no unique "best possible statement" of the "zeroth law", because there is in the literature a range of formulations of the principles of thermodynamics, each of which call for their respectively appropriate versions of the law.
Although these concepts of temperature and of thermal equilibrium are fundamental to thermodynamics and were clearly stated in the nineteenth century, the desire to explicitly number the above law was not widely felt until Fowler and Guggenheim did so in the 1930s, long after the first, second, and third law were already widely understood and recognized. Hence it was numbered the zeroth law. The importance of the law as a foundation to the earlier laws is that it allows the definition of temperature in a non-circular way without reference to entropy, its conjugate variable. Such a temperature definition is said to be 'empirical'.
The first law of thermodynamics may be stated in several ways:
More specifically, the First Law encompasses several principles:
Combining these principles leads to one traditional statement of the first law of thermodynamics: it is not possible to construct a perpetual motion machine which will continuously do work without consuming energy.
The second law of thermodynamics asserts the irreversibility of natural processes, and the tendency of natural processes to lead towards spatial homogeneity of matter and energy, and especially of temperature. It can be formulated in a variety of interesting and important ways.
It implies the existence of a quantity called the entropy of a thermodynamic system. In terms of this quantity it implies that
When two initially isolated systems in separate but nearby regions of space, each in thermodynamic equilibrium in itself but not necessarily with each other, are then allowed to interact, they will eventually reach a mutual thermodynamic equilibrium. The sum of the entropies of the initially isolated systems is less than or equal to the total entropy of the final combination. Equality occurs just when the two original systems have all their respective intensive variables equal; then the final system also has the same values.
This statement of the law recognizes that in classical thermodynamics, the entropy of a system is defined only when it has reached its own internal thermodynamic equilibrium.
The second law refers to a wide variety of processes, reversible and irreversible. All natural processes are irreversible. Reversible processes are a convenient theoretical fiction and do not occur in nature.
A prime example of irreversibility is in the transfer of heat by conduction or radiation. It was known long before the discovery of the notion of entropy that when two bodies initially of different temperatures come into thermal connection, then heat always flows from the hotter body to the colder one.
The second law tells also about kinds of irreversibility other than heat transfer, for example those of friction and viscosity, and those of chemical reactions. The notion of entropy is needed to provide that wider scope of the law.
According to the second law of thermodynamics, in a theoretical and fictional reversible heat transfer, an element of heat transferred, δQ, is the product of the temperature (T), both of the system and of the sources or destination of the heat, with the increment (dS) of the system's conjugate variable, its entropy (S)
Entropy may also be viewed as a physical measure of the lack of physical information about the microscopic details of the motion and configuration of a system, when only the macroscopic states are known. The law asserts that for two given macroscopically specified states of a system, there is a quantity called the difference of information entropy between them. This information entropy difference defines how much additional microscopic physical information is needed to specify one of the macroscopically specified states, given the macroscopic specification of the other - often a conveniently chosen reference state which may be presupposed to exist rather than explicitly stated. A final condition of a natural process always contains microscopically specifiable effects which are not fully and exactly predictable from the macroscopic specification of the initial condition of the process. This is why entropy increases in natural processes - the increase tells how much extra microscopic information is needed to distinguish the final macroscopically specified state from the initial macroscopically specified state.
The third law of thermodynamics is sometimes stated as follows:
At zero temperature the system must be in a state with the minimum thermal energy. This statement holds true if the perfect crystal has only one state with minimum energy. Entropy is related to the number of possible microstates according to S = kBln(Ω), where S is the entropy of the system, kB Boltzmann's constant, and Ω the number of microstates (e.g. possible configurations of atoms). At absolute zero there is only 1 microstate possible (Ω=1) and ln(1) = 0.
A more general form of the third law that applies to a systems such as a glass that may have more than one minimum microscopically distinct energy state, or may have a microscopically distinct state that is "frozen in" though not a strictly minimum energy state and not strictly speaking a state of thermodynamic equilibrium, at absolute zero temperature:
The constant value (not necessarily zero) is called the residual entropy of the system.
Count Rumford (born Benjamin Thompson) showed, about 1797, that endless mechanical action can generate indefinitely large amounts of heat from a fixed amount of working substance, so challenging the caloric theory that held that there would be a finite amount of caloric in a fixed amount of working substance. The historically first established thermodynamic principle which eventually became the second law of thermodynamics was formulated by Sadi Carnot during 1824. By 1860, as formalized in the works of those such as Rudolf Clausius and William Thomson, two established principles of thermodynamics had evolved, the first principle and the second principle, later restated as thermodynamic laws. By 1873, for example, thermodynamicist Josiah Willard Gibbs, in his memoir Graphical Methods in the Thermodynamics of Fluids, clearly stated the first two absolute laws of thermodynamics. Some textbooks throughout the 20th century have numbered the laws differently. In some fields removed from chemistry, the second law was considered to deal with the efficiency of heat engines only, whereas what was called the third law dealt with entropy increases. Directly defining zero points for entropy calculations was not considered to be a law. Gradually, this separation was combined into the second law and the modern third law was widely adopted.
Chemist and novelist C. P. Snow once remarked that not knowing the second law of thermodynamics was "like having never read a work by Shakespeare." The following simple expression of the four laws has been attributed to Snow: