Second law of Themodynamics
The second law of thermodynamics asserts the existence of a quantity called the entropy of a system and further states that
When two initially isolated systems in separate but nearby regions of space, each in thermodynamic equilibrium in itself but not necessarily with each other, are then allowed to interact, they will eventually reach a mutual thermodynamic equilibrium. The sum of the entropies of the initially isolated systems is less than or equal to the total entropy of the final combination.
This statement of the law recognizes that in classical thermodynamics, the entropy of a system is defined only when it has reached its own internal thermodynamic equilibrium. There are other statements of the law.
The second law refers to a wide variety of processes, reversible and irreversible. Its main import is to tell about irreversibility. In an irreversible process of transfer of matter and energy between two systems, the sum of the entropies of the two systems is greater finally than initially. In the trivial case in which the two systems have equal intensive variables, the sum of the entropies does not change. Apart from this trivial case, all natural processes are irreversible, though reversible processes are a convenient theoretical fiction.
The prime example of irreversibility is in the transfer of heat by conduction or radiation. It was known long before the discovery of the notion of entropy that when two bodies initially of different temperatures come into thermal connection, then heat always flows from the hotter body to the colder one.
The second law tells also about kinds of irreversibility other than heat transfer, for example that of chemical reactions. The notion of entropy is needed to provide that wider scope of the law.
According to the second law of thermodynamics, in a theoretical and fictional reversible heat transfer, an element of heat transferred,δQ, is the product of the temperature (T), both of the system and of the sources or destination of the heat, with the increment (dS) of the system's conjugate variable, its entropy (S)
Entropy may also be viewed as a measure of the lack of physical information about the microscopic details of the motion and configuration of a system. The law asserts that for two given macroscopically specified states of a system, there is a quantity called the difference of entropy between them. This entropy difference defines how much additional microscopic physical information is needed to specify one of the macroscopically specified states, given the macroscopic specification of the other - often a conveniently chosen reference state which may be presupposed to exist rather than explicitly stated. A final condition of a natural process always contains microscopically specifiable effects which are not fully and exactly predictable from the macroscopic specification of the initial condition of the process. This is why entropy increases in natural processes - the increase tells how much extra microscopic information is needed to distinguish the final macroscopically specified state from the initial macroscopically specified state.
No comments:
Post a Comment