6.1 Discussion Points:
In thermal processes the entropy measures the extent to which energy of the system is available for conversion to work. If a system undergoing an infinitesimal reversible change takes in a quantity of heat dQ at absolute temperature T , its entropy is increased by
dS= dQ/T
For an adiabatic process, there is no heat transfer and the entropy remains constant through the process.
On the microscopic level, the equilibrium of a thermodynamic system is associated with the distribution of molecules that has the greatest probability of occurring, i.e. the state with the greatest degree of disorder. Statistical mechanics interprets the increase in entropy to a maximum in a closed system at equilibrium as the consequence of the trend from a less probable to a more probable state. The Second Law of Thermodynamics states that the entropy of an isolated system is non-decreasing.
In statistical thermodynamics, entropy is often defined as a function of the number of possible microscopic states ( N) that a system of atoms or molecules could occupy, consistent with its macroscopic state (temperature, pressure, etc)
In Information Theory Shannon made rigorous the idea that the entropy of a process is a measure of the information ( or lack of it) contained in the process, if all the states are equally likely. Entropy is really a notion of self-information - the information provided by a random process about itself. The entropy is sufficient to study the reproduction of a single process through a noiseless environment, more often one has two or more distinct random processes. For example, one random process represents an information source and another represents the output of a communication medium wherein the coded source has been corrupted by another random process called noise. In such cases observations are made on one process in order to make decisions on another. Shannon introduced the notion of mutual information between two processes as the sum of the two entropies minus the entropy of the pair.
Another form of information measure is commonly referred to as relative entropy and it is better interpreted a a measure of the similarity between probability distributions than as a measure of information between random variables. Many results for mutual information and entropy can be viewed as special cases of results for relative entropy.
An isolated system is modelled as a Markov chain obeying the physical laws governing the
system. This assumption implies (1) the existence of an overall state of the system and (2) the fact that, knowing the present state, the future of the system is independent of the past. In such a system the relative entropy does not increase; it always decreases. In general, the fact that the relative entropy decreases does not imply that the entropy increases. Cover and Thomas show that for a Markov chain:
1. Relative entropy D(µnllµ�) decreases with ti, where µn and µ� are two probability
distributions on the state space of a Markov chain at time n. Thet means that the distance between the probability mass functions is decreasing with time n for any Markov chain.
2. Relative entropy D(µnllµ) between a distribution µn on the states at time n and a
stationary distribution µ decreases with n.
3. Entropy increases if the stationary distribution is uniform. If the stationary distribution is a uniform distribution, the relative entropy decreases monotonically and that implies a monotonic increase in entropy. This case is the closest to statistical thermodynamics, where all the microstates are equally likely.
4. The conditional entropy H(Xn I X1). increases with n. The conditional uncertainty
of the future increases.
5. Shuffies increase entropy.