**6****.****1 ****D****i****scu****ss****ion**** ****P****o****i****n****t****s****:**

In thermal processes the entropy measures the extent to which energy of the system is available for conversion to work. If a system undergoing an infinitesimal reversible change takes in a quantity of heat *dQ ** *at absolute temperature *T** **, *its entropy is increased by

*d**S**=** d**Q/**T*

For an adiabatic process, there is no heat transfer and the entropy remains constant through the process.

On the microscopic level, the equilibrium of a thermodynamic system is associated with the distribution of molecules that has the greatest probability of occurring, i.e. the state with the greatest degree of disorder. Statistical mechanics interprets the increase in entropy to a maximum in a closed system at equilibrium as the consequence of the trend from a less probable to a more probable state. The Second Law of Thermodynamics states that the entropy of an isolated system is non-decreasing.

In statistical thermodynamics, entropy is often defined as a function of the number of possible microscopic states ( *N) *that a system of atoms or molecules could occupy, consistent with its macroscopic state (temperature, pressure, etc)

In Information Theory Shannon made rigorous the idea that the entropy of a process is a measure of the information ( or lack of it) contained in the process, if all the states are equally likely. Entropy is really a notion of self-information - the information provided by a random process about itself. The entropy is sufficient to study the reproduction of a single process through a noiseless environment, more often one has two or more distinct random processes. For example, one random process represents an information source and another represents the output of a communication medium wherein the coded source has been corrupted by another random process called noise. In such cases observations are made on one process in order to make decisions on another. Shannon introduced the notion of *mu**tua**l **i**n**f**o**rm**a**t**i**o**n** *between two processes as the sum of the two entropies minus the entropy of the pair.

Another form of information measure is commonly referred to as *r**e**l**a**t**i**v**e** **e**n**t**r**o**p**y** *and it is better interpreted a a measure of the similarity between probability distributions than as a measure of information between random variables. Many results for mutual information and entropy can be viewed as special cases of results for relative entropy.

An isolated system is modelled as a Markov chain obeying the physical laws governing the

system. This assumption implies (1) the existence of an overall state of the system and (2) the fact that, knowing the present state, the future of the system is independent of the past. In such a system the relative entropy does not increase; it always decreases. In general, the fact that the relative entropy decreases does not imply that the entropy increases. Cover and Thomas show that for a Markov chain:

1. *Relat**i**ve** **e**n**t**r**o**p**y** **D**(**µ**n**l**l**µ*�*)** **d**e**c**r**e**a**s**e**s** **with** **ti**, *where *µn ** *and *µ*� are two probability

distributions on the state space of a Markov chain at time *n**. *Thet means that the distance between the probability mass functions is decreasing with time *n ** *for any Markov chain.

2. *Relat**i**ve** **e**n**t**r**o**p**y** **D**(**µ**n**llµ)** **b**e**t**w**e**e**n** a** **d**is**t**r**i**b**u**t**i**o**n** ** **µ**n **on** **th**e** **s**t**a**t**e**s** **a**t** **t**ime ** **n ** **an**d** **a*

*sta**t**i**o**n**a**r**y** **dis**tr**i**b**u**t**ion **µ ** **d**ec**r**e**a**s**e**s** **wit**h **n**.*

3. *E**n**t**r**o**p**y** **i**n**c**r**e**a**s**e**s** **i**f** **th**e** **s**tat**io**na**r**y** **d**i**s**t**ri**b**u**t**i**o**n** **is** **u**n**i**f**o**rm**.** *If the stationary distribution is a uniform distribution, the relative entropy decreases monotonically and that implies a monotonic increase in entropy. This case is the closest to statistical thermodynamics, where all the microstates are equally likely.

4. *Th**e co**nd**iti**o**na**l **e**n**t**r**o**p**y** ** **H**(**X**n** *I X1). *i**n**c**r**e**a**s**e**s** **w**i**t**h ** **n. *The conditional uncertainty

of the future increases.

5. *Sh**u**ff**i**es **i**n**c**r**e**a**s**e** **e**n**t**r**o**p**y**.*