Solution for Approaching Quantum Computing 1st Edition Chapter 6, Problem 1

by Dan C. Marinescu Gabriela M. Marinescu
70 Solutions 6 Chapters 10492 Studied ISBN: 9780131452244 Electrical Engineering 5 (1)

Chapter 6, Problem 1 : 6.1 Read Chapter 2 of the book Elements...

6.1 Read Chapter 2 of the book Elements of Information Theory by T. M. Cover and J. A. Thomas 8 and then discuss the relationship between the Second Law of Thermodynamics and the the entropy functions defined by the information theory.

Step-By-Step Solution

6.1 Discussion Points:

In thermal processes the entropy measures the extent to which energy of the system is available for conversion to work.  If a system undergoing an infinitesimal reversible change takes in a quantity of heat dQ at absolute temperature T , its entropy is increased by

dS= dQ/T

For an adiabatic process, there is no heat transfer and the entropy remains constant through the process.

On the microscopic level, the equilibrium of a thermodynamic system is associated with the distribution of molecules that has the greatest probability of occurring, i.e. the state with the greatest degree of disorder.  Statistical mechanics interprets the increase in entropy to a maximum in a closed system at equilibrium as the consequence of the trend from a less probable to a more probable state. The Second Law of Thermodynamics states that the entropy of an isolated system is non-decreasing.

In statistical thermodynamics, entropy is often defined as a function of the number of possible microscopic states ( N) that a system of atoms or molecules could occupy, consistent with its macroscopic state (temperature, pressure, etc)

In Information Theory Shannon made rigorous the idea that the entropy of a process is a measure of the information ( or lack of it) contained in the process, if all the states are equally likely. Entropy is really a notion of self-information - the information provided by a random process about itself. The entropis sufficient to study the reproduction of a single process through a noiseless environment, more often one has two or more distinct random processes. For example, one random process represents an information source and another represents the output of a communication medium wherein the coded source has been corrupted by another random process called noise. In such cases observations are made on one process in order to make decisions on another.  Shannon introduced the notion of mutual information between two processes as the sum of the two entropies minus the entropy of the pair.

Another form of information measure is commonly referred to as relative entropy and it is better interpreted a a measure of the similarity between probability distributions than as a measure of information between random variables. Many results for mutual information and entropy can be viewed as special cases of results for relative entropy.

An isolated system is modelled as a Markov chain obeying the physical laws governing the

system.  This assumption implies (1) the existence of an overall state of the system and (2) the fact that, knowing the present state, the future of the system is independent of the past. In such a system the relative entropy does not increase; it always decreases.  In general, the fact that the relative entropy decreases does not imply that the entropy increases. Cover and Thomas show that for a Markov chain:

1.  Relative entropy D(µnllµ) decreases with ti, where µn and µ  are two probability

distributions on the state space of a Markov chain at time n. Thet means that the distance between the probability mass functions is decreasing with time n for any Markov chain.

2. Relative entropy D(µnllµ) between a distribution  µon the states at time n and a

stationary distribution µ decreases with n.

3. Entropy increases if the stationary distribution is uniform. If the stationary distribution is a uniform distribution, the relative entropy decreases monotonically and that implies a monotonic increase in entropy.  This case is the closest to statistical thermodynamics, where all the microstates are equally likely.

4. The conditional entropy  H(Xn X1).  increases with n. The conditional uncertainty

of the future increases.

5. Shuffies increase entropy.


Unlimited Access Free
Signup & Access More than 10 Million+
  • Textbook Solutions
  • Flashcards
  • Homework Answers
  • Documents
All Chapters Solved


Is this a textbook?

We do not endorse or sell any Textbooks in this service. This is only a solution guide for the textbook shown. So, you will find all the answers to questions in the textbook, indexed for your ease of use. ExploreNow!

I am unable to find the book I need.

You can request for your textbook to be answered. While you may get the book resolved within 15-20 days’ subject to expert availability and can access it at no cost if you are premium member, We encourage you to use our Study Help service for the specific question or even a full chapter you currently, need within a 24-48-hour window.

My book is similar but not the same!

This may be due to different versions or editions of the same book. You can check the table of contents and match the questions in each chapter (As you can see, the questions are free to view for the entire book). You can subscribe if you decide the step-by-step solutions will be useful albeit the differences.

Not all questions are answered!

We try not to post guidebooks that are under progress. However, some guides are so high in demand that we have to post them as we work on them. If you find the notification stating, "An expert is currently solving this for you" in the answer section, you can contact customer support to know the status or even get an instant answer if you are a premium member.

The ISBN's are not matching!

Check if there are other ISBN's mentioned on the book cover page. If it still does not match, check the samples available to ensure you are on the right guide. 

Can I see some samples?

Every chapter in the book has the first three solutions displayed in full for free. Browse the chapters and questions to view the same. 

Is it possible to see a specific answer before I subscribe?

Contact customer support via Live Chat to request the same. They will assist you with the full answer if it is a simple question or a partial answer to assure you of the availability if it is a large solution.