µµ¼­/Ã¥°¡°Ýºñ±³ ³ë¶õºÏ
 
³ë¶õºñµð¿À
Ãßõµµ¼­ º£½ºÆ®¼¿·¯ ¸¹ÀÌ º» Ã¥ ½Å°£µµ¼­ ¼­Á¡À̺¥Æ® ÀçÁ¤°¡µµ¼­ ÅëÇÕ°¡°Ýºñ±³
[¼­¾çµµ¼­] Elements of Information Theory
9780471241959
69,000¿ø
Operating System Concepts (Paperback, 10)
9781119439257
74,340¿ø
9780471241959 6,218
Elements of Information Theory 2nd Edition (Wiley Series in Telecommunications and Signal Processing)
¿Ü±¹µµ¼­ > ÄÄÇ»ÅÍ > Á¤º¸ÀÌ·Ð

Elements of Information Theory 2nd Edition (Wiley Series in Telecommunications and Signal Processing)

Cover, Thomas M./ Thomas, Joy A.
Wiley-Interscience
2006³â 07¿ù 17ÀÏ Ãâ°£ Á¤°¡ 46,000¿ø ÆäÀÌÁö 748 Page

Preface to the Second Edition.

Preface to the First Edition.

Acknowledgments for the Second Edition.

Acknowledgments for the First Edition.

1. Introduction and Preview.

1.1 Preview of the Book.

2. Entropy, Relative Entropy, and Mutual Information.

2.1 Entropy.

2.2 Joint Entropy and Conditional Entropy.

2.3 Relative Entropy and Mutual Information.

2.4 Relationship Between Entropy and Mutual Information.

2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information.

2.6 Jensen’s Inequality and Its Consequences.

2.7 Log Sum Inequality and Its Applications.

2.8 Data-Processing Inequality.

2.9 Sufficient Statistics.

2.10 Fano’s Inequality.

Summary.

Problems.

Historical Notes.

3. Asymptotic Equipartition Property.

3.1 Asymptotic Equipartition Property Theorem.

3.2 Consequences of the AEP: Data Compression.

3.3 High-Probability Sets and the Typical Set.

Summary.

Problems.

Historical Notes.

4. Entropy Rates of a Stochastic Process.

4.1 Markov Chains.

4.2 Entropy Rate.

4.3 Example: Entropy Rate of a Random Walk on a Weighted Graph.

4.4 Second Law of Thermodynamics.

4.5 Functions of Markov Chains.

Summary.

Problems.

Historical Notes.

5. Data Compression.

5.1 Examples of Codes.

5.2 Kraft Inequality.

5.3 Optimal Codes.

5.4 Bounds on the Optimal Code Length.

5.5 Kraft Inequality for Uniquely Decodable Codes.

5.6 Huffman Codes.

5.7 Some Comments on Huffman Codes.

5.8 Optimality of Huffman Codes.

5.9 Shannon–Fano–Elias Coding.

5.10 Competitive Optimality of the Shannon Code.

5.11 Generation of Discrete Distributions from Fair Coins.

Summary.

Problems.

Historical Notes.

6. Gambling and Data Compression.

6.1 The Horse Race.

6.2 Gambling and Side Information.

6.3 Dependent Horse Races and Entropy Rate.

6.4 The Entropy of English.

6.5 Data Compression and Gambling.

6.6 Gambling Estimate of the Entropy of English.

Summary.

Problems.

Historical Notes.

7. Channel Capacity.

7.1 Examples of Channel Capacity.

7.2 Symmetric Channels.

7.3 Properties of Channel Capacity.

7.4 Preview of the Channel Coding Theorem.

7.5 Definitions.

7.6 Jointly Typical Sequences.

7.7 Channel Coding Theorem.

7.8 Zero-Error Codes.

7.9 Fano’s Inequality and the Converse to the Coding Theorem.

7.10 Equality in the Converse to the Channel Coding Theorem.

7.11 Hamming Codes.

7.12 Feedback Capacity.

7.13 Source–Channel Separation Theorem.

Summary.

Problems.

Historical Notes.

8. Differential Entropy.

8.1 Definitions.

8.2 AEP for Continuous Random Variables.

8.3 Relation of Differential Entropy to Discrete Entropy.

8.4 Joint and Conditional Differential Entropy.

8.5 Relative Entropy and Mutual Information.

8.6 Properties of Differential Entropy, Relative Entropy, and Mutual Information.

Summary.

Problems.

Historical Notes.

9. Gaussian Channel.

9.1 G..

Ãâó : ¾Ë¶óµò 
³»¿ëÀÌ ¾ø½À´Ï´Ù.
The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.<i>Elements of Information Theory</i>, Second Edition, covers the standard topics of information theory, such as entropy, data compression, channel capacity, rate distortion, multi-user theory and hypothesis testing.  It presents applications to communications, statistics, complexity theory, and investment. Chapters 1-9 cover the asymptotic equipartition property, data compression and channel capacity culminating in the capacity of the Gaussian channel.  Chapters 10-17 include rate distortion, the method of types, Kolmogorov complexity, network information theory, universal source coding and portfolio theory.   <p> The first edition of this book is the most successful book on Information Theory on the market today.  Adoptions have remained strong since the book's publication in 1991
Ãâó : ¾Ë¶óµò 
9780471241959
69,000¿ø
³ë¶õºÏ ¸µÅ© °øÀ¯»çÀÌÆ® : *ÀÚ±â°æ¿µ³ëÇÏ¿ì(Ä«Æä)
³ë¶õºÏ °³ÀÎÁ¤º¸Ãë±Þ¹æÄ§ ±¤°í/Á¦ÈÞ¹®ÀÇ  ¼¼Á¾Æ¯º°ÀÚÄ¡½Ã °¡¸§·Î 255-21(2Â÷Ǫ¸£Áö¿À½ÃƼ) 1452È£
»ç¾÷ÀÚ¹øÈ£ 203-02-92535 ÀÎÁ¾ÀÏ ½Å°í¹øÈ£ Á¦ 2015-¼¼Á¾-0075È£ E-mail dlsjong@naver.com 010-2865-2225
COPYRIGHT(c) noranbook.net All rights Reserved.