Информатика и ЭВМ
  • формат djvu, pdf
  • размер 8.79 МБ
  • добавлен 28 июня 2011 г.
Cover Thomas. Elements of Information Theory
Cover, Thomas. Elements of Information Theory. 2006, 2ed
Книга выложена в двух вариантах: в виде .pdf и .djvu файлов.
Отзывы читателей amazon.com/

Review
"As expected, the quality of exposition continues to be a high point of the book. Clear explanations, nice graphical illustrations, and illuminating mathematical derivations make the book particularly useful as a textbook on information theory. " (Joual of the American Statistical Association, March 2008)

"This book is recommended reading, both as a textbook and as a reference. " (Computing Reviews.com, December 28, 2006)

Product Description
The latest edition of this classic is updated with new problem sets and material

The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.

All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.

The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references

Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.

From the Publisher
Following a brief introduction and overview, early chapters cover the basic algebraic relationships of entropy, relative entropy and mutual information, AEP, entropy rates of stochastics processes and data compression, duality of data compression and the growth rate of wealth. Later chapters explore Kolmogorov complexity, channel capacity, differential entropy, the capacity of the fundamental Gaussian channel, the relationship between information theory and statistics, rate distortion and network information theories. The final two chapters examine the stock market and inequalities in information theory. In many cases the authors actually describe the properties of the solutions before the presented problems. -This text refers to an out of print or unavailable edition of this title.

From the Inside Flap
Elements of Information Theory is an up-to-date introduction to the field of information theory and its applications to communication theory, statistics, computer science, probability theory, and the theory of investment. Covering all the essential topics in information theory, this comprehensive work provides an accessible introduction to the field that blends theory and applications. In step-by-step detail, the authors introduce the basic quantities of entropy, relative entropy, and mutual information and show how they arise as natural answers to questions of data compression, channel capacity, rate distortion, hypothesis testing, information flow in networks, and gambling. In addition, Elements of Information Theory investigates a number of ideas never before covered at this level in a textbook, including:
* The relationship of the second law of thermodynamics to Markov chains
* Competitive optimality of Huffman codes
* The duality of data compression and gambling
* Lempel Ziv coding
* Kolmogorov complexity
* Portfolio theory
* Inequalities in information theory and their consequences in mathematics
Complete with numerous illustrations, original problems and historical notes, Elements of Information Theory is a practical reference for communications professionals and statisticians specializing in information theory. It also serves as an excellent introductory text for senior and graduate students taking courses in telecommunications, electrical engineering, statistics, computer science, and economics. -This text refers to an out of print or unavailable edition of this title.

From the Back Cover
The latest edition of this classic is updated with new problem sets and material

The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.

All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.

The Second Edition features:

Chapters reorganized to improve teaching
200 new problems
New material on source coding, portfolio theory, and feedback capacity
Updated references

Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

About the Author
THOMAS M. COVER, PHD, is Professor in the departments of electrical engineering and statistics, Stanford University. A recipient of the 1991 IEEE Claude E. Shannon Award, Dr. Cover is a past president of the IEEE Information Theory Society, a Fellow of the IEEE and the Institute of Mathematical Statistics, and a member of the National Academy of Engineering and the American Academy of Arts and Science. He has authored more than 100 technical papers and is coeditor of Open Problems in Communication and Computation.

JOY A. THOMAS, PHD, is the Chief Scientist at Stratify, Inc. , a Silicon Valley start-up specializing in organizing unstructured information. After receiving his PhD at Stanford, Dr. Thomas spent more than nine years at the IBM T. J. Watson Research Center in Yorktown Heights, New York. Dr. Thomas is a recipient of the IEEE Charles LeGeyt Fortescue Fellowship.
Смотрите также

Chaitin G.J. The Limits of Mathematics

  • формат djvu
  • размер 847.99 КБ
  • добавлен 22 марта 2011 г.
Springer, 1997. - 147 pages. This book is the final version of a course on algorithmic information theory and the epistemology of mathematics and physics. It discusses Einstein and Goedel's views on the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical. There is a foreword by Cris Calude of the University of Auckland, and supplementary material is available at the author's web s...

Cover T.M., Thomas J.A. Elements of Information Theory

  • формат pdf
  • размер 34.35 МБ
  • добавлен 06 января 2012 г.
Wiley series in telecommunications. John Wiley & Sons, Inc., 1991. – 563 pages. This is intended to be a simple and accessible book on information theory. As Einstein said, ’’Everything should be made as simple as possible, but no simpler.’’ This point of view drives our development throughout the book. We were drawn to the field of information theory from backgrounds in communication theory, probability theory and statistics, because of the...

Desurvire E. Classical and Quantum Information Theory: An Introduction for the Telecom Scientist

  • формат pdf
  • размер 4.74 МБ
  • добавлен 01 ноября 2011 г.
Cаmbridge Univеrsity Prеss, 2009, 691 pages Information theory lies at the heart of modern technology, underpinning all communications, networking, and data storage systems. This book sets out, for the first time, a complete overview of both classical and quantum information theory. Throughout, the reader is introduced to key results without becoming lost in mathematical details. Opening chapters present the basic concepts and various applicati...

Gray R.M. Entropy and Information Theory

  • формат pdf
  • размер 2.58 МБ
  • добавлен 07 апреля 2011 г.
2nd Edition Springer, 2011. 409 p. ISBN:1441979697 This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to...

Hamming R.W. Coding and Information Theory

  • формат djvu
  • размер 1.71 МБ
  • добавлен 17 апреля 2011 г.
Prentice Hall, 1986. - 272 pages. Preface to the First Edition: This book combines the fields of coding and information theory in a natural way. They are both theories about the representation of abstract symbols. The two fields are now each so vast that only the elements can be presented in a short book. Information theory is usually thought of as "sending information from here to there" (transmission of information), but this is exactly the...

Hirsch M.J., Pardalos P.M., Murphey R. Dynamics of Information Systems: Theory and Applications

  • формат pdf
  • размер 9.26 МБ
  • добавлен 29 августа 2011 г.
Sрringer, 2010. - 371 p. "Dynamics of Information Systems" presents state-of-the-art research explaining the importance of information in the evolution of a distributed or networked system. This book presents techniques for measuring the value or significance of information within the context of a system. Each chapter reveals a unique topic or perspective from experts in this exciting area of research. These newly developed techniques have numer...

Kullback S. Information Theory and Statistics

  • формат pdf
  • размер 15.7 МБ
  • добавлен 16 января 2011 г.
Peter Smith, 1978. - 399 pages. Highly useful text studies the logarithmic measures of information and their application to testing statistical hypotheses. Topics include introduction and definition of measures of information, their relationship to Fisher’s information measure and sufficiency, fundamental inequalities of information theory, much more. Numerous worked examples and problems.

Neubauer A., Freudenberger J., Kuhn V. Coding theory: algorithms, architectures and applications

  • формат pdf
  • размер 1.83 МБ
  • добавлен 07 сентября 2011 г.
The present book provides a concise overview of channel coding theory and practice as well as the accompanying algorithms, architectures and applications. The selection of the topics presented in this book is oriented towards those subjects that are relevant for information and communication systems in use today or in the near future. The focus is on those aspects of coding theory that are important for the understanding of these systems. This bo...

Pierce J.R. An Introduction to Information Theory

  • формат djvu
  • размер 2.29 МБ
  • добавлен 15 марта 2011 г.
Dover Publications, 1980. - 320 pages. Covers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology and art.

Thomas M. Cover, Joy A. Thomas. Elements of information theory

  • формат pdf
  • размер 10.09 МБ
  • добавлен 01 декабря 2009 г.
2nd ed. John Wiley & Sons, Inc. This book has arisen from over ten years of lectures in a two-quarter sequence of a senior and first-year graduate-level course in information theory, and is intended as an introduction to information theory for students of communication theory, computer science, and statistics. The first quarter might cover Chapters 1 to 9, which includes the asymptotic equipartition property, data compression, and channel...