Информатика и вычислительная техника
  • формат djvu
  • размер 3.32 МБ
  • добавлен 12 октября 2011 г.
Ash R. Information Theory
Издательство Dover Publications, 1965, -345 pp.

Statistical communication theory is generally regarded as having been founded by Shannon A948) and Wiener A949), who conceived of the communication situation as one in which a signal chosen from a specified class is to be transmitted through a channel, but the output of the channel is not determined by the input. Instead, the channel is described statisti- statistically by giving a probability distribution over the set of all possible outputs for each permissible input. At the output of the channel, a received signal is observed, and then a decision is made, the objective of the decision being to identify as closely as possible some property of the input signal.
The Shannon formulation differs from the Wiener approach in the nature of the transmitted signal and in the type of decision made at the receiver. In the Shannon model, a randomly generated message produced by a source of information is "encoded," that is, each possible message that the source can produce is associated with a signal belonging to a specified set. It is the encoded message which is actually transmitted. When the output is received, a "decoding" operation is performed, that is, a decision is made as to the identity of the particular signal transmitted. The objectives are to increase the size of the vocabulary, that is, to make the class of inputs as large as possible, and at the same time to make the probability of correctly identifying the input signal as large as possible. How well one can do these things depends essentially on the properties of the channel, and a fundamental conce is the analysis of different channel models. Another basic problem is the selection of a particular input vocabulary that can be used with a low probability of error.
In the Wiener model, on the other hand, a random signal is to be communicated directly through the channel; the encoding step is absent. Furthermore, the channel model is essentially fixed. The channel is generally taken to be a device that adds to the input signal a randomly generated "noise." The "decoder" in this case operates on the received signal to produce an estimate of some property of the input. For example, in the prediction problem the decoder estimates the value of the input at some future time. In general, the basic objective is to design a decoder, subject to a constraint of physical readability, which makes the best estimate, where the closeness of the estimate is measured by an appropriate criterion. The problem of realizing and implementing an optimum de- decoder is central to the Wiener theory.

A Measure of Information
Noiseless Coding
The Discrete Memoryless Channel
Error Correcting Codes
Farther Theory of Error Correcting Codes
Information Sources
Channels with Memory
Continuous Channels
Смотрите также

Chaitin G.J. The Limits of Mathematics

  • формат djvu
  • размер 847.99 КБ
  • добавлен 22 марта 2011 г.
Springer, 1997. - 147 pages. This book is the final version of a course on algorithmic information theory and the epistemology of mathematics and physics. It discusses Einstein and Goedel's views on the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical. There is a foreword by Cris Calude of the University of Auckland, and supplementary material is available at the author's web s...

Cover T.M., Thomas J.A. Elements of Information Theory

  • формат pdf
  • размер 34.35 МБ
  • добавлен 06 января 2012 г.
Wiley series in telecommunications. John Wiley & Sons, Inc., 1991. – 563 pages. This is intended to be a simple and accessible book on information theory. As Einstein said, ’’Everything should be made as simple as possible, but no simpler.’’ This point of view drives our development throughout the book. We were drawn to the field of information theory from backgrounds in communication theory, probability theory and statistics, because of the...

Desurvire E. Classical and Quantum Information Theory: An Introduction for the Telecom Scientist

  • формат pdf
  • размер 4.74 МБ
  • добавлен 01 ноября 2011 г.
Cаmbridge Univеrsity Prеss, 2009, 691 pages Information theory lies at the heart of modern technology, underpinning all communications, networking, and data storage systems. This book sets out, for the first time, a complete overview of both classical and quantum information theory. Throughout, the reader is introduced to key results without becoming lost in mathematical details. Opening chapters present the basic concepts and various applicati...

Gray R.M. Entropy and Information Theory

  • формат pdf
  • размер 2.58 МБ
  • добавлен 07 апреля 2011 г.
2nd Edition Springer, 2011. 409 p. ISBN:1441979697 This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to...

Hamming R.W. Coding and Information Theory

  • формат djvu
  • размер 1.71 МБ
  • добавлен 17 апреля 2011 г.
Prentice Hall, 1986. - 272 pages. Preface to the First Edition: This book combines the fields of coding and information theory in a natural way. They are both theories about the representation of abstract symbols. The two fields are now each so vast that only the elements can be presented in a short book. Information theory is usually thought of as "sending information from here to there" (transmission of information), but this is exactly the...

Hirsch M.J., Pardalos P.M., Murphey R. Dynamics of Information Systems: Theory and Applications

  • формат pdf
  • размер 9.26 МБ
  • добавлен 29 августа 2011 г.
Sрringer, 2010. - 371 p. "Dynamics of Information Systems" presents state-of-the-art research explaining the importance of information in the evolution of a distributed or networked system. This book presents techniques for measuring the value or significance of information within the context of a system. Each chapter reveals a unique topic or perspective from experts in this exciting area of research. These newly developed techniques have numer...

Kullback S. Information Theory and Statistics

  • формат pdf
  • размер 15.7 МБ
  • добавлен 16 января 2011 г.
Peter Smith, 1978. - 399 pages. Highly useful text studies the logarithmic measures of information and their application to testing statistical hypotheses. Topics include introduction and definition of measures of information, their relationship to Fisher’s information measure and sufficiency, fundamental inequalities of information theory, much more. Numerous worked examples and problems.

Neubauer A., Freudenberger J., Kuhn V. Coding theory: algorithms, architectures and applications

  • формат pdf
  • размер 1.83 МБ
  • добавлен 07 сентября 2011 г.
The present book provides a concise overview of channel coding theory and practice as well as the accompanying algorithms, architectures and applications. The selection of the topics presented in this book is oriented towards those subjects that are relevant for information and communication systems in use today or in the near future. The focus is on those aspects of coding theory that are important for the understanding of these systems. This bo...

Pierce J.R. An Introduction to Information Theory

  • формат djvu
  • размер 2.29 МБ
  • добавлен 15 марта 2011 г.
Dover Publications, 1980. - 320 pages. Covers encoding and binary digits, entropy, language and meaning, efficient encoding and the noisy channel, and explores ways in which information theory relates to physics, cybernetics, psychology and art.

Thomas M. Cover, Joy A. Thomas. Elements of information theory

  • формат pdf
  • размер 10.09 МБ
  • добавлен 01 декабря 2009 г.
2nd ed. John Wiley & Sons, Inc. This book has arisen from over ten years of lectures in a two-quarter sequence of a senior and first-year graduate-level course in information theory, and is intended as an introduction to information theory for students of communication theory, computer science, and statistics. The first quarter might cover Chapters 1 to 9, which includes the asymptotic equipartition property, data compression, and channel...