Информатика и вычислительная техника
  • формат pdf
  • размер 949.53 КБ
  • добавлен 26 августа 2011 г.
Chaitin G.J. Algorithmic information theory
Publisher: Cambridge University Press
ISBN: 0521343062
Pages : 236
Publication Date : April 2, 2003

Book Excerpts:

The aim of this book is to present the strongest possible version of G?del's incompleteness theorem, using an information-theoretic approach based on the size of computer programs.

One half of the book is conceed with studying Omega, the halting probability of a universal computer if its program is chosen by tossing a coin. The other half of the book is conceed with encoding Omega as an algebraic equation in integers, a so-called exponential diophantine equation.

G?del's original proof of his incompleteness theorem is essentially the assertion that one cannot always prove that a program will fail to halt. This is equivalent to asking whether it ever produces any output. He then converts this into an arithmetical assertion. Over the years this has been improved; it follows from the work on Hilbert's 10th problem that G?del's theorem is equivalent to the assertion that one cannot always prove that a diophantine equation has no solutions if this is the case.

This book approaches incompleteness by asking whether or not a program produces an infinite amount of output rather than asking whether it produces any; this is equivalent to asking whether or not a diophantine equation has infinitely many solutions instead of asking whether or not it is solvable.

Although the ideas in this book are not easy, this book has tried to present the material in the most concrete and direct fashion possible. It gives many examples, and computer programs for key algorithms. In particular, the theory of program-size in LISP presented in Chapter 5 and Appendix B, which has not appeared elsewhere, is intended as an illustration of the more abstract ideas in the following chapters.
Смотрите также

Barg Alexander. Complexity Issues in Coding Theory

  • формат pdf
  • размер 20.42 КБ
  • добавлен 26 августа 2011 г.
Pages : 115 Publication Date : Oct 1997 Document Summary: This paper deals with complexity issues in the theory of linear error-correcting codes. Algorithmic problems that this paper study are constructing good codes, encoding and decoding them. According to their complexity, problems are divided into easy, i.e., polynomial in the length n of the code, and difficult, i.e., exponential ones. The first part deals with easy problems. This paper...

Chaitin G.J. Information Randomness and Incompleteness. Papers on Algorithmic Information Theory

  • формат pdf
  • размер 1.76 МБ
  • добавлен 18 октября 2011 г.
Издательство World Scientific, 1997, -534 pp. God not only plays dice in quantum mechanics, but even with the whole numbers! The discovery of randomness in arithmetic is presented in my book Algorithmic Information Theory published by Cambridge University Press. There I show that to decide if an algebraic equation in integers has finitely or infinitely many solutions is in some cases absolutely intractable. I exhibit an infinite series of such a...

Chaitin G.J. The Limits of Mathematics

  • формат djvu
  • размер 847.99 КБ
  • добавлен 22 марта 2011 г.
Springer, 1997. - 147 pages. This book is the final version of a course on algorithmic information theory and the epistemology of mathematics and physics. It discusses Einstein and Goedel's views on the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical. There is a foreword by Cris Calude of the University of Auckland, and supplementary material is available at the author's web s...

Cover T.M., Thomas J.A. Elements of Information Theory

  • формат pdf
  • размер 34.35 МБ
  • добавлен 06 января 2012 г.
Wiley series in telecommunications. John Wiley & Sons, Inc., 1991. – 563 pages. This is intended to be a simple and accessible book on information theory. As Einstein said, ’’Everything should be made as simple as possible, but no simpler.’’ This point of view drives our development throughout the book. We were drawn to the field of information theory from backgrounds in communication theory, probability theory and statistics, because of the...

Desurvire E. Classical and Quantum Information Theory: An Introduction for the Telecom Scientist

  • формат pdf
  • размер 4.74 МБ
  • добавлен 01 ноября 2011 г.
Cаmbridge Univеrsity Prеss, 2009, 691 pages Information theory lies at the heart of modern technology, underpinning all communications, networking, and data storage systems. This book sets out, for the first time, a complete overview of both classical and quantum information theory. Throughout, the reader is introduced to key results without becoming lost in mathematical details. Opening chapters present the basic concepts and various applicati...

Gray R.M. Entropy and Information Theory

  • формат pdf
  • размер 2.58 МБ
  • добавлен 07 апреля 2011 г.
2nd Edition Springer, 2011. 409 p. ISBN:1441979697 This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to...

Hamming R.W. Coding and Information Theory

  • формат djvu
  • размер 1.71 МБ
  • добавлен 17 апреля 2011 г.
Prentice Hall, 1986. - 272 pages. Preface to the First Edition: This book combines the fields of coding and information theory in a natural way. They are both theories about the representation of abstract symbols. The two fields are now each so vast that only the elements can be presented in a short book. Information theory is usually thought of as "sending information from here to there" (transmission of information), but this is exactly the...

Kullback S. Information Theory and Statistics

  • формат pdf
  • размер 15.7 МБ
  • добавлен 16 января 2011 г.
Peter Smith, 1978. - 399 pages. Highly useful text studies the logarithmic measures of information and their application to testing statistical hypotheses. Topics include introduction and definition of measures of information, their relationship to Fisher’s information measure and sufficiency, fundamental inequalities of information theory, much more. Numerous worked examples and problems.

Sommaruga G (ed.) Formal Theories of Information. From Shannon to Semantic Information Theory and General Concepts of Information

  • формат pdf
  • размер 2.63 МБ
  • добавлен 06 декабря 2011 г.
Издательство Springer, 2009, -274 pp. It is commonly assumed that computers process information. But what is information? In a technical, important, but nevertheless rather narrow sense, Shannon’s information theory gives a first answer to this question. This theory focuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The uncertainty of a situatio...

Thomas M. Cover, Joy A. Thomas. Elements of information theory

  • формат pdf
  • размер 10.09 МБ
  • добавлен 01 декабря 2009 г.
2nd ed. John Wiley & Sons, Inc. This book has arisen from over ten years of lectures in a two-quarter sequence of a senior and first-year graduate-level course in information theory, and is intended as an introduction to information theory for students of communication theory, computer science, and statistics. The first quarter might cover Chapters 1 to 9, which includes the asymptotic equipartition property, data compression, and channel...