Информатика и вычислительная техника
  • формат pdf
  • размер 2.63 МБ
  • добавлен 06 декабря 2011 г.
Sommaruga G (ed.) Formal Theories of Information. From Shannon to Semantic Information Theory and General Concepts of Information
Издательство Springer, 2009, -274 pp.

It is commonly assumed that computers process information. But what is information? In a technical, important, but nevertheless rather narrow sense, Shannon’s information theory gives a first answer to this question. This theory focuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The uncertainty of a situation of ignorance in tu is measured by entropy. This theory has had an immense impact on the technology of information storage, data compression, information transmission and coding and still is a very active domain of research.
Shannon’s theory has also attracted much interest in a more philosophic look at information, although it was readily remarked that it is only a syntactic theory of information and neglects semantic issues. Several attempts have been made in philosophy to give information theory a semantic flavor, but still mostly based on or at least linked to Shannon’s theory. Approaches to semantic information theory also very often make use of formal logic. Thereby, information is linked to reasoning, deduction and inference, as well as to decision making. Further, entropy and related measure were soon found to have important connotations with regard to statistical inference. Surely, statistical data and observation represent information, information about unknown, hidden parameters. Thus a whole branch of statistics developed around concepts of Shannon’s information theory or derived from them. Also some proper measurements appropriate for statistics, like Fisher’s information, were proposed.
Algorithmic information theory introduced by Kolmogorov, Solomonoff and Chaitin provides a new look at the concept of information. It is again basically a theory of measuring information content. Here the information content of an information object, for instance, a binary string, is measured by the length of the shortest program which computes this object. It is based on Turing machines. A main result of this approach to information is the clarification of the concept of randomness and probability. Therefore it is not too surprising that algorithmic information theory reproduces Shannon’s results although in a rather different context.
Not too long ago it was noted that information is related to questions. Information represents answers to such questions. Or it was remarked that pieces of information shed light on a given context, and that this information might possibly also be transported through channels to other contexts. The problems of questions and of information related to questions were considered by Groenendijk and Stockhoff. Further, the flow of information between different contexts was studied by Barwise and Seligman. From quite a different point of view, similar issues were captured by the Fribourg school which introduces the concept of information algebras. Pieces of information come from different sources, conce different questions, can be combined or aggregated and focused on the questions of interest. These algebraic structures also provide a rigorous foundation for a theory of uncertain information based on probability theory. Furthermore they offer sufficient conditions for efficient generic methods of inference covering diverse domains such as relational databases, probability networks, logic systems, constraint programming, discrete transforms and many more.
Under the title Information and Knowledge, research groups of the Computer Science departments of the universities of Bee, Fribourg and Neuchtel collaborated over several years on issues of logic, probability, inference and deduction. Given the different approaches to the concept of information and its basic nature, one of the traditional Muenchenwiler seminars in May 2006 was devoted to an exchange of views between experts from the different schools mentioned above. The goal was to examine whether there is some common ground between these different formal theories. The contributions of the invited participants (with the exception of Robert van Roij, who was afterwards invited to contribute) are collected in this volume. The volume editor, Giovanni Sommaruga, discusses the question of whether there are one or several concepts of information as a first attempt to summarize the results of the seminar. It is up to the reader to continue in the direction of a possible unification of the different theories.
As the organizer of the May 2006 Muenchenwiler seminar, I would like to thank the authors for their participation in the seminar, their contributions to this volume and the patience they had to exercise during the editing process. My sincere thank goes to the editor of this volume, Giovanni Sommaruga, for all the work this implied and especially for his effort to compare the different approaches to information in search of a common thread. Thanks to Cris Calude for establishing the contacts with Springer for the publication of the volume. I am grateful to Cesar Schneuwly for the final typesetting preparations. Finally I thank the Swiss National Foundation for supporting several research projects on the subject of logic and probability and information and knowledge, as well as the Swiss Confederation which supported the collaboration project between the universities of Bee, Fribourg and Neuchtel under the title of deduction and inference. The Muenchenwiler seminar of May 2006, as well as many others, and the present volume are fruits of this encouragement.

Philosophical Conceptions of Information
Information Theory, Relative Entropy and Statistics
Information: The Algorithmic Paradigm
Information Algebra
Uncertain Information
Comparing Questions and Answers: A Bit of Logic, a Bit of Language, and Some Bits of Information
Channels: From Logic to Probability
Modeling Real Reasoning
One or Many Concepts of Information?
Смотрите также

Ash R. Information Theory

  • формат djvu
  • размер 3.32 МБ
  • добавлен 12 октября 2011 г.
Издательство Dover Publications, 1965, -345 pp. Statistical communication theory is generally regarded as having been founded by Shannon A948) and Wiener A949), who conceived of the communication situation as one in which a signal chosen from a specified class is to be transmitted through a channel, but the output of the channel is not determined by the input. Instead, the channel is described statisti- statistically by giving a probability dist...

Chaitin G.J. The Limits of Mathematics

  • формат djvu
  • размер 847.99 КБ
  • добавлен 22 марта 2011 г.
Springer, 1997. - 147 pages. This book is the final version of a course on algorithmic information theory and the epistemology of mathematics and physics. It discusses Einstein and Goedel's views on the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical. There is a foreword by Cris Calude of the University of Auckland, and supplementary material is available at the author's web s...

Cover T.M., Thomas J.A. Elements of Information Theory

  • формат pdf
  • размер 34.35 МБ
  • добавлен 06 января 2012 г.
Wiley series in telecommunications. John Wiley & Sons, Inc., 1991. – 563 pages. This is intended to be a simple and accessible book on information theory. As Einstein said, ’’Everything should be made as simple as possible, but no simpler.’’ This point of view drives our development throughout the book. We were drawn to the field of information theory from backgrounds in communication theory, probability theory and statistics, because of the...

Desurvire E. Classical and Quantum Information Theory: An Introduction for the Telecom Scientist

  • формат pdf
  • размер 4.74 МБ
  • добавлен 01 ноября 2011 г.
Cаmbridge Univеrsity Prеss, 2009, 691 pages Information theory lies at the heart of modern technology, underpinning all communications, networking, and data storage systems. This book sets out, for the first time, a complete overview of both classical and quantum information theory. Throughout, the reader is introduced to key results without becoming lost in mathematical details. Opening chapters present the basic concepts and various applicati...

Gray R.M. Entropy and Information Theory

  • формат pdf
  • размер 2.58 МБ
  • добавлен 07 апреля 2011 г.
2nd Edition Springer, 2011. 409 p. ISBN:1441979697 This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to...

Hamming R.W. Coding and Information Theory

  • формат djvu
  • размер 1.71 МБ
  • добавлен 17 апреля 2011 г.
Prentice Hall, 1986. - 272 pages. Preface to the First Edition: This book combines the fields of coding and information theory in a natural way. They are both theories about the representation of abstract symbols. The two fields are now each so vast that only the elements can be presented in a short book. Information theory is usually thought of as "sending information from here to there" (transmission of information), but this is exactly the...

Hirsch M.J., Pardalos P.M., Murphey R. Dynamics of Information Systems: Theory and Applications

  • формат pdf
  • размер 9.26 МБ
  • добавлен 29 августа 2011 г.
Sрringer, 2010. - 371 p. "Dynamics of Information Systems" presents state-of-the-art research explaining the importance of information in the evolution of a distributed or networked system. This book presents techniques for measuring the value or significance of information within the context of a system. Each chapter reveals a unique topic or perspective from experts in this exciting area of research. These newly developed techniques have numer...

Klir G.J. Uncertainity and Information. Foundations of Generalized Information Theory

  • формат pdf
  • размер 5.1 МБ
  • добавлен 18 октября 2011 г.
Издательство John Wiley, 2006, -518 pp. The concepts of uncertainty and information studied in this book are tightly interconnected. Uncertainty is viewed as a manifestation of some information deficiency, while information is viewed as the capacity to reduce uncertainty. Whenever these restricted notions of uncertainty and information may be confused with their other connotations, it is useful to refer to them as information- based uncertainty...

Kullback S. Information Theory and Statistics

  • формат pdf
  • размер 15.7 МБ
  • добавлен 16 января 2011 г.
Peter Smith, 1978. - 399 pages. Highly useful text studies the logarithmic measures of information and their application to testing statistical hypotheses. Topics include introduction and definition of measures of information, their relationship to Fisher’s information measure and sufficiency, fundamental inequalities of information theory, much more. Numerous worked examples and problems.

Thomas M. Cover, Joy A. Thomas. Elements of information theory

  • формат pdf
  • размер 10.09 МБ
  • добавлен 01 декабря 2009 г.
2nd ed. John Wiley & Sons, Inc. This book has arisen from over ten years of lectures in a two-quarter sequence of a senior and first-year graduate-level course in information theory, and is intended as an introduction to information theory for students of communication theory, computer science, and statistics. The first quarter might cover Chapters 1 to 9, which includes the asymptotic equipartition property, data compression, and channel...