Информатика и вычислительная техника
  • формат pdf
  • размер 5.1 МБ
  • добавлен 18 октября 2011 г.
Klir G.J. Uncertainity and Information. Foundations of Generalized Information Theory
Издательство John Wiley, 2006, -518 pp.

The concepts of uncertainty and information studied in this book are tightly interconnected. Uncertainty is viewed as a manifestation of some information deficiency, while information is viewed as the capacity to reduce uncertainty. Whenever these restricted notions of uncertainty and information may be confused with their other connotations, it is useful to refer to them as information- based uncertainty and uncertainty-based information, respectively. The restricted notion of uncertainty-based information does not cover the full scope of the concept of information. For example, it does not fully capture our common-sense conception of information in human communication and cognition or the algorithmic conception of information. However, it does play an important role in dealing with the various problems associated with systems, as I already recognized in the late 1970s. It is this role of uncertainty-based information that motivated me to study it.
One of the insights emerging from systems science is the recognition that scientific knowledge is organized, by and large, in terms of systems of various types. In general, systems are viewed as relations among states of some variables. In each system, the relation is utilized, in a given purposeful way, for determining unknown states of some variables on the basis of known states of other variables. Systems may be constructed for various purposes, such as prediction, retrodiction, diagnosis, prescription, planning, and control. Unless the predictions, retrodictions, diagnoses, and so forth made by the system are unique, which is a rather rare case, we need to deal with predictive uncertainty, retrodictive uncertainty, diagnostic uncertainty, and the like. This respective uncertainty must be properly incorporated into the mathematical formalization of the system.
In the early 1990s, I introduced a research program under the name generalized information theory (GIT), whose objective is to study information-based uncertainty and uncertainty-based information in all their manifestations. This research program, motivated primarily by some fundamental issues emerging from the study of complex systems, was intended to expand classical information theory based on probability. As is well known, the latter emerged in 1948, when Claude Shannon established his measure of probabilistic uncertainty and information.
GIT expands classical information theory in two dimensions. In one dimension, additive probability measures, which are inherent in classical information theory, are expanded to various types of nonadditive measures. In the other dimension, the formalized language of classical set theory, within which probability measures are formalized, is expanded to more expressive formalized languages that are based on fuzzy sets of various types. As in classical information theory, uncertainty is the primary concept in GIT, and information is defined in terms of uncertainty reduction.
Each uncertainty theory that is recognizable within the expanded framework is characterized by: (a) a particular formalized language (classical or fuzzy); and (b) a generalized measure of some particular type (additive or nonadditive). The number of possible uncertainty theories that are subsumed under the research program of GIT is thus equal to the product of the number of recognized formalized languages and the number of recognized types of generalized measures. This number has been growing quite rapidly. The full development of any of these uncertainty theories requires that issues at each of the following four levels be adequately addressed: (1) the theory must be formalized in terms of appropriate axioms; (2) a calculus of the theory must be developed by which this type of uncertainty can be properly manipulated; (3) a justifiable way of measuring the amount of uncertainty (predictive, diagnostic, etc.) in any situation formalizable in the theory must be found; and (4) various methodological aspects of the theory must be developed.
GIT, as an ongoing research program, offers us a steadily growing inventory of distinct uncertainty theories, some of which are covered in this book. Two complementary features of these theories are significant. One is their great and steadily growing diversity. The other is their unity, which is manifested by properties that are invariant across the whole spectrum of uncertainty theories or, at least, within some broad classes of these theories. The growing diversity of uncertainty theories makes it increasingly more realistic to find a theory whose assumptions are in harmony with each given application. Their unity allows us to work with all available theories as a whole, and to move from one theory to another as needed.
The principal aim of this book is to provide the reader with a comprehensive and in-depth overview of the two-dimensional framework by which the research in GIT has been guided, and to present the main results that have been obtained by this research. Also covered are the main features of two classical information theories. One of them covered in Chapter 3, is based on the concept of probability. This classical theory is well known and is extensively covered in the literature. The other one, covered in Chapter 2, is based on the dual concepts of possibility and necessity. This classical theory is older and more fundamental, but it is considerably less visible and has often been incorrectly dismissed in the literature as a special case of the probability-based information theory. These two classical information theories, which are formally incomparable, are the roots from which distinct generalizations are obtained.

Classical Possibility-Based Uncertainty Theory
Classical Probability-Based Uncertainty Theory
Generalized Measures and Imprecise Probabilities
Special Theories of Imprecise Probabilities
Measures of Uncertainty and Information
Fuzzy Set Theory
Fuzzification of Uncertainty Theories
Methodological Issues
A Uniqueness of the U-Uncertainty
B Uniqueness of Generalized Hartley Measure in the Dempster–Shafer Theory
C Correctness of Algorithm 6.1
D Proper Range of Generalized Shannon Entropy
E Maximum of GSa in Section 6.9
F Glossary of Key Concepts
G Glossary of Symbols
Читать онлайн
Смотрите также

Chaitin G.J. The Limits of Mathematics

  • формат djvu
  • размер 847.99 КБ
  • добавлен 22 марта 2011 г.
Springer, 1997. - 147 pages. This book is the final version of a course on algorithmic information theory and the epistemology of mathematics and physics. It discusses Einstein and Goedel's views on the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical. There is a foreword by Cris Calude of the University of Auckland, and supplementary material is available at the author's web s...

Cover T.M., Thomas J.A. Elements of Information Theory

  • формат pdf
  • размер 34.35 МБ
  • добавлен 06 января 2012 г.
Wiley series in telecommunications. John Wiley & Sons, Inc., 1991. – 563 pages. This is intended to be a simple and accessible book on information theory. As Einstein said, ’’Everything should be made as simple as possible, but no simpler.’’ This point of view drives our development throughout the book. We were drawn to the field of information theory from backgrounds in communication theory, probability theory and statistics, because of the...

Desurvire E. Classical and Quantum Information Theory: An Introduction for the Telecom Scientist

  • формат pdf
  • размер 4.74 МБ
  • добавлен 01 ноября 2011 г.
Cаmbridge Univеrsity Prеss, 2009, 691 pages Information theory lies at the heart of modern technology, underpinning all communications, networking, and data storage systems. This book sets out, for the first time, a complete overview of both classical and quantum information theory. Throughout, the reader is introduced to key results without becoming lost in mathematical details. Opening chapters present the basic concepts and various applicati...

Gray R.M. Entropy and Information Theory

  • формат pdf
  • размер 2.58 МБ
  • добавлен 07 апреля 2011 г.
2nd Edition Springer, 2011. 409 p. ISBN:1441979697 This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to...

Hamming R.W. Coding and Information Theory

  • формат djvu
  • размер 1.71 МБ
  • добавлен 17 апреля 2011 г.
Prentice Hall, 1986. - 272 pages. Preface to the First Edition: This book combines the fields of coding and information theory in a natural way. They are both theories about the representation of abstract symbols. The two fields are now each so vast that only the elements can be presented in a short book. Information theory is usually thought of as "sending information from here to there" (transmission of information), but this is exactly the...

Hirsch M.J., Pardalos P.M., Murphey R. Dynamics of Information Systems: Theory and Applications

  • формат pdf
  • размер 9.26 МБ
  • добавлен 29 августа 2011 г.
Sрringer, 2010. - 371 p. "Dynamics of Information Systems" presents state-of-the-art research explaining the importance of information in the evolution of a distributed or networked system. This book presents techniques for measuring the value or significance of information within the context of a system. Each chapter reveals a unique topic or perspective from experts in this exciting area of research. These newly developed techniques have numer...

Kullback S. Information Theory and Statistics

  • формат pdf
  • размер 15.7 МБ
  • добавлен 16 января 2011 г.
Peter Smith, 1978. - 399 pages. Highly useful text studies the logarithmic measures of information and their application to testing statistical hypotheses. Topics include introduction and definition of measures of information, their relationship to Fisher’s information measure and sufficiency, fundamental inequalities of information theory, much more. Numerous worked examples and problems.

Neubauer A., Freudenberger J., Kuhn V. Coding theory: algorithms, architectures and applications

  • формат pdf
  • размер 1.83 МБ
  • добавлен 07 сентября 2011 г.
The present book provides a concise overview of channel coding theory and practice as well as the accompanying algorithms, architectures and applications. The selection of the topics presented in this book is oriented towards those subjects that are relevant for information and communication systems in use today or in the near future. The focus is on those aspects of coding theory that are important for the understanding of these systems. This bo...

Thomas M. Cover, Joy A. Thomas. Elements of information theory

  • формат pdf
  • размер 10.09 МБ
  • добавлен 01 декабря 2009 г.
2nd ed. John Wiley & Sons, Inc. This book has arisen from over ten years of lectures in a two-quarter sequence of a senior and first-year graduate-level course in information theory, and is intended as an introduction to information theory for students of communication theory, computer science, and statistics. The first quarter might cover Chapters 1 to 9, which includes the asymptotic equipartition property, data compression, and channel...

Wiegand T., Schwarz H. Source Coding: Part I of Fundamentals of Source and Video Coding

  • формат pdf
  • размер 2.9 МБ
  • добавлен 09 августа 2011 г.
Из серии Foundations and Trends in Signal Processing издательства NOWPress, 2010, -224 pp. Изложение современных подходов к кодированию информации. Digital media technologies have become an integral part of the way we create, communicate, and consume information. At the core of these technologies are source coding methods that are described in this monograph. Based on the fundamentals of information and rate distortion theory, the most relevant...