INFORMATION
— 456—
Kinds of information
Counting-information is mathematical information
as defined by American mathematician and engi-
neer Claude Shannon (1916–2001) in a paper on
communication theory written in 1948. It has noth-
ing directly to do with meaning; rather it relates
solely to an arbitrary measure based upon the the-
ory of probability.
Meaning-information is information in the col-
loquial sense of knowledge. It is completely differ-
ent from Shannon’s concept of information; it is in-
terpretation-, language-, and culture-dependent.
Shaping-information denotes information as a
noun describing the action of giving form to
something. It is the oldest sense of the word, orig-
inating in the Latin verb informare, further re-
flected in current usage in the German in-
formieren and the French informer. In this sense,
one can speak of the “information” of a system
when one imposes constraints upon its degrees of
freedom, for example by giving content and struc-
ture to a spreadsheet.
Construed in these three ways, information
crosses boundaries between physics, culture, and
mind. In its modern, counting-information sense,
especially in the realm of information technology,
it seems to have taken on a life of its own, as if the
process of rendering things digitally had some in-
trinsic value apart from its use in conveying mean-
ing and enabling people to shape the world. As
with any new technology—the telephone, the tel-
evision, the motor car, the mobile phone—there is
a period during which fascination with the tech-
nology itself supplants the wisdom that governs its
use, but eventually the more important purposes
resume their ascendancy, and the technology once
again comes to be seen as no more than a tool.
The religious significance of the science of in-
formation is best understood in terms of the artic-
ulation of meaning and the establishment of a bal-
anced view of the place of information in human
life. That process is in full swing as digitization, the
Internet, global communication, and the dissolu-
tion of historical boundaries reshape how people
conceive of themselves and how they decide to
live their lives.
If technology is to serve rather than dictate
human needs, it is essential that people retain their
capacity to think creatively, which is to generate
the ideas that give shape to the technology by in-
vesting it with significant meanings. Otherwise
human needs will increasingly be at the mercy of
the agendas of those individuals, corporations, and
nation-states that control the technology, and peo-
ple will be powerless to resist their influence by
giving expression to their own objectives. Articula-
tion of worthy religious goals is one contribution
that theology can make to the restoration of the
balance between creative thought and technologi-
cal power.
Counting-information
The mathematical concept of counting-information
is based upon binary arithmetic, on the ability to
distinguish between two states, typically repre-
sented as 0 and 1, in an electronic device. One
such distinguishable state is called a binary unit or
bit. Combinations of these states allow data to be
encoded in strings, such as 01110101010, that can
be stored in two-state devices and transmitted
down communication channels. Electronic circuits
that can distinguish between only two states are
relatively easy to devise, although higher-state de-
vices are possible. The process of encoding facts
about the world in such binary strings is called
digitization, although any particular encoding is
arbitrary.
A string of n bits can exist in 2
n
; different states
and so can represent 2
n
different symbols. For ex-
ample, when n = 3, the string can be 000, 001, 010,
011, 100, 101, 110, or 111. If a particular encoding
treats these strings as binary numbers, they repre-
sent 0, 1, 2, . . . , 7; another encoding might treat
them as a, b, ... , h. In the early years of comput-
ing it was thought that 256 different strings would
be sufficient to encode most common letters, num-
bers, and control codes. The number of bits re-
quired to store a given amount of data is therefore
usually measured in eight-bit units called bytes be-
cause of the number of different states of a single
byte (2
8
= 256). Numbers of bits are counted in
powers of 2, so a kilobyte is 2
10
= 1024 bytes; a
megabyte is 1024 kilobytes (1024K); and a gigabyte
is 1024 megabytes. Typical hard disks can now
store between 20 and 100 gigabytes.
The states of a binary system are typically called
0 and 1, True and False, or Yes and No. The system
itself is oblivious to these interpretations of the two
possible states of a bit, and it is helpful to distin-
guish between system states and interpretations of