$$\require{cancel}$$

# 10.24 Digital Information

We take in information about the world all the time. Reading a book, looking at a picture, listening to a conversation of a piece of music — all of these activities convey information. We have seen over the past few decades a revolution in technology that allows information to be transmitted around the world. Radios and telephones and televisions are ubiquitous appliances that send information from one place to another. This requires the conversion of light and sound into measurable amounts of information. In science, information that can be measured and counted is called digital information.

The most basic way to define information is in a two-way or binary sense: yes or no, on or off, 1 or 0, white or black, up or down. The fundamental unit of information is called a binary digit, or a bit. With many pieces of information, or bits, we can describe very complex aspects of the natural world.

Suppose you ask a friend to think of a number from 1 to 100. How many "yes or no" questions would you have to ask to determine the number? The smart way to play the game is divide the number range in two each time. "Is it more than 50?" No. "Is it more than 25?" Yes. "Is it more than 37?" Yes. "Is it more than 43?" No. "Is it more than 40?" Yes. "Is it more than 41?" No. "Is it 41" Yes! If you play this game by just guessing a number each time, it might take you close to 100 guesses (or you might get lucky). But you can always determine any number up to a hundred with seven guesses. Try it! If you think of it in terms of information, the number your friend knows — any number from 1 to 100 — can be defined by 7 yes-no questions or 7 bits of information.

In mathematical terms, the information content depends on the number of binary choices or bits multiplied together. A single bit specifies 2 things, two bits specify 2 × 2 = 4 things, three bits specify 2 × 2 × 2 = 8 things, and so on. In general, with N bits of information

Information content = 2N

In the example of coding the numbers from 1 to 100, N = 7 bits gives an information content of 27 = 128 so 7 bits allow us to easily specify the numbers from 1 to 100. This way of describing information uses a counting system with a base of two rather than the familiar decimal system with a base of ten. But it is easy to convert any decimal number into a binary number. Here's how the counting goes

Decimal 0 is Binary 0
Decimal 1 is Binary 1
Decimal 2 is Binary 10
Decimal 3 is Binary 11
Decimal 4 is Binary 100
Decimal 5 is Binary 101
Decimal 6 is Binary 110
Decimal 7 is Binary 111
Decimal 8 is Binary 1000
Decimal 9 is Binary 1001
etc, etc

Binary counting is just like decimal counting with only two digits instead of ten. Each column is a higher power of the base; instead of ones (100) and tens (101) and hundreds (102) and thousands (103), we have ones (20) and twos (21) and fours (22) and eights (23), and so on. You can see that a bit in the binary system is like a significant digit in the decimal system. The number from our guessing game can be written in the binary system as 101001 (32 + 8 + 1 = 41).

Why is the binary system the basis of the Information Age? In 1947, three researchers at Bells Labs discovered how to switch an electrical current on and off using a tiny silicon sandwich called a transistor. A computer has no fingers so it doesn't count in a base of ten like we do. A computer contains many thousands of transistors; each one of which can switch a current on or off millions of times a second. The on and off of the current is the 1 and 0 counting system of a computer. By doing binary operations at blinding speed, computers do most of the calculations of the modern world. Equally important is the fact that they can manipulate and transmit any information that can be converted into a binary form.

What is the information content of written language? There are 26 letters in the alphabet. With capitals and punctuation there are about 60 different items, so we can describe writing with 6 bits of information, since 26 = 64. The average word has close to 5 letters or 6 × 5 = 30 bits of information. A typical textbook or a longish novel has about 350,000 words in it, so the total information content is 30 × 350,000 or about 10 million bits. For historical reasons, computer switches are groups in eights, so that a computer "word" is 8 bits, called a byte. Such a book contains 10,000,000 / 8 = 1,300,000 bytes or 1.3 megabytes. It could easily fit on a flash storage device.

Here is an example with astronomical relevance. What is the information content of an image? If each picture element, or pixel, can be black or white (on or off), then it contains one bit of information. As we increase the size of the array of pixels, the information content increases. With a 2 × 2 array (4 bits), not much information can be conveyed. However, with a 10 × 10 array (100 bits), we could make a visual display of a letter or several numbers. And with a 15 × 15 array (225 bits), we could transmit slightly more complex information. Information with only one bit per pixel (black or white) is very crude. There are no gradations of intensity and color cannot be represented. With a range of 64 colors or 64 shades of grey, each pixel represents 8 bits of information (28 = 256). The array is 29 × 21 pixels, giving a total information content of 29 × 21 × 8 = 4872 bits. You can see how the ability to convey complex information increases with the number of bits.

The CCD imaging devices used in astronomy can not measure colors directly. The intensity is measured in filtered light of a specific color, and the colors are combined afterward. A CCD can measure very fine gradations of intensity by counting the electrons created when photons hit each pixel. A typical CCD can count up to 250,000 electrons. From the equation above, we can see that this is 16 bits of information because 216 = 262,144. So each pixel contains 16 bits of information. (Not all of these grades of intensity or bits are conveyed by the printing process of a book, but they were present in the original electronic images.) A typical CCD with an array of 8000 × 8000 pixels therefore creates an image with 8000 × 8000 × 16 = 1 billion bits of information! Recall that a word has only 30 bits of information — a picture is worth a lot more than a thousand words.