2 minute read

Countable

Are All Infinite Sets Countable?



The answer to this question was given around 1870 by the German mathematician George Cantor. He showed that the set of numbers between 0 and 1 represented by infinite decimals was uncountable. (To include the finite decimals, he converted them to infinite decimals using the fact that a number such as 0.3 can be represented by the infinite decimal 0.29—where the 9s repeat forever.) He used a reductio ad absurdum proof, showing that the assumption that the set is countable leads to a contradiction. The assumption therefore has to be abandoned.



He began by assuming that the set was countable, meaning that there was a way of listing its elements so that, reading down the list, one could reach any number in it. This can be illustrated with the example in Table 5.

From this list he constructs a new decimal. For its first digit, he uses a 1, which is different from the first digit in the first number in the list. For its second digit, he uses a 3, which is different from the second digit in the second number. For its third digit, he uses a 3 again, since it is different from the third digit in the third number.

He continues in this fashion, making the n-th digit in his new number different from the n-th digit in the n-th number in the list. In this example, he uses 1s and 3s, but he could use any other digits as well, as long as they differ from the n-th digit in the n-th number in the list. (He avoids using 9's because the finite decimal 0.42 is the same number as the infinite decimal 0.4199999... with 9's repeating.) The number he has constructed in this way is 0.131131... Because it differs from each of the numbers in at least one decimal place, it differs every number in the assumed complete list. If one chooses a number and looks for it in a listing of a countable set of numbers, after a finite number of steps he will find it (assuming that list has been arranged to demonstrate the countability of the set). In this supposed listing he will not. If he checks four million numbers, his constructed number will differ from them all in at least one of the first four million decimal places.

Thus the assumption that the set of infinite decimals between 0 and 1 is countable is a false assumption. The set has to be uncountable. The infinitude of such a set is different from the infinitude of the natural numbers.


Resources

Books

Friedrichs, K.O. From Pythagoras to Einstein. Washington, DC: Mathematical Association of America, 1965.

Gullberg, Jan, and Peter Hilton. Mathematics: From the Birth of Numbers. W.W. Norton & Company, 1997.

Zippin, Leo. Uses of Infinity. Washington, DC: The Mathematical Association of America, 1962.


J. Paul Moulton

KEY TERMS

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Denumerable

—A synonym for "countable."

Uncountable

—A term describing the number of elements in an infinite set whose elements cannot be put into one-to-one correspondence with the natural numbers.

Additional topics

Science EncyclopediaScience & Philosophy: Cosine to Cyano groupCountable - Remarkably It Works., Are All Infinite Sets Countable?