This is, as they say, the Information Age. A days ago I posted a link to a graphic illustrating the exponential growth in data storage capacity, and before that I referenced the figures frequently cited to demonstrate the explosion of data and its trajectory. But it may be fair to ask what we mean by “information” or “data” anyway. James Gleik’s new book, The Information: A History, a Theory, a Flood, offers a historical frame of reference that helps us situate our present ideas about what exactly information is and what can be done with it.
Two recent reviews provide a helpful summary of the main themes. Nicholas Carr’s review, “Drowning in Beeps,” is at The Daily Beast, and Freeman Dyson’s lengthier review, “How We Know,” is in The New York Review of Books.
At the NYRB blog Gleik also wrote a short, engaging post, “The Information Palace,” examining the history of Information via the entry for “information” in Oxford English Dictionary. Here’s an excerpt:
It’s in the nineteenth century that we start to glimpse the modern sense of the word as a big category, a general thing encompassing news and facts, messages and announcements. The real turning point comes in 1948 when the Bell Labs mathematician and engineer Claude Shannon, in his landmark paper (later a book) “A Mathematical Theory of Communication,” made information, as the OED explains, “a mathematically defined quantity divorced from any concept of news or meaning …” We measure it in bits. We recognize it in words, sounds, images; we store it on the printed page and on polycarbonate discs engraved by lasers and in our genes. We are aware, more or less, that it defines our world.