When Jon McGregor was a guest on A Good Read, a BBC radio program where two guests and the host discuss their book choice, he chose a book with short essays on plants, stars and animals. It was as if he wanted to make the point that he really does see poetry in everything.
Reservoir 13 is his fourth novel โ and also the fourth one I read. The book is about an English village where one day a thirteen-year old girl, on holiday in the village, suddenly goes missing. The book describes the thirteen years following this incident.
Except that it is not really about the girl, or her disappearance. Though she is regularly referred to throughout the book, the fact that she’s always the thirteen-year old girl, while everyone else is getting older, is a clever way to show the passing of time, which is the book’s real theme.
In short snapshots โ thirteen per each of the thirteen years of the book; that is poetry too โ one sees children grow up, couples grow closer together or drift apart, annual traditions continue yet change a little bit every year. None of the characters receives special focus and none of the story lines is particularly interesting in itself; it is what they make together that is very beautiful.
I have never lived in a village myself and it is not one of the life experiences I am very sad to have missed, yet I do appreciate there is something uniquely beautiful about small communities: a certain kind of poetry that is less noticeable in towns and cities.
Month: January 2018
Long before Samuel Morse invented the electric telegraph, people in West-Africa could send messages over long distances using ‘talking drums’. But while the telegraph requires human language to be encoded into ‘dots’ and ‘dashes’ (and spaces โ Morse code isn’t binary), the drums actually do talk: the high and low pitches of the drums correspond to high and low tones in the language. Whatever is lost in the lack of consonants and vowels is made up for by making the words longer, not dissimilar to how we sometimes say ‘Alpha’ or ‘Bravo’ instead of ‘A’ and ‘B’, or how digital storage often contains extra bits to allow possible errors to be detected and corrected.
Few would argue that talking drums are part of IT, yet they are a technology to transmit information. I think it is thus right that James Gleick opens his truly fascinating book on Information (“The Information: A History, a Theory, a Flood“) with a chapter on these drums.
After talking drums, Gleick goes on to discuss early alphabetically-ordered dictionaries (a concept unique to languages that actually use alphabets), Charles Babbage’s analog computers, the mechanical and electrical telegraphs (that allowed information to be transfered faster than humans even for people who didn’t know how to use talking drums) and the telephone. But all throughout the book, the technology is made subordinate to information itself, which is implicitly treated as a philosophical concept.
The true hero of both the book, “father of information theory” Claude Shannon (1916-2001), makes his first appearance about halfway through. Shannon did important work on early computers and on code breaking (he cooperated with his British contemporary Alan Turing during the Second World War) but most importantly of all, he was the inventor of the ‘bit’ as the basic unit of information.
This is defined by Shannon not as the result of counting the number of zeros and ones used to store a particular message, but as a the amount of information actually contained in that message. Anyone who has ever used a program like WinZip to compress a text file, or who has ever found the need to rephrase a message so that it would fit into 140 (now 280) characters, knows that there is often less information contained in a message than that is used to store it. Though the exact amount of information is in most cases impossible to measure accurately, Shannon estimated that an average English text contained about 50% redundancy.
With the appearance of Shannon, things really took off for the information age. Computers came along, while at the same time DNA was discovered: the hard drives contained in every cell in every living organism. A full chapter of the book is dedicated to memes, a concept from evolutionary biology, and specific pieces of information ‘going viral’.
Again, Gleick’s book focuses less on “how does it work?” and more on “what does it mean?”, hence logicians like Bertrand Russel and Kurt Gรถdel feature prominently; I was intrigued to learn how the paradoxes studied by the former (for example the Berry Paradox) are very relevant to information theory. Important but surprisingly hard to grasp notions like randomness and entropy, often passed over briefly in more practically focused books, are also discussed intensively.
The book ends with a chapter on information overload, and it was somewhat of a relief to learn that concerns about more information not necessarily being better do in fact have a very long history.
Before that, Gleick has already cast a glimpse into the future of computing and the possible arrival of quantum computers. As a mathematician, I always feel a bit uncomfortable about the possibility that future computers may be based on physics rather than on mathematics and I hold on to the beliefs of some experts who say that practical quantum computers will never see the light of day. Conceptually, however, such computers are truly fascinating. And of course, Gleick’s focus is on that.