Calling James Gleick a “science writer” feels undernourished. His works are always written with style and a sense of wonder without burying the reader in scientific detail. The Information is a tour de force introduction to something we now take for granted: you guessed it, information.
From how African drum patterns encode information to Claude Shannon to wikipedia with many stops in between, Gleick’s book is a great survey. It introduces more issues than it resolves but hasn’t lost any relevance since its publication in 2011. The section on Charles Babbage and Ada Hopper were tremendously engaging, and even better were ruminations on the Oxford English Dictionary. The contest over mathematical formalism between Godel & Russell would have made a riveting stand-alone article. I could have said that about many of the chapters, whose clear expositions stand out for a difficult subject (A rule of thumb to explain the mid-century mathematical giants of information theory: Claude Shannon is the code maker, Alan Turing the code breaker.)
The ninth chapter is on entropy, which is counter-intuitive, as the defining feature of Information Theory. This is the philosophic highlight, one which I am rummaging in footnotes to take the next steps of understanding. (A good next step? Gödel, Escher, Bach).
What is the relationship between data and wisdom? How are we to deal with the “total noise” (in David Foster Wallace’s words) overwhelming us? Search and filtering are what is offered in the final chapters but there is another book I hope Gleick – an entrepreneur as well as writer – I hope will address in another work.
A rule of thumb for any book four star rating is I want to immediately re-read it again, and hit its sources as well. Gleick clears that hurdle with room to spare.