Monday, August 16th, 2010 | Author: Konrad Voelkel
I just finished Decoding the Universe - How the new science of information is explaining everything in the cosmos, from our brains to black holes, written 2006 by the former mathematics student and now associate professor of journalism Charles Seife, apparently well known for his other books Zero and Alpha&Omega (which I didn't read).
The book in Google Books and a much shorter review I wrote in German on Amazon.de
Overall, this is an edutainment book I would recommend to anyone who is remotely interested in either relativity theory, black holes, quantum mechanics, theories of everything or the nature of life.
Depending on your previous knowledge about physics, it will be a book you'll read very fast or at your usual literature speed. It doesn't contain any mathematics beyond talking about binary digits (0 and 1). In contrast to many other books, the passages about concepts I knew very well weren't boring but written in a good expository way that will enable me to explain the concepts better to others in future. Each chapter contains some historical remarks and anecdotes (and not only the most commonly known stories). The passages which explained concepts that were new to me explained them very good and I didn't have the feeling of missing anything (a problem I had with some parts of Penrose's Road to Reality before I learnt the math elsewhere).
Perhaps I should stress that the ideas promoted in the book are fairly standard by now and there is not so much debate in the scientific community about it. It does not discuss string theory vs. loop quantum gravity or any other crackpot magnet debate. You can expect to get solid education from the text (at least most of it, see my discussion of chapter 9 below).
Now let's see what the individual chapters are about:
Introduces codes and code-breaking (cryptography) in the context of the second world war, then uses this to illustrate redundancy in human languages. Contains some anecdotes about Turing.
First discusses Lavoisier's caloric theory in chemistry, then its problems, the industrial revolution and Carnot's theory of perfect (reversible) heat engines. The second law of thermodynamics (entropy increases) is presented, although entropy isn't mentioned explicitly. Some stories about Boltzmann and Maxwell are told. The Bell curve (of a Gaussian distribution) is explained (with pictures and a throwing-marbles-in-a-box analogy). Then entropy is defined and its connection to time (reversibility) is stressed. Maxwell's daemon is introduced.
Carnot engines are explained particularly well, you can't get it wrong from this exposition.
The history of information theory around Shannon. Definition of a bit (binary digit). Illustration of the importance of information with a story about Paul Revere and the American civil war. Then the relation between entropy and information is explained in-depth. Brillouin's connection of thermodynamic entropy to Shannon's information-theoretic entropy and Landauer's work on limits of computation (cost of erasure) are presented, which ultimately leads to an explanation why Maxwell's daemon is impossible.
The historical remarks about the importance of communication during the civil war and the discussion of the impossibility of Maxwell's daemon were enlightening.
This chapter starts with a discussion of Schrödinger's lecture "What is life?" and continues with a discussion of the genetic code, DNA computers and the relation between evolution and information. The story of the Lemba Jews in Zimbabwe is told.
This chapter really takes its time to convince you that DNA is a storage device for information and can be used in a Turing machine device, so every cell is a computer of some kind.
5. Faster than light
Einstein and some history, special relativity and the double-slit experiment (wave-particle duality). The Michelson-Morley experiment. The spear-in-a-barn paradox. Quantum tunnelling.
This chapter might be a little bit boring if you know some special theory of relativity.
More on wave-particle duality, self-interference and superposition. Schrödinger vs. Heisenberg and Schrödinger's cat. Entanglement and the Einstein-Podolsky-Rosen (EPR) experiment. Gisin's experiments in Geneva.
While the historical remarks and the explanation of Schrödinger's cat is pretty similar to other books on the topic, I very much enjoyed the discussion of Gisin's experiments and "spooky distance action".
7. Quantum information
Qubits, more on Schrödinger's cat, quantum computing, Shor's and Grover's algorithm. The quantum Zeno effect. Measurements undertaken by Nature and the Casimir effect (vacuum fluctuations). Decoherence. A new axiom: "Information can be neither created nor destroyed".
The notion of qubits is explained with the example of Schrödinger's cat and then the cat is debunked via decoherence occurring due to vacuum fluctuations. Nevertheless, decoherence doesn't get enough room for a solid understanding.
More on spooky distance action and Gisin's experiments. Quantum teleportation and causality. Black holes, the no-hair theorem and Hawking radiation. Black holes as computers.
The author clearly favours the axiom that information can not be destroyed and lays out what you need to understand to get the next chapter's ideas on information preservation in black holes. The passages on black holes as computers are interesting but slightly misleading, because it's more a metaphor than a concept.
More on black holes and their entropy. The holographic principle. Discussion about the infinite universe and some version of the infinite monkey theorem, leading to a many--worlds theory. The Copenhagen interpretation and a many-worlds interpretation of quantum physics. The end of all life due to the second law of thermodynamics.
The holographic principle deserves more room than it is given in this chapter. The argumentation for many worlds because of an infinite universe is not very clear to me (I think it's just wrong). The many-worlds interpretation of quantum physics is explained very nicely, although it contains no single word on science theory (which, in my opinion, dictates to abandon theories about the unfalsifiable...).
Appendices and bibliography
The first appendix explains logarithms and why the base is not that important. The second explains entropy in Shannon's sense in more detail. The bibliography contains a lot of helpful references (including some ArXiv papers) but sadly, they aren't references explicitly throughout the text.
Conclusion and recommendation
It was a nice to read survey of applications and instances of information theory and I guess for most people who are slightly interested in any of the topics mentioned, it would at least help to choose the next book to read. It's not even expensive!