Questions in Information Theory III: Statistical Physics, Quantum Physics and Thermodynamics

Thursday, October 28th, 2010 | Author:

See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity

Questions part III - Statistical Physics, Quantum Physics and Thermodynamics [GM94] [Pen05]

  1. Can there be some kind of Maxwell’s daemon, and why (not)?
  2. Can all analog information be transformed to digital information losslessly?
    Is physical information always digital (digital philosophy)?
  3. Does the second law of thermodynamics depend on a human observer?
  4. Can quantum mechanical entanglement be reduced to mutual information? [NC00] [DH00] [PV06] [BP07]
  5. What is the mathematical, what is the physical content of Heisenberg’s uncertainty relation?

    “I think I can safely say that nobody understands quantum mechanics.” – Richard Phillips Feynman, 1965

  6. Is there a method to denote and calculate, for an open physical system, how the information content changes in time when the system’s dynamics are known?
  7. Is there an important difference between information loss into small scales and information loss into microscopic freedoms?
  8. Where do (thermal, quantum, vacuum, ?) fluctuations come from? [Lam97] [Lam98]
    What do they change on the macroscopic scale?
    Are gravitational fluctuations possible?
  9. According to Einstein’s fluctuation-dissipation-theorem, do thermal fluctuations compensate exactly for information loss through dissipation? [CW51]
  10. Is probability theory powerful enough to capture any micro-physical phenomena?
    Is mathematical probability theory the correct language for modern physics? [Mil04]

    “In fact the smallest units of matter are not physical objects in the ordinary sense; they are forms, ideas which can be expressed unambiguously only in mathematical language.” – Werner Heisenberg, 1992

  11. How is Zeilinger’s concept of elementary systems generalizable to general state spaces?
  12. Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics? [And97]
    Does decoherence by interference with the background universe render the concept of closed quantum systems obsolete? [BTV09]
  13. How does randomness in quantum measurement emerge from unitary evolution?
    Is quantum physics truly information-preserving?
  14. How relevant is the classical concept of degree of freedom for quantum mechanics?


  • [And97] Philip W. Anderson, Is measurement itself an emergent property?, Complex. 3 (1997), no. 1, 14–16.
  • [BP07] S.L. Braunstein and A.K. Pati, Quantum information cannot be completely hidden in correlations: Implications for the black-hole information paradox, Physical Review Letters 98 (2007).
  • [BTV09] Buchleitner, Tiersch, and Viviescas, Entanglement and decoherence, Lecture Notes in Physics, Springer, 2009.
  • [CW51] Herbert B. Callen and Theodore A. Welton, Irreversibility and generalized noise, Phys. Rev. 83 (1951), no. 1, 34–40.
  • [DH00] D. Deutsch and P. Hayden, Information flow in entangled quantum systems, Proceedings: Mathematics, Physical and Engineering Sciences 456 (2000), no. 1999, 1759–1774.
  • [GM94] M. Gell-Mann, The quark and the jaguar, Freeman New York, 1994.
  • [Lam97] SK Lamoreaux, Demonstration of the casimir force in the 0.6 to 6 µm range, Physical Review Letters 78 (1997), no. 1, 5–8.
  • [Lam98] S. K. Lamoreaux, Erratum: Demonstration of the casimir force in the 0.6 to 6 µm range [phys. rev. lett. 78, 5 (1997)], Phys. Rev. Lett. 81 (1998), no. 24, 5475–5476.
  • [Mil04] David Miller, Probability generalizes logic, 2004.
  • [NC00] Nielsen and Chuang, Quantum computation and quantum information, 2000.
  • [Pen05] Roger Penrose, The road to reality: A complete guide to the laws of the universe, 3 ed., Knopf, 2005.
  • [PV06] Martin B. Plenio and S. Virmani, An introduction to entanglement measures, arXiv, Jun 2006.

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory II: Complexity and Algorithmic Complexity

Wednesday, October 13th, 2010 | Author:

See also: Questions part I - Information and Entropy

Questions part II - Complexity and Algorithmic Complexity [LV93]

  1. Can we capture the concepts of simplicity, Occam’s razor and complexity by the notion of algorithmic complexity? [LV93] [She09]

    “What is simplicity? Simplicity is the shortest path to a solution.” – Ward Cunningham, 2004

    “To repeat, we consider a computer program to be a theory for its output, that is the essential idea, and both theory and output are finite strings of bits whose size can be compared. And the best theory is the smallest program that produces that data, that precise output. That’s our version of what some people call Occam’s
    razor.” – Gregory Chaitin, 2008

  2. What is the relation between algorithmic complexity and entropy? [LV93] [BS10]
  3. How is semantics (e.g. of the number pi) related to algorithmic complexity?
  4. How does complexity arise from syntactic information? [MGG01]
  5. Do different degrees of complexity correspond to qualitative differences in cognitive/reactive behaviour?
  6. Is the Church-Turing hypothesis true and if so, does it matter?
    Is neural computation fundamentally distinct from electronic computation?
    Is hypercomputation possible? [TS02] [BA04] [Dav06]

    “A man provided with paper, pencil, and rubber, and subject to strict discipline, is in effect a universal machine.” – Alan Turing, 1948

  7. How can one define computability without referring to Turing machines?
  8. P vs. NP: is P=NP or not? [For09]
    Is there a practical benefit of P=NP or of a proof of this?
  9. Is the concept of a non-deterministic Turing machine implementable by natural computing just like deterministic Turing machines are implementable by home computers?
  10. Would it be a good idea to adapt the concept of non-deterministic Turing machines to real non-deterministic systems?
  11. Can we tell for a general physical system whether it is a Turing machine? [Zus70]

    “All processes, whether they are produced by human effort or occur spontaneously in nature, can be viewed as computations.” – Stephen Wolfram, 2002

  12. Can a Turing machine simulate itself?
  13. Does classical computer science “apply” to quantum computation? [Pre01] [Svo05] [BBC98]


Category: English, Questions in Information Theory | Comments off

Questions in Information Theory I: Information and Entropy

Saturday, September 18th, 2010 | Author:

During a workshop, we developed some rather broad questions surrounding the concepts of information and complexity in the sciences, especially looking at quantum physics, digital philosophy and philosophy of science. This is spiced up with some more metaphysical questions and some rants by well-known scientists, to provoke the reader’s imagination. References to the literature are given as a first starting point to develop answers to the questions.
Comments, answers and even more questions are very welcome.

Questions part I - Information and Entropy [CT91]

  1. Is it possible to define a unifying universal notion of information, applicable to all sciences? [SW63]
    Can we convert different notions of information and entropy? [Khi57]
    Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information? [KL51]

    “In fact, what we mean by information – the elementary unit of information – is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continually transformed are themselves provided with energy.” – Gregory Bateson, 1972

  2. Is it possible to define an absolute (non-relative, total) notion of information? [GML96]
    Can we talk about the total information content of the universe? [Llo02]
  3. Where does information go when it is erased?
    Is erasure of information possible at all? [Hit01]
  4. Does entropy emerge from irreversibility (in general)?
  5. Does the concept of asymmetry (group theory) significantly clarify the concept of information?
  6. What happens to information during computation?


UPDATE 2010-10-13: added links to the references

Category: English, Questions in Information Theory | Comments off

2nd Workshop on Personal Knowledge Management

Sunday, September 12th, 2010 | Author:

Today I'm attending the second Workshop on Personal Knowledge Management (PKM2010) at the Human-Computer-Interaction (HCI) conference "Mensch und Computer" in Duisburg (Germany).

I have absolutely no idea what to expect, so I expect to be surprised.

UPDATE: Now that the workshop is almost over (coffee break right now), maybe the most important for me:

It was fun!

Continue reading «2nd Workshop on Personal Knowledge Management»

Category: English | One Comment


Tuesday, August 24th, 2010 | Author:

Ich bin hier gerade in Rot an der Rot in einem Seminar über die Rolle der Informationstheorie in den Naturwissenschaften. Mein Vortrag hat heute statt gefunden, es war der vierte von zwanzig und behandelte bedingte Entropie, zusammen mit den notwendigen Voraussetzungen aus der diskreten Wahrscheinlichkeitstheorie, der Begrifflichkeit der Entropie (Information) an sich und zahlreiche Anwendungen.

Vorgetragen habe ich mit Folien (am Beamer), es dauerte etwa 90 Minuten und ging am Ende ein bisschen zu schnell. Ich habe auch noch ca. 90 Backup-Folien mit deutlich mehr Informationen vorbereitet, bevor ich den konkreten Vortrag daraus destilliert habe. Die Folien sind mit der Latex-Beamer-class erstellt.

Die Folien zum Vortrag (ca. 35, zum Vortragen optimiert) gibt es hier, und
die deutlich umfangreichere Version (ca. 90, mit Text vollgestopft) gibt es hier.

Beides gibt es natürlich ohne Gewähr, allerdings habe ich (meines Wissens nach) alle (2) Fehler korrigiert, die das Publikum gefunden hat.

Quellen waren die Paper von Shannon und Weaver, das Buch von Brillouin (Science and Information Theory) und meine alten Notizen zur Stochastik aus dem Vordiplom (zusammen mit dem einführenden Buch von Hans-Otto Georgii, das ich immer noch gern zum Nachschlagen verwende). Einzelne Grafiken stammen von Wikipedia, andere habe ich gezeichnet (und auch als public domain in Wikipedia eingebracht). Meine Zeichnungen habe ich mit Inkscape erstellt.

Viel Spaß damit!

Category: German, Mathematics | Comments off

Review – Seife: Decoding the Universe

Monday, August 16th, 2010 | Author:

I just finished Decoding the Universe - How the new science of information is explaining everything in the cosmos, from our brains to black holes, written 2006 by the former mathematics student and now associate professor of journalism Charles Seife, apparently well known for his other books Zero and Alpha&Omega (which I didn't read).
The book in Google Books and a much shorter review I wrote in German on

Continue reading «Review – Seife: Decoding the Universe»

Category: English | One Comment