9 Questions on Information Theory

Thursday, February 13th, 2014 | Author:

Back in 2010 I had a series of posts going about questions in information theory that arose from a 2-week seminar with a bunch of students coming from various scientific disciplines (a wonderful event!). Here I picked those that I still find particularly compelling:

  1. Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information?
  2. Can we talk about the total information content of the universe?
  3. Is hypercomputation possible?
  4. Can we tell for a physical system whether it is a Turing machine?
  5. Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics?
  6. Can we create or measure truly random numbers in nature, and how would we recognize that?
  7. Would it make sense to adapt the notion of real numbers to a limited (but not fixed) amount of memory?
  8. Can causality be defined without reference to time?
  9. Should we re-define “life”, using information-theoretic terms?

What do you think?


Category: English, Questions in Information Theory

Comments are currently closed.