Questions in Information Theory I: Information and Entropy

Saturday, September 18th, 2010 | Author:

During a workshop, we developed some rather broad questions surrounding the concepts of information and complexity in the sciences, especially looking at quantum physics, digital philosophy and philosophy of science. This is spiced up with some more metaphysical questions and some rants by well-known scientists, to provoke the reader’s imagination. References to the literature are given as a first starting point to develop answers to the questions.
Comments, answers and even more questions are very welcome.

Questions part I - Information and Entropy [CT91]

  1. Is it possible to define a unifying universal notion of information, applicable to all sciences? [SW63]
    Can we convert different notions of information and entropy? [Khi57]
    Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information? [KL51]

    “In fact, what we mean by information – the elementary unit of information – is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continually transformed are themselves provided with energy.” – Gregory Bateson, 1972

  2. Is it possible to define an absolute (non-relative, total) notion of information? [GML96]
    Can we talk about the total information content of the universe? [Llo02]
  3. Where does information go when it is erased?
    Is erasure of information possible at all? [Hit01]
  4. Does entropy emerge from irreversibility (in general)?
  5. Does the concept of asymmetry (group theory) significantly clarify the concept of information?
  6. What happens to information during computation?

References

UPDATE 2010-10-13: added links to the references


Category: English, Questions in Information Theory

Comments are currently closed.