Thursday, February 13th, 2014 | Author: Konrad Voelkel
Back in 2010 I had a series of posts going about questions in information theory that arose from a 2-week seminar with a bunch of students coming from various scientific disciplines (a wonderful event!). Here I picked those that I still find particularly compelling:
- Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information?
- Can we talk about the total information content of the universe?
- Is hypercomputation possible?
- Can we tell for a physical system whether it is a Turing machine?
- Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics?
- Can we create or measure truly random numbers in nature, and how would we recognize that?
- Would it make sense to adapt the notion of real numbers to a limited (but not fixed) amount of memory?
- Can causality be defined without reference to time?
- Should we re-define “life”, using information-theoretic terms?
What do you think?
Category: English, Questions in Information Theory | Comments off
Saturday, November 27th, 2010 | Author: Konrad Voelkel
See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity
Questions part III - Statistical Physics, Quantum Physics and Thermodynamics
Questions part IV - Philosophy of Science
Questions part V - Life and Metaphysics [Sch68]
- Is nature deterministic?
- Can causality be defined without reference to time? [BLMS87] [Sua01]
- How is it possible that semantic information emerges from purely syntactic information? [BLHL+ 01]
- Is there an inherent tendency in evolution to accumulate relevant information on the real world?
Is there an inherent tendency in evolution to increase the complexity of organisms and the biosphere as a whole?
“Humanity is now experiencing history’s most difficult evolutionary transformation.” – Buckminster Fuller, 1983
- Why are robustness and simplicity good and applicable criteria to describe nature (with causal networks)? [Jen03]
- Should we re-define “life”, using information-theoretic terms?
- What do Gödel’s theorems imply for information and complexity theory? [Cha82]
Is there an analogy between emergence and true but unprovable statements? [Bin08]
- Are there limits of self-prediction in individuals and societies?
“The human brain is incapable of creating anything which is really complex.” – Andrey Nikolaevich Kolmogorov 1990
- What is the answer to life, the universe and everything?
“There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another which states that this has already happened.” – Douglas Adams, 1980
Continue reading «Questions in Information Theory V: Life and Metaphysics»
Category: English, Questions in Information Theory | Comments off
Friday, November 12th, 2010 | Author: Konrad Voelkel
See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity
Questions part III - Statistical Physics, Quantum Physics and Thermodynamics
Questions part IV - Philosophy of Science [Pop34] [Kuh62] [Fey75] [Mil09]
- Does the point of view of information theory provide anything new in the sciences? [GM94]
Does information theory provide a new paradigm in the sciences? [Sei07]
- Is quantum information the key to unify general relativity and quantum theory?
Is information theory a guiding principle for a “theory of everything”?
“I think there is a need for something completely new. Something that is too different, too unexpected, to be accepted as yet.” – Anton Zeilinger, 2004
- (Why) are real discoveries possible in mathematics and other structural/formal sciences? [Bor07]
- Can we create or measure truly random numbers in nature?
How would we recognize random numbers?
What is a random number (or a random string of digits)?
“Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin. For, as has been pointed out several times, there is no such thing as a random number — there are only methods to produce random numbers, and a strict arithmetic procedure of course is not such a method.” – John von
Neumann, 1951
- What is semantic information, what is meaning in science?
What do we expect from an “explanation”?
“The Tao that can be told is not the eternal Tao.” – Lăozı, 4th century B.C.
- How do the concepts “truth” and “laws of nature” fit together? [Dav01] [Car94]
- Does is make sense to use linguistic terminology in natural sciences? [Gad75]
- Should physicists try to interpret quantum physics at all? [Dir42]
- Would it make sense to adapt the notion of real numbers to a limited amount of memory?
Can we build a theory of physics upon intuitionist logics?
Continue reading «Questions in Information Theory IV: Philosophy of Science»
Category: English, Questions in Information Theory | Comments off
Thursday, October 28th, 2010 | Author: Konrad Voelkel
See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity
Questions part III - Statistical Physics, Quantum Physics and Thermodynamics [GM94] [Pen05]
- Can there be some kind of Maxwell’s daemon, and why (not)?
- Can all analog information be transformed to digital information losslessly?
Is physical information always digital (digital philosophy)?
- Does the second law of thermodynamics depend on a human observer?
- Can quantum mechanical entanglement be reduced to mutual information? [NC00] [DH00] [PV06] [BP07]
- What is the mathematical, what is the physical content of Heisenberg’s uncertainty relation?
“I think I can safely say that nobody understands quantum mechanics.” – Richard Phillips Feynman, 1965
- Is there a method to denote and calculate, for an open physical system, how the information content changes in time when the system’s dynamics are known?
- Is there an important difference between information loss into small scales and information loss into microscopic freedoms?
- Where do (thermal, quantum, vacuum, ?) fluctuations come from? [Lam97] [Lam98]
What do they change on the macroscopic scale?
Are gravitational fluctuations possible?
- According to Einstein’s fluctuation-dissipation-theorem, do thermal fluctuations compensate exactly for information loss through dissipation? [CW51]
- Is probability theory powerful enough to capture any micro-physical phenomena?
Is mathematical probability theory the correct language for modern physics? [Mil04]
“In fact the smallest units of matter are not physical objects in the ordinary sense; they are forms, ideas which can be expressed unambiguously only in mathematical language.” – Werner Heisenberg, 1992
- How is Zeilinger’s concept of elementary systems generalizable to general state spaces?
- Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics? [And97]
Does decoherence by interference with the background universe render the concept of closed quantum systems obsolete? [BTV09]
- How does randomness in quantum measurement emerge from unitary evolution?
Is quantum physics truly information-preserving?
- How relevant is the classical concept of degree of freedom for quantum mechanics?
References
- [And97] Philip W. Anderson, Is measurement itself an emergent property?, Complex. 3 (1997), no. 1, 14–16.
- [BP07] S.L. Braunstein and A.K. Pati, Quantum information cannot be completely hidden in correlations: Implications for the black-hole information paradox, Physical Review Letters 98 (2007).
- [BTV09] Buchleitner, Tiersch, and Viviescas, Entanglement and decoherence, Lecture Notes in Physics, Springer, 2009.
- [CW51] Herbert B. Callen and Theodore A. Welton, Irreversibility and generalized noise, Phys. Rev. 83 (1951), no. 1, 34–40.
- [DH00] D. Deutsch and P. Hayden, Information flow in entangled quantum systems, Proceedings: Mathematics, Physical and Engineering Sciences 456 (2000), no. 1999, 1759–1774.
- [GM94] M. Gell-Mann, The quark and the jaguar, Freeman New York, 1994.
- [Lam97] SK Lamoreaux, Demonstration of the casimir force in the 0.6 to 6 µm range, Physical Review Letters 78 (1997), no. 1, 5–8.
- [Lam98] S. K. Lamoreaux, Erratum: Demonstration of the casimir force in the 0.6 to 6 µm range [phys. rev. lett. 78, 5 (1997)], Phys. Rev. Lett. 81 (1998), no. 24, 5475–5476.
- [Mil04] David Miller, Probability generalizes logic, 2004.
- [NC00] Nielsen and Chuang, Quantum computation and quantum information, 2000.
- [Pen05] Roger Penrose, The road to reality: A complete guide to the laws of the universe, 3 ed., Knopf, 2005.
- [PV06] Martin B. Plenio and S. Virmani, An introduction to entanglement measures, arXiv, Jun 2006.
Category: English, Questions in Information Theory | Comments off
Wednesday, October 13th, 2010 | Author: Konrad Voelkel
See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity [LV93]
- Can we capture the concepts of simplicity, Occam’s razor and complexity by the notion of algorithmic complexity? [LV93] [She09]
“What is simplicity? Simplicity is the shortest path to a solution.” – Ward Cunningham, 2004
“To repeat, we consider a computer program to be a theory for its output, that is the essential idea, and both theory and output are finite strings of bits whose size can be compared. And the best theory is the smallest program that produces that data, that precise output. That’s our version of what some people call Occam’s
razor.” – Gregory Chaitin, 2008
- What is the relation between algorithmic complexity and entropy? [LV93] [BS10]
- How is semantics (e.g. of the number pi) related to algorithmic complexity?
- How does complexity arise from syntactic information? [MGG01]
- Do different degrees of complexity correspond to qualitative differences in cognitive/reactive behaviour?
- Is the Church-Turing hypothesis true and if so, does it matter?
Is neural computation fundamentally distinct from electronic computation?
Is hypercomputation possible? [TS02] [BA04] [Dav06]
“A man provided with paper, pencil, and rubber, and subject to strict discipline, is in effect a universal machine.” – Alan Turing, 1948
- How can one define computability without referring to Turing machines?
- P vs. NP: is P=NP or not? [For09]
Is there a practical benefit of P=NP or of a proof of this?
- Is the concept of a non-deterministic Turing machine implementable by natural computing just like deterministic Turing machines are implementable by home computers?
- Would it be a good idea to adapt the concept of non-deterministic Turing machines to real non-deterministic systems?
- Can we tell for a general physical system whether it is a Turing machine? [Zus70]
“All processes, whether they are produced by human effort or occur spontaneously in nature, can be viewed as computations.” – Stephen Wolfram, 2002
- Can a Turing machine simulate itself?
- Does classical computer science “apply” to quantum computation? [Pre01] [Svo05] [BBC98]
References
- [BA04] Selmer Bringsjord and Konstantine Arkoudas, The modal argument for hypercomputing minds, Theoretical Computer Science 317 (2004), no. 1-3, 167 – 190, Super-Recursive Algorithms and Hypercomputation.
- [BBC98] G. Brassard, S. L. Braunstein, and R. Cleve, Teleportation as a quantum computation, Physica D 120 (1998), 43–47.
- [BS10] John Baez and Mike Stay, Algorithmic thermodynamics, preprint, 2010.
- [Dav06] Davis, Why there is no such discipline as hypercomputation.
- [For09] Lance Fortnow, The status of the p versus np problem, Communications of the Association of Computing Machinery 52 (2009), no. 9, 78–86.
- [LV93] Li and Vitanyi, An introduction to kolmogorov complexity, 1993.
- [MGG01] Andrés Moreira, Annahí Gajardo, and Eric Goles, Dynamical behavior and complexity of langton’s ant, Complex. 6 (2001), no. 4, 46–51.
- [Pre01] Preskill, Quantum information theory lecture notes, 2001.
- [She09] Alexander Shen, Algorithmic information theory and foundations of probability.
- [Svo05] Karl Svozil, Quantum logic. a brief outline.
- [TS02] Christof Teuscher and Moshe Sipper, Hypercomputation: hype or computation?, Commun. ACM 45 (2002), no. 8, 23–24.
- [Zus70] K. Zuse, Calculating space, Massachusetts Institute of Technology Technical Translation AZT-70-164-GEMIT (Project MAC). Cambridge (1970).
Category: English, Questions in Information Theory | Comments off
Saturday, September 18th, 2010 | Author: Konrad Voelkel
During a workshop, we developed some rather broad questions surrounding the concepts of information and complexity in the sciences, especially looking at quantum physics, digital philosophy and philosophy of science. This is spiced up with some more metaphysical questions and some rants by well-known scientists, to provoke the reader’s imagination. References to the literature are given as a first starting point to develop answers to the questions.
Comments, answers and even more questions are very welcome.
Questions part I - Information and Entropy [CT91]
- Is it possible to define a unifying universal notion of information, applicable to all sciences? [SW63]
Can we convert different notions of information and entropy? [Khi57]
Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information? [KL51]
“In fact, what we mean by information – the elementary unit of information – is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continually transformed are themselves provided with energy.” – Gregory Bateson, 1972
- Is it possible to define an absolute (non-relative, total) notion of information? [GML96]
Can we talk about the total information content of the universe? [Llo02]
- Where does information go when it is erased?
Is erasure of information possible at all? [Hit01]
- Does entropy emerge from irreversibility (in general)?
- Does the concept of asymmetry (group theory) significantly clarify the concept of information?
- What happens to information during computation?
References
- [CT91] T.M. Cover and J.A. Thomas, Elements of information theory, New York, 1991.
- [GML96] Murray Gell-Mann and Seth Lloyd, Information measures, effective complexity, and total information, Complex. 2 (1996), no. 1, 44–52.
- [Hit01] Hitchcock, is there a conservation of information law for the universe, 2001.
- [Khi57] A.I. Khinchin, Mathematical foundations of information theory, Dover Pubns, 1957.
- [KL51] S. Kullback and R.A. Leibler, On information and sufficiency, The Annals of Mathematical Statistics 22 (1951), no. 1, 79–86.
- [Llo02] Seth Lloyd, computational capacity of the universe, Physical Review Letters 88 (2002).
- [SW63] C.E. Shannon and W. Weaver, The mathematical theory of communication.
UPDATE 2010-10-13: added links to the references
Category: English, Questions in Information Theory | Comments off