Continuing Integer Sequences

Monday, October 27th, 2014 | Author:

A classic class of "math problem" is the continuation of integer sequences from a finite sample (usually at the beginning). For example:

Continue 2,4,6,8,...

To which the solution is often supposed to be 10,12,14, and so on.

The problem, as any mathematician knows, is that there is not the one solution. In fact, given any set of finite numbers one could just talk about the sequence that starts like that and continues with zeroes only, that's a perfectly valid sequence. Of course, one should really figure out the rule behind the finitely many numbers, but there are always many possible choices. The game is not to find any sensible rule, but to find the rule the designer had in mind. It's more about testing your knowledge of culture than an honest test of mathematical ability (or anything else).

But there is a way to fix this.

Continue reading «Continuing Integer Sequences»

Category: English, Mathematics | Comments off

9 Questions on Information Theory

Thursday, February 13th, 2014 | Author:

Back in 2010 I had a series of posts going about questions in information theory that arose from a 2-week seminar with a bunch of students coming from various scientific disciplines (a wonderful event!). Here I picked those that I still find particularly compelling:

  1. Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information?
  2. Can we talk about the total information content of the universe?
  3. Is hypercomputation possible?
  4. Can we tell for a physical system whether it is a Turing machine?
  5. Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics?
  6. Can we create or measure truly random numbers in nature, and how would we recognize that?
  7. Would it make sense to adapt the notion of real numbers to a limited (but not fixed) amount of memory?
  8. Can causality be defined without reference to time?
  9. Should we re-define “life”, using information-theoretic terms?

What do you think?

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory V: Life and Metaphysics

Saturday, November 27th, 2010 | Author:

See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity
Questions part III - Statistical Physics, Quantum Physics and Thermodynamics
Questions part IV - Philosophy of Science

Questions part V - Life and Metaphysics [Sch68]

  1. Is nature deterministic?
  2. Can causality be defined without reference to time? [BLMS87] [Sua01]
  3. How is it possible that semantic information emerges from purely syntactic information? [BLHL+ 01]
  4. Is there an inherent tendency in evolution to accumulate relevant information on the real world?
    Is there an inherent tendency in evolution to increase the complexity of organisms and the biosphere as a whole?

    “Humanity is now experiencing history’s most difficult evolutionary transformation.” – Buckminster Fuller, 1983

  5. Why are robustness and simplicity good and applicable criteria to describe nature (with causal networks)? [Jen03]
  6. Should we re-define “life”, using information-theoretic terms?
  7. What do Gödel’s theorems imply for information and complexity theory? [Cha82]
    Is there an analogy between emergence and true but unprovable statements? [Bin08]
  8. Are there limits of self-prediction in individuals and societies?

    “The human brain is incapable of creating anything which is really complex.” – Andrey Nikolaevich Kolmogorov 1990

  9. What is the answer to life, the universe and everything?

    “There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another which states that this has already happened.” – Douglas Adams, 1980

Continue reading «Questions in Information Theory V: Life and Metaphysics»

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory III: Statistical Physics, Quantum Physics and Thermodynamics

Thursday, October 28th, 2010 | Author:

See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity

Questions part III - Statistical Physics, Quantum Physics and Thermodynamics [GM94] [Pen05]

  1. Can there be some kind of Maxwell’s daemon, and why (not)?
  2. Can all analog information be transformed to digital information losslessly?
    Is physical information always digital (digital philosophy)?
  3. Does the second law of thermodynamics depend on a human observer?
  4. Can quantum mechanical entanglement be reduced to mutual information? [NC00] [DH00] [PV06] [BP07]
  5. What is the mathematical, what is the physical content of Heisenberg’s uncertainty relation?

    “I think I can safely say that nobody understands quantum mechanics.” – Richard Phillips Feynman, 1965

  6. Is there a method to denote and calculate, for an open physical system, how the information content changes in time when the system’s dynamics are known?
  7. Is there an important difference between information loss into small scales and information loss into microscopic freedoms?
  8. Where do (thermal, quantum, vacuum, ?) fluctuations come from? [Lam97] [Lam98]
    What do they change on the macroscopic scale?
    Are gravitational fluctuations possible?
  9. According to Einstein’s fluctuation-dissipation-theorem, do thermal fluctuations compensate exactly for information loss through dissipation? [CW51]
  10. Is probability theory powerful enough to capture any micro-physical phenomena?
    Is mathematical probability theory the correct language for modern physics? [Mil04]

    “In fact the smallest units of matter are not physical objects in the ordinary sense; they are forms, ideas which can be expressed unambiguously only in mathematical language.” – Werner Heisenberg, 1992

  11. How is Zeilinger’s concept of elementary systems generalizable to general state spaces?
  12. Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics? [And97]
    Does decoherence by interference with the background universe render the concept of closed quantum systems obsolete? [BTV09]
  13. How does randomness in quantum measurement emerge from unitary evolution?
    Is quantum physics truly information-preserving?
  14. How relevant is the classical concept of degree of freedom for quantum mechanics?

References

  • [And97] Philip W. Anderson, Is measurement itself an emergent property?, Complex. 3 (1997), no. 1, 14–16.
  • [BP07] S.L. Braunstein and A.K. Pati, Quantum information cannot be completely hidden in correlations: Implications for the black-hole information paradox, Physical Review Letters 98 (2007).
  • [BTV09] Buchleitner, Tiersch, and Viviescas, Entanglement and decoherence, Lecture Notes in Physics, Springer, 2009.
  • [CW51] Herbert B. Callen and Theodore A. Welton, Irreversibility and generalized noise, Phys. Rev. 83 (1951), no. 1, 34–40.
  • [DH00] D. Deutsch and P. Hayden, Information flow in entangled quantum systems, Proceedings: Mathematics, Physical and Engineering Sciences 456 (2000), no. 1999, 1759–1774.
  • [GM94] M. Gell-Mann, The quark and the jaguar, Freeman New York, 1994.
  • [Lam97] SK Lamoreaux, Demonstration of the casimir force in the 0.6 to 6 µm range, Physical Review Letters 78 (1997), no. 1, 5–8.
  • [Lam98] S. K. Lamoreaux, Erratum: Demonstration of the casimir force in the 0.6 to 6 µm range [phys. rev. lett. 78, 5 (1997)], Phys. Rev. Lett. 81 (1998), no. 24, 5475–5476.
  • [Mil04] David Miller, Probability generalizes logic, 2004.
  • [NC00] Nielsen and Chuang, Quantum computation and quantum information, 2000.
  • [Pen05] Roger Penrose, The road to reality: A complete guide to the laws of the universe, 3 ed., Knopf, 2005.
  • [PV06] Martin B. Plenio and S. Virmani, An introduction to entanglement measures, arXiv, Jun 2006.

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory II: Complexity and Algorithmic Complexity

Wednesday, October 13th, 2010 | Author:

See also: Questions part I - Information and Entropy

Questions part II - Complexity and Algorithmic Complexity [LV93]

  1. Can we capture the concepts of simplicity, Occam’s razor and complexity by the notion of algorithmic complexity? [LV93] [She09]

    “What is simplicity? Simplicity is the shortest path to a solution.” – Ward Cunningham, 2004

    “To repeat, we consider a computer program to be a theory for its output, that is the essential idea, and both theory and output are finite strings of bits whose size can be compared. And the best theory is the smallest program that produces that data, that precise output. That’s our version of what some people call Occam’s
    razor.” – Gregory Chaitin, 2008

  2. What is the relation between algorithmic complexity and entropy? [LV93] [BS10]
  3. How is semantics (e.g. of the number pi) related to algorithmic complexity?
  4. How does complexity arise from syntactic information? [MGG01]
  5. Do different degrees of complexity correspond to qualitative differences in cognitive/reactive behaviour?
  6. Is the Church-Turing hypothesis true and if so, does it matter?
    Is neural computation fundamentally distinct from electronic computation?
    Is hypercomputation possible? [TS02] [BA04] [Dav06]

    “A man provided with paper, pencil, and rubber, and subject to strict discipline, is in effect a universal machine.” – Alan Turing, 1948

  7. How can one define computability without referring to Turing machines?
  8. P vs. NP: is P=NP or not? [For09]
    Is there a practical benefit of P=NP or of a proof of this?
  9. Is the concept of a non-deterministic Turing machine implementable by natural computing just like deterministic Turing machines are implementable by home computers?
  10. Would it be a good idea to adapt the concept of non-deterministic Turing machines to real non-deterministic systems?
  11. Can we tell for a general physical system whether it is a Turing machine? [Zus70]

    “All processes, whether they are produced by human effort or occur spontaneously in nature, can be viewed as computations.” – Stephen Wolfram, 2002

  12. Can a Turing machine simulate itself?
  13. Does classical computer science “apply” to quantum computation? [Pre01] [Svo05] [BBC98]

References

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory I: Information and Entropy

Saturday, September 18th, 2010 | Author:

During a workshop, we developed some rather broad questions surrounding the concepts of information and complexity in the sciences, especially looking at quantum physics, digital philosophy and philosophy of science. This is spiced up with some more metaphysical questions and some rants by well-known scientists, to provoke the reader’s imagination. References to the literature are given as a first starting point to develop answers to the questions.
Comments, answers and even more questions are very welcome.

Questions part I - Information and Entropy [CT91]

  1. Is it possible to define a unifying universal notion of information, applicable to all sciences? [SW63]
    Can we convert different notions of information and entropy? [Khi57]
    Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information? [KL51]

    “In fact, what we mean by information – the elementary unit of information – is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continually transformed are themselves provided with energy.” – Gregory Bateson, 1972

  2. Is it possible to define an absolute (non-relative, total) notion of information? [GML96]
    Can we talk about the total information content of the universe? [Llo02]
  3. Where does information go when it is erased?
    Is erasure of information possible at all? [Hit01]
  4. Does entropy emerge from irreversibility (in general)?
  5. Does the concept of asymmetry (group theory) significantly clarify the concept of information?
  6. What happens to information during computation?

References

UPDATE 2010-10-13: added links to the references

Category: English, Questions in Information Theory | Comments off