Questions in Information Theory V: Life and Metaphysics

Saturday, November 27th, 2010 | Author:

See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity
Questions part III - Statistical Physics, Quantum Physics and Thermodynamics
Questions part IV - Philosophy of Science

Questions part V - Life and Metaphysics [Sch68]

  1. Is nature deterministic?
  2. Can causality be defined without reference to time? [BLMS87] [Sua01]
  3. How is it possible that semantic information emerges from purely syntactic information? [BLHL+ 01]
  4. Is there an inherent tendency in evolution to accumulate relevant information on the real world?
    Is there an inherent tendency in evolution to increase the complexity of organisms and the biosphere as a whole?

    “Humanity is now experiencing history’s most difficult evolutionary transformation.” – Buckminster Fuller, 1983

  5. Why are robustness and simplicity good and applicable criteria to describe nature (with causal networks)? [Jen03]
  6. Should we re-define “life”, using information-theoretic terms?
  7. What do Gödel’s theorems imply for information and complexity theory? [Cha82]
    Is there an analogy between emergence and true but unprovable statements? [Bin08]
  8. Are there limits of self-prediction in individuals and societies?

    “The human brain is incapable of creating anything which is really complex.” – Andrey Nikolaevich Kolmogorov 1990

  9. What is the answer to life, the universe and everything?

    “There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another which states that this has already happened.” – Douglas Adams, 1980

Continue reading «Questions in Information Theory V: Life and Metaphysics»

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory IV: Philosophy of Science

Friday, November 12th, 2010 | Author:

See also: Questions part I - Information and Entropy
Questions part II - Complexity and Algorithmic Complexity
Questions part III - Statistical Physics, Quantum Physics and Thermodynamics

Questions part IV - Philosophy of Science [Pop34] [Kuh62] [Fey75] [Mil09]

  1. Does the point of view of information theory provide anything new in the sciences? [GM94]
    Does information theory provide a new paradigm in the sciences? [Sei07]
  2. Is quantum information the key to unify general relativity and quantum theory?
    Is information theory a guiding principle for a “theory of everything”?

    “I think there is a need for something completely new. Something that is too different, too unexpected, to be accepted as yet.” – Anton Zeilinger, 2004

  3. (Why) are real discoveries possible in mathematics and other structural/formal sciences? [Bor07]
  4. Can we create or measure truly random numbers in nature?
    How would we recognize random numbers?
    What is a random number (or a random string of digits)?

    “Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin. For, as has been pointed out several times, there is no such thing as a random number — there are only methods to produce random numbers, and a strict arithmetic procedure of course is not such a method.” – John von
    Neumann, 1951

  5. What is semantic information, what is meaning in science?
    What do we expect from an “explanation”?

    “The Tao that can be told is not the eternal Tao.” – Lăozı, 4th century B.C.

  6. How do the concepts “truth” and “laws of nature” fit together? [Dav01] [Car94]
  7. Does is make sense to use linguistic terminology in natural sciences? [Gad75]
  8. Should physicists try to interpret quantum physics at all? [Dir42]
  9. Would it make sense to adapt the notion of real numbers to a limited amount of memory?
    Can we build a theory of physics upon intuitionist logics?

Continue reading «Questions in Information Theory IV: Philosophy of Science»

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory II: Complexity and Algorithmic Complexity

Wednesday, October 13th, 2010 | Author:

See also: Questions part I - Information and Entropy

Questions part II - Complexity and Algorithmic Complexity [LV93]

  1. Can we capture the concepts of simplicity, Occam’s razor and complexity by the notion of algorithmic complexity? [LV93] [She09]

    “What is simplicity? Simplicity is the shortest path to a solution.” – Ward Cunningham, 2004

    “To repeat, we consider a computer program to be a theory for its output, that is the essential idea, and both theory and output are finite strings of bits whose size can be compared. And the best theory is the smallest program that produces that data, that precise output. That’s our version of what some people call Occam’s
    razor.” – Gregory Chaitin, 2008

  2. What is the relation between algorithmic complexity and entropy? [LV93] [BS10]
  3. How is semantics (e.g. of the number pi) related to algorithmic complexity?
  4. How does complexity arise from syntactic information? [MGG01]
  5. Do different degrees of complexity correspond to qualitative differences in cognitive/reactive behaviour?
  6. Is the Church-Turing hypothesis true and if so, does it matter?
    Is neural computation fundamentally distinct from electronic computation?
    Is hypercomputation possible? [TS02] [BA04] [Dav06]

    “A man provided with paper, pencil, and rubber, and subject to strict discipline, is in effect a universal machine.” – Alan Turing, 1948

  7. How can one define computability without referring to Turing machines?
  8. P vs. NP: is P=NP or not? [For09]
    Is there a practical benefit of P=NP or of a proof of this?
  9. Is the concept of a non-deterministic Turing machine implementable by natural computing just like deterministic Turing machines are implementable by home computers?
  10. Would it be a good idea to adapt the concept of non-deterministic Turing machines to real non-deterministic systems?
  11. Can we tell for a general physical system whether it is a Turing machine? [Zus70]

    “All processes, whether they are produced by human effort or occur spontaneously in nature, can be viewed as computations.” – Stephen Wolfram, 2002

  12. Can a Turing machine simulate itself?
  13. Does classical computer science “apply” to quantum computation? [Pre01] [Svo05] [BBC98]

References

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory I: Information and Entropy

Saturday, September 18th, 2010 | Author:

During a workshop, we developed some rather broad questions surrounding the concepts of information and complexity in the sciences, especially looking at quantum physics, digital philosophy and philosophy of science. This is spiced up with some more metaphysical questions and some rants by well-known scientists, to provoke the reader’s imagination. References to the literature are given as a first starting point to develop answers to the questions.
Comments, answers and even more questions are very welcome.

Questions part I - Information and Entropy [CT91]

  1. Is it possible to define a unifying universal notion of information, applicable to all sciences? [SW63]
    Can we convert different notions of information and entropy? [Khi57]
    Is the mathematical definition of Kullback-Leibler distance the key to understand different kinds of information? [KL51]

    “In fact, what we mean by information – the elementary unit of information – is a difference which makes a difference, and it is able to make a difference because the neural pathways along which it travels and is continually transformed are themselves provided with energy.” – Gregory Bateson, 1972

  2. Is it possible to define an absolute (non-relative, total) notion of information? [GML96]
    Can we talk about the total information content of the universe? [Llo02]
  3. Where does information go when it is erased?
    Is erasure of information possible at all? [Hit01]
  4. Does entropy emerge from irreversibility (in general)?
  5. Does the concept of asymmetry (group theory) significantly clarify the concept of information?
  6. What happens to information during computation?

References

UPDATE 2010-10-13: added links to the references

Category: English, Questions in Information Theory | Comments off

On Theories and Stories

Saturday, December 05th, 2009 | Author:

Many many times I talked about this topic in personal communication. I repeated myself and I developed my presentation of arguments over the years. The development will continue, but now I present you the current status. I hope you enjoy reading this and give some comments!
Continue reading «On Theories and Stories»

Category: English, Not Mathematics | 2 Comments