Measuring Time

Wednesday, March 27th, 2013 | Author:

The units scientists use in their daily work are the SI units, and sometimes the equivalent cgs units. In these unit systems, everything is based upon the physical quantities length, mass, time (and some more). The part "physical" is only deserved if we mean something measurable by these "quantities". So, what does it mean to measure length, mass and time?

Category: English, Not Mathematics | Comments off

Feynman Graphs and Motives

Wednesday, March 20th, 2013 | Author:

Being on a school about Feynman graphs and Motives, I just learned how these are related. It's a cute story! Actually, you don't need any physics to appreciate it, though physics might let you appreciate it even more.

A Feynman graph is just a (non-directed) graph with a finite number of vertices and a finite number of edges. Physicists are interested in computing certain integrals defined in terms of Feynman graphs, which they call amplitudes.

Category: English, Mathematics | Comments off

Some thoughts on AQFT (algebraic or axiomatic quantum field theory)

Monday, August 08th, 2011 | Author:

In this post, I want to explain briefly the idea behind AQFT in the Haag-Kastler style. To motivate this, let me first sketch what QFT (= quantum field theory) is about, at least in my mathematically distorted perception.

Classical quantum theory is about modelling purely quantum effects, i.e. without considering gravity, or at least without considering relativistic effects. There, a separable Hilbert space as a state space is appropriate. The bounded linear operators on the state space form a certain kind of normed algebra with compatible involution (taking the adjoint) called C*-algebra and measurements correspond to self-adjoint operators.

Quantum field theory tries to incorporate quantum mechanics into electromagnetic field theory (or vice versa) and gravitational field theory. So far, no such theory-of-everything has been developed with falsifiable predictions, although there are some promising candidates.

Category: English | Comments off

Questions in Information Theory IV: Philosophy of Science

Friday, November 12th, 2010 | Author:

Questions part IV - Philosophy of Science [Pop34] [Kuh62] [Fey75] [Mil09]

1. Does the point of view of information theory provide anything new in the sciences? [GM94]
Does information theory provide a new paradigm in the sciences? [Sei07]
2. Is quantum information the key to unify general relativity and quantum theory?
Is information theory a guiding principle for a “theory of everything”?

“I think there is a need for something completely new. Something that is too diﬀerent, too unexpected, to be accepted as yet.” – Anton Zeilinger, 2004

3. (Why) are real discoveries possible in mathematics and other structural/formal sciences? [Bor07]
4. Can we create or measure truly random numbers in nature?
How would we recognize random numbers?
What is a random number (or a random string of digits)?

“Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin. For, as has been pointed out several times, there is no such thing as a random number — there are only methods to produce random numbers, and a strict arithmetic procedure of course is not such a method.” – John von
Neumann, 1951

5. What is semantic information, what is meaning in science?
What do we expect from an “explanation”?

“The Tao that can be told is not the eternal Tao.” – Lăozı, 4th century B.C.

6. How do the concepts “truth” and “laws of nature” ﬁt together? [Dav01] [Car94]
7. Does is make sense to use linguistic terminology in natural sciences? [Gad75]
8. Should physicists try to interpret quantum physics at all? [Dir42]
9. Would it make sense to adapt the notion of real numbers to a limited amount of memory?
Can we build a theory of physics upon intuitionist logics?

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory III: Statistical Physics, Quantum Physics and Thermodynamics

Thursday, October 28th, 2010 | Author:

Questions part III - Statistical Physics, Quantum Physics and Thermodynamics [GM94] [Pen05]

1. Can there be some kind of Maxwell’s daemon, and why (not)?
2. Can all analog information be transformed to digital information losslessly?
Is physical information always digital (digital philosophy)?
3. Does the second law of thermodynamics depend on a human observer?
4. Can quantum mechanical entanglement be reduced to mutual information? [NC00] [DH00] [PV06] [BP07]
5. What is the mathematical, what is the physical content of Heisenberg’s uncertainty relation?

“I think I can safely say that nobody understands quantum mechanics.” – Richard Phillips Feynman, 1965

6. Is there a method to denote and calculate, for an open physical system, how the information content changes in time when the system’s dynamics are known?
7. Is there an important diﬀerence between information loss into small scales and information loss into microscopic freedoms?
8. Where do (thermal, quantum, vacuum, ?) ﬂuctuations come from? [Lam97] [Lam98]
What do they change on the macroscopic scale?
Are gravitational ﬂuctuations possible?
9. According to Einstein’s ﬂuctuation-dissipation-theorem, do thermal ﬂuctuations compensate exactly for information loss through dissipation? [CW51]
10. Is probability theory powerful enough to capture any micro-physical phenomena?
Is mathematical probability theory the correct language for modern physics? [Mil04]

“In fact the smallest units of matter are not physical objects in the ordinary sense; they are forms, ideas which can be expressed unambiguously only in mathematical language.” – Werner Heisenberg, 1992

11. How is Zeilinger’s concept of elementary systems generalizable to general state spaces?
12. Given the fact that every system is continually measured, is the concept of a closed quantum system (with unitary time evolution) relevant for real physics? [And97]
Does decoherence by interference with the background universe render the concept of closed quantum systems obsolete? [BTV09]
13. How does randomness in quantum measurement emerge from unitary evolution?
Is quantum physics truly information-preserving?
14. How relevant is the classical concept of degree of freedom for quantum mechanics?

References

• [And97] Philip W. Anderson, Is measurement itself an emergent property?, Complex. 3 (1997), no. 1, 14–16.
• [BP07] S.L. Braunstein and A.K. Pati, Quantum information cannot be completely hidden in correlations: Implications for the black-hole information paradox, Physical Review Letters 98 (2007).
• [BTV09] Buchleitner, Tiersch, and Viviescas, Entanglement and decoherence, Lecture Notes in Physics, Springer, 2009.
• [CW51] Herbert B. Callen and Theodore A. Welton, Irreversibility and generalized noise, Phys. Rev. 83 (1951), no. 1, 34–40.
• [DH00] D. Deutsch and P. Hayden, Information ﬂow in entangled quantum systems, Proceedings: Mathematics, Physical and Engineering Sciences 456 (2000), no. 1999, 1759–1774.
• [GM94] M. Gell-Mann, The quark and the jaguar, Freeman New York, 1994.
• [Lam97] SK Lamoreaux, Demonstration of the casimir force in the 0.6 to 6 µm range, Physical Review Letters 78 (1997), no. 1, 5–8.
• [Lam98] S. K. Lamoreaux, Erratum: Demonstration of the casimir force in the 0.6 to 6 µm range [phys. rev. lett. 78, 5 (1997)], Phys. Rev. Lett. 81 (1998), no. 24, 5475–5476.
• [Mil04] David Miller, Probability generalizes logic, 2004.
• [NC00] Nielsen and Chuang, Quantum computation and quantum information, 2000.
• [Pen05] Roger Penrose, The road to reality: A complete guide to the laws of the universe, 3 ed., Knopf, 2005.
• [PV06] Martin B. Plenio and S. Virmani, An introduction to entanglement measures, arXiv, Jun 2006.

Category: English, Questions in Information Theory | Comments off

Questions in Information Theory II: Complexity and Algorithmic Complexity

Wednesday, October 13th, 2010 | Author:

Questions part II - Complexity and Algorithmic Complexity [LV93]

1. Can we capture the concepts of simplicity, Occam’s razor and complexity by the notion of algorithmic complexity? [LV93] [She09]

“What is simplicity? Simplicity is the shortest path to a solution.” – Ward Cunningham, 2004

“To repeat, we consider a computer program to be a theory for its output, that is the essential idea, and both theory and output are ﬁnite strings of bits whose size can be compared. And the best theory is the smallest program that produces that data, that precise output. That’s our version of what some people call Occam’s
razor.” – Gregory Chaitin, 2008

2. What is the relation between algorithmic complexity and entropy? [LV93] [BS10]
3. How is semantics (e.g. of the number pi) related to algorithmic complexity?
4. How does complexity arise from syntactic information? [MGG01]
5. Do diﬀerent degrees of complexity correspond to qualitative diﬀerences in cognitive/reactive behaviour?
6. Is the Church-Turing hypothesis true and if so, does it matter?
Is neural computation fundamentally distinct from electronic computation?
Is hypercomputation possible? [TS02] [BA04] [Dav06]

“A man provided with paper, pencil, and rubber, and subject to strict discipline, is in eﬀect a universal machine.” – Alan Turing, 1948

7. How can one deﬁne computability without referring to Turing machines?
8. P vs. NP: is P=NP or not? [For09]
Is there a practical beneﬁt of P=NP or of a proof of this?
9. Is the concept of a non-deterministic Turing machine implementable by natural computing just like deterministic Turing machines are implementable by home computers?
10. Would it be a good idea to adapt the concept of non-deterministic Turing machines to real non-deterministic systems?
11. Can we tell for a general physical system whether it is a Turing machine? [Zus70]

“All processes, whether they are produced by human eﬀort or occur spontaneously in nature, can be viewed as computations.” – Stephen Wolfram, 2002

12. Can a Turing machine simulate itself?
13. Does classical computer science “apply” to quantum computation? [Pre01] [Svo05] [BBC98]

References

Category: English, Questions in Information Theory | Comments off