Metric
02-27-2006, 11:34 AM
I recently came across some interesting work that was done about 10 years ago, concerning the "measurement problem" of quantum mechanics. Basically, the measurement problem refers to the strange non-deterministic "quantum jumps" and "randomness" that occur when a measurement is made.
This work, however, sets all of quantum theory in a new and interesting light. In this formalism, there are no "jumps" -- pure quantum mechanical states evolve to pure quantum mechanical states, completely deterministically. Probabilities arise only when "ignoring" part of a system in a well-defined way. However, it is shown by the author that this process of "ignoring" part of a system is common to all measurements, and thus the probabilities that we have always dealt with by introducing a seperate (and rather unsatisfying) postulate of QM emerge naturally due to an unavoidable "ignorance of the state" that is inherent in every measurement.
Interestingly, this work has not become widely known and appreciated. I contacted someone I know who works in this field and inquired why this was the case, and his impression was that the formalism (which makes extensive use of information theoretic concepts) was somewhat dense, and the work preceeded the huge interest in quantum information theory which has arisen in the intervening years -- i.e. he was ahead of his time. The author has since gone on to other things, and nobody else has made much of an effort to advance this stuff.
For those of you who are fluent in quantum mechanics and have an interest in the philosophical implications of "randomness" I recommend this paper:
http://xxx.lanl.gov/abs/quant-ph/9605002
This work, however, sets all of quantum theory in a new and interesting light. In this formalism, there are no "jumps" -- pure quantum mechanical states evolve to pure quantum mechanical states, completely deterministically. Probabilities arise only when "ignoring" part of a system in a well-defined way. However, it is shown by the author that this process of "ignoring" part of a system is common to all measurements, and thus the probabilities that we have always dealt with by introducing a seperate (and rather unsatisfying) postulate of QM emerge naturally due to an unavoidable "ignorance of the state" that is inherent in every measurement.
Interestingly, this work has not become widely known and appreciated. I contacted someone I know who works in this field and inquired why this was the case, and his impression was that the formalism (which makes extensive use of information theoretic concepts) was somewhat dense, and the work preceeded the huge interest in quantum information theory which has arisen in the intervening years -- i.e. he was ahead of his time. The author has since gone on to other things, and nobody else has made much of an effort to advance this stuff.
For those of you who are fluent in quantum mechanics and have an interest in the philosophical implications of "randomness" I recommend this paper:
http://xxx.lanl.gov/abs/quant-ph/9605002