Two Plus Two Newer Archives  

Go Back   Two Plus Two Newer Archives > Other Topics > Science, Math, and Philosophy
FAQ Community Calendar Today's Posts Search

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 12-30-2006, 05:38 AM
Metric Metric is offline
Senior Member
 
Join Date: Oct 2005
Posts: 1,178
Default origin of order in the universe

One problem with cosmology is the fact that the universe appears so ordered. One of the main postulates of statistical mechanics is that there are no a-priori "preferred" or "special" states. This leads to the prediction that a closed system without any external constraints, should almost always be expected to be in the highest entropy state (thermodynamic equilibrium). But the universe is dramatically NOT in one of these states -- we are alive because the universe apparently started from a supremely "special" state at the big bang, blatantly contradicting the standard assumption of stat mech (invoking anthropic arguments here does not help -- there are a HUGE number of "vastly more probable" ways the universe could support life than the way it was apparently done).

So, what's the explanation? The fact is, "because God did it that way" is nearly as good as anything we have so far. (I'm exaggerating a bit, but not very much)

However, I think there may be another way to look at this. There is a concept computer scientists should be aware of called "computational complexity." Basically, the computational complexity of a bit string is the length of the shortest program that produces that bit string. So, for example:

111111111111111111111111111111

has a very low complexity, since it can be described as "30 1s". On the other hand:

0010011010110010101110101110110

is much more complex -- the shortest description of this string is much longer. An "algorithmically random" bit string is such that no program shorter than the string itself exists -- i.e. their complexity is a maximum.

Now, 20 years ago there was an interesting paper published. Basically, the upshot was that if you want to simulate a physical system on a computer, the amount of information needed to specify the state of the system basically scaled like the entropy. So highly entropic states are much more difficult to specify than low-entropy states. This makes some sense -- try specifying the exact state of every atom of gas in a large chamber, vs. specifying the exact state of those same atoms in a perfect crystal lattice at absolute zero. One is very simple -- the other, mind bogglingly complex.

Now, one more thing: There is a thing called the "universal probability distribution," which basically says that with RANDOM input, the probability that bit string "x" is the output of a generic Turing machine is (to a good approximation) proportional to 2^-C (where "C" is the complexity of "x"). Thus, "algorithmically simple" outputs are vastly more probable.

So, if you envision a kind of "cosmic Turing machine" outputting an ensemble of universes with "random rules of physics and initial conditions" -- then the universal probability distribution predicts that we should expect to find ourselves in a universe evolving from a low-entropy (and thus algorithmically simple) initial state.

So we get to keep the Copernican principle, and at the same time live with the fact that we find ourselves in a "thermodynamically special" state of the universe -- since those are by far the most common!

(this argument has already been used as an explanation of the fact that the rules of physics tend to be "unreasonably simple" -- but I've not seen the connection to the entropy problem of universe drawn before)
Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 07:04 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.