Two Plus Two Newer Archives

Two Plus Two Newer Archives (http://archives1.twoplustwo.com/index.php)
-   Science, Math, and Philosophy (http://archives1.twoplustwo.com/forumdisplay.php?f=49)
-   -   Hypothetical CS type question... (http://archives1.twoplustwo.com/showthread.php?t=557605)

madnak 11-30-2007 12:56 AM

Re: Hypothetical CS type question...
 
Oh yeah, my last post made me think of the Asimov story boris posted - The Last Question.

I think the growth is probably stable if you use the right scale. For instance, if you use orders of magnitude then it goes 1 hour, 10 hours, 100 hours, 1000 hours, 10000 hours. So it's only five "steps" really. Whereas you have many "steps" under 1 hour (down to tiny fractions of a second) and over 1 hour (up to trillions of years). So there's an illusion of "clustering" if your scale is ordinary (I don't know the terminology, but if each interval is n instead of 10^n). Though I think the growth is still very slow after the 1-year point. The difference between a 1,000-year program and a 1,000,000-year program isn't trivial, right?

Another problem is that supercomputers have already solved many of the mid-range problems. Stuff like "find the first n primes" has also gotten out of hand, because distributed computing projects have have already found the answer for a large n.

I do think rendering typically falls in the middle range, and it seems to fit the criteria.

CallMeIshmael 11-30-2007 01:31 AM

Re: Hypothetical CS type question...
 
mad,

v nice posting in this thread


though, I think I might take issue w/:


[ QUOTE ]
So there's an illusion of "clustering" if your scale is ordinary (I don't know the terminology, but if each interval is n instead of 10^n). Though I think the growth is still very slow after the 1-year point.

[/ QUOTE ]


Things Im interested in tend to grow factorially. So even in a log scale we will still see some serious growth for those problems. This could be a function of my interests though, and the whole set of programs may not tend to show that growth pattern.


Also, at least for right now, the difference between something like 10k and 10 million years to solve is, for intents and purposes, trivial. In as much as neither can produce the data we want, and thus fails at its sole purpose. (obviously, to a certain extent, Im taking a selfish viewpoint w/ that statement, since its reasonable to believe that, at some point, those programs will be runable in normal time, and, of course, at some point the difference will no longer be trivial).

thylacine 11-30-2007 01:38 AM

Re: Hypothetical CS type question...
 
[ QUOTE ]
[ QUOTE ]

*For the sake of the post, we can give 'meaningful' the definition of 'anything you can reasonably see someone wishing to program' So, obviously a very wide range of possible programs here. Basically, just dont make up a program for the sole purpose of discussing it in this thread.

[/ QUOTE ]

Tons of scientific computations can take a verrrry long time to compute. I could imagine many different types of quantum simulations, for example, that would blow up exponentially in time with addition of each new particle. In that sense, then, nearly all such programs will fall into the longest category. (EDIT: To be more explicit, let's say that the integer N characterizing the size of the problem has some threshold N0 where the problem ends up in category 3. ~100% of integers are bigger than N0, so to first order, all programs take longer.)

The one caveat is that the computer has finite space, so I can't just make these programs consider bigger and bigger problems without bound, otherwise 100% (in the measure theoretic sense) of the programs fall into the last category. But, in the real world, that's a concern. So I'd guess it's something like 95% of "interesting" programs as a lower bound fall into the last category.

[/ QUOTE ]

You are right and this remains true with quantum computers.

madnak 11-30-2007 01:57 AM

Re: Hypothetical CS type question...
 
[ QUOTE ]
Things Im interested in tend to grow factorially. So even in a log scale we will still see some serious growth for those problems. This could be a function of my interests though, and the whole set of programs may not tend to show that growth pattern.

[/ QUOTE ]

I think you're right here. It's hard to categorize these things. I mean, solving checkers, building a rainbow table, rendering a video, these all fall within the middle range. I'm sure there are many more examples. And if computers continue to get more powerful the whole "window" will change. But maybe there is a gap. I don't know what that would imply... Coincidence? Or are we "missing something?"

[ QUOTE ]
Also, at least for right now, the difference between something like 10k and 10 million years to solve is, for intents and purposes, trivial. In as much as neither can produce the data we want, and thus fails at its sole purpose. (obviously, to a certain extent, Im taking a selfish viewpoint w/ that statement, since its reasonable to believe that, at some point, those programs will be runable in normal time, and, of course, at some point the difference will no longer be trivial).

[/ QUOTE ]

Right, but the difference between 1 microsecond and 1 millisecond is also trivial to us, because it produces the data effectively instantaneously. Time scales of hours and days are more significant to us because those are the time scales we deal with in our lives. Exceptions abound on the small end of things, and would abound on the large end if we were immortal.


All times are GMT -4. The time now is 08:20 PM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.