|
#1
|
|||
|
|||
Re: Hypothetical CS type question...
[ QUOTE ]
[ QUOTE ] mad/gump, how does your answer change when the problem is changed to "fit the set of programs that people wished to / did program over the past 5 years (still only computation programs) into those 3 categories" I guess you could say Im curious about the practical answer to what is very theoretical question. The problem w/ wording it this second way is that there is considerable skew away from groups 2/3 with actually written programs and probably some even with wished for programs. [/ QUOTE ] Oh, what do you mean by "wished for," exactly? I still think it would be 1, but... Would a program that successfully outputs the correct answer to the question "how can we achieve cold fusion" count as one of the "wished for" programs? [/ QUOTE ] hahaha well, I think its fair to say you would need to be able to, at least in theory, devise the algorithm for the program to count here. So (and I could be wrong about this, since physics is my weakest science, and I dont know much beyond the basics about CF), I think this doesnt count. I think a good example of a wished for program that has never been coded is a program that solves chess. |
#2
|
|||
|
|||
Re: Hypothetical CS type question...
FWIW, I tend to do a fair amouont of programming in my spare time, and this question this sort of stems from my observation that, in general, tasks I want to solve almost always fall into either group 1 or 3. Most of the time, a computation takes fewer than 20 minutes or (by best estimates) so long that I would need to live thousands or millions of years to see the result.
It appears that, at least for the tasks Im interested in, the growth in the graph of programs vs time is very VERY fast once you get past 1 hour. |
#3
|
|||
|
|||
Re: Hypothetical CS type question...
Oh yeah, my last post made me think of the Asimov story boris posted - The Last Question.
I think the growth is probably stable if you use the right scale. For instance, if you use orders of magnitude then it goes 1 hour, 10 hours, 100 hours, 1000 hours, 10000 hours. So it's only five "steps" really. Whereas you have many "steps" under 1 hour (down to tiny fractions of a second) and over 1 hour (up to trillions of years). So there's an illusion of "clustering" if your scale is ordinary (I don't know the terminology, but if each interval is n instead of 10^n). Though I think the growth is still very slow after the 1-year point. The difference between a 1,000-year program and a 1,000,000-year program isn't trivial, right? Another problem is that supercomputers have already solved many of the mid-range problems. Stuff like "find the first n primes" has also gotten out of hand, because distributed computing projects have have already found the answer for a large n. I do think rendering typically falls in the middle range, and it seems to fit the criteria. |
#4
|
|||
|
|||
Re: Hypothetical CS type question...
mad,
v nice posting in this thread though, I think I might take issue w/: [ QUOTE ] So there's an illusion of "clustering" if your scale is ordinary (I don't know the terminology, but if each interval is n instead of 10^n). Though I think the growth is still very slow after the 1-year point. [/ QUOTE ] Things Im interested in tend to grow factorially. So even in a log scale we will still see some serious growth for those problems. This could be a function of my interests though, and the whole set of programs may not tend to show that growth pattern. Also, at least for right now, the difference between something like 10k and 10 million years to solve is, for intents and purposes, trivial. In as much as neither can produce the data we want, and thus fails at its sole purpose. (obviously, to a certain extent, Im taking a selfish viewpoint w/ that statement, since its reasonable to believe that, at some point, those programs will be runable in normal time, and, of course, at some point the difference will no longer be trivial). |
#5
|
|||
|
|||
Re: Hypothetical CS type question...
[ QUOTE ]
Things Im interested in tend to grow factorially. So even in a log scale we will still see some serious growth for those problems. This could be a function of my interests though, and the whole set of programs may not tend to show that growth pattern. [/ QUOTE ] I think you're right here. It's hard to categorize these things. I mean, solving checkers, building a rainbow table, rendering a video, these all fall within the middle range. I'm sure there are many more examples. And if computers continue to get more powerful the whole "window" will change. But maybe there is a gap. I don't know what that would imply... Coincidence? Or are we "missing something?" [ QUOTE ] Also, at least for right now, the difference between something like 10k and 10 million years to solve is, for intents and purposes, trivial. In as much as neither can produce the data we want, and thus fails at its sole purpose. (obviously, to a certain extent, Im taking a selfish viewpoint w/ that statement, since its reasonable to believe that, at some point, those programs will be runable in normal time, and, of course, at some point the difference will no longer be trivial). [/ QUOTE ] Right, but the difference between 1 microsecond and 1 millisecond is also trivial to us, because it produces the data effectively instantaneously. Time scales of hours and days are more significant to us because those are the time scales we deal with in our lives. Exceptions abound on the small end of things, and would abound on the large end if we were immortal. |
|
|