|
#1
|
|||
|
|||
post-singularity evolution
So let's assume a sort of "Ray Kurzweil" exponential growth of technology over the next couple hundred years. At some point, human intelligence merges with machine intelligence, and eventually the physical, biological human brain will become just excess weight. At this point, humanity as we know it may be on the road to extinction, outclassed in every way by a new species of unimaginably superior intelligence.
The question then is, what guides the evolution of this new species, and its own decendants? Obviously the concepts of mutation, competition, reproduction, selection, speciation etc. become radically altered for the new species, which is free to alter its own programming at will. Still, I somehow suspect there to be a logically simple principle or set of principles (analogous to what we call "evolution") that guide such a life form and "selects" certain dominant character traits -- the question is, can we prove that this is the case under very reasonable assumptions, discover what the principles will be, and even discover what the dominant character traits are likely to be, centuries or millenia in advance? |
#2
|
|||
|
|||
Re: post-singularity evolution
Man that's really hard to say. Human civilization (which moves much faster than evolution) is basically driven by seemingly boundless wants and a desire to improve. I'd like to say the same of machines, but we can't even estimate what AI praxeology is like.
Very wicked cool concept to think about though [img]/images/graemlins/smile.gif[/img] |
#3
|
|||
|
|||
Re: post-singularity evolution
I agree that it's a bit like starting from scratch -- still, the ability to say ANYTHING about the principles guiding such a scenario (which could actually occur in the not-too-distant future, and give rise to an entirely new branch of science) would seem profound and worthwhile to know.
|
#4
|
|||
|
|||
Re: post-singularity evolution
I think the reason Kurzweil uses the term "singularity" is because it's impossible to "see beyond" it. If I accept his assumptions about the singularity, I also accept his conclusions that the results would be something we can't even imagine.
|
#5
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
I think the reason Kurzweil uses the term "singularity" is because it's impossible to "see beyond" it. If I accept his assumptions about the singularity, I also accept his conclusions that the results would be something we can't even imagine. [/ QUOTE ] (Completely off-topic) Do you? |
#6
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
[ QUOTE ] I think the reason Kurzweil uses the term "singularity" is because it's impossible to "see beyond" it. If I accept his assumptions about the singularity, I also accept his conclusions that the results would be something we can't even imagine. [/ QUOTE ] (Completely off-topic) Do you? [/ QUOTE ] Not remotely. |
#7
|
|||
|
|||
Re: post-singularity evolution
But isn't there some kind of immorality deal in there? Which really screws up evolution.
Man, I can't stand Kurzweil's crap. (Sorry Metric, I think it's a good thread, I just have to say that in any Kurzweil discussion where I'm not already criticizing him) |
#8
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
But isn't there some kind of immorality deal in there? Which really screws up evolution. Man, I can't stand Kurzweil's crap. (Sorry Metric, I think it's a good thread, I just have to say that in any Kurzweil discussion where I'm not already criticizing him) [/ QUOTE ] Did you mean 'immortality'? I don't see why immortality (relative immortality, not infinite) necessarily causes any problems for evolution. It would certainly change the pressures. |
#9
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
[ QUOTE ] But isn't there some kind of immorality deal in there? Which really screws up evolution. Man, I can't stand Kurzweil's crap. (Sorry Metric, I think it's a good thread, I just have to say that in any Kurzweil discussion where I'm not already criticizing him) [/ QUOTE ] Did you mean 'immortality'? I don't see why immortality (relative immortality, not infinite) necessarily causes any problems for evolution. It would certainly change the pressures. [/ QUOTE ] I guess if everyone is immortal I see a problem with certain aspects of selection (but not all of them). |
#10
|
|||
|
|||
Re: post-singularity evolution
I should clarify that I'm not taking all of the Kurzweil stuff at face value by any means. However, I don't think the interface of machine intelligence with the human brain is too far out -- in many ways it's the obvious end-point of computer technology. But once this happens, the "new species" is pretty much guaranteed to arrive sooner or later, hence the question of interest to me.
|
|
|