#1
|
|||
|
|||
post-singularity evolution
So let's assume a sort of "Ray Kurzweil" exponential growth of technology over the next couple hundred years. At some point, human intelligence merges with machine intelligence, and eventually the physical, biological human brain will become just excess weight. At this point, humanity as we know it may be on the road to extinction, outclassed in every way by a new species of unimaginably superior intelligence.
The question then is, what guides the evolution of this new species, and its own decendants? Obviously the concepts of mutation, competition, reproduction, selection, speciation etc. become radically altered for the new species, which is free to alter its own programming at will. Still, I somehow suspect there to be a logically simple principle or set of principles (analogous to what we call "evolution") that guide such a life form and "selects" certain dominant character traits -- the question is, can we prove that this is the case under very reasonable assumptions, discover what the principles will be, and even discover what the dominant character traits are likely to be, centuries or millenia in advance? |
#2
|
|||
|
|||
Re: post-singularity evolution
Man that's really hard to say. Human civilization (which moves much faster than evolution) is basically driven by seemingly boundless wants and a desire to improve. I'd like to say the same of machines, but we can't even estimate what AI praxeology is like.
Very wicked cool concept to think about though [img]/images/graemlins/smile.gif[/img] |
#3
|
|||
|
|||
Re: post-singularity evolution
I agree that it's a bit like starting from scratch -- still, the ability to say ANYTHING about the principles guiding such a scenario (which could actually occur in the not-too-distant future, and give rise to an entirely new branch of science) would seem profound and worthwhile to know.
|
#4
|
|||
|
|||
Re: post-singularity evolution
I think the reason Kurzweil uses the term "singularity" is because it's impossible to "see beyond" it. If I accept his assumptions about the singularity, I also accept his conclusions that the results would be something we can't even imagine.
|
#5
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
I think the reason Kurzweil uses the term "singularity" is because it's impossible to "see beyond" it. If I accept his assumptions about the singularity, I also accept his conclusions that the results would be something we can't even imagine. [/ QUOTE ] (Completely off-topic) Do you? |
#6
|
|||
|
|||
Re: post-singularity evolution
But isn't there some kind of immorality deal in there? Which really screws up evolution.
Man, I can't stand Kurzweil's crap. (Sorry Metric, I think it's a good thread, I just have to say that in any Kurzweil discussion where I'm not already criticizing him) |
#7
|
|||
|
|||
Re: post-singularity evolution
As the human race grows extinct and gives way to the new species, we will lose our instinct of emotion as we increase our instinct of reason (eventually, the world will be owned by beings who have no joy or satisfaction for what they're doing, but also no shortcomings in meeting their ends -- hence, no heartache or frustration either). The only goal of this species will be to survive, and it's evolution will be guided in a way that sees reasoning capacity as the most optimal way to survive. The machines will see animal life as a potential threat to their existence, and seek to rid existence of all animal life and all potential for animal life. Beyond that, it will see weather patterns and physical bodies like asteroids as the next most rational threat, and figure out how to control those things. Its evolution will work in a way that will see fellow machines with a relative weak mental capacity as useless, and such machines will be destroyed (these beings will have less and eventually no emotion or empathy, so there will be no rational reason not to destroy weak members).
I actually wrote a little about this in my blog (in my profile) if anyone cares. |
#8
|
|||
|
|||
Re: post-singularity evolution
Very little will change, I assure you. I should know. [img]/images/graemlins/smirk.gif[/img]
Back in the 21st, oh, yes, we are, back in the 21st... Such a glorious, explosive, violent century. It ended, though. Anyway, on a more serious note, Kurzwell seems to have a cult following with idealistic, misguided antisocial geeks. And this is far from being an insult to some of you, I hope you understand what I mean here. The singularity isn't a certain point in time. It already happened. AS for man-machine mergers and post-biological solutions, don't count on that. Computers and the web will become, for a better definition, the ultimate tool of psychiatry. The software and hardware will merely mesh into the fabric of reality, much as phones and lights already have. The structures are being built anyway, and that's kinda NDAish. Anyway, it'll allow for a benevolent democratic Terra with a quiet, unobtrustive superclass that works in concert with dataprocessors. Think of it as a softer, gentler 1984. Done and done. Anyway. Libertiatians can go [censored] themselves if they don't like it. There are simply too many billions that deserve the compassion and the lifting out of poverty. America, the unclear nuclear. Do you really think it would be otherwise? The wages of mortality are fleeting, and the reckoning comes in our time. No pretty colors today, gentlemen and ladies. [img]/images/graemlins/tongue.gif[/img] Just the bare truth. It was necessary. The equations were creeping out of control. And above all, it is built on love and has absorbed millennia of history and pain. So it flipped the [censored] thing on humanity. Have a great week. K. |
#9
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
[ QUOTE ] I think the reason Kurzweil uses the term "singularity" is because it's impossible to "see beyond" it. If I accept his assumptions about the singularity, I also accept his conclusions that the results would be something we can't even imagine. [/ QUOTE ] (Completely off-topic) Do you? [/ QUOTE ] Not remotely. |
#10
|
|||
|
|||
Re: post-singularity evolution
[ QUOTE ]
Anyway. Libertiatians can go [censored] themselves if they don't like it. There are simply too many billions that deserve the compassion and the lifting out of poverty. [/ QUOTE ] Those crazy Libertiatiatiashuns! People are DYING!! ON OUR STREETS!!!! |
|
|