#1
|
|||
|
|||
playing God
Let's fast forward a couple centuries in technology, and suppose you have access to (for our purposes) nearly unlimited computing power. Suppose you write a computer program that simulates a world with rules where life is likely to evolve. You then run this program, and sure enough after a while life evolves in your computer universe, and eventually (to your great interest -- let's say you're a grad student studying such things) intelligent life which spends a good deal of its time communicating with other members of its species, building civilizations and lives for themselves, trying to understand the universe they inhabit, and perhaps going to war over resources from time to time.
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible? And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted? |
#2
|
|||
|
|||
Re: playing God
[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible? [/ QUOTE ] We're talking about a computer program here, right? If so, you have no responsibilities to these "life forms" - because they're models that exist in a virtual world. If they were real, living, beings that would be another story. [ QUOTE ] And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted? [/ QUOTE ] It doesn't seem like a big deal, it's just a computer program. Unless this program is being used for something other than enjoyment, it should probably be deleted (especially if it's running on the University's equipment.) There doesn't seem to be a reason to break rules in order to save it. |
#3
|
|||
|
|||
Re: playing God
[ QUOTE ]
Let's fast forward a couple centuries in technology, and suppose you have access to (for our purposes) nearly unlimited computing power. Suppose you write a computer program that simulates a world with rules where life is likely to evolve. You then run this program, and sure enough after a while life evolves in your computer universe, and eventually (to your great interest -- let's say you're a grad student studying such things) intelligent life which spends a good deal of its time communicating with other members of its species, building civilizations and lives for themselves, trying to understand the universe they inhabit, and perhaps going to war over resources from time to time. Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible? [/ QUOTE ] I don't think you're responsible exactly, but I'd probably be pretty pleased with myself and I'd want to take responsibility. I'd also steal/somehow keep this program for 100% sure, regardless of what the university wanted to do with it. Edit: this kinda makes me want to play Sim City... weird huh? |
#4
|
|||
|
|||
Re: playing God
You seem to be making a distinction between information encoded in "computer" degrees of freedom and information encoded in "natural" degrees of freedom. But of course if it's the same information, it's every bit as real (insert Morpheus quote here). The only difference is that you have complete and utter control of all the computer degrees of freedom, and no comparable control over natural ones, since you yourself are encoded in natural degrees of freedom.
Still, I accept your answer completely -- you're god of this universe, and you can do whatever you like with zero consequences. |
#5
|
|||
|
|||
Re: playing God
By "responsibility" I mean what moral obligations would you feel to the intelligences in your program? Would you, for example, have any qualms about randomly occuring "natural disasters" killing some of the children of the species, and feel compelled to halt the program and erase the effect each time it happened?
|
#6
|
|||
|
|||
Re: playing God
[ QUOTE ]
You seem to be making a distinction between information encoded in "computer" degrees of freedom and information encoded in "natural" degrees of freedom. But of course if it's the same information, it's every bit as real (insert Morpheus quote here). [/ QUOTE ] That's not necessarily true. It's not a position I accept personally - sometimes (especially with life) the medium is the message. At the same time, this isn't a computer simulation of life. The properties of life have arisen spontaneously and are emergent properties of the core components of the program. They probably are a bit different from the properties of life in our universe. But I don't know that there's any way to conclude that these "people" are any less "alive" than we are. Or any less sentient. I'd probably still turn off the program. See, I'm not actually omnipotent in this universe, and more importantly I'm not omniscient. I don't believe it's possible for me to make life a happy experience for all of them, and I don't want to be responsible for what is likely to be a universe of suffering. But I don't think this is a moral position per se. There are plenty of things that would convince me to keep the experiment going. For example, if this were the very first form of artificial intelligence then it would represent an entire new community of thinkers who could contribute to our own advancement. In fact, if time passes more quickly in the artificial universe, then the people there may exceed our creative and intellectual capacities before long. They might even discover the nature of their universe, and they might even find a way to update the program from within or even to find a way into our own world. |
#7
|
|||
|
|||
Re: playing God
Suppose that I, Metric, being a supreme a-hole, decided to halt the program while you were at lunch, and communicated with the computer beings, telling them that their creator, madnak, intends to destroy their universe and all of them with it at some roughly specified point in their future -- unless, of course, they could convince you otherwise. What would they be justified in thinking about you? And would you care?
|
#8
|
|||
|
|||
Re: playing God
[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible? [/ QUOTE ] Responsibilities to what, the deterministic epiphenomena of a machine? None. [ QUOTE ] And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted? [/ QUOTE ] There's no need. Just load it again later. |
#9
|
|||
|
|||
Re: playing God
This seems to gloss over the nature of 'attachment/responsibility'. The starving african child is an example from this world.
It's difficult to say how a person would react in such a situation..look how whacked out some people get when you knock over their butterfly collection. luckyme |
#10
|
|||
|
|||
Re: playing God
Doesn't have to be deterministic. Could in principle be a quantum computer running "standard model" software, in which case it would be (from the inside) indistinguishable from a piece of our own universe. But I accept your decision as god of your particular universe -- no right or wrong answer here, of course.
|
|
|