PDA

View Full Version : playing God


Metric
02-16-2007, 02:53 PM
Let's fast forward a couple centuries in technology, and suppose you have access to (for our purposes) nearly unlimited computing power. Suppose you write a computer program that simulates a world with rules where life is likely to evolve. You then run this program, and sure enough after a while life evolves in your computer universe, and eventually (to your great interest -- let's say you're a grad student studying such things) intelligent life which spends a good deal of its time communicating with other members of its species, building civilizations and lives for themselves, trying to understand the universe they inhabit, and perhaps going to war over resources from time to time.

Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted?

abridge
02-16-2007, 04:23 PM
[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

[/ QUOTE ]
We're talking about a computer program here, right?
If so, you have no responsibilities to these "life forms" - because they're models that exist in a virtual world. If they were real, living, beings that would be another story.

[ QUOTE ]
And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted?

[/ QUOTE ]
It doesn't seem like a big deal, it's just a computer program. Unless this program is being used for something other than enjoyment, it should probably be deleted (especially if it's running on the University's equipment.) There doesn't seem to be a reason to break rules in order to save it.

mbillie1
02-16-2007, 04:27 PM
[ QUOTE ]
Let's fast forward a couple centuries in technology, and suppose you have access to (for our purposes) nearly unlimited computing power. Suppose you write a computer program that simulates a world with rules where life is likely to evolve. You then run this program, and sure enough after a while life evolves in your computer universe, and eventually (to your great interest -- let's say you're a grad student studying such things) intelligent life which spends a good deal of its time communicating with other members of its species, building civilizations and lives for themselves, trying to understand the universe they inhabit, and perhaps going to war over resources from time to time.

Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

[/ QUOTE ]

I don't think you're responsible exactly, but I'd probably be pretty pleased with myself and I'd want to take responsibility. I'd also steal/somehow keep this program for 100% sure, regardless of what the university wanted to do with it.

Edit: this kinda makes me want to play Sim City... weird huh?

Metric
02-16-2007, 05:02 PM
You seem to be making a distinction between information encoded in "computer" degrees of freedom and information encoded in "natural" degrees of freedom. But of course if it's the same information, it's every bit as real (insert Morpheus quote here). The only difference is that you have complete and utter control of all the computer degrees of freedom, and no comparable control over natural ones, since you yourself are encoded in natural degrees of freedom.

Still, I accept your answer completely -- you're god of this universe, and you can do whatever you like with zero consequences.

Metric
02-16-2007, 05:07 PM
By "responsibility" I mean what moral obligations would you feel to the intelligences in your program? Would you, for example, have any qualms about randomly occuring "natural disasters" killing some of the children of the species, and feel compelled to halt the program and erase the effect each time it happened?

madnak
02-16-2007, 07:18 PM
[ QUOTE ]
You seem to be making a distinction between information encoded in "computer" degrees of freedom and information encoded in "natural" degrees of freedom. But of course if it's the same information, it's every bit as real (insert Morpheus quote here).

[/ QUOTE ]

That's not necessarily true. It's not a position I accept personally - sometimes (especially with life) the medium is the message.

At the same time, this isn't a computer simulation of life. The properties of life have arisen spontaneously and are emergent properties of the core components of the program. They probably are a bit different from the properties of life in our universe. But I don't know that there's any way to conclude that these "people" are any less "alive" than we are. Or any less sentient.

I'd probably still turn off the program. See, I'm not actually omnipotent in this universe, and more importantly I'm not omniscient. I don't believe it's possible for me to make life a happy experience for all of them, and I don't want to be responsible for what is likely to be a universe of suffering.

But I don't think this is a moral position per se. There are plenty of things that would convince me to keep the experiment going. For example, if this were the very first form of artificial intelligence then it would represent an entire new community of thinkers who could contribute to our own advancement. In fact, if time passes more quickly in the artificial universe, then the people there may exceed our creative and intellectual capacities before long. They might even discover the nature of their universe, and they might even find a way to update the program from within or even to find a way into our own world.

Metric
02-16-2007, 08:20 PM
Suppose that I, Metric, being a supreme a-hole, decided to halt the program while you were at lunch, and communicated with the computer beings, telling them that their creator, madnak, intends to destroy their universe and all of them with it at some roughly specified point in their future -- unless, of course, they could convince you otherwise. What would they be justified in thinking about you? And would you care?

Skidoo
02-16-2007, 08:36 PM
[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

[/ QUOTE ]

Responsibilities to what, the deterministic epiphenomena of a machine? None.

[ QUOTE ]
And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted?

[/ QUOTE ]

There's no need. Just load it again later.

luckyme
02-16-2007, 08:38 PM
This seems to gloss over the nature of 'attachment/responsibility'. The starving african child is an example from this world.

It's difficult to say how a person would react in such a situation..look how whacked out some people get when you knock over their butterfly collection.

luckyme

Metric
02-16-2007, 08:47 PM
Doesn't have to be deterministic. Could in principle be a quantum computer running "standard model" software, in which case it would be (from the inside) indistinguishable from a piece of our own universe. But I accept your decision as god of your particular universe -- no right or wrong answer here, of course.

Skidoo
02-16-2007, 09:13 PM
If you mean that the indeterminacy of the standard model would be included in the simulation, I'd say whatever status the "beings" would have otherwise had as autonomous and self-determining would be reduced by doing so.

Metric
02-16-2007, 10:39 PM
[ QUOTE ]
If you mean that the indeterminacy of the standard model would be included in the simulation, I'd say whatever status the "beings" would have otherwise had as autonomous and self-determining would be reduced by doing so.

[/ QUOTE ]
Increased, reduced -- I don't care. I simply mean that their status as sentient, self-determining beings beings can be made identical to ours. Except, of course, that you are their god.

Skidoo
02-16-2007, 10:51 PM
OK, I won't argue with the premise of your OP that a (partially indeterminate) computational process can have subjective awareness.

madnak
02-16-2007, 11:18 PM
[ QUOTE ]
Suppose that I, Metric, being a supreme a-hole, decided to halt the program while you were at lunch, and communicated with the computer beings, telling them that their creator, madnak, intends to destroy their universe and all of them with it at some roughly specified point in their future -- unless, of course, they could convince you otherwise. What would they be justified in thinking about you? And would you care?

[/ QUOTE ]

Ugh. Well, I don't think they'd be justified in coming to any conclusions. But they probably would.

I'd definitely care, because now I have a relationship with them. That indicates responsibility. I think if my only goal were to minimize suffering, I'd still zap them, but as it is I'd feel an obligation to work with them to improve things. Maybe take care of a few problems, and give them some way to "apply" for divine interventions. I'd also have to make sure that they didn't get deleted.

And I'd have to make you pay, sometime and somehow, when you were least expecting it.

SitNHit
02-17-2007, 07:52 AM
What would be the reason for these beings, what would be their purpose? Is their a certain goal for them? Is their a universe in this place? Is their a history previous to them existing, I think these people in this world would start to wonder after a while why the heck they exist.

Great concept for a game if it was possible to produce a game of this liking.

I would think if you said that everybody is responsible for the decisions they make then you have no responsibility to them.

Metric
02-17-2007, 10:23 AM
[ QUOTE ]

Ugh. Well, I don't think they'd be justified in coming to any conclusions. But they probably would.

I'd definitely care, because now I have a relationship with them. That indicates responsibility. I think if my only goal were to minimize suffering, I'd still zap them, but as it is I'd feel an obligation to work with them to improve things. Maybe take care of a few problems, and give them some way to "apply" for divine interventions. I'd also have to make sure that they didn't get deleted.

[/ QUOTE ]
Fascinating... So in this universe we have a god that doesn't do too much in the way of direct interaction (maybe occationally -- he does care about his "children" on some level), but with whom "belief/relationship" has become pretty much the road to salvation. Now that I think about it, it might even be +EV for the life forms there to kill off anyone in their community denying that "there is only one madnak and Metric is his prophet" to prevent atheistic or antimadnak views from becoming dominant, which would place their civilization at much greater risk of eventually getting unplugged.

revots33
02-17-2007, 10:35 AM
[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

[/ QUOTE ]

How about if you created a secondary program that would inflict eternal torture on any of the life forms who did not worship their programmer?

And how about you created the program in such a way that the life forms have no way of knowing who the programmer is?

Then I'd say you have some ethical explaining to do.

Metric
02-17-2007, 10:48 AM
It's clear that they don't have any way of knowing the programmer, unless you intervene to cause them to. But I take it you're good with anything that doesn't involve eternal torture?

madnak
02-17-2007, 11:16 AM
[ QUOTE ]
Fascinating... So in this universe we have a god that doesn't do too much in the way of direct interaction (maybe occationally -- he does care about his "children" on some level), but with whom "belief/relationship" has become pretty much the road to salvation. Now that I think about it, it might even be +EV for the life forms there to kill off anyone in their community denying that "there is only one madnak and Metric is his prophet" to prevent atheistic or antimadnak views from becoming dominant, which would place their civilization at much greater risk of eventually getting unplugged.

[/ QUOTE ]

No, when I say I'd "take care of a few problems" I mean I'd put a stop to that kind of thing, as well as disease and poverty and all that. Give them some wells of infinite resources, stuff like that.

Metric
02-17-2007, 11:22 AM
Would you do that because you think anything less is unethical, or just because you're a good guy and the poor bastards love you so much?

revots33
02-17-2007, 12:29 PM
[ QUOTE ]
It's clear that they don't have any way of knowing the programmer, unless you intervene to cause them to. But I take it you're good with anything that doesn't involve eternal torture?

[/ QUOTE ]

No, I think even temporary torture (for ex. an African person starving to death) is wrong, if you could just write a couple lines of code and feed him.

Metric
02-17-2007, 12:43 PM
So at what point in the evolutionary development of this computer intelligence do you begin to feel that the only moral thing to do is to actively prevent all suffering?

Matt R.
02-17-2007, 12:43 PM
[ QUOTE ]
[ QUOTE ]
It's clear that they don't have any way of knowing the programmer, unless you intervene to cause them to. But I take it you're good with anything that doesn't involve eternal torture?

[/ QUOTE ]

No, I think even temporary torture (for ex. an African person starving to death) is wrong, if you could just write a couple lines of code and feed him.

[/ QUOTE ]

In such a program, where everything is given to these algorithmically evolved sentient beings and they do not experience pain, would they be able to experience pleasure? Would they even be able to understand what happiness is, if they never experience any sort of discomfort? Would they appreciate their world? Why or why not?

madnak
02-17-2007, 01:20 PM
[ QUOTE ]
Would you do that because you think anything less is unethical, or just because you're a good guy and the poor bastards love you so much?

[/ QUOTE ]

Out of a sense of obligation, so I suppose the former.

madnak
02-17-2007, 01:34 PM
[ QUOTE ]
In such a program, where everything is given to these algorithmically evolved sentient beings and they do not experience pain, would they be able to experience pleasure? Would they even be able to understand what happiness is, if they never experience any sort of discomfort? Would they appreciate their world? Why or why not?

[/ QUOTE ]

That's a lot of different questions. Would they be able to experience pleasure? Of course. There's absolutely no valid reason to consider pleasure as contigent on pain. We don't even experience the two in the same way neurologically (in general). And there have been plenty of people who've experience extreme pain, but never extreme pleasure, or vice versa.

Would they understand happiness? Probably not to the extent that we do - how does it go? "If you have experienced hunger, you know that having food is a miracle. If you have suffered from the cold, you know the preciousness of warmth."

But in terms of appreciating the world, I think they definitely would. For the people (and cultures) who genuinely have it good, thankfulness for life is a common theme. Of course, you see the same thing in people who have suffered and then seen an end to that suffering, it's hard to say which is stronger.

Metric
02-17-2007, 01:45 PM
[ QUOTE ]
[ QUOTE ]
In such a program, where everything is given to these algorithmically evolved sentient beings and they do not experience pain, would they be able to experience pleasure? Would they even be able to understand what happiness is, if they never experience any sort of discomfort? Would they appreciate their world? Why or why not?

[/ QUOTE ]

That's a lot of different questions. Would they be able to experience pleasure? Of course. There's absolutely no valid reason to consider pleasure as contigent on pain. We don't even experience the two in the same way neurologically (in general). And there have been plenty of people who've experience extreme pain, but never extreme pleasure, or vice versa.

Would they understand happiness? Probably not to the extent that we do - how does it go? "If you have experienced hunger, you know that having food is a miracle. If you have suffered from the cold, you know the preciousness of warmth."

But in terms of appreciating the world, I think they definitely would. For the people (and cultures) who genuinely have it good, thankfulness for life is a common theme. Of course, you see the same thing in people who have suffered and then seen an end to that suffering, it's hard to say which is stronger.

[/ QUOTE ]

I certainly don't dismiss this point of view, but pain/suffering is really the body's way of saying "the situation isn't ideal -- please change something." It seems to me that making "universal provision" for any particular set of needs is likely to result in a new path of evolution in which other sets of wants/needs become the focus of competition and by extension suffering when those wants/needs are not met.

madnak
02-17-2007, 01:56 PM
I think that's valid, but you are forgetting a minor detail known as God.

Matt R.
02-17-2007, 02:01 PM
Keep in mind, we are writing code that removes any and all "temporary torture", as revots put it. So we are talking about extremes.

Obviously torture is caused by pain in some form. If we are removing all pain, thereby removing all forms of temporary torture, what would our sentient programs derive pleasure from? As an example, one way people in our world tend to experience pleasure or happiness is through working hard to achieve something difficult. If you never experienced pain anywhere along the way, then it would be easy. If it was easy, then no hard work was involved along the way. It would be about as difficult as me opening a door. I certainly derive more pleasure from say, learning a difficult subject thoroughly than I do opening a door. This is because it was difficult along the way, there were certainly moments of frustration (a form of psychological pain), and when it actually clicks in my mind I experience a form of pleasure. Removing any sort of pain (and thereby any sort of difficulty in doing something) would certainly seem to remove any sort of pleasure derived from the experience. Would the experiences of our sentient beings in the program be different in some way if I wrote code which prevented them from experiencing any sort of temporary pain or torture?

I also agree that they definitely would not understand happiness in the way we do. Does understanding the experience of happiness pose no inherent value for the sentient programs? Why or why not?

Also, why would they appreciate the world? Have you ever seen that show on MTV... "my sweet 16" (not sure if this is the exact title). It is the one where the extremely wealthy families throw birthday parties for the rich and spoiled daughters. I believe one girl got extremely angry and whiney because she did not receive her new BMW or Mercedez in the proper color. Do you think giving this girl everything she wanted made her appreciate the world more, less, or the same? Do you think it would be the same or different for our sentient beings in the program?

Metric
02-17-2007, 02:06 PM
It seems like we're creating a situation where it's immoral to run a sufficiently advanced "evolution sim" unless you the programmer are prepared to spend all of your time making sure that every new demand by every individual is met. I guess there's no reason this has to be wrong a priori, but it seems a bit unreasonable to me -- but maybe that's because my intuition is based on a world where competition (and thus, suffering) are the norm.

Matt R.
02-17-2007, 02:07 PM
[ QUOTE ]
I think that's valid, but you are forgetting a minor detail known as God.

[/ QUOTE ]

I think Metric's point is totally valid. In our example, we ARE God for our program. In our program, the pain mechanism evolved to help the sentient beings cope with the world. If we actively removed it through coding, thereby removing all forms of temporary torture and pain, what do you think would happen to the beings in our program? How would they know if something is bad or not?

Metric
02-17-2007, 02:24 PM
[ QUOTE ]
Also, why would they appreciate the world? Have you ever seen that show on MTV... "my sweet 16" (not sure if this is the exact title). It is the one where the extremely wealthy families throw birthday parties for the rich and spoiled daughters. I believe one girl got extremely angry and whiney because she did not receive her new BMW or Mercedez in the proper color. Do you think giving this girl everything she wanted made her appreciate the world more, less, or the same? Do you think it would be the same or different for our sentient beings in the program?

[/ QUOTE ]
I guess if you could deeply apologise and upgrade her to a Ferrari in the proper color -- and could continue the series of ever-more-absurd demands indefinately... Well -- that might result in an overall higher level of happiness. But we're definately talking about a weird form of life here if every individual gets this kind of treatment.

Matt R.
02-17-2007, 02:30 PM
I don't know Metric. If I were God of a programmed universe, my "well of infinite resources" would certainly spew Ferraris out at will. It would even have a button to choose the desired color.

If someone then got angry at me for allowing them to wreck their new Ferrari, I would just smite them. What's the over/under on length of time before an internet message evolves criticizing my Ferrari well for not making Lamborghinis? Think someone will write a book called "The Matt Delusion", or "Letter to a Metrician Nation"? [Edit -- had to correct the title of one of the classics]

Metric
02-17-2007, 02:54 PM
Well I certainly can't fault the desire to eradicate unpleasantness. It might even be possible to do, if you were God over your universe. But it's far from obvious, and it seems much farther from practical in our little "grad student running evolution sim on university's computer" thought experiment.

Matt R.
02-17-2007, 03:12 PM
Right, I can definitely understand the desire to. I just don't see a solution to the problem, unless our programmed universe becomes static in every way. In other words, I don't see how dynamic, conscious beings could exist in such a program. It seems like "unpleasantness" is a necessary feature.

What I meant by my last (over the top) post, was that I would hope someone could conceive of how to program such an "unpleasantness free" universe before criticizing the program. [Edit: Also, I really don't think that a decent solution is "giving the sentient beings whatever they want"] The bottom line is, I simply don't see how such a program could exist unless we constantly tinkered with the code. And once we did that, doesn't ALL of the original programming lose all value? Would the beings be able to understand their universe at all? It would seem that any sort of structure resembling the physics of our world could not exist in such a program.

I don't deny the possibility that there would be a way around this, and we *could*, possibly, create the program in such a way where there is no pain. I would just hope that if I ever made such an elegant program, that the sentient beings inside would not criticize me unless they could solve the problem of "pain" themselves. Otherwise, it would appear that any criticism amounts to a bunch of hand-waving, would it not? It's sort of like saying "there is clearly a better way to prove this theorem, but I don't know what it is and your way is inefficient and stupid."

Matt R.
02-17-2007, 03:34 PM
Oh yeah, to answer your original thought experiment rather than sticking to the tangent from the replies:

[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

[/ QUOTE ]

I don't think I have any responsibilities if the universe is self-contained. In other words, if my creation can continue to exist without my help. However, I would almost certainly find my project fascinating, and given the amazement I would likely feel over my accomplishment I would almost certainly "watch over" my program and see what my little computational life-forms are up to. I may, *on occasion*, tinker with things if I don't like the way they are going. Can I zoom in and observe an individual program that I think is particularly interesting? If so, I may get involved on a personal level with some of them. Of course this is all based off of personal desire, but I can't imagine I wouldn't feel some sort of connection with my program. But, I don't think I am responsible for anything beyond keeping the program going (the rest is just stuff I would want to do).

[ QUOTE ]
And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted?

[/ QUOTE ]

Certainly. But it depends on what those rules are. I would definitely fight to keep the program going for as long as I (we) am/are capable of. But I live in my own universe, with rules that I may not be able to break, so I may be limited to some extent as to what I can accomplish.

madnak
02-17-2007, 10:11 PM
[ QUOTE ]
Obviously torture is caused by pain in some form. If we are removing all pain, thereby removing all forms of temporary torture, what would our sentient programs derive pleasure from? As an example, one way people in our world tend to experience pleasure or happiness is through working hard to achieve something difficult. If you never experienced pain anywhere along the way, then it would be easy.

[/ QUOTE ]

Not buying it, even this far. Frustration is what I feel when something is too hard, and boredom when it's too easy. When it's just challenging, I feel excited and engaged - I feel great, and sometimes even get more pleasure from the process of accomplishing something than from actually accomplishing it. If it's frustration I feel, then I'm just tired and relieved when it's all over with.

[ QUOTE ]
If it was easy, then no hard work was involved along the way. It would be about as difficult as me opening a door.

[/ QUOTE ]

Or about as difficult as getting fellatio - you're using a situation where there is no pleasure or pain, I'm talking about a situation with pleasure but without pain. Then again, maybe you like the teeth.

[ QUOTE ]
I certainly derive more pleasure from say, learning a difficult subject thoroughly than I do opening a door.

[/ QUOTE ]

And more than from fellatio? Hey, we all have our preferences. But once again, "hard" and "painful" are two different things. Especially since I'm talking about subjective pain - "suffering" might be a better term. A sore muscle isn't pain if you enjoy it and it makes you happy.

[ QUOTE ]
This is because it was difficult along the way, there were certainly moments of frustration (a form of psychological pain),

[/ QUOTE ]

No, there weren't. I don't even get much frustration when I study, I get more boredom. But again, both frustration and boredom indicate an inappropriate challenge level. So far I have worked hard at school, studying for over 30 hours per week. But the challenge level was low, so I just felt bored while doing it and had no problem getting a 4.0. A challenge level that's slightly higher would have made the entire experience outright pleasurable - what pain there was universally detracted from the pleasure. My accomplishment would feel like so much more if I hadn't been bored on the way there.

I also look for difficult in video games I play, and in a good game I'll never feel pain. I can get... obsessive with games. Sometimes I can very much enjoy trying a challenge over and over again. However, I get frustrated like everyone else - when I do I stop playing right away and come back the next day, because frustration makes the game a chore and I'm playing it for entertainment.

[ QUOTE ]
and when it actually clicks in my mind I experience a form of pleasure. Removing any sort of pain (and thereby any sort of difficulty in doing something) would certainly seem to remove any sort of pleasure derived from the experience. Would the experiences of our sentient beings in the program be different in some way if I wrote code which prevented them from experiencing any sort of temporary pain or torture?

[/ QUOTE ]

This seems like an appropriate place to bring up another point - the severity is highly relevant. I don't think anyone even claims to get pleasure from an experience that is absolutely frustrating. In fact, I would argue that almost everyone, even when they accomplish something extremely difficult, feel a much greater amount of pleasure during the process than pain. To take a look at the opposite end, which is probably more relevant anyhow... Most people (the vast majority) have never been tortured. Almost none of the people who have experienced really sublime happiness have been tortured. Most of those who have been tortured endure depression and anxiety, as well as associated problems and sometimes even more severe ones such as psychosis, throughout the rest of their lives. If an ideal life requires a recipe of 95% pleasure/5% pain, then (even though I say 100/0) I don't think that contradicts my position. Your assumption that all pain is eliminated is not justified by any of our posts, and I'll get further into that later.

[ QUOTE ]
I also agree that they definitely would not understand happiness in the way we do. Does understanding the experience of happiness pose no inherent value for the sentient programs? Why or why not?

[/ QUOTE ]

I think not. The happiest people I've known didn't understand their happiness. And those most capable of understanding their emotions have typically been those who've struggled with mental illness. I see no correlation between understanding of an emotion and ability to appreciate that emotion.

[ QUOTE ]
Also, why would they appreciate the world? Have you ever seen that show on MTV... "my sweet 16" (not sure if this is the exact title). It is the one where the extremely wealthy families throw birthday parties for the rich and spoiled daughters. I believe one girl got extremely angry and whiney because she did not receive her new BMW or Mercedez in the proper color. Do you think giving this girl everything she wanted made her appreciate the world more, less, or the same? Do you think it would be the same or different for our sentient beings in the program?

[/ QUOTE ]

Two final points.

First and most importantly, I think the idea that she gets everything she wants is a bit shallow. What she wants is concern and attention from her parents, as well as structure and discipline. "Discipline" in particular has become a word with unpleasant association, but it's definitely possible to have discipline and yet experience no associated pain. As for structure, it's downright comforting. There is no evidence that her happiness and pleasure would increase with a better car - on the contrary, it's the simple emotional needs that seem to be the major source of pleasure, and so in a world without pain those are the issues that would be addressed.

The second point is that nobody has said anything about giving everyone whatever they want. I would do that, ideally, but it's not the position I'm arguing for here. What we are suggesting is giving them everything they need. The examples we're using are violence, famine, disease and torture. I don't think for a moment that people would be less able to experience happiness if all those things were eliminated. I think the example revots used to describe what he meant by "torture" was a starving African child, not a spoiled prissy teenager.

madnak
02-17-2007, 10:21 PM
[ QUOTE ]
It seems like we're creating a situation where it's immoral to run a sufficiently advanced "evolution sim" unless you the programmer are prepared to spend all of your time making sure that every new demand by every individual is met. I guess there's no reason this has to be wrong a priori, but it seems a bit unreasonable to me -- but maybe that's because my intuition is based on a world where competition (and thus, suffering) are the norm.

[/ QUOTE ]

Well, I'd spend at least one "hour" per day with them, yes. I mean, ideally as soon as I saw life arise I'd have to make some choices. Hard choices, of course. I might have to wait at least until the advent of civilization before intervening. I would probably just delete the program as soon as single-celled organisms emerged. Prevent the suffering entirely. The idea of creating a whole universe of suffering, even if it does get me past a basic technological barrier, is intolerable to me. Then again, I might make excuses like "if I don't do it, someone else will," or "the happiness I can provide over a million years is more meaningful than the suffering I'll cause over a few millenia."

madnak
02-17-2007, 10:27 PM
[ QUOTE ]
[ QUOTE ]
I think that's valid, but you are forgetting a minor detail known as God.

[/ QUOTE ]

I think Metric's point is totally valid.

[/ QUOTE ]

I don't think we agree on what Metric's point was.

[ QUOTE ]
In our example, we ARE God for our program.

[/ QUOTE ]

This was the point of my response to him - I can control the evolution directly, and prevent undesirable traits from emerging.

[ QUOTE ]
In our program, the pain mechanism evolved to help the sentient beings cope with the world.

[/ QUOTE ]

No, the pain mechanism evolved to help the organisms acquire the resources and environments necessary to sustain themselves. It is useless ("not selected for" or "not fit," I should say - evolution doesn't work toward a purpose, it just happens) when those resources are abundant.

[ QUOTE ]
If we actively removed it through coding, thereby removing all forms of temporary torture and pain, what do you think would happen to the beings in our program? How would they know if something is bad or not?

[/ QUOTE ]

The utility of identifying "bad" and "good" states is the selection of the good states over the bad. If there are no bad states, there would be no such utility. Of course, given the scenario described the ideas of more preferable and less preferable would presumably remain consistent. Even emotions such as anger might continue to be associated with such conditions. But as anger is typically a response to a threatening or intolerable situation, the absence of such situations should prevent most of it.

Matt R.
02-18-2007, 02:18 AM
If the studying a difficult subject example doesn't work for you, I'll try to generalize.

If we have an individual who is very unhappy and experiences a lot of pain in their life, I don't think it is a stretch to say that it doesn't take much to make that person feel good in some way. Of course it depends on the individual, but in general, if someone has a lot of crappy things happening to them at once, and someone goes out of their way to do even the tiniest thing to make them feel better, it can potentially make a huge difference. Now if we take away that extreme discomfort, it would appear that it takes "more" to make that person happy. A starving person may be ecstatic over a peanut butter and jelly sandwich, but a wealthy person who eats like a king probably doesn't go nuts when someone offers them a PB&J. For strong evidence of this, look at the opposite extreme. People who are extremely spoiled may get angry because their birthday present is a Mercedez in the WRONG COLOR. When we remove all the bad stuff and the struggle in their life, they tend to lose all perspective about what is important.

Now what some of you appear to be suggesting to do, is to alter the code of our program in some way to remove short spans of torture, which is certainly caused by pain. What I am wondering, is why would you do such a thing? I could understand re-coding occasionally if it suits your fancy, but to insure no unhappiness or pain seems like a stretch to me. Especially when it almost certainly leads to a sense of being spoiled in the sentient beings, and their getting angry over the color of their new Mercedez.

As to the claim that you are not suggesting to remove all displeasure: this goes back to Metric's point. Once you remove the "lowest common denominator" of discomfort, won't your programs then demand that the next level up be removed as well? When does it stop?

I guess what I am wondering is this. Say you decide you want to re-code your program to feed a starving young person from a poor region. How do you justify intervening to give someone with X amount of food a free ticket, yet NOT give the person with X + dX (tiniest amount possible) the food? If you say, "well I'll give both of them the food" it appears to me you've just started a giant cascade where you must give into everyone's demands to not turn into an almighty ass. Better get that infinite Ferrari well working.

Matt R.
02-18-2007, 02:27 AM
[ QUOTE ]
I think the example revots used to describe what he meant by "torture" was a starving African child, not a spoiled prissy teenager.

[/ QUOTE ]

Just to reiterate, because I think this may make my point considerably more clear.

My point in using the prissy spoiled teen as an example, is that even after you give someone all the resources they could ever need and want, they will almost certainly STILL FIND SOMETHING TO COMPLAIN ABOUT. You and revots made the suggestion that you would re-code to give someone food. If there are examples of people that have an extravagance of wealth, yet they are still unhappy, what makes you think giving people food will "solve" the problem of pain in the programmed world?

Unless you think unhappiness in rich people isn't real, but I don't think that rich people are making stuff up when they claim to be depressed or unhappy.

And if we do decide to just relegate our intervention to consumable necessities, what happens if we give our starving person a meal yet someone with more power steals it from them? Or what happens when they can't find clothes or shelter? What if the climate they are in is unbearably cold or hot? I suppose my main question is still where do you decide to draw the cut-off point for your intervention, and how do you come up with that point?

Matt R.
02-18-2007, 02:44 AM
One last somewhat nitty point:

[ QUOTE ]
Quote:
--------------------------------------------------------------------------------

In our program, the pain mechanism evolved to help the sentient beings cope with the world.


--------------------------------------------------------------------------------



No, the pain mechanism evolved to help the organisms acquire the resources and environments necessary to sustain themselves. It is useless ("not selected for" or "not fit," I should say - evolution doesn't work toward a purpose, it just happens) when those resources are abundant.


[/ QUOTE ]

Your first sentence is just restating what I said in different words. Coping with the world in terms of survival means helping the organisms acquire resources and reproduce. Your second sentence I disagree with, and although it's nitty I think it is important. Even when resources are abundant it's kind of crazy to claim that pain becomes "useless", imo. If someone has tons of resources, food, shelter, etc. does it becomes useless for your body to tell you "stop doing that" when you put your hand on a hot stove?

FortunaMaximus
02-18-2007, 04:46 AM
[ QUOTE ]
Since you are completely omnipotent over this universe, what are your responsibilities to these life forms, and at what point do you become responsible?

[/ QUOTE ]

Any responsibility is to yourself as creator, nothing more, nothing less. If the usage of energy is sustainable and not a drain on your overall resources, there is no reason the experiment shouldn't continue in perpetuity assuming those conditions are met.

Any moral obligation to the individual entities within a civilization, be it created or emergent, should be internal obligations. So it's the omnipotent being's own moral code that determines. If a ethical and moral code emerges in the experiment, it's an artifact of the experiment and has no bearing on the creator.

[ QUOTE ]
And if your computing power should, at the end of the semester, be required by the university for other purposes and projects, are you prepared to break any rules to prevent your program from being deleted?

[/ QUOTE ]

Preference bias, perhaps, but if the experiment serves no useful purpose and the energy requirements are a drain and enough conclusions have been made based on the iteration, and the creator is able to resume similar when he is again able to summon the energy levels necessary... <takes a moment to squint at language, oh well>

If I glossed over the replies, that's only because those are internal functions within the experiment and really has no major bearing on its continuation.

As for the data collected, if it cannot be preserved out of the experiment, then it shouldn't be initially run at all. If it has to be run, and stoppage of the experiment causes loss of data, and there is always a way to sustain the energy requirements, the perpetuity point made applies.

That and a cup of coffee can run a civilization for a very long time. A couple of centuries is somewhat pessimistic, is it not?

madnak
02-18-2007, 05:42 PM
[ QUOTE ]
If we have an individual who is very unhappy and experiences a lot of pain in their life, I don't think it is a stretch to say that it doesn't take much to make that person feel good in some way. Of course it depends on the individual, but in general, if someone has a lot of crappy things happening to them at once, and someone goes out of their way to do even the tiniest thing to make them feel better, it can potentially make a huge difference. Now if we take away that extreme discomfort, it would appear that it takes "more" to make that person happy. A starving person may be ecstatic over a peanut butter and jelly sandwich, but a wealthy person who eats like a king probably doesn't go nuts when someone offers them a PB&J. For strong evidence of this, look at the opposite extreme. People who are extremely spoiled may get angry because their birthday present is a Mercedez in the WRONG COLOR. When we remove all the bad stuff and the struggle in their life, they tend to lose all perspective about what is important.

[/ QUOTE ]

I think in many cases you're just making claims that are plain false. It seems that way to you, but not to me. Still, what you're referring to is a psychological mechanic that exists - I have seen nothing to suggest it exists in the form you're expressing, but it does exist. Sometimes I go take a walk in the cold, more because I'll feel warm and refreshed when I get back in than because I feel good outside. Sometimes I won't eat breakfast or lunch if I'm having a good dinner, because hunger amplifies the experience.

But "amplifies" is the critical term here. If my "happiness level" is at -320 because I'm starving, and then I get a peanut butter sandwich, that sandwich might have a dramatic effect on my "happiness level." It might raise it all the way up to -240. And a +80 increase is cause for celebration. But if I'm at +130 because I'm wealthy and live like a king, then a peanut butter sandwich may actually cause a decrease to +128 for me. If I'm at +50 and a "normal" human being, then it might be pretty unremarkable and have no effect. If I'm just famished and at 0, maybe it will raise me to +3.

So the significance of the sandwich does certainly depend on my current happiness level. But it will never improve my happiness level beyond a certain point. And if I'm above that point, of course the sandwich won't appeal to me - the point that it brings me closer to is only +50, and if I'm at +70, then it constitutes a pretty poor meal.

There may be ways to exploit this mechanic - in fact, my habits represent such exploitations. There are also situations in which it doesn't apply - past a certain level of starvation it actually gets better, and eating something may only give you the energy to feel more hungry. These things typically go in waves.

Still, the fact I might want to exploit the mechanic among the people of my universe is no indication that I would permit torture among them. Allowing them to get a bit nippy before a dinner they're looking forward to is a long way from allowing them to starve (which is categorically unjustifiable, in my opinion).

[ QUOTE ]
Now what some of you appear to be suggesting to do, is to alter the code of our program in some way to remove short spans of torture, which is certainly caused by pain. What I am wondering, is why would you do such a thing? I could understand re-coding occasionally if it suits your fancy, but to insure no unhappiness or pain seems like a stretch to me. Especially when it almost certainly leads to a sense of being spoiled in the sentient beings, and their getting angry over the color of their new Mercedez.

[/ QUOTE ]

Still not buying it. The phenomenon of being "spoiled" is probably, according to every scholar I respect, the result of receiving insufficient emotional care, especially during early childhood. They're spoiled because they have too much pain, not too little. Once again I'll repeat that emotional concerns are a bigger issue than physical concerns, especially in developed nations, and this is not a topic you've touched on at all.

So I'll touch on it in a more powerful way - I'm in a psychiatric ward right now. To some extent this is due to early trauma. Let me make something very clear - extreme pain is qualitatively different from minor pain. And it makes you feel things that most people never have to feel. Let me make one other thing clear - the people at the bottom are the hardest to make happy, by far. I and others in this facility have sought treatment for many years (50+ in some cases). We've been through dozens of medications in addition to various physical and psychological forms of treatment, and nothing has worked. Some people who are tortured do seem to have an inherent ability to "spring back." Still, most end up dealing with severe psychological abnormalities for the rest of their lives.

You said earlier that the people who suffer most are the easiest to make happy. You don't think it's a stretch. I do. And to me the facts about the issue are so numerous and so fully in support of my position that your statement indicates major ignorance regarding how people and their emotions actually work.

[ QUOTE ]
As to the claim that you are not suggesting to remove all displeasure: this goes back to Metric's point. Once you remove the "lowest common denominator" of discomfort, won't your programs then demand that the next level up be removed as well? When does it stop?

[/ QUOTE ]

When I say it does. Did you forget for a second that I'm God?

[ QUOTE ]
I guess what I am wondering is this. Say you decide you want to re-code your program to feed a starving young person from a poor region. How do you justify intervening to give someone with X amount of food a free ticket, yet NOT give the person with X + dX (tiniest amount possible) the food? If you say, "well I'll give both of them the food" it appears to me you've just started a giant cascade where you must give into everyone's demands to not turn into an almighty ass. Better get that infinite Ferrari well working.

[/ QUOTE ]

I don't want to call you ignorant about economics as well, but giving everyone sufficient food and other vital resources (oil, metals, crops) would go without saying. There are theorists (I don't agree with them) who even suggest that all of society's problems stem directly from resource limitations. My divisions would be made along qualitative, not quantitative, lines. There might be areas where I'd have to make quantitative decisions, and I'd probably just use calculus for those. Call me superstitious - I think every direct problem of quantitative balance can be expressed in the form of a curve with maxima and minima.

madnak
02-18-2007, 05:55 PM
[ QUOTE ]
My point in using the prissy spoiled teen as an example, is that even after you give someone all the resources they could ever need and want

[/ QUOTE ]

Like stable, appropriate family life and a general sense of validation? Oh wait, you didn't give that to the prissy teen. And, while I'm no expert on prissy teens, I'd still take 3:2 that 95% of those particular teens come from families that are clearly dysfunctional, and that plenty of rich girls who get "everything they want" don't express such symptoms.

[ QUOTE ]
they will almost certainly STILL FIND SOMETHING TO COMPLAIN ABOUT.

[/ QUOTE ]

I don't really care what they complain about.

[ QUOTE ]
And if we do decide to just relegate our intervention to consumable necessities

[/ QUOTE ]

Who ever said that?

[ QUOTE ]
what happens if we give our starving person a meal yet someone with more power steals it from them?

[/ QUOTE ]

Specific solutions would depend on program structure. I might just make thieves up and die, if something more complex weren't possible. If someone is stealing simply to deprive someone else, I have no trouble with them dying. Painlessly, needless to say - fearlessly, peacefully...but still dying. I might even give extra happy pills to the surviving family members! Then again, if it were possible I could send criminals to prison. "Prison" being a personal pleasure universe for the individual.

[ QUOTE ]
Or what happens when they can't find clothes or shelter?

[/ QUOTE ]

Providing those goes without saying. Or at least providing the constituent components and making it clear that I'll smite anyone who tries to dominate the supply. Or just send them to prison.

[ QUOTE ]
What if the climate they are in is unbearably cold or hot?

[/ QUOTE ]

Obviously they can teleport anywhere they want geographically, and I'll make it easy for them to create buildings that are bigger on the inside than the outside, maybe even materials that will assume whatever shape they want but will be otherwise completely static (and indestructible). And a kind of pendant they can wear that will make them immune to the effects of extreme temperature (and eliminate the need to breath, eat, and sleep - unless they want to). I could go on all day.

And then there's the whole proposition mechanism. They can just ask for a climate change, and I might even grant it if they have good reasons.

madnak
02-18-2007, 05:58 PM
[ QUOTE ]
Your first sentence is just restating what I said in different words. Coping with the world in terms of survival means helping the organisms acquire resources and reproduce. Your second sentence I disagree with, and although it's nitty I think it is important. Even when resources are abundant it's kind of crazy to claim that pain becomes "useless", imo. If someone has tons of resources, food, shelter, etc. does it becomes useless for your body to tell you "stop doing that" when you put your hand on a hot stove?

[/ QUOTE ]

No, and it's true I might not be able to stop that level of pain. So oh well. But the "resource" they "aren't getting enough of" here is protection from burns. Make them all immune to burns and then they can touch hot stoves all they like without feeling pain. I am a proponent of the theory that pain is a response that tells us something is undesirable. Making everything desirable, then, eliminates any need for coping with undesirable situations - they don't exist!

Matt R.
03-03-2007, 11:57 AM
Bump to reply. I haven't been to the forum in awhile. I'll break the responses up into a couple posts for clarity.

I'll be honest, I don't understand the overall point you are getting at. In particular, it seems your are disagreeing with me in one breath yet agreeing with in another, yet disagreeing with me again when you apply the reasoning to our programmed world.

For instance:

[ QUOTE ]
I think in many cases you're just making claims that are plain false.

[/ QUOTE ]

Okay, so my claim is that someone's happiness level is a dynamic quantity which varies with their comfort level. You disagree with this. Roughly, my claim is that if someone is uncomfortable, then it takes less "absolute resources" to make that person comfortable and happy. For instance, it only takes a peanut butter and jelly sandwich to make a starving person comfortable and happy (even if it is somewhat brief). If you give a rich man a PB&J, it won't increase his happiness at all. In fact, if you force him to eat it, as you said it may even decrease his happiness. Happiness and comfort level clearly depends on what the baseline is for the individual. Increasing someone's baseline by giving them INFINITE resources will increase someone's basal resource requirements to become happy.

I don't understand why you disagree with this assertion, and in fact it appears that you don't in your next paragraphs:

[ QUOTE ]
Still, what you're referring to is a psychological mechanic that exists - I have seen nothing to suggest it exists in the form you're expressing, but it does exist. Sometimes I go take a walk in the cold, more because I'll feel warm and refreshed when I get back in than because I feel good outside. Sometimes I won't eat breakfast or lunch if I'm having a good dinner, because hunger amplifies the experience.


[/ QUOTE ]

OK, so I guess you agree? I don't understand what you mean when you say that you have never "seen it in the form I'm suggesting". Is it the Mercedez example? So, the girl is lying when she says she wants a different color Mercedez? She just wants a hug? I don't completely disagree with this -- but the emotional detachment almost certainly stems from her parents flinging resources at her just because she wants them. In other words, pain/unhappiness STILL EXISTS when you start giving people tons of the stuff they want. So, we certainly have not solved the problem of unhappiness simply by feeding someone when they are hungry. And as for the Mercedez girl that I used as an example, I knew someone similar (although it wasn't as extreme as "omg, this color is totally not what I want). Her parents were fantastic and were always there for her. Yet she still always needed that extra $900 purse or $600 pair of shoes that were in fashion. She was quite the unhappy camper when she didn't get the material things she wanted. Essentially, she trained herself over time to psychologically need the trivial outlandishly expensive clothes and accessories -- and when she didn't get them she was clearly less happy.

So I think you agree with me on my overall point, but you disagree with the example I used with the car. I do agree that it is a *combination* of emotional and material needs. But I think the emotional needs actually are a result of using material possessions as a substitute -- and even if you give that person the emotional part, giving them all the resources they want obviously "shifts" the baseline level of happiness/comfort that I referred to earlier.

[ QUOTE ]
Still, the fact I might want to exploit the mechanic among the people of my universe is no indication that I would permit torture among them. Allowing them to get a bit nippy before a dinner they're looking forward to is a long way from allowing them to starve (which is categorically unjustifiable, in my opinion).


[/ QUOTE ]

I agree that it's a BIG difference. But the pain caused by both is rooted in the same psychological mechanisms. They are just on different ends of the spectrum.

The problem, which is my main point, is that once you give the food, their baseline requirements for happiness clearly will "shift up", as I discussed before. This is not a problem by itself, but remember, you can simply encode WHATEVER YOU WANT TO in your program. What justification you have to stop there? You have now shifted the basal level of happiness up -- we have solved the food problem for some people (as long as no one comes along and steals it, btw. Considering that this will likely happen at some point, we certainly haven't solved it for everyone). Now the "bottom feeders" on our happiness scale aren't the starving individuals, it is someone else. We have not removed unhappiness or torture -- on the contrary it will become the same as before over time. One could easily argue that our population will evolve to accodomate their infinite food wells and then the "suffering" ones will have other problems. They will certainly have something to complain about, and it will be justifiable. After all, you are god of the computer world and you can easily fix it. So you do. Now you've shifted the baseline again. So you fix the new problem. Do this enough, and it's easy to see that you're now dealing with problems of infinities where you're just giving them everything. Which you HAVE TO DO if you want to remove "all pain and temporary torture".

You said you would not permit them to starve, but now you would allow them to (for example) freeze to death. If one of your computer world residents now makes the claim "Clearly madnak does not care about us, because if he did he would simply program our world so everyone was at a comfortable temperature", would he be justified. Why or why not?

Matt R.
03-03-2007, 12:16 PM
Continuing...

[ QUOTE ]
Still not buying it. The phenomenon of being "spoiled" is probably, according to every scholar I respect, the result of receiving insufficient emotional care, especially during early childhood. They're spoiled because they have too much pain, not too little. Once again I'll repeat that emotional concerns are a bigger issue than physical concerns, especially in developed nations, and this is not a topic you've touched on at all

[/ QUOTE ]

The reason I have not touched on it is because I am not the one suggesting to re-program the world to remove unhappiness and pain. Of course there will be people who are emotionally distraught and unhappy. YOU need to be the one to fix this, as you are the one suggesting to remove all the bad stuff that hurts people. I think the universe that evolves based on our programmed algorithms is perfectly fine as is.

[ QUOTE ]
So I'll touch on it in a more powerful way - I'm in a psychiatric ward right now. To some extent this is due to early trauma. Let me make something very clear - extreme pain is qualitatively different from minor pain. And it makes you feel things that most people never have to feel. Let me make one other thing clear - the people at the bottom are the hardest to make happy, by far. I and others in this facility have sought treatment for many years (50+ in some cases). We've been through dozens of medications in addition to various physical and psychological forms of treatment, and nothing has worked. Some people who are tortured do seem to have an inherent ability to "spring back." Still, most end up dealing with severe psychological abnormalities for the rest of their lives.


[/ QUOTE ]

OK, their psychological problems are clearly a result of genetics or some defect that is rooted in biology. This is why I said "in general" and "not always" that people's happiness varies with their baseline level of comfort/resources. You have pointed out an instance where this is not true, and I agree it is not true here. Their biology is the problem. Again, I am not suggesting we fix EVERY INSTANCE OF UNHAPPINESS. You are. It is now your responsibility to fix our psych ward patients. Otherwise you clearly have not solved the problem of unhappiness. Are you now suggesting we re-program unfit genotypes? Or are we just re-programming the genotypes which result in unhappiness? How often are we intervening here, and are we going to allow our program to evolve? If we start removing everything but the optimum psychological genotype, this will certainly cause problems with our "brain" evolution. Which genotypes are you fixing or removing exactly? If it is not due to genetics, but severe trauma in the environment (causing long term psychological problems), are you going to "remove" all instances of extreme psychological trauma in your universe? You must see how complicated this becomes in a very short amount of time when you start "fixing" things that are not ideal and cause pain.

[ QUOTE ]
You said earlier that the people who suffer most are the easiest to make happy. You don't think it's a stretch. I do. And to me the facts about the issue are so numerous and so fully in support of my position that your statement indicates major ignorance regarding how people and their emotions actually work.


[/ QUOTE ]

Alright, so I guess all I need to be convinced is your solution to the problem. I admit I do not have a degree in psychology and it is not a prime interest of mine. I do agree that the people who are "hardest to make happy" are the ones with biological defects in the brain. This is, again, why I said "in general" for my mechanism of happiness in my previous post. There are definitely exceptions -- which are rooted in brain physiology. But again, I am not the one proposing to fix all brain physiology that results in unhappiness. This is not a problem to me -- some people will clearly be less happy due to their genes AND/OR environment. I do not propose that I should fix every little thing to make everyone happy. I admit that I may be ignorant on psychology -- so prove your case. You claim it is feasible to make everyone happy and have a functioning universe. Show me how you will fix the psychological defects.

[ QUOTE ]
As to the claim that you are not suggesting to remove all displeasure: this goes back to Metric's point. Once you remove the "lowest common denominator" of discomfort, won't your programs then demand that the next level up be removed as well? When does it stop?


--------------------------------------------------------------------------------



When I say it does. Did you forget for a second that I'm God?



[/ QUOTE ]

FANTASTIC! So you're saying you need no justification for your cut-off point, and it's completely arbitrary based on what you want? This is basically what I am saying. And since we can't promise everyone everything, I do not think our conscious program is justified in claiming "clearly, Matt R. does not love us because he doesn't give us X". I completely agree that the god of a programmed universe should be able give whatever he/she/it wants to the individuals within the universe and not be subject to criticism if he/she/it does not give a little bit more.

Matt R.
03-03-2007, 12:45 PM
[ QUOTE ]
I don't want to call you ignorant about economics as well, but giving everyone sufficient food and other vital resources (oil, metals, crops) would go without saying. There are theorists (I don't agree with them) who even suggest that all of society's problems stem directly from resource limitations. My divisions would be made along qualitative, not quantitative, lines. There might be areas where I'd have to make quantitative decisions, and I'd probably just use calculus for those. Call me superstitious - I think every direct problem of quantitative balance can be expressed in the form of a curve with maxima and minima.

[/ QUOTE ]

Whoa whoa whoa. You've completely lost me here. This question has nothing to do with economics. You are, for all intents and purposes, god of your programmed universe. You can program into the world WHATEVER YOU DESIRE. There is no such thing as scarce resources. There is no economic problem. Framing it as an economic problem is silly.

You *arbitrarily* set your cut-off to give everyone sufficient food and vital resources. If you decide to "use Calculus" and examine "maxima and minima on a curve", or something, could you provide an example of how this would work? I don't understand how you use Calculus to discover some magical "optimal level of resources" for your universe, which imo will be completely arbitrary. What maxima and minima are you looking at, exactly?

Also, do you realize as soon as you come up with a finite "resource value", you will increase the ceiling of your population size? If your universe behaves anything like ours, when you give them more resources their population sizes will simply grow over your previous boundary limited by resources. You now have a new ceiling, more scarce resources, and more suffering after they hit the new ceiling. We have to re-program *again*

I think earlier you said you would give wells of infinite resources. I'm not sure how this would work exactly. For instance, in our universe if you get an answer of infinity in a real physical problem it is a sure sign that something is awry. For instance, sporadically putting wells of infinite energy all over the world would cause huge problems. How do you get around these problems of infinite energies? I guess I'm not seeing how the program logic would work out if you start throwing infinities in there. (feel free to ignore this problem if you aren't suggesting to give your creation stuff like infinite energy).

Matt R.
03-03-2007, 01:07 PM
Last post for now, promise:

[ QUOTE ]
Quote:
--------------------------------------------------------------------------------

they will almost certainly STILL FIND SOMETHING TO COMPLAIN ABOUT.


--------------------------------------------------------------------------------



I don't really care what they complain about.


[/ QUOTE ]

Cool, me neither. So you agree that suffering in some form is inevitable, and removing all instances of suffering is ludicrous unless you constantly tweak the program, rendering all of your original algorithm invalid? I would re-program on occassion as well, but I wouldn't pretend I could fix all instances of unhappiness.

[ QUOTE ]
Quote:
--------------------------------------------------------------------------------

And if we do decide to just relegate our intervention to consumable necessities


--------------------------------------------------------------------------------



Who ever said that?


[/ QUOTE ]

No one? That's why I said "and if". The reason I included it is because you seemed to be starting off by giving everyone food (a consumable necessity). Are you going to go beyond this, and fix everyone's happy neurons in their brain? If you make a programmed me in your universe, I'd like a Ferrari please that transforms into a Lamborghini at high speeds.

[ QUOTE ]
Specific solutions would depend on program structure. I might just make thieves up and die, if something more complex weren't possible. If someone is stealing simply to deprive someone else, I have no trouble with them dying. Painlessly, needless to say - fearlessly, peacefully...but still dying. I might even give extra happy pills to the surviving family members! Then again, if it were possible I could send criminals to prison. "Prison" being a personal pleasure universe for the individual.


[/ QUOTE ]

Wowsa, this is getting complicated. Are you sure your original algorithm (allowing your conscious programs to evolve in the first place) has any value at this point?

[ QUOTE ]
Obviously they can teleport anywhere they want geographically, and I'll make it easy for them to create buildings that are bigger on the inside than the outside, maybe even materials that will assume whatever shape they want but will be otherwise completely static (and indestructible). And a kind of pendant they can wear that will make them immune to the effects of extreme temperature (and eliminate the need to breath, eat, and sleep - unless they want to). I could go on all day.


[/ QUOTE ]

I'll admit, these are some cool suggestions. But I don't see how they fit in the framework of a logical algorithm that our universe is based on. What physics are you going to program to for instance, allow "indestructible materials that can take any shape they want"? Would it be some kind of psychic link between the material and individual where it takes on the shape they desire? What if someone made it take on the shape of a nuclear bomb strong enough to blow up the world? Would you just turn that program off? I also think the pendant of invulnerability idea is cool. I wouldn't do it myself because now we're delving into territory of giving people everything they want (I thought you weren't *really* doing this)? At least everyone is happy I guess. The more I think about it, the more I hate the God of our universe for not giving me a pendant that transports a supermodel into my room whenever I get the hormones going. Bastard!

[ QUOTE ]
And then there's the whole proposition mechanism. They can just ask for a climate change, and I might even grant it if they have good reasons.

[/ QUOTE ]

Interesting. So if someone talks to you (let's call it prayer), you may or may not intervene to give them what they want? But you certainly wouldn't answer EVERY request. I agree, that would be crazy.