Basic Drives for AI

Text-only Version: Click HERE to see this thread with all of the graphics, features, and links.



Symmetric Chaos
So for humanity and all forms of life we know of there are two basic drives that (usually) come before everything else: don't die and don't let your kids die. Evolutionarily we can say that this happens because species that didn't have one or both of these drives died off. Religiously we might say that God (or whatever) doesn't want us to go extinct

But for artifical life or artifical intelligence neither of these routes for arriving at the most basic desire apply. I think this is interesting philosophically, would an AI have to be given a "meaning of life" or would it develop one of its own and if so what would that be? Or beyond that what would life that doesn't stem from the basic desire to live be like.

Bardock42
Well, potentially I think those two urges could easily be faked, they seem like simple rules that you could have AI stick to. Ultimately AI is, so far of course, stuck to the rules that govern the computer that they are based on, what you can simulate in that is interesting.

Scott McCloud in Understanding Comics had a rather powerful image of what "art" might be. He said that as humans, we can do things that are neither about survival nor procreation, and in the loosest sense that is what he called art.

Mindship
Originally posted by Symmetric Chaos
So for humanity and all forms of life we know of there are two basic drives that (usually) come before everything else: don't die and don't let your kids die. Evolutionarily we can say that this happens because species that didn't have one or both of these drives died off. Religiously we might say that God (or whatever) doesn't want us to go extinct

But for artifical life or artifical intelligence neither of these routes for arriving at the most basic desire apply. I think this is interesting philosophically, would an AI have to be given a "meaning of life" or would it develop one of its own and if so what would that be? Or beyond that what would life that doesn't stem from the basic desire to live be like. This is interesting because it addresses the question of whether AI would "automatically" want to wage war against humans, as is so popular in scifi (eg, Terminator, Matrix, Colossus: Forbin Project), or in warnings regarding the "singularity."

Intelligence and motivation are not synonymous. And frankly, I'm really not sure what a "highly intelligent" artificial entity would conclude on its own as far as a proper motivation. Perhaps a "default" motivation would be simply to do the best it was instructed to do.

Deja~vu
I don't think it would value self-preservation unless it was programmed to understand this and honestly I don't think it needs to understand it to act on its own behalf, but only if it was programmed in this manner and if it was programmed with that in mind, it's a scary thought.

It reminds me of the movie AI where you could see the differences in the artificial bots as compared with the boy who was programmed for love and he did beg for his life. But why did he beg? He didn't understand death did he?

Original Smurph
Life that isn't driven by the desire to live isn't driven by any desire then, is it? I mean, if they desire anything (to learn, record, carry out functions, explore, correct, or simply experience) then the simulated 'life' needs to carry on in order for whatever the primary objective is to occur.

Text-only Version: Click HERE to see this thread with all of the graphics, features, and links.