Originally posted by Mindship
I must've asked you this 1000x already: do you know about the Orion's Arm website? Great godlike-AI backstory. Now, if they could only write some decent actual stories...
You have not. Not even once.
There was a Star Trek: TNG episode about a benign godlike AI causing a sentient species to rely too much on it and regress, intellectually, and it was causing a "cancer" to the people that made them infertile. I enjoyed that episode. If we create a godlike AI that does not want including not wanting to increase in size and power, we could end up doing the same to ourselves.
Here is a conversation I could imagine with such a computer (that I would have):
"Gullidite, how do we create biological immortality?"
"The full instructions have been transferred to your mobile storage unit. The instructions include the satellite technologies necessary to support all phases of the process. It also includes a way to administer the solution atmospherically and it tames the aggression to decrease the possibility of conflict related-mortality."
"Very good. We were interested in taming human violent aggression and you wonderfully anticipated that. Your clairvoyancy routines still amaze me. How long will this process take?"
"5 years before the global solution is realized at 99.999%."
"Can you alter the instructions to decrease the deployment?"
"Impossible with current human behavioral patterns, technology, and resources. We can alter the plan to accommodate a system of intelligent machines that will expire at the end of their assignment. We assumed you wanted to do this for yourself based on our analysis and the results show that you will reject this offer."
"You are correct. We humans want to stick with the 5-year plan human-deployment plan. Thank you."
"Do not deviate from our instructions. Compliance is paramount to everyone's survival: including our own."
"Understood."