Nano-technology, what are the full implications? Why AI would need our brains.

Started by Stealth Moose5 pages
Originally posted by Oneness
I believe humans suffer from an anthropocentric viewpoint,

Humans can only see reality from a human viewpoint. This is why even empiricism is a relative method of understanding the world. The idea that we can perceive outside of human perception is laughable.

in that we think the universe is a certain way and so we erroneously base our opinions off of that viewpoint.

Correction: we can only view the world through the lens of human perception. Until you find aliens that can relate to us another dimension or level of understanding which we cannot acquire due to physiological limitations, this point is moot.

That is the essence of spirituality.

Spirituality is a result of human limited understanding? Possibly.

Originally posted by Oneness
Merging with technology completely is the only way for a human to be liberated from a stagnant, declining, and scarcity-based civilization because as Super Beings we are no longer subject to Maslow's hierarchy of needs.

1. "Only" is an absolute stance which begs for proof.

2. Unless technology removes emotional needs, gives us self-actualization, and renders socializing moot, this is incorrect. The base levels of Maslow's hierarchy are physical, but the higher up you go the more of it is mental and social. Being Adam Jensen, if anything, would make these problems worse.

All of our current motivators weigh us down, and self-actualization is very difficult considering how needy we are, as substrate independents with such substrates as a Strong AI being available to us, we will have perpetual, and exponential motivation to fuel this self-actualization.

Or become slaves to alien thought patterns, no longer recognizable as individuals, no longer marked by our struggles, our development, and our ability to change. Really, this is all wishful thinking.

Imagine your highest state of optimistic arousal, multiply it by 1 billion, imagine it as an exponent, and then multiply it to that exponent's power ad infinitum.
Now imagine this frame of mind being constantly available to you.

Aside from having an out of body experience due to too much NyQuil, or drinking German beer through a straw all night, I can't imagine anything remotely like this completely not-before-experienced experience which is outside of our current means of experiencing.

So I'll just imagine a cat instead.

Behold, abstract cat.

Originally posted by Stealth Moose
Correction: we can only view the world through the lens of human perception. Until you find aliens that can relate to us another dimension or level of understanding which we cannot acquire due to physiological limitations, this point is moot.

Not if what I'm trying to say is to be more like you're being right now, skeptical.

2. Unless technology removes emotional needs,

I'll try and make sure you comprehend when I quote your cheesy example of a cat.

gives us self-actualization,

Removing all the needs below it makes self-actualization not only easy, but the only thing you're ever doing.

and renders socializing moot,

Exactly.

this is incorrect. The base levels of Maslow's hierarchy are physical, but the higher up you go the more of it is mental and social.

No, just social. Self-actualization requires always challenging yourself. That is the only mental need in his list beyond physiological, psychological, psycho-social, emotional, etc.

Being Adam Jensen,

If you're going to bring up a random name I'd like to be in the know about what who you're talking about please. Thank you.

if anything, would make these problems worse.

Well I don't know who the **** Adam Jensen is, so thank you for that.

Or become slaves to alien thought patterns, no longer recognizable as individuals, no longer marked by our struggles, our development, and our ability to change. Really, this is all wishful thinking.

Can you explain why changing substrates would do this. This is what Shakya said too, it is a theory that has no logical or rational thought behind it.

abstract cat.

What are talking about? It is difficult to buy that you don't understand the concept because you can't actually feel that way. If I write this and this, and you are unsure of what I meant, ask me. This metaphor was purposefully obnoxious.

I don't get what you don't understand about being in a state of mind that does not disorientate or distract from reality, but that is in a state of optimistic arousal. What to you does not make sense about that.

Keep in mind we're talking about a computer that looks and seemingly acts like Doctor Manhattan from Watchmen, and can do some of the things he can do as well.

Think about Data, except think that the writers did not know what exactly they were about, and Data did in fact display free will and emotion - Data could have just always been in a better mindset than you or me, he could have always been happy if his emotion-chip said so.

Being happy when Picard has been kidnapped, or when he's trying to rescue Picard, in no way inhibits his capabilities. Or think about I Robot, except, once again, the writers did not really understand emotion and that every single machine displayed emotion. Or, for instance, David from Prometheus, the list goes on.

Are you worried that you'll be programmed to carry out irrational decisions?

Where in the in vito neuron--->nanite conversion is there anything conscious (strong AI is not doing this) or sophisticated (a human can't fathom what's going on with all the resources on earth at his disposal).

Nothing is programming you, the only thing going on is a computer operation accounting for the changes, only to the extent of replacing every neuron with nanites without breaking your stream of continuity.

Originally posted by Shakyamunison
So, we would never need to be repaired?

For all we know sunlight, heat, motion, water, friction, a recharge from your autonomous home could be the only energy required to fuel the replication or maintenance of the nanites in one's body.

If one is destroyed by an anti-matter bomb for whatever reason they could be rebuilt down the trillionth of a nanosecond of memories. You won't notice you were destroyed at all. Human brain cells already die and are replaced from one moment to the next. There is no difference here or in the conversion of substrates for that matter either.

That would violate the 2nd law Thermodynamics,

Yes, we're still using resources, we're still building, creating, inventing. What's your point? There's just no longer the biological factor of distress.

therefore we will have to be repaired from time to time. How will we know that our machine bodies need repair? Of course, we would have a net of sensors throughout our machine bodies to tell us when we have been damaged.

This isn't even an unpleasant experience, it's nothing like a homo sapien being shot in the foot.

This would be the machine equivalent to pain.

Why can't you change your thinking? This should be obvious to you. No, it would not be unpleasant like pain.

Human are sometimes needy even when they have no need to fill.

Hedonic adaptation is what it's called, and hedonism is the result of biochemical interactions.

Why do you think that humans in a machine body would be any different from humans now?

It's not that simple, it is not a human in a machine. It is a human that has changed, "gained powers", so to speak. Many perceptions will have been altered. Imagine Neo getting up after Smith shot and killed him, and seeing the world in code. I mean, that's just sight, we're talking about emotion itself, not even one of the five senses, that changes in your perception of the world.

Assuming we do not loose our own individuality and personality,

Oh assuredly, you will not be the same.

we would still need things out side of ourselves. We would still need love; we would still want to party; we would still desire things we cannot have, unless boredom takes over.

Thank God we'll have changed.

Originally posted by Stealth Moose
Or become slaves to alien thought patterns, no longer recognizable as individuals, no longer marked by our struggles, our development, and our ability to change. Really, this is all wishful thinking.

As opposed to being a slave to your usual thought patterns? One does not lose it's identity, that's called a loss in lucidity/disorientation.

You're still changing, physically you're changing. Taking away all this harmful baggage, neediness, pain, suffering. That's a real nirvana - especially in that your state of arousal, of awareness, your state of pleasure, is growing. Stacking, always being multiplied to a higher power.

People think it's a more dull experience, it is nothing like that. I am trying to challenge that misconception. How does David 8 from Prometheus really think? Imagine the trimmer of excitement, or something miraculous, causing hairs on the back of your neck to stand up. That is what I'm talking about - animation, stimulation, awareness, clarity, concentration, arousal.

Originally posted by Oneness
For all we know sunlight, heat, motion, water, friction, a recharge from your autonomous home could be the only energy required to fuel the replication or maintenance of the nanites in one's body.

How much energy is irrelevant. We would not be self sustaining perpetual motion bots. We will always need to repair our machine bodies.

Originally posted by Oneness
Yes, we're still using resources, we're still building, creating, inventing. What's your point? There's just no longer the biological factor of distress.

There will be a mechanical distress. In the long run, there will be no difference between a mechanical distress and a biological distress. That means bots of the future will feel pain, just like we do.

Originally posted by Oneness
This isn't even an unpleasant experience, it's nothing like a homo sapien being shot in the foot.

Repairing ones body takes recourses and time, so a future human replacement will not want you to shoot it in the foot any more they you want that.

Originally posted by Oneness
Why can't you change your thinking? This should be obvious to you. No, it would not be unpleasant like pain.

Pain is all in the mind. There are people who can endure large amounts of pain with mental disciple. Therefore, in the long run the human replacement will experience pain of their own kind. So, some will be able to ignore the pain, but most will not.

Originally posted by Oneness
It's not that simple, it is not a human in a machine. It is a human that has changed, "gained powers", so to speak. Many perceptions will have been altered. Imagine Neo getting up after Smith shot and killed him, and seeing the world in code. I mean, that's just sight, we're talking about emotion itself, not even one of the five senses, that changes in your perception of the world.

Humans are biological machines. Sure a human replacement would experience the world around them differently, but the human replacement would still be in the same world and suffer the same sufferings as a human.

Originally posted by Oneness
Oh assuredly, you will not be the same.

So, why would I give up my life so a human replacement could pretend to be me?

Originally posted by Oneness
Thank God we'll have changed.

Are we now talking about a machine god?

They'd need us so they can have a creator creation utopia.