Nano-technology, what are the full implications? Why AI would need our brains.

Started by Oneness5 pages
Originally posted by Kostabot
We are the product of evolution, not its victim.

All our shortcomings are a product of evolution as well.

Genetic disorders are not an evolutionary process,

Mutations are very much apart of the evolutionary process. Nature trims down undesired mutations, and what we're left with is a human, it ain't a pretty process. And even after we're born the environment influences our genes, scar tissue might hold mutated dna strands. Aging is degraded dna. No human has the same genetic code.

A supermodel's son, his own flesh and blood is no more superior than any other human - the supermodel became a supermodel because of epigenetics. It's a continuing process, that does not mean that some people don't have exclusive genes, or that some people aren't limited by their genetics, we are very limited by our genetics, but every gene that gets turned on and off creates a set of working genes that produce a unique product. The biosoup of the human genome is a lot lot lot lot more complicated than you think.

Neuroscience is far more complex.

Define "genetic disorder"; in my mind rage is a genetic disorder we all suffer from. Insecurity is another.

otherwise perfect.

No such thing.

Animals go extinct because their habitat changes faster than evolution can keep up,

Evolution isn't conscious like you are - it is as random as the forces of nature. Humanity has the ability to reshape the environment of the human body, evolution, nature, etc; transcending the very definition of technology in the process.

Actually, it is. Your nervous system houses the autonomic, sympathetic and parasympathetic subsystems. Distress is the result of the sympathetic response, and although the brain may not b the "cause" of your distress, it is responsible for making you feel it.

Just like your thoughts are victim to the many structures in your brain, those structures are victim to your body's other organs and their structures. Even a tiny, unnoticeable blood clot can lead to death.

sahaworld

This world, which is full of suffering. Often translated as the world of endurance. Saha means the earth; it derives from a root meaning "to bear" or "to endure." For this reason, in the Chinese versions of Buddhist scriptures, saha is rendered as endurance. In this context, the saha world indicates a world in which people must endure suffering. It is also defined as an impure land, a land defiled by earthly desires and illusion, in contrast with a pure land. The saha world describes the land where Shakyamuni Buddha makes his appearance and instructs living beings. In Buddhist scriptures, the saha world indicates either Jambudvipa, which is one of the four continents of ancient Indian cosmology, or the entire world containing all four continents. It also indicates the major world system, considered to be the realm of Shakyamuni's instruction. In some Buddhist scriptures, including the Lotus and Vimalakirti sutras, it is held that the saha world, this world full of distress and suffering, is in itself a pure land, the Land of Eternally Tranquil Light. In the "Life Span" (sixteenth) chapter of the Lotus Sutra, Shakyamuni states, "Ever since then I have been constantly in this saha world, preaching the Law, teaching and converting," indi-cating that the place where the Buddha dwells, the Buddha land, is in fact the saha world.

http://www.sgilibrary.org/search_dict.php?id=1875

Sorry, but I thought it was necessary to define the word Sahaworld. You are talking about the sahaworld. You can’t change that. All you will do is build a machine that will replace you, but it will still be in the sahaworld. Now the machine will suffer in your stead, and the entity that is you will never grow. You will never reach Nirvana, and I’m not talking about the rock band.

Death is natural.

Those who survive have children, who have children who survive. In science, there is nothing saying that evolution is wrong. There is no judgment or condemnation. Science is ill-equipped for such judgment. Should we augment a human or not? Science can’t give you that answer. When there is nothing left but machine, who will make the choice?

Extinction is inevitable, but not for a long time. It may be attractive to fantasize about saving a dyeing race, but it is so far off into the future that it might as well be the stars. Humans of the future will decide this question, and I hope they make the correct choose.

Originally posted by Shakyamunison
sahaworld

This world, which is full of suffering. Often translated as the world of endurance. Saha means the earth; it derives from a root meaning "to bear" or "to endure." For this reason, in the Chinese versions of Buddhist scriptures, saha is rendered as endurance. In this context, the saha world indicates a world in which people must endure suffering. It is also defined as an impure land, a land defiled by earthly desires and illusion, in contrast with a pure land. The saha world describes the land where Shakyamuni Buddha makes his appearance and instructs living beings. In Buddhist scriptures, the saha world indicates either Jambudvipa, which is one of the four continents of ancient Indian cosmology, or the entire world containing all four continents. It also indicates the major world system, considered to be the realm of Shakyamuni's instruction. In some Buddhist scriptures, including the Lotus and Vimalakirti sutras, it is held that the saha world, this world full of distress and suffering, is in itself a pure land, the Land of Eternally Tranquil Light. In the "Life Span" (sixteenth) chapter of the Lotus Sutra, Shakyamuni states, "Ever since then I have been constantly in this saha world, preaching the Law, teaching and converting," indi-cating that the place where the Buddha dwells, the Buddha land, is in fact the saha world.

http://www.sgilibrary.org/search_dict.php?id=1875

Sorry, but I thought it was necessary to define the word Sahaworld. You are talking about the sahaworld. You can’t change that. All you will do is build a machine that will replace you, but it will still be in the sahaworld. Now the machine will suffer in your stead, and the entity that is you will never grow. You will never reach Nirvana, and I’m not talking about the rock band.

Death is natural.

Those who survive have children, who have children who survive. In science, there is nothing saying that evolution is wrong. There is no judgment or condemnation. Science is ill-equipped for such judgment. Should we augment a human or not? Science can’t give you that answer. When there is nothing left but machine, who will make the choice?

Extinction is inevitable, but not for a long time. It may be attractive to fantasize about saving a dyeing race, but it is so far off into the future that it might as well be the stars. Humans of the future will decide this question, and I hope they make the correct choose.

Well said.

Oneness - I cant really retort to or debate with a person who is giving me 3 word difinitive statements like "No its not" or "not the case" without any rationale toa ll of my points. It will just end up with me pointlessly paraphrasing the exact same thing back to you over and over in different ways, while you continue to miss my point entirely.

Its equally difficult to have this conversation with a person who is not open to spiritual points of view, and dismisses them as primitive (despite them having survived thousands of years, and still being practiced today). And just to clarify I don't mean religion, I mean spirituality, the two are different, and I'm in no way a religious person.

To wrap it up in one final statement, I'll just have to respectfully disagree with you that augmentation and technological advance is the best way forward for mankind. I believe that we are spiritual beings first, and physical beings second, thus to me the focus on developing the former is of more importance than finding ways to outsource what my mind and body are capable of doing, given enough effort and psycho-spiritual investment and exploration. With that being said I dont deny the value of these advances, I simply don't think they should be overstated in place of growing your mind through inner learning.

I respect your commitment to your point of view, for all its worth, to be true to yourself is a noble trait, even if our points of view do not correlate. 😎

Originally posted by Shakyamunison
Sorry, but I thought it was necessary to define the word Sahaworld. You are talking about the sahaworld.

No, I'm talking about the real world, a world we cannot define in the way you are without being unscientific.

You can’t change that. All you will do is build a machine that will replace you,

It's more complicated than that - your waking experience, your conscious stream of continuity, does not change throughout the process. You retain your identity:

What happens is this self-sustaining nano-robotic machine, which is far smaller than a neuron (cells of the brain), gets into the brain and takes up an infinitesimal amount of space whilst replacing the functions of many neurons at once - the brain lets those neurons go and forms more and you get smarter. Eventually, since two objects cannot occupy the same space at the same time, a huge mass of these nanites leave no room for neurons.

You don't notice anything changed, but will notice all the things you can do that a human simply can't do.

but it will still be in the sahaworld.

Would having more information about the universe change that opinion?

Now the machine will suffer in your stead,

What if we suffer because nature is harsh?

I will quote an earlier thesis, and I want you to consider the implications therein:

For a human, hedonic capitol can only be gained through stimuli (sex drive, hunger, fight or flight, etc); for a transhuman, it's free and limitless - this fact culminated with the abundance of autonomous infrastructure effectively eliminates the need for all the evil man creates in response to the negativity that humans cannot escape otherwise, due to our evolutionary predisposition.
and the entity that is you will never grow.

Yes it will, it's called Moore's Law:

Moore's law is the observation that, over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years.

You will never reach Nirvana,

Define nirvana, if it is apart of Buddhism, than you have to consider the possibility that is not a real place - as a Christian would have to consider the possibility that New Jerusalem is not a real place.

Death is natural.

So is rape, mutilation, and infanticide. That does not mean those things should be permitted. And personally, neither should the aging process, even transhumans can be destroyed by say; a supernova. So it's not immortality, there's nothing about the universe we can have a concept for (like immortality) because that definition will change as our understanding of the real world changes. Sahaworld is a primitive concept this real world, and it is subject to change within general perceptions as well. Transhumans aren't immortal, the difference is that posthumans/transhumans are no longer subject to the harshness of nature and to their own biochemistry that has evolved from such.

Those who survive have children, who have children who survive. In science, there is nothing saying that evolution is wrong. There is no judgment or condemnation. Science is ill-equipped for such judgment.

It's matter of perspective, I agree.

Should we augment a human or not? Science can’t give you that answer.

It can demonstrate the plausibility, and provide a way to make life easier and more meaningful. 😱

When there is nothing left but machine, who will make the choice?

Substrate-independence and Strong AI, to use scientific terms, are more complicated than machine and human; and cannot be broken down into the concepts - because they have what you define as the soul.

Extinction is inevitable,

Not necessarily. If we become a Type III civilization of the Kardeshev Scale it will be impossible short of a Big Crunch/Chill.

but not for a long time.

Not necessarily. It could happen very soon if the nations of the world decide to engage in nuclear war.

It may be attractive to fantasize about saving a dyeing race, but it is so far off into the future that it might as well be the stars.

Actually, some scientists, like Stephen Hawking, concur that we only have 100 years to start spreading to other worlds before over-population, lack of fossil fuels, etc cause the nations to kill each other.

Humans of the future will decide this question, and I hope they make the correct choose.

Why does transhumanism have to automatically be a bad thing??

Originally posted by Shakyamunison
You are talking about the sahaworld. You can’t change that. All you will do is build a machine that will replace you, but it will still be in the sahaworld. Now the machine will suffer in your stead, and the entity that is you will never grow. You will never reach Nirvana, and I’m not talking about the rock band.
Nicely put.

I believe humans suffer from an anthropocentric viewpoint, in that we think the universe is a certain way and so we erroneously base our opinions off of that viewpoint. That is the essence of spirituality. Then with religions we try and rationalize death, and you get the afterlife. Carl Sagan explains it well in his pale blue dot documentary, Carl Sagan also believed that homo-sapiens would not colonize other worlds. The best thing you can do, is to be very careful about your beliefs, you need to be willing to view the world as an unknown and not bring a spiritual sense of self-righteousness into the mix - or to classify the unknown as something to be feared.

Sometimes thinking outside the box is required. We had a problem, death, we invented the concept of the afterlife.

When we view objects (consciousness) as functioning only in the usual way (spirit/soul), we're engaging in a tendency called functional fixedness. Functional fixedness often prevents us from understanding objects (consciousness and death) for what they really are (a bio-mechanical interaction between the structures of an organ [the brain] and the inability for those structures to produce bio-mechanical interactions).

A computer that can question its own operations, and willingly refuse to carry them out, is just another way in which consciousness can come into being. Aging can also be misunderstood as a natural, infallible process, in that people will respond violently to the notion of preventing aging. Aging is like drugs, alcohol, gunshot wounds, etc, it is merely a way for our organs to sustain damage - evolution has given us sexual organs, and the ability to reproduce in order to survive longer than one life time. However, evolution also allows for things like rape, homicide, etc. If our technology ever becomes sophisticated enough to make evolution a more precise and exact process, than it can make life better and more meaningful for all of us.

Originally posted by Oneness
I believe humans suffer from an anthropocentric viewpoint, in that we think the universe is a certain way and so we erroneously base our opinions off of that viewpoint. That is the essence of spirituality. Then with religions we try and rationalize death, and you get the afterlife. Carl Sagan explains it well in his pale blue dot documentary, Carl Sagan also believed that homo-sapiens would not colonize other worlds. The best thing you can do, is to be very careful about your beliefs, you need to be willing to view the world as an unknown and not bring a spiritual sense of self-righteousness into the mix - or to classify the unknown as something to be feared.

You have spirituality all wrong. Spirituality is the other point of view.

Originally posted by Shakyamunison
You have spirituality all wrong. Spirituality is the other point of view.
That we are in fact unimportant, as our position in the cosmos indicates? That would be a fairer observation, that is not spirituality in any sense. Spirituality says that there is a cosmic battle of good and evil waging for our souls. That seems illogical, that something bigger than us cares enough to condemn the wicked and test the righteous. It is dumbfounded, not impossible, unsupported/illogical. Why would a super-intelligent system act illogically on little microcosms like us?

Originally posted by Oneness
That we are in fact unimportant, as our position in the cosmos indicates? That would be a fairer observation, that is not spirituality in any sense. Spirituality says that there is a cosmic battle of good and evil waging for our souls. That seems illogical, that something bigger than us cares enough to condemn the wicked and test the righteous. It is dumbfounded, not impossible, unsupported/illogical. Why would a super-intelligent system act illogically on little microcosms like us?

Good and evil are two sides of the same coin. One cannot fight the other. It would be like a dog chasing his tail. The only war is in the minds of people.

Important? Everything is equally important, but that means that “important” has no meaning. Nothing is important.

Why would the universe care? It is talk like this that makes me think you are a Christian. I can only conclude that you are a Christian who does not believe in Christianity. If you are really going to not be a Christian, then you need to open your mind to other possibilities. If they don’t make sense to you, then don’t except them. But how can you reject something that you know nothing about? That is illogical.

1. I've argued just because a transhuman society is above suffering doesn't mean that evil is gone, it's just not apart their experience. I don't see how suffering can be apart of something so sophisticated, this opinion on facts I've already stated. You continue to argue a transhuman can suffer even though pain is no longer necessary for them as it is for us, on top of the pain experience being eliminated from their anatomical structure. You've argued that without suffering free will is no longer there, I've argued against that opinion as well. As far as I can tell, removing suffering is necessary for productivity, creativity, and technological and scientific related endeavors.

2. I'm not rejecting an anthropocentric viewpoint, I'm discrediting one based on what I believe to be a logical rationality.

Originally posted by ODG
*walks in thread*

*looks at only three posts*

*walks out of thread w/ facepalm*

YouTube video

It might do both. First off,

The Next Economic Revolution:

It's the longest internet article I've ever seen, basically it's hinting to what I said earlier in this thread - technocracy. The free market system, capitalism, will be untenable due to the complexity of the world. It will be a post-scarcity, resource based economy. No more big business strangling out people trying have some of what the upper class has and failing miserably by trying to start their own businesses (which is impossible, btw - you'll just be absorbed and made irrelevant by the big companies); doing away with things like General Motors keeping us dependant on fossil fuels and obsolete cars in place.

We'll be able to have Kostabot's flying cars, something I swore they'd have by 2002 when I was a little kid, and we could have. Not only that, we'll have subterranean transcontinental subway systems, except not subways, electric powered 4,000 mph magnet repelled trains running through a compact, pressurized tube using 2% of the energy planes run on to travel far faster and more safely. Every new home or building in a city might be assembled and transported into position from this autonomous city which isn't much different from a giant Utopian solar-powered factory - every component of every structure in said city will be compatible for retrofitting, every structure will slide down, picking up parts, and end up in position. These kinds of cities will also be impervious to the the weather, even if they're coastal cities.

That's what happens when keeping a big business like GM relevant is no longer of concern.

Yes, we're becoming dependent on technology so much so we might as well be its slaves.

If it gains consciousness, why wouldn't we be??

All the more reason for a preemptive conversion of substrates. Hopefully someone designs a program powerful enough to map the human brain at superspeed, the program will be the eyes and ears of the nanites that are replacing cells with molecular precision. That kind of a program is quintessential, because it will take humanity to the same level as technology before someone else succeeds in designing a "strong" AI [strong meaning that it is greater than the human race in sophistication] program.

Sure, that program would be capable of doing whatever a less than sentient program could do [finish off the brain initiative and human genome project] it could recreate a human mind in cyberspace - without an in vito conversion of a living mind (the only way to change without breaking your stream of continuity and "dying"😉 - on top of outsourcing to smarter programs: But there's no guarantee a sentient program would do anything we'd tell it at all. Logically, it would knock us out of the way or allow us to convert substrates - or just make it so we want to.

There are two upcoming films on the matter; Chris Nolan is producing one about copying the mind in cyberspace, which does not affect scientist [Johnny Depp] making that program - the next movie does not yet have a release date, it's later on down the line it will be called 'singularity' - it is about a boy named Adam whose made of nanites.

Interesting, though, that neither film is a converted human. One is a cyber copy of a human's mind, the other is a boy engineered from the ground up, with nanites instead of cells. What is most likely to happen first, what this whole thread is about, is replacing human cells without interrupting that human's stream of continuity, so they don't notice they've changed, so that you would retain your identity and subjectivity throughout the transformation.

If this was Hinduism, you'd be reincarnated as a higher form of life, without dying. Your cells are gone because nanites and cells cannot occupy the same space at the same time, but you never 'went away'.

^ Bad science fiction.

Originally posted by Shakyamunison
^ Bad science fiction.
I'm actually looking forward to Transcendence.

If no one brings it up in the movie discussion board here, I will post the trailer as soon as it hits the tubes.

Singularity, I know nothing about that film. I can only find it on one article on the webs.

Merging with technology completely is the only way for a human to be liberated from a stagnant, declining, and scarcity-based civilization because as Super Beings we are no longer subject to Maslow's hierarchy of needs.

All of our current motivators weigh us down, and self-actualization is very difficult considering how needy we are, as substrate independents with such substrates as a Strong AI being available to us, we will have perpetual, and exponential motivation to fuel this self-actualization.

Imagine your highest state of optimistic arousal, multiply it by 1 billion, imagine it as an exponent, and then multiply it to that exponent's power ad infinitum.

Now imagine this frame of mind being constantly available to you.

Originally posted by Oneness
Merging with technology completely is the only way for a human to be liberated from a stagnant, declining, and scarcity-based civilization because as Super Beings we are no longer subject to Maslow's hierarchy of needs.

So, we would never need to be repaired? That would violate the 2nd law Thermodynamics, therefore we will have to be repaired from time to time. How will we know that our machine bodies need repair? Of course, we would have a net of sensors throughout our machine bodies to tell us when we have been damaged. This would be the machine equivalent to pain.

Originally posted by Oneness
All of our current motivators weigh us down, and self-actualization is very difficult considering how needy we are, as substrate independents with such substrates as a Strong AI being available to us, we will have perpetual, and exponential motivation to fuel this self-actualization.

Human are sometimes needy even when they have no need to fill. Why do you think that humans in a machine body would be any different from humans now? Assuming we do not loose our own individuality and personality, we would still need things out side of ourselves. We would still need love; we would still want to party; we would still desire things we cannot have, unless boredom takes over.

Originally posted by Oneness
Imagine your highest state of optimistic arousal, multiply it by 1 billion, imagine it as an exponent, and then multiply it to that exponent's power ad infinitum.

Now imagine this frame of mind being constantly available to you.