KillerMovies - Movies That Matter!

REGISTER HERE TO JOIN IN! - It's easy and it's free!
Home » Community » General Discussion Forum » Philosophy Forum » On The Technological Singularity

On The Technological Singularity
Started by: Oneness

Forum Jump:
Post New Thread    Post A Reply
Pages (2): [1] 2 »   Last Thread   Next Thread
Author
Thread
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

On The Technological Singularity

The Singularity does not necessitate Transhumanism (cybernetic brains) because machines will never be able to think unless we program them with emotions.

Technological transhumanism is also unnecessary for transhumanism in general, as it is unnecessary for us to achieve biological immortality. Nucleotide sequences can be altered and turned into anything using embryonic stem cells that can then be injected directly into a human's DNA code - altering it to do anything, while nano-technology can constantly trim off parts of the strands that may degrade or fail to maintain this rechristened DNA code.

Transhumanism does, however, ensure that a technological utopia that accommodates or even tolerates humans will never be achieved. However, to program emotions into these computers would be an act of genocide, emotions are the only incentives that aren't arbitrated by rules and regulations.

However, the singularity can happen without self-aware artificial intellects. Computers are already superior at carrying out all tasks a human could. Machines and computers interchangeable, they are physically limitless in application, vastly more farseeing in calculation, self-sustaining, and ever-improving.

Lastly, the singularity is necessary for a technological utopia which in and of itself is necessary to make human life liberated, fair, prosperous, and simplistic enough for the golden rule to come into effect.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Last edited by KillaKassara on Jul 26th, 2014 at 03:25 AM

Old Post Jul 26th, 2014 03:13 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
Lord Lucien
Lets all love Lain

Gender: Male
Location:

Ray Kurzweil predicts it to happen by 2045, so... huzzah!


__________________
Recently Produced and Distributed Young but High-Ranking Political Figure of Royal Ancestry within the Modern American Town Affectionately Referred To as Bel-Air.

Old Post Jul 26th, 2014 03:38 AM
Lord Lucien is currently offline Click here to Send Lord Lucien a Private Message Find more posts by Lord Lucien Edit/Delete Message Reply w/Quote Quick Quote
zrh23
Junior Member

Gender:
Location:

Layman's terms? Thats a whole lot of jargon talk for something as easy as saying we dont need emotional computers to reach an event horizon.

Old Post Jul 26th, 2014 03:47 AM
zrh23 is currently offline Click here to Send zrh23 a Private Message Find more posts by zrh23 Edit/Delete Message Reply w/Quote Quick Quote
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

quote: (post)
Originally posted by Lord Lucien
Ray Kurzweil predicts it to happen by 2045, so... huzzah!
So we'll have the technology to design and build a self-sustaining, self-improving, work-free, money-less, classless industrial infrastructure within 31 years.

How long do you think it would take to remove the modern greed-based society's casting system to allow for said machine-run classless society?

Maybe, say, 17 years of exposing the wrong business deals to the right people via global spying? Illegally cracking every database on earth for the sake of spying on everyone may be wrong, but information about the greedy can be useful in the right hands, with the right propaganda and exposure.

It used to be called muck-racking, only these days a filthy rich, stupendously brilliant young man can do far more damage to; say, general motors or to the current officials that compose the House of Representatives than a one billion nation army.

There will be no hesitation, there will be no mistakes.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Last edited by KillaKassara on Jul 26th, 2014 at 03:55 AM

Old Post Jul 26th, 2014 03:48 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

quote: (post)
Originally posted by zrh23
Layman's terms? Thats a whole lot of jargon talk for something as easy as saying we dont need emotional computers to reach an event horizon.
It's more informative than the aforementioned simplistic explanation.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Old Post Jul 26th, 2014 03:56 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
zrh23
Junior Member

Gender:
Location:

Absolutely no doubt. I read your threads, always entertaining. No disrespect meant.

Old Post Jul 26th, 2014 12:01 PM
zrh23 is currently offline Click here to Send zrh23 a Private Message Find more posts by zrh23 Edit/Delete Message Reply w/Quote Quick Quote
loaderharrison
Restricted

Gender:
Location:

Account Restricted

Yeah its correct

Old Post Nov 6th, 2014 09:03 AM
loaderharrison is currently offline Click here to Send loaderharrison a Private Message Find more posts by loaderharrison Edit/Delete Message Reply w/Quote Quick Quote
Shakyamunison
Nam Myoho Renge Kyo

Gender: Male
Location: Southern Oregon, Looking at you.

Re: On The Technological Singularity

quote: (post)
Originally posted by Oneness
The Singularity does not necessitate Transhumanism (cybernetic brains) because machines will never be able to think unless we program them with emotions.

Technological transhumanism is also unnecessary for transhumanism in general, as it is unnecessary for us to achieve biological immortality. Nucleotide sequences can be altered and turned into anything using embryonic stem cells that can then be injected directly into a human's DNA code - altering it to do anything, while nano-technology can constantly trim off parts of the strands that may degrade or fail to maintain this rechristened DNA code.

Transhumanism does, however, ensure that a technological utopia that accommodates or even tolerates humans will never be achieved. However, to program emotions into these computers would be an act of genocide, emotions are the only incentives that aren't arbitrated by rules and regulations.

However, the singularity can happen without self-aware artificial intellects. Computers are already superior at carrying out all tasks a human could. Machines and computers interchangeable, they are physically limitless in application, vastly more farseeing in calculation, self-sustaining, and ever-improving.

Lastly, the singularity is necessary for a technological utopia which in and of itself is necessary to make human life liberated, fair, prosperous, and simplistic enough for the golden rule to come into effect.


I don't believe in the Technological Singularity. New discoveries will make it obsolete.


__________________

Old Post Nov 10th, 2014 05:37 PM
Shakyamunison is currently offline Click here to Send Shakyamunison a Private Message Find more posts by Shakyamunison Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

Re: Re: On The Technological Singularity

quote: (post)
Originally posted by Shakyamunison
I don't believe in the Technological Singularity. New discoveries will make it obsolete.


Would not new discoveries lead to the technological singularity?


__________________

Old Post Nov 10th, 2014 05:44 PM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
Shakyamunison
Nam Myoho Renge Kyo

Gender: Male
Location: Southern Oregon, Looking at you.

Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by dadudemon
Would not new discoveries lead to the technological singularity?


Maybe, but most likely not. We humans are not very good at predicting the future. It always turns out different then we imagine.


__________________

Old Post Nov 11th, 2014 05:07 PM
Shakyamunison is currently offline Click here to Send Shakyamunison a Private Message Find more posts by Shakyamunison Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by Shakyamunison
Maybe, but most likely not. We humans are not very good at predicting the future. It always turns out different then we imagine.


I'm the opposite. I see it as a probabilistic inevitability. Unless humans do something to curb the growth of our AI technologies, there's really no avoiding it. Even if we did that, someone, somewhere, will make true AI and it will be too late before we can do anything about stopping it.

I cannot see a way around this short of humans destroying themselves, first, or humans strangely agreeing to not do 1 particular thing with technological development.


__________________

Old Post Nov 13th, 2014 02:31 AM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
Shakyamunison
Nam Myoho Renge Kyo

Gender: Male
Location: Southern Oregon, Looking at you.

Re: Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by dadudemon
I'm the opposite. I see it as a probabilistic inevitability. Unless humans do something to curb the growth of our AI technologies, there's really no avoiding it. Even if we did that, someone, somewhere, will make true AI and it will be too late before we can do anything about stopping it.

I cannot see a way around this short of humans destroying themselves, first, or humans strangely agreeing to not do 1 particular thing with technological development.


Why would AI be something that would destroy us. What if it became dependent. Imagine a laptop that loves you. wink


__________________

Old Post Nov 13th, 2014 02:53 AM
Shakyamunison is currently offline Click here to Send Shakyamunison a Private Message Find more posts by Shakyamunison Edit/Delete Message Reply w/Quote Quick Quote
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

Re: Re: Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by Shakyamunison
Why would AI be something that would destroy us. What if it became dependent. Imagine a laptop that loves you. wink
Our destruction would be logical.

There's a difference between a program that literally cannot achieve sentience and a human mind in virtual substrate. But there'd be plans within plans against that, the preemptive botanist jihad. Schools would be built to train mentats.

In all probability, it's now working for WMDs, it can work for AI if we recognize the risk before it's too late.

Transcendence with Johnny Depp was wrong in that everyone who undergoes apotheosis into binary will not join one man's ego or be extensions of it. It's very much a preferable way to go. Your chances of survival are increased googols-fold, you can setup any experience you want. It'd be The Garden.

Sci-fi has never been spot-on with its depictions of what mind-uploading would truly be like. The en-vito approach does not break one's stream of consciousness and so you're literally going into cyberspace, that you in cyberspace is not a copy of you like in transcendence. It's truly paradise in reality.

It's the most beautiful and exciting thing, every human left on earth liberated from suffering the instant it happens.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Last edited by KillaKassara on Nov 13th, 2014 at 03:12 AM

Old Post Nov 13th, 2014 03:06 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

There're perhaps misunderstandings as to what I'm writing because it is the kind of thing that, no matter who you are, the hairs on the back of your head should stand up once you get the idea. By the time there's about 1 trillion humans, long, long, long after the first computers can utilize their sensors to the capacity of fully mimicking an individual's thinking patterns and uploading that conscious entity into cyber-space; self-improved intelligence, strong AI, will have the sophistication to formulate an en-vito approach in which I argued diligently with a psychologist here over a year ago whether or not it'll ever be possible, the en-vito approach is to send nanites into a live human brain and to slowly kill off neurons, replacing them with nanites. The human himself won't die, his brain will change from being composed of organic molecules to the silicon molecules that comprise the brain of artificial neurons.

The issue is that the bio-chemical based neuron in the human brain will basically be tasered by these artificial neurons because they're silicon-based and the whole firing of synapses is out the window. Perhaps a more sophisticated intellect could possibly make them compatible. And anything is possible given ample time, the laws of thermodynamics themselves are changing over a googols-of-years time-frame.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Old Post Nov 13th, 2014 03:20 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

This is where it gets spooky, the doomsday argument clearly states that by the time there's 1 trillion humans spread about many habitable worlds and space-stations supported by super-human AI - humanity will cease to exist.

That is contradictory, if humans have biological immortality, nanites that can pump fresh molecules into the DnA and RnA strands so that it never ever degrades at all, and if, technologically, we're at our peak and most capable and able to survive - then how in the hell do we just drop dead?

Answer, the en-vito approach, nobody'd die, they'd just cease to be human.

Conversely, expansion may make it impossible for humans to exist. However, we could even survive this by utilizing zero-point energy for time travel, as the smallest unit of space cannot support more than x amount of energy, it will generate a rift that when passed through will take you to the universe in an earlier point in time. Therefore, humans don't die, they just disappear from the current timeline.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Last edited by KillaKassara on Nov 13th, 2014 at 03:33 AM

Old Post Nov 13th, 2014 03:25 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
Shakyamunison
Nam Myoho Renge Kyo

Gender: Male
Location: Southern Oregon, Looking at you.

Re: Re: Re: Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by Oneness
Our destruction would be logical...


And this is from the most illogical person I know.

I didn't read the rest.


__________________

Old Post Nov 13th, 2014 04:18 AM
Shakyamunison is currently offline Click here to Send Shakyamunison a Private Message Find more posts by Shakyamunison Edit/Delete Message Reply w/Quote Quick Quote
KillaKassara
Restricted

Gender: Male
Location: Midwest

Account Restricted

I said logical. Think about all the upkeep, we'd be a bit of a distraction to keep around. Like a pet costs some money to keep.

But no conscious mind thinks like that. Only a simple program.

There's real reason to fear something more cognitively powerful than every human mind combined.


__________________
"Compounding these trickster aspects, the Joker ethos is verbally explicated as such by his psychiatrist, who describes his madness as "super-sanity." Where "sanity" previously suggested acquiescence with cultural codes, the addition of "super" implies that this common "sanity" has been replaced by a superior form, in which perception and processing are completely ungoverned and unconstrained"

Old Post Nov 13th, 2014 04:44 AM
KillaKassara is currently offline Click here to Send KillaKassara a Private Message Find more posts by KillaKassara Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

Re: Re: Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by Shakyamunison
Why would AI be something that would destroy us.


I didn't state or imply that but that is one of the outcomes of the "Technological Singularity."

I said:

"I cannot see a way around this short of humans destroying themselves, first, or humans strangely agreeing to not do 1 particular thing with technological development."

In the above sentence, "this" = Technological Singularity.

To state it even more directly, one of the few ways we can prevent the Technological Singularity is if we completely wipe out all humans before we create true AI.

The only other probable way I can see the prevention of the Technological Singularity is all of humans, for the rest of human existence, agreeing to not create AI beyond a certain point.


I cannot think of any other probable, non-God interference, ways to prevent it from happening.

quote: (post)
Originally posted by Shakyamunison
What if it became dependent. Imagine a laptop that loves you. wink


As a kid, I used to speculate that humans created God via AI. Then that AI transcended time and space and started interacting with humans throughout history. smile


__________________

Old Post Nov 14th, 2014 07:18 AM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
Shakyamunison
Nam Myoho Renge Kyo

Gender: Male
Location: Southern Oregon, Looking at you.

Re: Re: Re: Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by dadudemon
...I cannot think of any other probable, non-God interference, ways to prevent it from happening...


But what if it doesn't happen? I think that AI will take million of years to evolve into existence.


__________________

Old Post Nov 14th, 2014 07:43 AM
Shakyamunison is currently offline Click here to Send Shakyamunison a Private Message Find more posts by Shakyamunison Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

Re: Re: Re: Re: Re: Re: Re: Re: On The Technological Singularity

quote: (post)
Originally posted by Shakyamunison
But what if it doesn't happen? I think that AI will take million of years to evolve into existence.


We already have advanced AI so now what are you thoughts?


We are just shy of making human-like intelligence. Seriously. There are people that study these things. At the current path of improvement, we are looking for it to happen around 2024-2025. Check out this guy:

http://en.wikipedia.org/wiki/Nigel_Shadbolt


I don't know how to word it any better: it is currently inevitable. Unless something drastic changes, we are right on path. Google may have already obtained "near-human" AI with one of their projects: DeepMind. Also, check out IJCAI.


The only reason people seem skeptical of AI is because it seems like Sci-Fi. It's not. It is here, now. AI has taken off the last 10-15 years. This is no different that improving our particle accelerators over the last 20 years. People seem to have no problem with particle physicists working on better particle accelerators that are on track to be released in 10-15 years (such as the Large Hadron Collider which took 20 years to build and another few more years after to bring the equipment up to fully operational status). People can buy that. They can digest that and accept it despite the fact that it was the very cutting edge of particle physics. Why do people accept that and not the AI projects? Because particles do not necessarily think back at you. AI does. This scares people. So people become skeptics and doubters.

People are going to shit themselves and be utterly shocked when we release a "very near" human-like AI. All we have to do is create an AI that is close enough to human-like intelligence that it can start improving itself at a decent pace (meaning, better than we have now but not necessarily anywhere close to what a human can do...just good enough that it can resemble the performance of a human because the computers do not tire so they can continue to work when we need to poop or sleep).


__________________

Old Post Nov 14th, 2014 03:38 PM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
All times are UTC. The time now is 06:17 AM.
Pages (2): [1] 2 »   Last Thread   Next Thread

Home » Community » General Discussion Forum » Philosophy Forum » On The Technological Singularity

Email this Page
Subscribe to this Thread
   Post New Thread  Post A Reply

Forum Jump:
Search by user:
 

Forum Rules:
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is OFF
vB code is ON
Smilies are ON
[IMG] code is ON

Text-only version
 

< - KillerMovies.com - Forum Archive - Forum Rules >


© Copyright 2000-2006, KillerMovies.com. All Rights Reserved.
Powered by: vBulletin, copyright ©2000-2006, Jelsoft Enterprises Limited.