Did You Know?

Started by Mindship2 pages

Did You Know?

I had to share this. The music alone makes it worthwhile.

http://www.youtube.com/watch?v=ljbI-363A2Q

Re: Did You Know?

Originally posted by Mindship
I had to share this. The music alone makes it worthwhile.

http://www.youtube.com/watch?v=ljbI-363A2Q

Awesome vid.

I think Japan already has 10 terabit backbones in place.

Cool stuff. Actually, "cool stuff" doesn't cover it. Sh*t like that is absolutely exciting for our planet, regardless of what good and bad things come of it.

wow...that's pretty amazing stuff...i like the bit about the fact that more information had been created in the last year than in the last 5000 years before it

I remember reading a short story by Isaac Asimov like this. Humans forgot how to do math because technology became so advanced. There was a war going on, however, and one scientist discovered mathematics--addition, subtraction, multiplication, and he was working on division. Because the technology was so advanced, the entire war was at a stalemate; the computers were too evenly matched. Thus, a solution was born: humans should pilot the projectile weaponry (missiles, rockets, etc.) because a computer couldn't anticipate the human mind.

Not that this has much to do with anything, but I think it's far from exciting that humans will one day be completely inferior to our technology. I think it's horrifying.

Originally posted by Zeal Ex Nihilo
Not that this has much to do with anything, but I think it's far from exciting that humans will one day be completely inferior to our technology. I think it's horrifying.

Humans will eventually be more than human anyway, if we allow evolution to continue to do its work over unfathomably long periods of time. This is just speeding up the process.

I see it as a testament to human ingenuity, intelligence, as well as the power and majesty of science. Both are awe-inspiring.

...

But let's look at it another way: say an alien species came along that was vastly superior to us both mentally and physically. Like humans to dogs. But they embraced us and shepherded us to new frontiers of the mind and body. Is this bad?

Hopefully not (for most, at least). And this is the same thing. Technology won't become our master (presumably), and we are the creators behind their mechanisms, so we can guide them to ever higher heights....which will bring humans up several levels in the process.

So quit being such a species-ist. I for one look forward to possibly having a conversation with a non-human sentience in my lifetime.

Originally posted by DigiMark007
So quit being such a species-ist.

That is an unbelievable level of retardation. Don't say stupid shit like that.

😂

I see no need to fear technological singularity... although its relatively off topic...

Originally posted by Zeal Ex Nihilo
That is an unbelievable level of retardation. Don't say stupid shit like that.

Why? It was kind of in jest, but I fail to see what's so stupid about it....especially since the quote is contextualized in a post that explains my position.

Anyway, you did nothing to address my points. Instead, you insulted me.

Originally posted by Zeal Ex Nihilo

Not that this has much to do with anything, but I think it's far from exciting that humans will one day be completely inferior to our technology. I think it's horrifying.

For once, I agree with you

Originally posted by DigiMark007
Why? It was kind of in jest, but I fail to see what's so stupid about it....especially since the quote is contextualized in a post that explains my position.

Anyway, you did nothing to address my points. Instead, you insulted me.


Because you implied that I implied that the human species is superior to another species (aside from animals).

I'm hoping that when we do have AIs more intelligent than people, this will help to answer some questions, such as...

1. Exactly, what is intelligence? Ie, what will make these machines "smarter" than us? Will they just be better math-based problem-solvers? What about nonverbal/nondigital problem-solving? Creativity and insight? Do these play a role in human intelligence? How will we end up defining intelligence?

2. How does intelligence relate to consciousness? Motivation? Eg, will being smarter automatically instill a self-preservation "instinct?" What will AIs be "inherently driven" to figure out for themselves? (Personally, I don't think they will automatically deem us inferior and/or a threat to their existence.) And will machine intelligence highlight any paths to take in exploring/defining "consciousness?"

BTW, I don't recall anyone ever mentioning this site, but this...
www.orionsarm.com
...is a scifi site which takes the concept of AIs to very cool extremes.

Originally posted by Mindship
I'm hoping that when we do have AIs more intelligent than people, this will help to answer some questions, such as...

1. Exactly, what is intelligence? Ie, what will make these machines "smarter" than us? Will they just be better math-based problem-solvers? What about nonverbal/nondigital problem-solving? Creativity and insight? Do these play a role in human intelligence? How will we end up defining intelligence?

I figure if they start taking over their smarter in the ways that count.

Originally posted by Mindship
2. How does intelligence relate to consciousness? Motivation? Eg, will being smarter automatically instill a self-preservation "instinct?" What will AIs be "inherently driven" to figure out for themselves? (Personally, I don't think they will automatically deem us inferior and/or a threat to their existence.) And will machine intelligence highlight any paths to take in exploring/defining "consciousness?"

Depends on the form of the AI. A nonvolitional AI would have no such drive, of course, and a normal one could be programmed not to think about that (I assume). If they did think about superiority I don't see why they wouldn't deem themselves superior.

Originally posted by Mindship
BTW, I don't recall anyone ever mentioning this site, but this...
www.orionsarm.com
...is a scifi site which takes the concept of AIs to very cool extremes.

Have you ever read Hyperion? It has a great (is pessimistic) take on AIs that are so far above us as to exist as what are essentially plank-tech beings.

Originally posted by Symmetric Chaos
I figure if they start taking over their smarter in the ways that count.

This is why we keep you around 😂

Originally posted by Symmetric Chaos
I figure if they start taking over their smarter in the ways that count.
*thinking back to the 2000 Prez election...*

Have you ever read Hyperion? It has a great (is pessimistic) take on AIs that are so far above us as to exist as what are essentially plank-tech beings.
I read the first book a long time ago. Great story, though IIRC this first book focused more on the pilgrims to the Time Tombs than AI. But yeah, I understand the series has terrific AI stuff.

Originally posted by Zeal Ex Nihilo
Because you implied that I implied that the human species is superior to another species (aside from animals).

No, I didn't. You said you were terrified of us being inferior to machine intelligence (in many ways we already are, btw). It didn't imply a species superiority but an inherent fear of another species (in this case, a synthetic AI being) trumping our capabilities. So yeah, it was slightly species-ist, just not in the way you thought I meant it.

I didn't mean it as an overt insult, however, just a jibe at your horror that I find to be a bit misplaced. Sorry you took it so harshly.

...

As for Mindship's musings about intelligence, most cpus would generally wax the floor with us on standard IQ tests. Ironically enough, the exactitude of computers is something that we occasionally see as a fault. As if our penchant for making errors somehow makes us more intelligent (though it does add a certain seeming random-ness to behavior that makes the human machine impossible to predict accurately).

But when we talk about obstacles in AI, we generally mean consciousness, and machine AI's (seeming) lack of it. But the fact remains that at some point along our evolutionary lineage we weren't consciously aware, and at another point we were. It obviously wasn't an on/off sort of thing, but gradual steps of consciousness...like how a dog is probably conscious but not at the level of humans.

Therefore, it's only a matter of complexity. Unfortunately, computer AI lacks the processor-on-processor complexity of the billions that humans possess, so it's many orders of magnitude away from acheiving a human level of consciousness. But it isn't outlandish to imagine that we will be able to construct something with at least rudimentary awareness of itself within a lifetime or two.

We could argue until we die about what consciousness is: seperate or the same as the physical processes that give rise to it. But that isn't my point. Whichever one it is, the operative idea is that our physical nature gives rise to consciousness (regardless of whether it is itself physical or not), and so it is possible (though difficult) to create such beings artificially rather than having them grown over hundreds of millenia via evolution.

Originally posted by DigiMark007
the operative idea is that our physical nature gives rise to consciousness (regardless of whether it is itself physical or not), and so it is possible (though difficult) to create such beings artificially rather than having them grown over hundreds of millenia via evolution.
Well, this is part of what I am wondering. Certainly, if we live in a fundamentally material universe (matter gives rise to consciousness), then it is just a matter of time before the complexity of our machines exceeds the complexity of the human brain and we will have AI superconsciousness.

However, if the mystical/transcendent paradigm is correct (Consciousness precedes and emerges through material complexity, not from it), then there may very well be an element to Consciousness that no machine, no matter how complex it is, will be able to replicate/possess.

I don't intend to open up a What Is Reality discussion here (Lord knows, we have enough threads on that topic). I was just elucidating where I was coming from in my musings.

Originally posted by Mindship
Well, this is part of what I am wondering. Certainly, if we live in a fundamentally material universe (matter gives rise to consciousness), then it is just a matter of time before the complexity of our machines exceeds the complexity of the human brain and we will have AI superconsciousness.

However, if the mystical/transcendent paradigm is correct (Consciousness precedes and emerges through material complexity, not from it), then there may very well be an element to Consciousness that no machine, no matter how complex it is, will be able to replicate/possess.

I don't intend to open up a What Is Reality discussion here (Lord knows, we have enough threads on that topic). I was just elucidating where I was coming from in my musings.

I think the former is a much more realistic way of looking at things, as that the latter theory requires a certain amount of belief/faith in the presence of mystical forces to accept.

In any case, I have no problem accepting the possibility that consciousness is separate from physical forces. But I see it as rather clearly a bottom-up construction rather than top-down.

Of course, the creation of computer AI would vindicate this position, but until then it's just educated hypotheses on both sides.

Originally posted by DigiMark007
No, I didn't. You said you were terrified of us being inferior to machine intelligence (in many ways we already are, btw). It didn't imply a species superiority but an inherent fear of another species (in this case, a synthetic AI being) trumping our capabilities. So yeah, it was slightly species-ist, just not in the way you thought I meant it.

I didn't mean it as an overt insult, however, just a jibe at your horror that I find to be a bit misplaced. Sorry you took it so harshly.

...

As for Mindship's musings about intelligence, most cpus would generally wax the floor with us on standard IQ tests. Ironically enough, the exactitude of computers is something that we occasionally see as a fault. As if our penchant for making errors somehow makes us more intelligent (though it does add a certain seeming random-ness to behavior that makes the human machine impossible to predict accurately).

But when we talk about obstacles in AI, we generally mean consciousness, and machine AI's (seeming) lack of it. But the fact remains that at some point along our evolutionary lineage we weren't consciously aware, and at another point we were. It obviously wasn't an on/off sort of thing, but gradual steps of consciousness...like how a dog is probably conscious but not at the level of humans.

Therefore, it's only a matter of complexity. Unfortunately, computer AI lacks the processor-on-processor complexity of the billions that humans possess, so it's many orders of magnitude away from acheiving a human level of consciousness. But it isn't outlandish to imagine that we will be able to construct something with at least rudimentary awareness of itself within a lifetime or two.

We could argue until we die about what consciousness is: seperate or the same as the physical processes that give rise to it. But that isn't my point. Whichever one it is, the operative idea is that our physical nature gives rise to consciousness (regardless of whether it is itself physical or not), and so it is possible (though difficult) to create such beings artificially rather than having them grown over hundreds of millenia via evolution.

I would like to point out that CPUs are stupid, not because we can make mistakes.

But because we be inexact and still accomplish our goals. Computers can not begin to cope with something that is not programmed into it.