KillerMovies - Movies That Matter!

REGISTER HERE TO JOIN IN! - It's easy and it's free!
Home » Community » General Discussion Forum » Philosophy Forum » Human extinction

Human extinction
Started by: Emil Blonsky

Forum Jump:
Post New Thread    Post A Reply
Pages (4): [1] 2 3 » ... Last »   Last Thread   Next Thread
Author
Thread
Harvey Two-Face
Restricted

Gender: Male
Location: Arkham

Account Restricted

Human extinction

Okay, I've been thinking: Let's hypothetically say humans find life on other planets where there's no other life, who's to say that we still won't become extinct? I mean, afterall, we can't keep moving around the universe forever, right?

What are you all's thoughts on human extinction?

Old Post Jul 8th, 2008 01:17 PM
Harvey Two-Face is currently offline Click here to Send Harvey Two-Face a Private Message Find more posts by Harvey Two-Face Edit/Delete Message Reply w/Quote Quick Quote
filmchicno9
Restricted

Gender: Unspecified
Location: United States

Account Restricted

I wonder what it is that makes people reproduce if they believe life begins in other parts of the universe.

I would be happy because it would erase the stress on mistakes of racism.

Old Post Jul 8th, 2008 02:16 PM
filmchicno9 is currently offline Click here to Send filmchicno9 a Private Message Find more posts by filmchicno9 Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

Well, there's only so many more millions (possibly billions) of years any life can exist, given entropy, and the fact that it appears as though the universe will not contract at some point, but continue indefinitely to grow. It's just a scientific fact, and the vague futurist musings on converting our consciousnesses to energy fields or somesuch would likely only prolong the inevitable, not avoid it.

Still, Earth will go before many other planets, and our sun will go supernova before various others. We can certainly prolong the species considerably by learning to inhabit other planets, solar systems, galaxies, etc.


__________________

Old Post Jul 8th, 2008 04:35 PM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

quote: (post)
Originally posted by DigiMark007
Well, there's only so many more millions (possibly billions) of years any life can exist, given entropy,


If you're referring to "heat death" or the inevitable maximized entropy potential and its repercussions on the universe, that is absurdly in the future.....we're talking >10^100 years and beyond. Really really really far into the future.


quote: (post)
Originally posted by DigiMark007
and the fact that it appears as though the universe will not contract at some point, but continue indefinitely to grow. It's just a scientific fact, and the vague futurist musings on converting our consciousnesses to energy fields or some such would likely only prolong the inevitable, not avoid it.


There's other theories which involve AI. It is estimated by some that around 2050, for all intents and purposes, god-like AI will be created. Not god-like as in all powerful but more like omniscient. Truly being able to calculate and grasp all realities, all potential realities, and all probable realities. (potential and probable realities are slightly different, imo. potential don't necessarily exist or will never exist but they could be used to better form tangible numbers on the probable, the probable is used to form tangible numbers on the real....and so forth.)

Some also believe that AI will actually envelop the entirety of the universe and have utter and complete control over things such as universal expansion and proton decay. This implies that heat death would actually become impossible.

Is it really hard to see humanity creating ways around heat death or other phenomenon? This assumes that other species throughout our universe haven't created ways around heat death or created god-like AI. Probabilities point to the existance of other "intelligent" species existing in this universe.

BTW, heat death is not the only proposed theory on the distant future of our universe. At this moment, universal expansion seems the most probable, though.

quote: (post)
Originally posted by DigiMark007
Still, Earth will go before many other planets,


I don't understand what you mean. It is possible that we will actually immortalize our planet. We may actually end up turning our planet into something that looks like a hybrid between Coruscant and a Borg Cube. Seriously. Our planet should end up as one giant organic computer.


quote: (post)
Originally posted by DigiMark007
and our sun will go supernova before various others.


It won't. Our sun is not massive enough. It will turn into a red giant and slowly die the death of most stars. We will end up with a cloud of stellar matter that was slowly ejected from our expanding red giant and only the tiny and very dense nucleus will remain.

quote: (post)
Originally posted by DigiMark007
We can certainly prolong the species considerably by learning to inhabit other planets, solar systems, galaxies, etc.


I don't believe that that is a probability. We will more than likely "technology out" of existence and merge into a very advanced form of AI consciousness . This assumes that there aren't various species of AI that conflict at some level for conscience dominance. Regardless, we will evolve and become one with the AI we give birth to.

I say it won't be a probability because it won't really be "us" that teraforms those planets into giant computers for the edification of the universal AI.


__________________

Last edited by dadudemon on Jul 8th, 2008 at 05:41 PM

Old Post Jul 8th, 2008 05:38 PM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

Like I said, futurist techno-babble promises a lot of the stuff that dudemon mentioned. It all remains speculation right now, at best. Lulz at the omniscient AI by 2050, however. If a computer with a multiversal intellect is created during my lifetime, I'll fly to your house and shake your hand.

Not that I think some of what you mentioned isn't possible. It very well might be. But talking about it like it's an inevitability that, say, earth will one day closely resemble a Borg Cube or AI will one-day control proton decay, and you lose a bit of credibility. It's a remote possibility, not an imminent future.

Although I did misspeak when I said our star would go supernova. It will have the same affect, however, since its slow expansion will eventually make earth uninhabitable. Again, though, we're talking in terms of billions of years. Though I didn't pass off heat death as irrefutable fact, so there was no need to "correct" me. It's the most likely outcome based on our current knowledge.


__________________

Old Post Jul 8th, 2008 07:15 PM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
Harvey Two-Face
Restricted

Gender: Male
Location: Arkham

Account Restricted

Our sun isn't large enough to go supernova. It'll certainly die, as will Earth, but won't go supernova.

As for humans...I dunno, maybe if we smarten up and spread our race across the universe we'll survive for a long time. But by then, if we evolve, we may become a different species entirely.


__________________

You guys need to start taking things a little more seriously.

Old Post Jul 9th, 2008 03:09 AM
Harvey Two-Face is currently offline Click here to Send Harvey Two-Face a Private Message Find more posts by Harvey Two-Face Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

quote: (post)
Originally posted by Emil Blonsky
But by then, if we evolve, we may become a different species entirely.


That's actually fairly likely. We can already do simple forms of gene splicing for pre-born children, and species can deviate as quickly as a few hundred thousand years (though usually longer).


__________________

Old Post Jul 9th, 2008 03:18 AM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
Harvey Two-Face
Restricted

Gender: Male
Location: Arkham

Account Restricted

If we do change into another species, so to speak, what are we talking here? What'll be different (I'm really curious)?


__________________

You guys need to start taking things a little more seriously.

Old Post Jul 9th, 2008 04:21 AM
Harvey Two-Face is currently offline Click here to Send Harvey Two-Face a Private Message Find more posts by Harvey Two-Face Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

quote: (post)
Originally posted by DigiMark007
Like I said, futurist techno-babble promises a lot of the stuff that dudemon mentioned. It all remains speculation right now, at best. Lulz at the omniscient AI by 2050, however. If a computer with a multiversal intellect is created during my lifetime, I'll fly to your house and shake your hand.


Why my house? Why not the physicists who predicted it?

I qualified my statement like so:


quote: (post)
Originally posted by dadudemon
There's other theories which involve AI. It is estimated by some that around 2050, for all intents and purposes, god-like AI will be created.


Unfortunately (or fortunately) I cannot take credit for those theories. They are far from my own original ideas.

quote: (post)
Originally posted by dadudemon
Not that I think some of what you mentioned isn't possible. It very well might be. But talking about it like it's an inevitability that, say, earth will one day closely resemble a Borg Cube or AI will one-day control proton decay, and you lose a bit of credibility.


So, when I qualify my statements like so:

quote: (post)
Originally posted by dadudemon
Some also believe that AI..


You interpret that as "inevitable speak"?

Also, I qualified my statements with things such as "It is possible.." and "We may actually end up turning our planet into something that looks like a hybrid between Coruscant and a Borg Cube."

Some dude did a projection on current growth and he came up with a year that Earth will be one large megalopolis just like Coruscant. Unless something is done to curb that, it is inevitable that the ridiculous vast majority of the planet will be one city.

As far as the Earth being similar to a Borg Cube...well, if we have our way with technology, much of the planet with be computers.

Also... I was referring to the ideas of Raymond Kurzweil. He has this very canny(canny, because he isn't just guessing) ability to predict technological milestones.

Here is a nice summary of his "planet stuff" from wiki.

" * The physical bottom limit to how small computer transistors (or other equivalent, albeit more effective components, such as memristors integrated into Crossbar latches) can be shrunk is reached. From this moment onwards, computers can only be made more powerful if they are made larger in size.
* Because of this, A.I.s convert more and more of the Earth's matter into engineered, computational substrate capable of supporting more A.I.s. until the whole Earth is one, gigantic computer.
* At this point, the only possible way to increase the intelligence of the machines any farther is to begin converting all of the matter in the universe into similar massive computers. A.I.s radiate out into space in all directions from the Earth, breaking down whole planets, moons and meteoroids and reassembling them into giant computers. This, in effect, "wakes up" the universe as all the inanimate "dumb" matter (rocks, dust, gases, etc.) is converted into structured matter capable of supporting life (albeit synthetic life).
* Kurzweil predicts that machines might have the ability to make planet-sized computers by 2099, which underscores how enormously technology will advance after the Singularity.
* The process of "waking up" the universe could be complete as early as 2199, or might take billions of years depending on whether or not machines could figure out a way to circumvent the speed of light for the purposes of space travel.
* With the entire universe made into a giant, highly efficient supercomputer, A.I./human hybrids (so integrated that, in truth it is a new category of "life") would have both supreme intelligence and physical control over the universe. Kurzweil suggests that this would open up all sorts of new possibilities, including abrogation of the laws of Physics, interdimensional travel, and a possible infinite extension of existence (true immortality)"

http://en.wikipedia.org/wiki/Ray_Kurzweil


However, I did say that we will more than likely "technology out of existence" because we most likely will. In one form or another, we will not be human as we recognize it or would like to currently define it...that assumes we won't destroy ourselves first. Unlocking all the secrets of the human brain and genome open the door to many possibilities. Possibilities that lead to different forms of humanity and beyond.

quote: (post)
Originally posted by dadudemon
It's a remote possibility, not an imminent future.


No, it is the highest probable future based on current projections. I'm referring to a a Coruscant planet of course and AI's applications in things like teraforming.



As far as other forms of AI, there are computing technologies in the works such as holographic memory storage, 4 dimensional memory storage (The storage capacity on this shit is absurd), quantum computing, etc. that will make our computers look rather simple. That does not bridge the gap, though, for AI. However, I suspect that your personal qualifications for AI is anthropomorphic in nature. I agree that it is very uncertain that human-like AI is not as probable as advanced forms of AI. It is inevitable that we will create very advanced forms of AI, though. They just may not be anthropomorphic. Creating AI that can teraform a planet? That is not that far out of bounds of current AI. They already have AI that can put together a book shelf...even when presented with the "data" out of order.

I am skeptical of strong AI myself. However, I see it as a strong (pardon the pun) possibility for the future. Human-like AI? I'm not too sure.

quote: (post)
Originally posted by dadudemon
Although I did misspeak when I said our star would go supernova.


I get the feeling that your whole post was a tad defensive. I'll get to this in a sec.

quote: (post)
Originally posted by dadudemon
Though I didn't pass off heat death as irrefutable fact, so there was no need to "correct" me.



That wasn't what I was correcting at all. I never disputed that. As of right now, that is an inevitable future. (barring quantum phenomena at the macro level, and the vacuum of space collapsing in on itself and forming an entirely different set of physics, etc...ad nauseum with the theories..)

"Well, there's only so many more millions (possibly billions) of years any life can exist, given entropy, and the fact that it appears as though the universe will not contract at some point, but continue indefinitely to grow. It's just a scientific fact,..."

I was correcting you when I gave a more accurate time of heat death, or as your inferred, maximized entropy potential.

If I said there are about 100 or possibly 1000 feet in a mile, I would fully expect your to steer me in the right direction with 5,280 feet. In this particular instance, you were off by an absurd amount of numbers, so I figured I'd shed some light on it. I apologize if it came off as "know-it-all" or condescending. My intentions were more like "I smell what you're cooking, but your numbers are off." If you don't like to be corrected, I won't do it again. I welcome being corrected when I'm wrong...I probably shouldn't expect (subconsciously) others to welcome a correction on numbers. That was my bad.



On the whole, you seem very skeptical AND dubious about strong AI. I see it is a high possibility. I also think that we as a species will probably, in one form or another, figure out a way to prevent heat death. If we immortalize ourselves with "ghost in the shell" type of stuff, we'll think of ways to stay immortal.


__________________

Old Post Jul 9th, 2008 07:15 AM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
michelle444
Restricted

Gender: Unspecified
Location:

Account Restricted

^_^

hmmm...

Old Post Jul 9th, 2008 09:54 AM
michelle444 is currently offline Click here to Send michelle444 a Private Message Find more posts by michelle444 Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

quote: (post)
Originally posted by Emil Blonsky
If we do change into another species, so to speak, what are we talking here? What'll be different (I'm really curious)?


We can't predict how random genetic variation and natural selection will coincide to produce change. So your guess is as good as mine.

...

As for dudemon's statements, lulz first of all at accidentally putting my words in quotes that say you wrote it. But it's an understandable mistake for such a long post.

In any case, someone as researched as you on Kurzweil should be aware that there is another side to anything. For every over-zealous transhumanist who believes the singularity will occur within our lifetimes, there are equally credible scientists who doubt that the singularity could even exist. From my own readings and thoughts (I'm largely unfamiliar with Kurzweil's work, though not with many of the ideas he proposes, since he's not the only one) I side with the latter group, as their reasoning seems far more sound to me. I actually have a thread on transhumanism in this forum, and detail some of that reasoning if you're interested. It should be easy to find with the search.

But the main point is that you presented only those ideas, not their refutations or competing theories. In doing so, yes, it came across as sounding inevitable in your opinion. Being acquainted with only one side of an issue, or at least presenting only that side, is the definition of bias. You say that you remain skeptical of some of what you wrote, but your own admission of belief at the end their as well as your presentation of the material speak to a lack of skepticism for much of it.

Also, you seem to be confused as to the difference between data processing and application of it. For example: Computers could indeed "terraform the planet" in a mathematical or physical model. Yet the logistics of completing such a transformation remain far outside our, or any computer's, ability to manufacture and complete. Not to mention the political, cultural, and religious roadblocks that any such movement would encounter....which, collectively, would be enough to nullify it to a crawl, not an exponential rate of increase.

Also, the main points you outline for Kurzweil are remarkably similar to another physicist, Frank Tipler (Author of, most famously, The Physics of Immortality). As with Tipler, within the main points the logic seems sound enough, but it is between points where large logical gaps remain.
- If and when we reach the smallest that cpus can become, and there is no reason to believe it will happen soon, he also assumes that the next step will be to convert all of the earth into such material. This presumes consciousness in machines, as well as independent thought, an entirely different phenomenon than the "as small as possible" that he mentions, and decidedly harder.
- Earth as a giant computer: physical limitations mentioned earlier.
- Colonization of space, assumes widespread space travel will be possible, as well as sustainability of humans (or machines by this point, I suppose, in his theory). Life being synthetic doesn't ensure self-sustainability, so massive obstacles would need to be overcome just to maintain this on a global level....unless he's theorizing an end to entropy itself, or frictionless operation of machines, which would be yet another large jump.
- Colonization of the universe: to its credit, it mentions the problems inherent in FTL travel, however it doesn't attempt to reconcile it and blithely assumes that a solution will be found.
- Things like true immortality, overriding the laws of physics, interdimensional travel, at this point can hopefully be recognized for what they are: speculative nonsense that comes at the end of an overly optimistic list of assumptions.
- The last, and biggest, hurdle is that the Singularity may not be possible. History gives us the illusion of increased rates of change, when in fact progress is and has always been incremental. There are more "small" things to build off of as time progresses, and they group into bigger things (think of the hundreds of small inventions that help make a car, for example)....so incremental, gradual change takes on the appearance of faster rates of change simply because there is more to build off of than the past. But it doesn't mean that the rate of change itself is increasing, which is the very basis behind the Singularity.

A more trained eye would likely find more flaws. For myself, I copied much of that for my memories of scientific articles criticizing and debunking Tipler, whose conclusions are similar, and equally as unfounded.

I don't mean to sound negative, btw. Your interest is encouraging, and its certainly an interesting field of study. I just think your enthusiasm is misplaced.


__________________

Old Post Jul 9th, 2008 11:42 PM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

quote: (post)
Originally posted by DigiMark007
We can't predict how random genetic variation and natural selection will coincide to produce change. So your guess is as good as mine.

...

As for dudemon's statements, lulz first of all at accidentally putting my words in quotes that say you wrote it. But it's an understandable mistake for such a long post.

In any case, someone as researched as you on Kurzweil should be aware that there is another side to anything. For every over-zealous transhumanist who believes the singularity will occur within our lifetimes, there are equally credible scientists who doubt that the singularity could even exist. From my own readings and thoughts (I'm largely unfamiliar with Kurzweil's work, though not with many of the ideas he proposes, since he's not the only one) I side with the latter group, as their reasoning seems far more sound to me. I actually have a thread on transhumanism in this forum, and detail some of that reasoning if you're interested. It should be easy to find with the search.

But the main point is that you presented only those ideas, not their refutations or competing theories. In doing so, yes, it came across as sounding inevitable in your opinion. Being acquainted with only one side of an issue, or at least presenting only that side, is the definition of bias. You say that you remain skeptical of some of what you wrote, but your own admission of belief at the end their as well as your presentation of the material speak to a lack of skepticism for much of it.

Also, you seem to be confused as to the difference between data processing and application of it. For example: Computers could indeed "terraform the planet" in a mathematical or physical model. Yet the logistics of completing such a transformation remain far outside our, or any computer's, ability to manufacture and complete. Not to mention the political, cultural, and religious roadblocks that any such movement would encounter....which, collectively, would be enough to nullify it to a crawl, not an exponential rate of increase.

Also, the main points you outline for Kurzweil are remarkably similar to another physicist, Frank Tipler (Author of, most famously, The Physics of Immortality). As with Tipler, within the main points the logic seems sound enough, but it is between points where large logical gaps remain.
- If and when we reach the smallest that cpus can become, and there is no reason to believe it will happen soon, he also assumes that the next step will be to convert all of the earth into such material. This presumes consciousness in machines, as well as independent thought, an entirely different phenomenon than the "as small as possible" that he mentions, and decidedly harder.
- Earth as a giant computer: physical limitations mentioned earlier.
- Colonization of space, assumes widespread space travel will be possible, as well as sustainability of humans (or machines by this point, I suppose, in his theory). Life being synthetic doesn't ensure self-sustainability, so massive obstacles would need to be overcome just to maintain this on a global level....unless he's theorizing an end to entropy itself, or frictionless operation of machines, which would be yet another large jump.
- Colonization of the universe: to its credit, it mentions the problems inherent in FTL travel, however it doesn't attempt to reconcile it and blithely assumes that a solution will be found.
- Things like true immortality, overriding the laws of physics, interdimensional travel, at this point can hopefully be recognized for what they are: speculative nonsense that comes at the end of an overly optimistic list of assumptions.
- The last, and biggest, hurdle is that the Singularity may not be possible. History gives us the illusion of increased rates of change, when in fact progress is and has always been incremental. There are more "small" things to build off of as time progresses, and they group into bigger things (think of the hundreds of small inventions that help make a car, for example)....so incremental, gradual change takes on the appearance of faster rates of change simply because there is more to build off of than the past. But it doesn't mean that the rate of change itself is increasing, which is the very basis behind the Singularity.

A more trained eye would likely find more flaws. For myself, I copied much of that for my memories of scientific articles criticizing and debunking Tipler, whose conclusions are similar, and equally as unfounded.

I don't mean to sound negative, btw. Your interest is encouraging, and its certainly an interesting field of study. I just think your enthusiasm is misplaced.


I call my perspective skeptical. Maybe you're not aware of previous convos I've had about AI. Oh well. I've stated before that I am not even sure that true AI is even possible. Oh, I'm sure that strong AI will exist, that is inevitable...but sentient AI is another story. Machines that are programmed to teraform a planet? Easy. That is almost within our capabilities now...it would take a hell of a lot of computers and programmers, though. (which, technically, makes it actually out of reach...but still plausible.)

An AI future is inevitable. A sentient AI future is what I'm skeptical of. You can call that overly optimistic if you want. I just see it as unavoidable based on current efforts. Cars that can drive themselves over terrain that most humans would mess shit up on over 142 miles? Yeah, that was impossible ten years ago.

Also, I eat up any and all skepticism or negative speak when it comes to futurists and their theories. You won't have to act like I'm a bible thumping Christian who won't listen to reason....hahahahaha laughing


As far as humans going extinct? Eh, maybe...but probably not. If god doesn't exist, humans will create it/him/her eventually. Sounds weird, I know...but it's possible.


__________________

Old Post Jul 10th, 2008 02:21 AM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

Fair enough, good talk. Just know that guys like Tipler (and I would imagine Kurzweil just as much) receive boatloads of legit scientific criticism for their ideas. I like the line of thinking, and am very interested in transhumanism...they're just overly optimistic about it, and let their ideas get ahead of themselves without properly justifying each step.


__________________

Old Post Jul 10th, 2008 03:08 AM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

quote: (post)
Originally posted by DigiMark007
Fair enough, good talk. Just know that guys like Tipler (and I would imagine Kurzweil just as much) receive boatloads of legit scientific criticism for their ideas. I like the line of thinking, and am very interested in transhumanism...they're just overly optimistic about it, and let their ideas get ahead of themselves without properly justifying each step.



I probably hold to Kurzweil's ideas better than others only because he has been absurdly prophetic with his more current predictions. You'd think the evangelicals would be all over that... shifty


__________________

Old Post Jul 10th, 2008 06:05 AM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
P23
Restricted

Gender: Male
Location: WESTSIDE ~haermm~

Account Restricted

i believe 50 years from now man will live on mars. if you look at mars carefully it is liveable. as for human extinction anything is possible. i do believe history repeats itself. if we became extinct i dont wanna =be in some muesum


__________________

i like to sit on teh ****ing curb drawing lines of dead people on teh ****ing ground
u kno who u ar and im comin to ****ing get u

Old Post Jul 11th, 2008 04:01 PM
P23 is currently offline Click here to Send P23 a Private Message Find more posts by P23 Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

quote: (post)
Originally posted by P23
i do believe history repeats itself.


A classic line, and not without historical precedence, though it must be prefaced with a few qualifying statements.

The laws of nature bind human action to a limited sphere of action. So too does our evolutionary programming, to an extent. So too do basic social forces that have existed throughout history (economic fluctuation and need for stability, survival instinct, procreative desires, religious, etc.). Given these factors that are present for us all, and will do so for the foreseeable future, there's only so many permutations of events that can take place.

Imagine categorizing historical events into genres. No one act will repeat itself in the details and historical implications, for the individual possibilities for action are endless. But overarching themes, struggles, causes, and effects would seem to form loose patterns. This is due to the restricting forces I mentioned earlier. The Holocaust is a unique event, for example. Yet human genocide based on arbitrary distinctions is an event that has repeated itself hundreds of times (most in much smaller forms) throughout history. The underlying ideas of intolerance, fear, and in-group desire for dominance (Aryan in the Nazi case, though usually at the local or tribal level) are all present.

So is history cyclical? Yes, but only conditionally. There aren't prescribed cycle-lengths for various events to repeat, nor a guarantee that an event will loosely repeat itself ever, nor will the circumstances ever be identical in the details, only in their overarching qualities and causes.


__________________

Old Post Jul 11th, 2008 05:30 PM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

quote: (post)
Originally posted by DigiMark007
So is history cyclical? Yes, but only conditionally.


You may already be meaning this...but it is cyclic when a pattern or definition is applied or looked for. We tend to look for order in things. That's one of the things that makes us human, isn't it?


I'd like to think of it like the uncertainty principle (which, to me, is really what philosophy is about...applying definitions and perspectives to "stuff", but by doing so, something else is lost or a another perspective can exist that is not necessarily incorrect) The more accurately you define a particle's velocity, the harder it is to define the space it occupies and vice versa. Not to say we lose an otherwise useful perspective when another is applied, but it is just another interpretation of historical "cycles".

There are no patterns, if you want to nitpick, and just about everything is a very common pattern down to the human, if you want to get liberal.

Born, survive x amount of time, die.

I'm not sure if I am mirroring what you mean.


__________________

Old Post Jul 11th, 2008 08:24 PM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
Digi
Forum Leader

Gender: Unspecified
Location:

The uncertainty principle is quantum mechanics. You're trying to apply it to sociology? Kind of a big leap there, imo.

Like I said (which you re-worded at one point): all events are unique, since the combinations of possibilities in the world are infinite. Yet, as applied to regular human understanding, we can see similar causal forces at work in a variety of situations, since our actions are limited by our nature, which does indeed have a limited number of driving forces. We're not viewing it from a physical perspective, but from a social one, which is simply a different perspective, and no less valid. Because of course there are no cycles if we break events down into a meaningless reductionist mess....but that's not what I'm talking about.

So like I said, cyclical, yes, but conditionally so, because we must recognize the uniqueness of every event before we perceive the connections, and so learn from them.


__________________

Old Post Jul 11th, 2008 10:23 PM
Digi is currently offline Click here to Send Digi a Private Message Find more posts by Digi Edit/Delete Message Reply w/Quote Quick Quote
Symmetric Chaos
Fractal King

Gender: Male
Location: Ko-ro-ba

quote: (post)
Originally posted by Emil Blonsky
Our sun isn't large enough to go supernova. It'll certainly die, as will Earth, but won't go supernova.


Stars like the Sun get larger as they die. The sun will still be directly responsible for killing the Earth.

quote: (post)
Originally posted by Emil Blonsky
As for humans...I dunno, maybe if we smarten up and spread our race across the universe we'll survive for a long time. But by then, if we evolve, we may become a different species entirely.


I prefer the take that rather than becoming a new species we'll become a variety of new species. Humanity has never always gone in one direction and I doubt we'll all go in the same direction if and when we attempt to colonize other planets.

The idea is that best way to live on Earth =/= best way to live in space =/= best way to live on Omicron Persei 8. Genetic modifications to make people capable of living comfortably on another world aren't terribly unrealistic and the course of evolution would be dramatically different on different worlds.


__________________



Graffiti outside Latin class.
Sed quis custodiet ipsos custodes?
A juvenal prank.

Old Post Jul 11th, 2008 10:49 PM
Symmetric Chaos is currently offline Click here to Send Symmetric Chaos a Private Message Find more posts by Symmetric Chaos Edit/Delete Message Reply w/Quote Quick Quote
dadudemon
Senior Member

Gender: Male
Location: Bacta Tank.

quote: (post)
Originally posted by DigiMark007
The uncertainty principle is quantum mechanics. You're trying to apply it to sociology? Kind of a big leap there, imo.


Sorry, I'm a human so naturally, I'm trying to analyze something by finding a semblance in something else I'm familiar with. I was more or less saying that as you(ambiguous "you") focus one area or concept, you become less focused on another or it becomes harder to define a perspective in terms (or in semblance) of the currently held perspective.

Who knows what types of patterns could be derived from human behavior. What if a sentient extraterrestrial, with an entirely different set of "anthropomorphisms"*, was to observe humanity's history in its entirety? What would it think? Would it come to the same general assessments as humans or come to the same cyclic conclusions? Wouldn't it "lose" some sort of essence of our humanity because of it's own set of expectations? (Back to that focus thing in the uncertainty principle.)

quote: (post)
Originally posted by DigiMark007
Like I said (which you re-worded at one point): all events are unique, since the combinations of possibilities in the world are infinite.


I follow you here. That makes sense.

quote: (post)
Originally posted by DigiMark007
Yet, as applied to regular human understanding, we can see similar causal forces at work in a variety of situations, since our actions are limited by our nature, which does indeed have a limited number of driving forces.


Okay. I follow you here.

quote: (post)
Originally posted by DigiMark007
We're not viewing it from a physical perspective, but from a social one, which is simply a different perspective, and no less valid. Because of course there are no cycles if we break events down into a meaningless reductionist mess....but that's not what I'm talking about.


Reductionist mess...

I think this is where I'm losing you. I thought that approaching subjects with a reductionist's perspective is simplifying things to the most basic element/s. It sounded like you were using reductionism to indicate that any one of those events in and of themselves are just random occurrences governed, if even slightly, by the laws of nature by which we are all in some way fundamentally driven. Then, looking at those individual events (via a reductiontists perspective), one could derive a meaning or pattern to humanity. (Kind of like putting together a one dimensional puzzle and when it is complete, it forms a 2 dimensional experience...like holism.)

Do you feel that even if we define all human events in their entirety, one can still not define what it is to be human?





Damn, I think I'm rambling now.

quote: (post)
Originally posted by DigiMark007
So like I said, cyclical, yes, but conditionally so, because we must recognize the uniqueness of every event before we perceive the connections, and so learn from them.


Hmm...

So you are not really talking about holism, then?


More on topic, though.

After reviewing as many of these human events as possible from an anthropological and historical perspective, could one derive the outcome of humanity, thereby answering this thread's intention?

Well, I think I've reviewed tons of human history and the good, the bad, and everything inbetween or unrelated, and I feel that I have a grasp on humanity. It may not be very strong but rather nebulous, but I think I have a general grasp.

I don't think humans, as I'd like to define them, will ever become extinct.(Barring our previous heat death conversation events.) Even IF we eventually become so embedded with technology that we woudn't be recognized as humans by today's anthropologists, some of our essence is still there. Maybe I think too much of humanity.





* They most certainly wouldn't be call "anthropomorphisms", but you know what I mean.


__________________

Old Post Jul 11th, 2008 11:46 PM
dadudemon is currently offline Click here to Send dadudemon a Private Message Find more posts by dadudemon Edit/Delete Message Reply w/Quote Quick Quote
All times are UTC. The time now is 06:52 PM.
Pages (4): [1] 2 3 » ... Last »   Last Thread   Next Thread

Home » Community » General Discussion Forum » Philosophy Forum » Human extinction

Email this Page
Subscribe to this Thread
   Post New Thread  Post A Reply

Forum Jump:
Search by user:
 

Forum Rules:
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is OFF
vB code is ON
Smilies are ON
[IMG] code is ON

Text-only version
 

< - KillerMovies.com - Forum Archive - Forum Rules >


© Copyright 2000-2006, KillerMovies.com. All Rights Reserved.
Powered by: vBulletin, copyright ©2000-2006, Jelsoft Enterprises Limited.