USH'S MATRIX GAME 2006 FOURTH ASSIGNMENT (PHILOSOPHY)- 'The Door'

Started by Newjak102 pages

"I think we all have our higher being. Mine is God for others it can be as simple as the greater good but I do believe that we all have a planned destiny that we either choose to follow or don't."

Lo Qi nods.

"Thank you."

"Um...Your Welcome?"

I'll give this one more day for responses.

"If you're religious, your ultimate purpose can be defined by a higher being, as Berserker believes. God, Buddha, Zeus, whoever you chose to worship usually has some goal to set for a human life.

"However, I'm not one of these people. I'm not sure there is an ultimate purpose, other than survival."

'Not sure' doesn't quite cut it- yay or nay, after consideration? Or at leasta considered reason behind being unsure.

I will say nay. I have no reason to believe there is an ultimate purpose that isn't set by religion.

"I agree with Melkor. We have no purpose - only that which we choose for ourselves."

"Humans fill only the purpose they choose for themselves."

"A shame," says Lo Qi. "Such a petty way to view the world, where the only justification can be your own selfishness. I had hoped more Humans would look for somethihg much higher."

And indeed, philosophically speaking this is virtually nihilism, which is incredibly uninteresting.

I also rather feel some of you have parroted a purpose answer based entirely around yourselves, hence the comment of slefishness. "Humans have no purpose other than that they chose for themselves." is not really an adequate answer to Lo Qi's question, which was whether the universe, existence itself, had any point. Survival for what reason? Why bother at all? Life, death, love, war, what the heck is the point of any of it? Of living at all? As Lo Qi says, your only justification for any action then is "Well, *I* want to do it. No other reason." Great.

I know a position of extreme relatavism/no ultimate point is very popular on KMC in general, but from a philosophical point of view it's almost a non-starter. As we began with, the first question philosophy asks is- Why are we here? Philosophy is the branch of human experience that tries to answer that. It seems few of you really want to even try and do that, which calls into question the point of being on this path at all.

It's a shame, of course, if so many people hold such... flat views. All of you that hold such views challenge the idea of being heroes- you really do just look like selfish sods, doing stuff just because it pleases you. It gives you no platform from which you can reasonably criticise any action taken by any bad guy in this game, and I feel this is going to make you look pretty bad when it comes to considering your final response to all this. And being unable to justify a reason for action... you probably won't score.

Melkor- in kinda contradicting himself, possibly due to time pressure- has more room for manoeuvre than others. Anyone can try and change their minds, of course- or perhaps giving more thought to the area is a better way of putting it. But time is short.

But, we have no time to linger.

"I agree with but one of you," says Lo Qi. "Would any of your views be impossible for Machines to hold?"

Well, I want Melkor to have a liberal and humanistic view, not nihilistic one. He believes in personal development, and that the more enlightened the person is, the more likely he is to not abuse his freedom in regards to others- he would agree with some ideas of Kant here- that in ideal society everybody would be allow almost unlimited freedom, because almost everyone would follow "moral law" and something like Kant`s categorical imperative- and they would do out of their own free will. Because it is the most reasonable for sentient beings.

Well, the issue with your reply was that you said there was no purpose, and then changed your mind. You appear to be saying that this process of self-discovery, or self-actualisation, is purpose for a Human; it's harder to say that as a grand, universal purpose, though, or one for society.

But fairly much, everything you say points towards you thinking that there IS a point to it.

(I also feel it should be pointed out that nihilism absolutely buggered poor Fire last time around, who didn't believe it himself but adopted it as an RP measure. He could answer every question reallly easily, and then had absolutely nothing he could do with the final conundrum. A lesson to learn from)

Well, but personal development is very broad way term- people may want to develop in diffrent way- though Melkor believes that ulitimately intellectual and moral development are connected with each other.

Advanced society should emphasize such issues as education, free network of information etc.- it should not force anything on people, but give people chances to develop, though Melkor has doubts, because people often choose stagnation and easy answers.

Anyway, sentient life is unique and has great value in itself.

But why? Why to any of that?

Because learning and understanding is goal in itself- we don`t have ultimate answers, but we can get closer, and we have more options.

Okidokey then.

Originally posted by Ushgarak

"I agree with but one of you," says Lo Qi. "Would any of your views be impossible for Machines to hold?"
"I don't think so from what I've seen of machines can hope and dream that there is something better something waiting for us, and more importantly that we must choose between our destined path or going our own way."

"I think the ultimate point of life is to live," Heph says. "We're handed this fraction in time that's gone in a flash, so why not put it to good use? Embetter society and the standard of living for everyone else, make someone's day better, take out your land lady's trash, pass on what you learned to others so they won't make your mistakes. Personally, I want to help others, and I think that that is generally held by...well, lots of people. It's human instinct. We help the sick, the poor, the elderly, even though it might be costly.

"As for whether or not Machines could hold any of our views...well, they're sentient, aren't they? The reason Machine society is not advancing is because they don't want to just slaughter tons of Machines that need to be replaced. If nobody held my view in 01, then that moral dilemma would not be present at all."

"If I could blend the religious answer and the altruistic answer, I would," Sirin replies. "I don't know if religion applies to AI, or if they follow any sort of religious practice. I agree that the issue of morality in killing Machines that need replacing would not be around if being altruistic and helping build up others and society were not a very very very high goal, if not the ultimate goal."

"As for religious practice, generally speaking Machines do not, no."