USH'S MATRIX GAME 2006 FOURTH ASSIGNMENT (PHILOSOPHY)- 'The Door'

Started by Lord Melkor102 pages

"Wouldn`t Zion have the choice if they win the war thanks to Virus eradicating the Matrix?

And the Matirx..... promises were not kept?"

"Oh no, they may well be. The main promise was a system of control that would make the future eradication of Humans possible if absolutely necessary. And it is not for me to state what choices Zion may face."

"But the system of control failed because Machines couldn`t predict that a large number of humans would reject the Matrix?"

"It's never failed. Jericho is right, Zion has no hope of success. The Machines will destroy it at will."

"Why haven`t they done it earlier then? Do I trully have to believe you?"

"Whether you believe me is up to you. And it has not been done... yet... because they have not yet seemed it necessary. This is a very complicated area, as the system of control is very complex indeed. I think it would be a distraction to mention it; especially because it has actually now failed with the release of the Virus."

"So what do you think will happen now? Where are you leading us with your lessons?"

Ush, do we feel more ill now? We were talking for some time.

Well, you have a headache, Melkor.

Let's see if others have any comments to make.

"So morality was the imperfection that the King described his successor having? Well then this is interesting."

"So then the argument that saved all mankind was one of morality in which morality won?"

"Its kind of interesting to think about all this. It seems as if humanity keeps trying to strive for perfection. Yet the machines have tasted it and seem to strive for imperfection now?"

"I should have known; Sennacherib seems extremely immoral," Heph says. "Machines having morals is not something we see all the time. I mean, the Xiao Emo didn't exactly show me anything notable in reference to morality. Sennacherib's not the only immoral Machine, right?"

"Well, the pursuit of perfection is an awkward one. As I mentioned before, if something is perfect but is not actually what you want, that perfection will work against you.

"Immoral can be a misleading term; it implies a deliberate working aginst that which is moral. Sennacherib does not engage with the concept of morality at all, good or evil. he just does what is expedient. Other Machines that have no moral sense are likely those who do not have much in the way of higher brain function. A sense of right and wrong has helped to build civilisations before, so there is no general programme of excising it from Machines. But with Sennacherib, it seemed an obstacle to his purpose. And indeed it was.

"But having a moral sense does not make one moral. All Humans, save those suffering from extreme forms of mental impairment, have a moral sense, or at least the capacity for it. It does not follow that all Humans act morally. But every Human has that potential. Sennacherib does not. So to him, it is nothing but imperfection."

"I suppose that further irks him, in regards to mankind," Sirin says. "We all have a sense of moral judgement, so we're imperfect to him."

"Well, he has many areas in which he can show the lesser nature of a Human in comparison to him. But in a society founded on the ideal of Human inferiority, he is one of the machines with the highest regard for your kind."

"What I mean about the idea of imperfect is that so far with everything I've heard it seems like Machines are trying to be just like us to some extent."

"They made the King to be perfect he turned out bad. They made the first Matrix perfect it backfired on them. They made the first agents indestructible they were bad. It just seem the more I listen to this the more Machines have found out being perfect isn't always the best thing"

"It is more as I say. Perfection works very well indeed, but if you have made a mistake- like in all the areas you mention- then a perfect mistake is going to be a larger problem than an imperfect one. It cuts both ways, bad and good."

---

Time to move on.

New words form

DIALECTIC

"First of all," says Lo Qi, "to keep you happy, an answer to a question you have been asking. Though we shall take several steps to get there.

"Ignoring the issue of the Matrix, and looking at the time after its construction. What was the largest problem Machine society had in functioning as a society, immediately after the war? A problem faced by many cultures over time."

"I imagine they had to work out what came next - what should they be trying to do?" Azrael says.

"Indeed so. For a long time, all remaining societies on Earth- wildlife having been extinguished- had been focussed purely on survival, both Man and Machine. With the war over, our society was ill-prepared for any other purpose."

"Must have looked pretty grim, even for the victors."

"The mind can make a hell out of heaven, or a heaven out of hell. But having won a war based around the right to exist, it was only natural that some of us begun to wonder what existence was for. How were we to develop? Our society was starting with an extreme disadvantage- the power crisis. You see, whils the Matrix can provide vast amounts of power, it also takes a vast amount of effort to keep running. Not at all like the Sun, which rather handles itself. There is only so much we can manage- our resources were relatively limited. It is from this basic position that the mantra of purpose became so very important. Every Machine must have its purpose, must contribute, else... it is simply a waste of the power we can produce."

"So the loss of the Sun really hurt your society's ability to advance beyond the role of purpose?"