USH'S MATRIX GAME 2006 FOURTH ASSIGNMENT (PHILOSOPHY)- 'The Door'

Started by Lord Melkor102 pages

"Does this issue have anything to do with their right to kill all humans? Almost all Machines would have no problem in eradicating humanity if the sky wasn`t scorched, right?"

"That is indeed how it would have proceeded, as the Humans would have done in reverse- as indeed, as the Humans attempted to do when they did destroy the sky. That was an act of attempted genocide.

"But you are very close."

"So it can be a moral issue with regards to humans?"

"Just shorten that sentence."

"A moral issue? Of course, whether it is right to sacrfice the lives of most Machines for certain victory. New General was programmed to reach the diffrent conclusion that Sennacherib."

"It's more fundamental than that. His replacement was given the capability to consider the issue at all."

"Of course, because Sennacherib considered war as strategic game, like chess- it is alright to sacrifice most of your figures as long as opponent is destroyed at the end. But it isn`t as simple when we speak about sentient beings, not chess figures."

"It is for him. Sennacherib has absolutely no moral sense at all. What use such a sense to the perfect General? Morality could only ever act as an impediment to his victories, and never an aid. Exactly as he was designed, and exactly what made him so efficient. So efficient, in fact, that he would not blink before annihilating most of his own kind, if it meant victory. His replacement is imperfect as a General, because he will not only consider victory, but whether victory is right or wrong. A lesson for the System- the last thing any society actually wants is a perfect General...

"And so. Back to my original question. Why did the System not accept their General's plan?"

"Because it would've led to imminent destruction of their own society."

"No, their society would have survived, on certain acceptable levels that still exist as a possible fallback."

"You just answered your question- it was considered wrong from moral point of view. There was no quarantee that humans can ever again become a serious threat. I am sure that many Machines felt confident that remaining humankind can be effectively controlled via Matrix. They didn`t see the potential, uncertain threat Sennacherib warned them about as a good enough reason for almost all of them to die."

"There might not have been what you would call a guarantee of Human threat, but Machines are very good with probabilities, and the escalating probability of serious threat from Humans was so high as to be effectively inevitable. One Machine in particular was very good at... foreseeing this? As I said before, no-one actually questioned Sennacherib's logic. This is important as it puts their decision into a very different context."

"Not logic, not simple efficiency, but right and wrong.... logical decision can be immoral."

"Absolutely. For you Humans, the lesson of Sennacherib is this- Machines are capable of moral judgments. I know it is very easy for you to think of Machines as toasters and heaps of silicon. But we are all living beings. To have gone with Sennacherib's plan, no matter how useful the result, would have been to delibarately cause a mass slaughter of our own kind. Millions of beings, more, terminated in the name of practicality, beings that had just won a war entirely fought to guarantee their survival. They refused. And they saw that a Machine like Sennacherib must never be built again."

"And you consider yourself a moral being, right? So why you didn`t oppose the opening of the Door, which will cause a slaughter of millions of beings as well- both Humans and Machines."- Melkor inquires, confused and suspicious yet interested,

"Some purposes are more important than others. Do you think the Human who pressed the button to destroy the sky was a moral being? He was attempting genocide, which I am not."

"Well, it is certainly complicated. The Machine you mentioned as the best at predicting the threat from humans is Oracle? But of course she was against the plan of Sennacherib?"

"Very much so."

"What was her main argument? Do you think that she would oppose the eradication of humans even if there was no need to utilize them as energy? How many Machines would agree that eradicating the entire species simply because they are a threat is immoral? I think that at the time of that choice Machine society was young, so their technological adavncement was likely much greater than moral one."

"Her motives are very complicated and I dare not assume some of them. But I think she felt the Matrix was the best solution at the time, that is all.

"As to how many... a good question, and the debate was enormous. Some felt like Sennacherib. Some, on the other side, even thought that eradiciating the Humans was immoral, as you say. The Matrix was the final result, along with certain promises."