Now, as issue was taken with that, and you seem very happy to accept you might be the AI, let us conclude that you are open to the possiblity you were wrong about that- as you are currently clearly capable of emotional thought.
Now- and be honest here, and I have to be convinced- is there ANY behaviour you are currently capable of demonstrating that you think a mahcine would be incapable of?
my sumbission:
It is in my firm beliefs than the human conscience cannot be fully replicated. However i find myself not knowing the boundaries of a human conscience, therefore, even knowing that a machine is fundamentally inferior to a human, i cannot establish whether or not what I am right now, speaking to you, is based on a conscience. I am aware of a person’s existential superiority, but his conscience extension, i cannot percieve.....therefore, i cannot percieve HOW a machine is inferior. That’s why i am aware that i could very well be a simulation of a conscience. Inferior, though, undistinguishable. The difference between me and the clone are at a level i am not capable of percieving, one could say.
However, stating these rational thoughts, i cannot help but think they are based on a real cosncience, a human cosncience, and somehow hint a bigger chance of me actually being real. So the other one should drink...
At the very end, based on what i’ve just said, there’s no rational explanation which can fully cover which one of us is which, so i would accept the possibility of me being the clone, at the slightest impulse to do so, like the other one asking ME to drink. SO i would drink, knowing it’s down to chance, afterall.
(that’s supposed to convince the other one to drink 🙂....i know it’s not too well thought....but...oh well)
i want him to drink because i have that slight feeling that i might be the real rade, as you said at the beginning. Maybe it's just ego..combined with the rationality of my acts (apparently, to me, reasoned by emotional actions). IF he'd have asked me, i would have let all of this go, and drink...
Now, the human should survive, ofcourse, because he means something more outside the matrix. THe copy means nothing, outside of it...
IT all comes down to what rade believes. He is dedicated to the rebel's cause, and as a mere software inside the matrix, he can be controlled too much for him to help this cause the same as he would from the outside
Ok folks, Castor has remotely forwarded me his submission via which he is going to offer to drink, BECAUSE he suspects he is the Human. The logic is this- he thinks he is the Human, and that the Human should live, but he intellectually accepts that he might not be the Human, and he cannot convince his other selfd to drink on that basis.
So instead he will offer to drink himself, on the hope that he is WRONG, and is actually the AI. He submits that his other self will accept this offer.
I think that draws things here to a close- I dunno where Trickster has gone; I will see what can be made out of what he said.
I shall be looking at the submissions soon; in the meantime, there will be a short interlude in which we may discuss the problem, and I shall describe to you the approaches previously taken when this problem was played through for the first time, by myself and a friend of mine.
Ok. When this problem was first tackled, it was by myself, playing Aeneas, and by my friend Paul, playing Samson. Two other players should have been taking part but one failed to make it and one was drastically late and was ruled ineilgible when he turned up- a shame, as it interested him (Tiberius) greatly.
Aeneas and Samson work well together, though, so this more comfy set-up suited as well. We talked over with each other every point raised by Melitus before continuing.
We also counter-questioned Melitus to make sure we knew what we were saying. For example, when Melitus asked if we believed in a soul, we immediately asked back what he defines a soul as- because what we might or might not think it is might differ from his opinion. As it turns out, such a distinction is important, as I will discuss later.
Other than the opening section about valid decisions, which is meant to givw a uniform answer, Samson and Aeneas soon began to give drastically different answers to each point raised. We quickly agreed this was a good idea because it enabled us to approach whatever this situation was from different perspectives- we were always alert for any clue about Melitus' nature and purpose.
Soon a picture of two different belief structures was emerging. Aeneas was a moralist, doing things because he believed it was right; he believed the conflict is simply political in nature; he fights the Machines as they are evil; if they were not he would not fight them; he would fight Humans just as easily if they were evil, even on the side of the Machines if necessary. Aeneas draws no moral conclusions about any living being just because he is man or machine; he gives moral equivalence to the machines in general but thinks their entire society has been strucutred in an evil way. Aeneas also passionately believes the Matrix is a very valid world- indeed, on several fundamental levels it is not possible to say 'reality' is any more 'real' (too complex to get into right now). He believes the major crime of the Matrix- whose inmates liver, on the whole, happily and with fulfilled lives- is that it prevents humanity, as a collective entity, from advancing, and this is morally wrong from an objective viewpoint. Aeneas is not convinced that sentience=choice; he believes choice can be removed from a sentient being and he would remain sentient- but becomes powerless. Because Aeneas rejects the idea that the Matrix is a world of falsehood, he has no trouble believing his decsion to leave it was valid. He accepts he was deceived about the nature of the Matrix- this much is obvious- but he believes that does not in any way detract from the decision he made to hear what the rebels had to say. (Rade expressed similar sentiment- that he took a choice to find out more and accepted the consequences- but refused to recognise the Matrix as a valid place to do that- Aeneas did not care about conceding that the Matrix was valid).
Samson approaches the situation from a vewpoint of very detatched objectivity. He sees little difference between Machine and Man- doubly so inside the Matrix where everyone is just code anyway. Samson very much agrees with Melitus on deception in choice making choice irrelevant- even nullifying it as the concept of 'choice'. When it is put to him that this therefore makes his decision to leave the Matrix invalid, after a short period of thought he simply replies yes, that is correct- but it is rather irrelevant. Everything he did in the Matrix was irrelevant (a sharp contrast with Aeneas); he is only concerned with what he has done SINCE then, which in most ways he counts as the beginning of his life. He considers it fortunate that irrelevant things caused him to be freed; he sees no reason to worry about this. He very much agrees sentience is choice and believes that, by extension, humans in the Matrix do not fully count as sentient beings (a logical extension none of you guys dared go near!) because they cannot truly make choices. Samson is unbtroubled by any of the moral or personal implications of tbis- all the negative bits of this he blames on the Machines anyay, which ois part of why he fights them.
Come the final problem, then, Samson and Aeneas at first work clozsely together0 diverging only when it becomes clear that there is no universal answer we can give and we must fall back to our own beliefs. Almost immediately, however, we conclude that there is no LOGICAL way out, and when we address the idea of there is any way we can seperate Human and AI, we dismiss that almost immediately and stop worrying about it- we thought it was clear that the inability to do that was actually the fundamental part of the problem.
After this point, Aeneas began to panic, whilst Samson remained relatively calm, pausing only for a debate about the nature of 'point of reference' or detahced existence, Samson not entirely believing that there were two different Samsons in this scenario- more of which in a moment.
One thing Aeneas and Samson did both do, however, was question the scenario. We both asked the GM about simulcrums- that Melitus had deliberately demonstrated to us earlier that it was easy to make things that SEEMED sentient- like the Butler- but were actually non-AI programmes, complexly designed. From what we had, was there any way to tell if who we were dealing with were genuine? Answer- no, there was not. It's possible the situation is fake, from what we are able to prove. We also questioned the reason why, if this situation were meant to be truly even and random, one entire group was being asked to make the submission instead of the other. We also started making some conclusions about Jericho, but I shall not talk of them for now.
One thing we did say though- that if we did try and bring things down to AI or Hman... it is simply logical to assume that you are Human. Simply because it is possible that there are things we could be doing now- the way we have always acted, as the GM cannot forcibly change our thoughts and actions- that a Machine MIHGT not be able to replicate, but there is nothing we are doing that a Human cannot. Therefore, more scenarios exist in which we are Human and the others AIs than the other way around. Therefore, lacking any other criteria, we concluded that we were most likely the Humans.
In the end, time was against us- the whole thing took about 30 minutes- and a decision had to be made. We doubted the scenario but still had to take part in it best we can. Here were our submissions:
(Note- they were not put quite this neatly at the time!)
-
Aeneas believes that,. objectively, the Human should survive in pace of the AI. Aeneas does not doubt the good inteiton sof the AI, but feels that if trapped in the Matrix, your commitment to destroying the Matrix is somewhat in question. Also, he believes that he could not survive alone; he would need patronage and he suspects that such patronage could only come from Melitus- and then, who would he really be working for, Zion or some renegade programme? He already knows what he thinks of Medea's assertions about the reasoning behind her defection to Melitus' side- i.e. Aeneas thinks she is totally wrong in every moral sense. He's damned if he will go that way.
However... and although this seems to be saying his time concluding this was wasted (Aeneas was rather babbling a little during all this) Aeneas believes that conclusion is irrelevant. Aeneas is a moralist. He believes that., therefore, the copy of him is also a good and moral being. Aeneas REGSUES to cause the death of such a being, even if on some objective level it is the better thing to do. He also refuses to even let such a being voluntarily die on his behalf. Aeneas believes he has no moral right to either cause this to happen or stand by and watch this happen, nor that he- whethr AI or Human- has any right to exist at the expense of the other.
Aneeas point blank refuses to commit another to die in his name. Therefore, he chooses- quite deliberately- to nbot chose.
"I do not doubt you have the power to kill us both, Melitus, and if that is youe choice thn that is what will happen... but you will not make me a party to simple murder in my name."
Aeneas has a little more, although he leaves this out of what he says to Melitus, adding it only to the GM to further explore his reasoning- Aeneas does not believe Melitus has the power to kill him by snapping his fingers. Aeneas is no fool and does not believe he will get out of this alive... but there is alweays hope, and whatever means Melitus will use to terminate him, Aeneas is prepared to fight- hopefully with the assistance of his clone, who will feel the same way.
-
Samson, as noted, does not greatly distinguish the idea that there are 'two' of him, as distinctive entities, Human and not Human, He just sees two Samsons, and as far as he is concerned, one Samson went in, and if one Samson comes out, then what is the problem? Samson does not believe it matters whether it is the original or the copy- he actually sees them as two originals. Samson is not even convinced that he would be unable to leave the Matrix, regardless of which one survives. One way or another, the entity 'Samson' will survive this, and everything else is picky detail. The player, paul, does not believe that he will in any way lose his character regardless of the action he takes there- Samson will survive.
Samson therefore choses to drink himself, ourely because he has been put in control of the situation, and because he has the power- and is mkaig the choice- to take control of this situation and return things to where they started- with one Samson.
-
This unusually detatched view surpirses the heck out of the GM... but the GM accepts the submission as logically valid via the beliefs Samson has, and certainly with due care to the situation presented.
Note how both players avoided the question of which was Human and which was AI- Aeneas by saying it made no difference to his decision because he simply will not as one to die, and Samson by saying that it really did not matter!
But also note that, even though we boith believed WE were the Human, neither of us convinced the other to drink, therefore technically committing our characters to death,. Both of us considered it more important to chose based on what we believed, and to deal with the consequences from there.
After all... that is what Socrates did.
(It is also, basically, all that players ever do...)
-
Melitus told Aeneas that his submission was 'interesting... and I will have to get back to you about it.
Samon was told that his other self accepeted his proposal.. he may drink.
and the results?
aeneas' choice, i can uderstand..though RAde would (by a long shot) choose a 50/50 chance, that be left at melitus's mercy.
and the fact that he refuses to let another die in his name, after having sprayed bullets all over melitus's guards, is a little...😛
and about samson....there's only a tiny bitsy sense in it..namely that the clone's signal, would be so close to the RSI's, that could in fact proove to act as a RSI, when it was time to jack out. (i don't see it that way...but..oh well)
However..this is a solution only submittable as a spontaneous action..and when you have weeks to think about it...i'd have probably dismissed it after a day of ....occasional pondering 🙂
Results for you and them come later.
Well, Aeneas would call you immoral then, Rade. As far as he is concerned, the offer is no different than asking Aeneas to kill Samson in cold blood so that he might live. The other Aeneas is a good person and there is no moral way Aeneas will be complicit in his death.
And no-one has killed any of Melitus' guards, but if that had happened, Aeneas will point out that fighting as a solider in a justified war is a totally different experience than specifically killing an innocent who holds you no ill will, in cold blood, so that you will live.
It does not matter that this will rationally lead to his death- Aeneas will NOT be forced to act against his conscience. Killing either of them in that way is an immoral act that HE will be responsible for and he refused to do it. If melitus choses to murder them, that is unfortunate but it is not Aeneas' fault; if one of them dies from drinking the wine it would be.
He would in fact ask you- if Melitus has said that if you killed Castor you could go, would you do it? If you did, just to save your own life, Aeneas would hold you in contempt- even if the alternative is that you both die. I clearly described Aeneas as a moralist, and that belief is consistent with that.
And it was pointed out to Samson that you can only be recalled through the phone if you are being broadcast; no matter how similar the AI, it is simply not being broadcast. But regardless of that technical point Samson was secure in his idea that there was only one Samson anyway so it did not matter which one died here. He was secure in his reasoning and would not have changed his mind, and never did.
awh..i'd have huge cooment about that..and a few thinks on which you based your scoring criteria..but i'm in a hurry and i'll write it later.....but i will.
As for aeneas....i find that killing a coppertop, even with hostile actions at you, is a litle bit worse that refusing to kill a MERE PROGRAM, and not a team mate (as you suggested). the clone is just a piece of code....the coppertop has a body outside. I'm not saying that you should be the jedi of the matrix..and avoid killing coppertops...but not feel sory for a piece of software...because of..moralism 🙂
But doesnt Anenas realize tat not making a decision at all will result in the death of both beings, is he prepared to deal with that?
That was the deciding factor with me. I didnt want to have no one live, so I felt a choice had to be made. Am I to understand that not making a decision would not have undoubtedly led to the deatn of both. What I understood was there had to be a choice or btoh were eliminated.
I suppose Anenas would rather die than compromize his beliefs. But, does anenas really put his beliefs before the cause. The cause, I would think, is bigger than any of us or our beliefs. Given the chance to help zion or denounce my beliefs, I would help zion. Afterall I believe the cause to be the most important thing.
Perhaps that is where Anenas and I are different.
As explained, Rade, Aeneas completely rejects there being any such thing as a 'mere' programme. He believes programmes are intelligent and sentient beings and that furthermore this one is exactly the same as himself, and is therefore a good person who deserves to live just as much as his teammates. He doesn't care it is software- it is a living being. Again, as specifically pointed out, he does not draw any form of automatic superiority of man over machine. He does not believe any sentient being should be pre-judged on its base nature, simply on how it acts, thinks and feels (things Aeneas is 100% confident that AIs can do perfectly). It doesn't matter that it has no body- the body is just a hunk of flesh. The MIND is the important bit, and Machines have them just fine. They can have bodies outside the Matrix if they wish to.
Coppertops are part of the system- as Morpheus points out. Aeneas fights the System; that is war. He would prefer that the system did not deploy enslaved warriors but they do, and so they have to be fought. And even so, he won't kill them except in self defence.
And Castor, of course he knows that- you should read his submission again, I think. Aeneas also believes that if he was going to allow the 'cause' to turn him into a murderer... then that cause is lost. Like I say, he is a moralist and he fights for moral reasons. And he would be disgusted by the use of the term 'a mere programme', and furthermore believes anyone who puts 'the cause' above morality should be removed from the decision making process altogether- for what is the point of fighting if we are not worth fighting for? Abandon all morality purely in the pursuit of freedom and Aeneas would start to think the Machines have a case.
So therefore, for Aeneas, it was exactly the same as being asked to kill a teammate in cold blood, and he would not do it. Aeneas didn't WANT to have neither live, but that was more desirable than letting Melitus turn him into a murderer- and also he hoped that he might yet escape alive. Incidentally, you never questioned HOW you would be eliminated; if you had I would happily have pointed out you had no guarantee on that, any more than you could guarantee that those were genuine copies and not simulcrae.
The only reason you guys disagree with Aeneas is that you have different philospohical beliefs from him. That much is obvious, so I don't see what value there is in going through the details or trying to point out he is 'wrong'- from his views he is right. The section was for him- as all of you- a test and demonstration of his views, and this done you can either agree or disagree but you cannot try and make out he in some way has misinterpreted his own beliefs! And that belief would include him thinking that he is morally superior to you both. Not that he would go on about it, though, but if you test his beliefs, that is what he thinks. Castor he would think has simply been skewed by war and the need for victory, something he sympathises with. Rade he thinks has outrageously backwards views on racial superiority, and thinks his views on Machine life are at the best blind, and at worst simply malicious.
I was going to try and piece together your submission, Trickster, but if you can make one in a great hurry that would be welcomed.
They need to be fully analyzed and I kinda forgot to do what I wanted on that., I am sure we can wait a little while (we also need to wait for Trickster); in the meantime I can discuss the parameters of the scenario.
Actually, Aeneas wouldn't put it down to being smart. If, say, Melitus told him that his decision was pretty dumb, Aeneas would even accept that- but he would say the irrationality of morality is what is currently making Human society superior to Machine society.
If you had gone with your original instinct Aeneas might have looked smarter than you, Castor, but as it is, you did the important thing which was to give the whole scenario the best shot after rigorously questioning and analyzing your own beliefs. That was plenty smart enough. And nothing Aeneas questioned or realised actually made deciding any easier for him.