Let us start with this observation: "In Levy’s presentation, the language deployed to describe the ideal robotic spouse was inadvertently telling: ' All of the following qualities and many more are likely to be achievable in software within a few decades. Your robot will be patient, kind, protective, loving, trusting, truthful, persevering, respectful, uncomplaining, complimentary, pleasant to talk to, and sharing your sense of humor.' Suffice it to say for now that this statement reveals quite a bit about the priorities of the author, and very little about the task of technologically approximating any existing human woman." Or to put in in other words: the robot is to be giving out on the one hand and undemanding on the other. So that word, "love": this describes not something to be loved, certainly not the subject of agape. It is instead something to take "love" from.
And that takes us to something that Slade does not touch upon. She does remark upon the endless optimism of AI futurists that the human mind can be simulated, but it is, after all, 2017 and not 1976, the year that Joseph Weizenbaum's Computer Power and Human Reason was published. Weizenbaum was shocked to discover that many people were willing to treat interaction with the manifestly stupid ELIZA program which he had constructed as if it were interaction with a real person. The lesson I see in this is that the Turing Test (conversation with a computer being indistinguishable from that with a human) is in practice not successful, because people aren't good at it. Weizenbaum went on to attack the whole notion of external behavior as a proxy for the interior life of the mind, and this is particularly pointed in a situation where that internal life really seems, in the end, to be unwelcome.
After all, the terms of Levy's ideal are frankly servile, and it is a short distance from serving to simply being used. One senses from all the emphasis on sexual companionship that these futurists have no problem with the idea of a robotic sex toy which cannot be raped because refusal has been edited out of its humanity. It one telling footnote, Slade remarks that "While writing in this field gestures toward both male and female robotic lovers, the predominant assumption is clearly centered on the idea of a robotic woman," and I imagine that feminist analysis of this would be unsparing. She remarks on the degree to which various alienations seem to drive the quest for the robotic companion, but there is the irony: if it were possible to fully emulate the humanity of a real woman, it does not seem (in the minds of these would-be Pygmalions) desirable to do so. This she does note, but I would go further: not only would they not desire it, but I believe that they would on one level fail to see that something was missing. That is, they would see that some "undesirable" impulses were absent, but they would fail to grasp that they had obtained a less-than-human, because all of these "faults" arise out of the will.
So here we are, right back in our own Eden, but we don't have to worry about Eve taking the fruit because she lacks the independence to do so. I am thus irresistibly reminded of the creation of the dwarves by Aulë in the Quenta Silmarillion. Ilúvatar sees how the dwarf fathers were created outside of his one, true creation, and he challenges Aulë's deviation from the divine plan; but when Aulë moves to destroy them, they quail, showing that they have been given life and wills of their own. I imagine our ideal companion robots putting up no such defense, and indeed one can imagine a robot who has specifically been created to be tortured (as I believe at least one SF write must have already depicted). And thus the alienation is complete: its creators put off from real human congress, the ideal robot companion is made a perversion of humanity, specifically so that it can be used and abused without qualm. We may not be able to give it a soul, but we will surely see to it that it does not accidentally get one. So much of robot mythology fantasizes that they may exceed our humanity and displace us, but here it seems that in our relationships with them, they may exceed our humanity because we diminish theirs— and with that, our own.
No comments:
Post a Comment