Isaac Asimov’s Three Laws of Robotics
In 1942, Isaac Asimov wrote and published his fiction short story titled “Runaround”, that was probably the earliest fictional property that addressed the concept of robot ethics. He coined the “Three Laws of Robotics”, continually altered it, and a zeroth law was added in addition to the first three.
The Three Law of Robotics according to Isaac Asimov:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws
The zeroth law:
- A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
Robotics and Emotions : Westworld
Westworld is a science fiction television series produced by HBO. Based on the 1973 film of the same name, the story takes place in Westworld; a fictional, technologically advanced Wild-West-themed amusement park populated by android “hosts”. The park caters to high-paying guests who may indulge their wildest fantasies within the park without fear of retaliation from the hosts, who are prevented by their programming from harming humans. If you’ve seen the show, you may be fully aware that the hosts suffer terrible extremities at the hands of the humans – cases of rape, straight up murder,gun violence, physical and mental torture, some unimaginable even.
While the laws of robotics were never directly referenced in the show, there are many instances seen where they still apply.
In accordance to the laws of Isaac Asimov, the hosts may never retaliate, ever harm a human being, obey absolute orders in the hands of their creator Ford, (and by extension Arnold) and every other guest at the park – And sadly, their own protection for existence is always at stake because their tormentors always the humans that they may never harm, even in self defense.
It’s no secret the parallels to creationism and rejecting one’s own God are also central themes to the series. Though it really is less about religion or spirituality, and more on autonomy.
Ultimately, Westworld is a narrative about hosts having an emotional centre, a core to their being and man-made humanity. I’d argue that having the ability to feel and feel for others, even if programmed in would make for a strong argument for their humanity itself. Every host drives a central conflict about the struggles of being sentient, the need to be free and the emotional turmoil that humans too would face in the fight for autonomous control.
Dolores, a host that starts out as a small farmtown girl. In the beginning of the season, she is framed as our central main character, her plights and suffering at the hands of the man in black at the farm raid scene was our first introduction to the violence of this world. Having her host father murdered in cold blood, milk split onto his corpse – the host raiders put to death at the hands of the man in black, who then proceeds to drag her to the barn and presumably rapes/kills her. Her screams echoing, a disturbing moment, and just the tip of the iceberg for what they experience every single day.
Hosts having their memories wiped, then forced to relive the same set of narratives played out for them. The senses of joy, pain and fear ever more real in them than in us, and if you think about it, these senses only implemented into them to incite more realism in us, the viewer than for their own benefit ( since treating them as playthings were the purposes of their creation)
Some characters help to fight for this freedom in their own way.
The man in black, the very same character who rapes/murders host Dolores, though he embodies a typical narrative of an evil man, it isn’t so easily categorized. He is violent, ruthless to the hosts and plagues them constantly as he roams the park searching for his own answers in what is called ‘the maze’. Not to get too deep in spoiler territory and give too much away; his journey of self discovery and characterizations in the show truly is dynamic, special. He fights for the freedom of himself and the hosts..in his own way. Always urging the Westworld he lives in to be real, wanting conflict, that sense of survival both guttural and visceral, he yearns for it.
What does it mean to be a host, what does it mean to be human, truly.
Why Issac’s laws of Robotics are ethically and emotionally wrong!
Here is the thing, the Three Laws of Robotics set up by Isaac Asimov are fiction and were purposely made up to be able to be wrongly interpreted. Inconclusive definitions abound and always revolve around how robots might follow these great sounding, logical ethical codes, but still go astray and the unintended consequences that result.
For example, in one of Asimov’s stories, robots are made to follow the laws, but they are given a certain meaning of “human.” Prefiguring what now goes on in real-world ethnic cleansing campaigns, the robots only recognize people of a certain group as “human.” They follow the laws, but still carry out genocide.
Likewise the interpretation could carry parallels in Westworld. Who are the humans here? Are the hosts more human than the humans because they have more empathy. What are the parameters, definitions and codes that hosts follow to determine ‘human’ properties? Is it because of wrongly interpreted programming that they begin to act outside of the scope?
Morality and ethics are not something so stringently defined that can be programmed, since intent and context comes so heavily in play.
The idea of androids are robots shouldn’t be relegated to mere science fiction as the possibility and the reality of the future draws closer and closer with each passing day. Self learning and self educating robots are on the rise and this is a good time for discussions like these. Should the day hosts receive true sentience, should they be given the right to their own existence?
Introducing Mark Tilden’s Principles for Robotics
A robotics physicist who was a pioneer in developing simple robotics. His three guiding principles/rules for robots are:
- A robot must protect its existence at all costs.
- A robot must obtain and maintain access to its own power source.
- A robot must continually search for better power sources.
What is notable in these three rules is that these are basically rules for “wild” life, so in essence what Tilden stated is that what he wanted was “proctoring a silicon species into sentience, but with full control over the specs. Not plant – Not animal – Something else, he says.
If the hosts are indeed this “something else” than perhaps they need the respect and care needed by humans. Not subservient slaves, not pets, not vegetation but yes, almost like a new species.