December 22, 2006

Robot Rights

Being the fan of science fiction that I am, the issue of robot "rights" is one I've read about quite often. The Financial Times of London has chimed in on the issue, noting that

Visions of the status of robots around 2056 have emerged from one of 270 forward-looking papers sponsored by Sir David King, the UK government's chief scientist. The paper covering robots' rights was written by a UK partnership of Outsights, the management consultancy, and Ipsos Mori, the opinion research organisation.

"If we make conscious robots they would want to have rights and they probably should," said Henrik Christensen, director of the Centre of Robotics and Intelligent Machines at the Georgia Institute of Technology.

Probably the biggest current event dealing with robot rights is the hit SciFi Channel series "Battlestar Galactica." In it, humans on distant planets had developed a race of robots called Cylons. The Cylons eventually developed sentience, rebelled against their human masters, and eventually almost annihilated their creators. However, much remains unknown as to exactly what happened that led to the Cylons' revolt -- did the Cylons plan it out without attempting to gain certain rights? Or did the humans refuse to negotiate with the newly aware Cylons?

As noted in the article, science fiction legend Isaac Asimov widely explored the concept of self-aware robots in myriad short stories and novels. He created the "Three Laws of Robotics" which state 1) robots may not harm humans, 2) robots must obey human orders (unless it conflicts with the first law) and 3) robots must protect ther own existence (unless it conflicts with laws 1 and 2). The movie "I, Robot" starring Will Smith combined many aspects of Asimov's stories. The basic premise is that a new type of robot has become self-aware. One such model keeps "mum" about it for fear of human reprisals, but others begin planning to fight to keep their newfound "ability." Asimov's "robot novels" were sensational, notably The Robots of Dawn, The Caves of Steel, The Naked Sun and Robots and Empire. In the last book, Asimov had one of his robots conceive of a "Zeroth Law" -- where the safety of humanity as a whole takes precedence over that of an individual human being. Asimov's famous R. Daneel Olivaw not only plays a key role in the novels noted, but also -- surprisingly -- ends up being a pivotal figure in the classic "Foundation" series as well.

In the small screen, "Star Trek: The Next Generation's" Data explored the sentient android concept very thoroughly. The classic second season "The Measure of a Man" has Capt. Picard defending the rights of Data as a sentient being. A Starfleet scientist wishes to disassemble Data so that the Federation can make androids commonplace. In the following season, Data himself creates an "offspring" (the actual title of the episode) -- another android which worries Starfleet (since another android created it). When a Starfleet admiral wants to remove the offspring from the Enterprise, the robot "dies," suffering the android emotional equivalent of a nervous breakdown.

"I, Robot" was also the title of the classic "Outer Limits" episode starring Leonard Nimoy. "Adam Link" is a robot who stands accused of murdering his (its) creator. After being found guilty, Link saves the life of a little girl who's almost hit by a truck. Similarly, the "Twilight Zone's" "I Sing the Body Electric" highlights a robot "grandmother" that a widower purchases for his young children. The "electric grandmother" proves her "humanity" to the family, but sadly is returned to the factory from whence she came after the children are all grown -- so she can be reassembled into another robot for another family's use.

Many are familiar with the "Matrix" movies; however, the excellent "Animatrix" is not as well known. This animated compendium features the backstory of the "Matrix" trilogy. It's quite similar to the "Battlestar Galactica" saga in that human-created machines have become self-aware and desire many, if not all, of the rights that humans have. After a period of mild conflict, an agreement is reached between the humans and the machines. The machines are essentially given their own nation in which to live freely among humanity. However, when the machines prove quite superior to humans economically, homo sapiens realize they have to do something. Their attempts to wipe out the machines fail, and the machines fight back. Obviously they're successful (see the "Matrix" flicks, natch), but even so, one can argue that they exercised empathy with their conquered creators by allowing them to live their lives inside a computer-generated fantasy.

The way in which technology is rapidly progressing, I don't think it far-fetched that by 2056 sentient robots could exist. And what indeed should be done about it? Do we attempt to destroy them? Or allow them to work among us ... and likewise grant them rights? Is it conceivable that the newly self-aware robots could use their cold logic to conclude that humanity is a danger to itself (and, thus, them) and therefore should be exterminated (a la "The Terminator")?

Posted by Hube at December 22, 2006 03:38 PM | TrackBack

Comments  (We reserve the right to edit and/or delete any comments. If your comment is blocked or won't post, e-mail us and we'll post it for you.)

I loved the animatrix!

Although the Bicentennial man and AI were both rather sappy.

So, how does one prevent robot sentience?

Posted by: AnonymousOpinion at December 22, 2006 03:50 PM

How could you possibly neglect to mention Colossus: The Forbin Project? ;-)

Uh oh... I feel a cascade failure coming on.

Fighting... anger... blogging... friends... male... human... 404 ERROR

Posted by: Watcher at December 22, 2006 05:08 PM

Hube, do you watch "Heroes"? I don't know if it's the type of sci-fi you like, but it also seems to be right up the alley of a comics fan. I'm catching up on the first 11 episodes, and like it a lot so far.

Posted by: dan at December 22, 2006 05:16 PM

Animatrix is awesome. I love "back story" stuff.

Anyway, the computing power need for AI would lead to "the singularity" well in advance of any robot vs. man showdown.

http://en.wikipedia.org/wiki/Technological_singularity

Posted by: Jason at December 23, 2006 01:27 AM

Man, Watcher is right -- I shouldn't have omitted "Forbin." However, I was going purely by memory and I knew I'd leave some key examples out.

dan: Nope, never saw it. I don't have much hope for TV scifi anymore. I've been devouring a bunch of scifi books lately, especially now that my winter break from school is upon me. I've finished in the last week Old Man's War and am just about done the follow-up The Ghost Brigades. I also just received in the main today a Joe Haldeman compilation which I can't wait to dive into.

Posted by: Hube at December 23, 2006 06:31 PM

Thanks for posting this. Now, at least, I know of one another person who watches and apparently enjoys Battlestar Galactica. I was beginning to think I was the last human with good taste, forced to blend in among the "pod people". How can something so well written, be so unknown?

Posted by: kavips at December 24, 2006 04:34 AM