Monday, May 17, 2010

Machina

I watched Stanley Kubrick’s 2001 again the other day after reading Arthur C. Clark’s novel of the same name, which was written during film production. One of the themes I got from the film and book was the nature of intelligence, sentience, and its control. Near the end of the second act, astronaut David Bowman unplugs HAL, the on-board computer that controls everything after it kills Bowman’s colleague Frank Poole. Up to this point, human crewmembers have treated HAL as a fellow with intelligence and feelings, entrusting this machine with their lives. When that trust is betrayed, and HAL understands the cost of the deed, it begs for mercy.

In both the book and the film, this monologue is really quite touching. You almost feel sorry for the computer that made the mistakes. Had this machine been human, we’d understand its error as part of our collective condition. To err is human, to forgive is divine.

But if one of our agents, a computer, shows up one of our human errors, we must reboot or unplug. We are forgivable, but the machine of our making is not. It doesn’t enjoy the same rights and privileges we do. It is not a person. (By the way, women were not considered “persons” in Canada until 1929.)

In his recent Globe and Mail essay, “One Robot, One Vote?”, Neil Reynolds, addresses the issue of robot rights. For a good chunk, he assumes that cyborgs will have genders and discusses sex, marriage and divorce. Sadly, he doesn’t entertain the notion of gender neutral robots or same-sex human-robot relations.

He does, however, bemoan the fact that “so far most of the heavy thinking about their rights, responsibilities, and morality has come from comic books.” Hmm. Yet he cites Clark and Isaac Asimov’s “Three Laws of Robotics,” neither of whom wrote comic books. (He also cites the Bible, which is now a graphic novel.) We could also look at Gene Rodenberry’s Star Trek the Next Generation (“Oh, Data, you are a gem!”). Science Fiction and comic book are the playgrounds of ideas, particularly the uncomfortable ones that make lesser men and women squirm. Why not do our heavy thinking there? Where else will it be done: government?

Will robots eventually have rights? I expect so. We’ll create them in our own image. I just hope that by the time we have to put this heavy thinking into action and words, we ourselves become more humane.

No comments: