So I saw
I, Robot for the second time yesterday, and I liked it just as much as the first time. If you haven't seen it, please do (and I rarely encourage people to spend their hard-earned dollars on blockbuster Hollywood flicks--well, ok, maybe you should see
Spider-man 2 as well; not only is it a great summer movie but I have an inexplicable, giggly-sighing-girl-like celebrity crush on Tobey Macguire. And speaking of giggles and sighing, Will Smith is still buffed out enough from
Ali to look pretty hot in a few gratuituous, “Look-how-fine-I-am” nude and semi-nude shots in I, Robot).
Disclaimer: I apologize in advance for possibly spoiling any surprises in the film. But the movie is good enough that it won’t matter much.
The premise of I, Robot is that in a future world, robots play a central role in modern human life, serving people in a myriad of ways, from personal assistant to cook, baby-sitter to factory worker. In order to keep the robots completely safe for human use (read: exploitation without the emotional and moral complications), they are all programmed with three basic laws (or rules):
1st law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
2nd law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3rd law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
(Note: I actually got this version of the laws from an
Isaac Asimov web page about robotics, and not the film itself.)
Hardwired with these three “perfect” laws, the robots become more common than personal computers are today, with 1 robot for every 5 humans on the planet (yikes! That’s a LOT of electricity). They operate on pure logic, rescuing drowning people based on their probability of surviving, for example, and have no emotional capacity whatsoever.
Pretty neat, right? None of that pesky human emotion to confuse and complicate things, none of those moral or ethical dilemmas about right and wrong. Three simple laws: Don’t harm humans. Obey orders. Protect yourself. As long as you don’t harm humans. Cool.
Of course, things don’t stay that simple.
Enter the newest generation of robots, NS-5’s--which have such a Macintosh-platform, OS-X ring to them that all us Mac-lovers gotta be proud. NS-5’s are created with space for a second processor, located where a
human heart would be. The robots start to evolve. They start to make their own choices. They start to listen to other robots instead of humans.
They break the rules--or at least the rules the way humans understand them. Things get ugly. If you’ve seen the previews, you know what I’m talkin’ about.
Which brings me back to the idea of rule-breaking. When humans break the rules as part of their creative efforts, the results can be astonishingly beautiful, even sublime. Consider Bruce Lee’s street-smart
Jeet Kune Do philosophy, which strips more traditional kung fu down to some bare essentials, and ends up as a seriously fierce and effective approach to ass-kicking--I mean, living as a peaceful warrior.
Or think about jazz music, which has roots in musical styles as disparate as Italian operettas, Black southern blues, American military marches, and old Negro spirituals. Who knew that all that stuff could be cooked together like gumbo to create a completely new musical genre that in turn helped give birth to a dizzying array of other sounds, from bebop to the
Motown sound to trip hop and acid jazz.
Kali, too, takes a similar approach--Bruce Lee did study it with his homie, world-renowned Kali and Jeet Kune Do master
Dan Inosanto--taking the old wisdom passed down through generations of teachers and deepening it through new understandings of not only other martial arts forms, but of everyday life.
Kamatuuran, the name of my Kali school, is a Visayan word for “truth,” and that is our ultimate quest--for truth itself.
So what’s the difference between a bunch of robots breaking those three simple laws and humans breaking and re-writing the rules of artistic expression?
That heart-like space in those robots? It wasn’t filled with the capacity to feel empathy, compassion, anger, sadness, grief or joy. One robot character in the film, Sonny, who steals the show in a few instances,
does learn how to feel, and--even more important--how to reconcile those pesky, nuanced, knotty emotions with the immaculate logic of his robot-brain. I’ll let you guess what happens next.
As for us humans, our hearts are not only in the center of our bodies, they are (or at least should be) in the center of our creative pursuits--from the wailing grief of a bluesy jazz trumpeter to the joyful exhiliration of a dancer’s gravity-defying leap into the ether.
Or on a darker note--from the steel-cold slicing of a killer-enemy’s flesh to the merciful warning of a glancing blow--our hearts, if we listen to them, can also tell us where the fine lines lie between life and death, between going too far and not going far enough.
And during these times of war and fear-mongering and danger, we need to listen to our hearts even more deeply, and let them move our whole beings into truthful action.