Bad Dog!

A few years back, it was, when I saw my first video of a robo-dog.  I don’t mean the cute ones that you might fool yourself into thinking, on an off day, might be a real mammal.  I mean the bare-bones, mean-looking robot kind.  If was, of course, being developed by military contractors.  Then just days ago I saw something truly frightening.  In a video from China, one of these robo-dogs with an assault rifle and a ton of sensors mounted on it, was remotely air dropped by a drone and began policing the area.  Knowing that fleets (I’m not sure that’s the right word) of thousands of drones have been coordinated for entertainment purposes, and aware of how much money and tech militaries have, well, let’s just say nightmares aren’t just for sleeping any more.

Image credit: DARPA, public domain via Wikimedia Commons

Dogs, until they mostly had it selectively bred out of them, are killers by nature.  The wolf has to be a predator to survive in the wild.  As much as we like bipedalism, it has to be admitted that four legs (or more—imagine the robo-centipede, if you dare!) benefit locomotion quite a bit.  You may get a lower angle of view but a boost for speed.  And if you see a robo-dog, especially one with a machine gun for a nose, running is where you’d want to excel.  But we’ve taken our companion—our “best friend”—and made it into yet another engine of fear.  As someone who grew up with an inordinate number of phobias, I really don’t need one more.  Of course, it’s a truism that if a technology comes from the military it will be cause for alarm.

I’m capable of dreaming.  I can dream of peace and cooperation and what we could build if we didn’t have to worry about the aggressive, the greedy, and the narcissist.  Those who never learned to play well with others but who make money easily and spend it to bend the world to their bleak, bleak vision that lacks a happy ending for all but themselves.  I can envision meeting people who are different without the first thought being exploitation—what can I get out of them?—or fear that they wish to harm me.  Humans are endlessly inventive, especially when it comes to ways to harm one another.  If our creativity could be set toward working for the benefit of all, dogs would be for petting and drones would be for seeking out new ways to solve the problems that beset us all.  Instead we make them into new nightmares.


Mars Bars

It brings tears to my eyes. A little guy millions of miles from home. The only spark of acknowledged intelligence on the entire planet. It’s his birthday and he’s singing “Happy Birthday” to himself. It’s downright depressing. The guy, however, is the Mars rover Curiosity. It is a machine. The headline, however, jerks an emotional response from all but the coldest of individuals: “Lonely Curiosity rover sings ‘Happy Birthday’ to itself on Mars.” It’s that word “lonely.” It gets me every time. Then I stop to think about machine consciousness again. Empirical orthodoxy tells us that consciousness—which is probably just an illusion anyway—is restricted to people. Animals, we’re told, are “machines” acting out their “programing” and not really feeling anything. So robots we build and send to empty planets have no emotions, don’t feel lonely, and are not programed for sadness. Even your dog can’t be sad.

Amazing how short-sighted such advanced minds can be.

We don’t understand consciousness. We’re pretty wowed by our own technology, however, so that building robots can be brought down to the level of middle-school children. We build them, but we don’t understand them. And we may be losing part of ourselves in the process. An undergraduate I know who works in a summer camp to earn some money tells me a couple of disturbing things. Her middle-school-aged charges are having trouble with fine motor skills. They have trouble building basic balsa-wood airplanes. Some of them can’t figure out how paperclips work. One said she couldn’t write unless she had access to a computer. This camp worker’s supervisor suggested that this is typical of the “touchscreen generation.” They’re raised without the small motor skills that we’ve come to take for granted. Paperclips, it seems to me, are pretty intuitive.

Some 34 million miles away, Curiosity sits on Mars. An exile from Earth or an explorer like Henry Hudson? Or just a machine?

pia20602_sol1338_take7frontlook

Machines don’t always do what you tell them to. I attended enough high school robotics sessions to know that. Yet at the local 4-H fair the robots have a tent next to the goats, the dogs, and the chickens. We’ve come to love our devices. We give them names. They seem to have personalities. Some would claim that this blog is the mere result of programing (“consciousness”) just as surely as Curiosity’s programmed singing to itself out in the void. I’m not for turning back the clock, but it does seem to me that having more time to think about what we do might benefit us all. This constant rush to move ahead is exhausting and confusing. And now I’m sitting her wondering how to get this belated birthday card delivered all the way to Mars.


Circus of the Absurd

As long as I’m thinking about ethics, my thoughts turn to the fair. Every August our county 4-H Fair becomes an event in our lives. Since my family has been involved with 4-H for many years, we always try to spend as much time there as we can afford. Jobs and daily life tend to get in the way, of course. While there we get to see the animals that are missing from our lives, and reconnect with art and culture. Robotics are now part of our local fair, and this is the first year that I’ve ever seen pigs there. And there were the political booths. Just around the corner from where the sheriff’s office was giving out free gun locks to prevent kids from shooting someone accidentally was the booth supporting Trump. I’ve never been so strongly tempted in my life to walk up to a total stranger and say, “You are kidding, right?” But no, like the Donald himself, a flashy large sign displayed their ignorance for all to see. We live in the era of the delightfully uninformed.

IMG_2908

I’m no political pundit. I tend not to trust any politicians much. I distrust businessmen even more. The fact is the only thing you need to be a viable candidate for President is money. Over the past several weeks Trump has shown himself to be anything but qualified for political office. Major newspapers run articles that seriously question his sanity. And yet here are good people who don’t have the sense to maybe put up an embarrassed, small sign saying “Sorry folks, we’ll try again in 2020.” We find it hard to admit our mistakes. Especially when the stakes are so terribly high.

I go to the fair to support 4-H and to enjoy an evening out with my family. Although I spend most of every day in a different state working in an isolated cubicle, I can always count on seeing people I know at the fair. I enjoy the arts tent where young folks are making their first steps into lives filled with creativity and imagination. The more technical tents can be intimidating where kids a quarter my age are launching model rockets and those under half my age are building robots. In the herpetology tent I see a snake amid a bed of shredded newspaper. He’s hiding under the photo of a prominent non-politician who has a large booth displaying his name just across the grounds. And I remind myself this is the first year they’ve had a swine tent. I wonder if anything will be the same next year.


Double Blind

When I read Isaac Asimov’s three laws of robotics as a child, I assumed that I’d not live to worry about them in real life. What we don’t know can indeed hurt us. Time magazine frightens me sometimes. This week’s offerings include a small blurb about drones. When I was a kid, a drone was a bee—dangerous in its own right—or it was a verb used to describe an uninspired teacher or preacher’s monotoned wisdom. Now drones are robotic planes that can operate themselves without human input. Time reveals that technology has been developed that would allow drones to kill without human input. Asimov’s laws have become truly science fiction. Proponents argue that “collateral damage” might be minimized if we allow robots to kill with precision, and some have argued that the research should be prohibited. The fact that it has been developed, however, means the line in the sand has been crossed. If it has been done once, it will be done again.

800px-MQ-9_Reaper_in_flight_(2007)

Even as a daily user of technology, a deep ambivalence besets me. Maybe if it weren’t for the fact that every once in a while my computer (most often at work) freezes up and issues a command I can’t understand, I might feel a little more secure. Instead I issue a ticket for IT and when they call on me sometimes even a specialist can’t figure out what went wrong. Once the bullets are flying it’s a little too late to reboot. Maybe I’m just not yet ready to crawl into bed with a technology that might kill me, without feeling.

Just five pages earlier Time notes that 1 in 5 is the “Ratio of people who would have sex with a robot, according to a U.K. study.” All things are fair, it seems, in love and war. The part of the equation that we haven’t accounted for in our artificial intelligence is that thought requires emotion—which we don’t understand—as much as it requires reason, upon which we have only a toddler’s grasp. And yet we continue to build more and more powerful devices that might kill us with ease. Isaac Asimov was a prescient writer and a forward thinker. He was from the generation that aspired to ethics being in place before technology was implemented. At least as an ideal. We’ve reversed the order in our world, where ethics is continually playing catch-up to the new technologies we’ve invented. Now it’s time to decide whether to make love to it or to say our final prayers.


Paging Dr. Asimov

Who remembers Rock ’em Sock ’em Robots? Plastic “robots” in the boxing ring trying to knock each other’s block’s off was a form of entertainment for kids of the ‘60s before such things as humanoid robots actually existed. So when Boston University’s alumni magazine had an article about dancing robots, I had to see what was up. As regular readers will know, I’ve been exploring some of the problems with reductionism lately. This idea, that humans and animals are just fleshy machines, breaks down when we try to design robots that can do some of the most basic of human activities. Sometimes we dance and we don’t know why. Apart from Wall-e’s dance with Eve, robots have trouble getting the concept. Graduate student John Baillieul notes that this isn’t about “some high school guy who had trouble getting a date, so you get a robot. The ultimate goal is to understand human reaction to gestures and how machines may react to gestures.” Having actually been a high school guy who never even got to the prom, I’m wondering how depressed our robots get when the fem-bots all look the other way.

Rockem Sockem

The reductionistic outlook suggests that we can eventually program robots to respond as humans would, responding fluidly to situations, allowing them to over-ride their “instinct,” which, the article implies, equals programming. We have no idea what instinct is. It is something all biological creatures have, from the heliotrope following the sun to the human dancing her heart out. Do we want machines to replicate our most intimate emotions? Even our most reliable chip-driven devices sometimes freeze up or rebel. My car has recently got the idea in its mechanistic brain that the right-hand side rearview mirror should be rotated as far to the right as possible. We bicker about this all the time when I get in to drive. Well, machines know best. They, after all, are the shape of the future.

So programming robots so that they can react in real time to non-verbal cues, like all sentient beings do, is a desideratum of our mechanistic Weltanschauung. Notes Rich Barlow, the article’s author, “bats, for example, camouflage their motions so that they can sneak up on insect prey, a fake-out familiar to anyone who’s tried to swat a pesky fly.” My question is who is the pesky fly in this robot-human scenario? Who acts irrationally and unpredictably? Isn’t our instinct to smash the fly a result of our annoyance at it landing, yet again, on our sandwich with its dirty feet? And what is that stupid dance that it does when it’s all over our food? Reductionism must, by definition, reduce instinct to the level of a kind of genetic programming. Even this aging blogger, however, knows what it is to dance without knowing why. He also knows what it feels like when your date goes home with somebody else, something to which he’s not convinced that we want robots calculating an “instinctual” response.


RoboStop

Although few objects are as soulless and mechanistic as robots, I still feel strangely emotional about them. Had my daughter not been interested in them, I would never have become involved with FIRST Robotics, even serving for a year as an officer in the Team 102 foundation to help raise the thousands of dollars needed to run such a club. Like most people with a background deep in the humanities, I would’ve not pondered too deeply how much of ourselves we put into our machines. Right now I’m reevaluating that hypothesis.

No one doubts that an artist or musician puts her- or himself into her or his work. Those who do it best are most highly valued (after we let the artist die off, usually, after having lived a difficult life). We admire those who are able to catch the human spirit in such forms of expression while many scientists inform us that there really is no spirit at all—it is just chemical reactions and electrical circuits in the soft tissue of our brains. When we see the Mona Lisa, or hear Beethoven’s Seventh, however, we know they must be wrong. What we make becomes part of us. And I’m thinking that may apply to robots too.

I’ve just attended my last FIRST Robotics competition. It is difficult to convey, if you’ve never been, what such an event is like. Hundreds of screaming high school students excited about engineering and the thrill of competition. A playing field is constructed to exacting specifications and six teams in two alliances facing off their creations to emulate human—sometimes superhuman—behavior. All the while the thumping rock beat of loud music and the play of colored lights give the event the emotional charge of a football game and homecoming dance rolled into one. Only you really don’t have to move very much at all.

I’m not a robot designer or builder. It is difficult to imagine anything further from my training (except perhaps accounting). Still, I’m a little let down after my last FIRST competition. Four years ago it seemed so novel, and there have been some difficult moments along the way. I’ve seen kids build robots that play soccer, hang inflatable tubes, play basketball, shoot frisbees, climb towers, climb poles, and do many other seemingly impossible tasks. I’ve been up before dawn to ride a chilly school bus across the state to compete, coming home in time to fall in bed to get up early for work the next morning. But most of all, I’ve seen kids putting themselves into more than machines. I’ve seen them putting themselves into a team. Although we didn’t win this year, and next year I probably will be consumed with other concerns, I am proud to have been, in my small way, a part of Team 102. Way to go, Gearheads! Maybe robots do have souls after all.

Team102FinalAlliance


Next to Godliness

Catholics, secularists, and even a Pharaoh or two. Loud, pounding music. Dancing teenagers. It must be FIRST Robotics season again. Although I’m ambivalent about the implications of a world filled with robots, I can’t help but be impressed by what high school students can do when they are mentored so closely by adults eager to share the tricks of the trade. If you’re not familiar with FIRST Robotics, here are the basics: each January a new game objective is released. Participating high schools throughout the world have six weeks to plan, design, construct, and program a robot to perform the tasks spelled out. Since this is a busy time of year for many schools, dedicating extra hours to building a robot leads to complaints and loss of sleep—maybe a skipped supper or two. When they come together on the playing field, however, all that is forgotten and the wonder is that kids, who are often disparaged in our society, have managed to construct a working, complex machine capable of tasks impossible for many adults (for example, doing chin-ups).

Every year I can’t help but think how like a religious service these events are. The robots are like deities to be served and the technology flits about like mechanized angels. There is an increasingly complex hierarchy of officials telling you what you can’t do (now this is beginning to feel like work!). At the end of the day, however, the kids get to be the stars in a competition that puts brains over brawn. And the robots are treated with extreme deference, because we know that we wouldn’t stand a chance if they had a will anything like the deities of yore.

The religious imagery, however, is never absent. Technology represents humans doing things without divine intervention. These are empirically devised devices, performing according to the laws of physics. And yet, teams from Catholic high schools, bearing mythologically-laden names, join in the world where no gods need apply. Robots, as initially named by Karel Čapek, were human servants, the ultimate in godliness—making images in our likeness to do our bidding. And yet we can’t escape the language of religion when thinking about our own creations. The fascination applies to non-parochial schools as well, with some teams claiming names echoing themes from holy writ. Creating autonomous beings is next to godliness. We make our own future, and, god-like, we hope that nothing goes wrong.

Humans and machines

Humans and machines


A Tale of Two Bees

We’re nearing the competition season for FIRST Robotics. The animated, mechanical creatures created from scratch since early January are now set to compete for a kind of ultimate, ultimate frisbee. Only you can’t call it “frisbee,” for copyright reasons. Ironically drone bombers have been in the headlines this past week. Drones are robotic planes that fly their missions with human pilots sitting safely hundreds, or even thousands of miles away from the action. People are beginning to wonder—is this ethical? I pull out the Scientific American I purchased at Bush International in Houston last week. There’s an article about robo-bees. In a scare that seems like it could have come straight from the X-Files, I’ve been reading about the disappearance of bees. There are people seriously worried about this. It does seem that we failed to learn the lesson of Rachel Carson, and a land of milk and honey just doesn’t appeal without the honey.

The robo-bees are the size and roughly the shape of biological bees. They can be programmed to behave like bees and pollinate plants that our missing bees have been, well, missing. There may be hope for the flowers after all. But I wonder about the honey. No doubt, technology will come to the rescue. Those labs that gave us sucralose, aspartame, and stevia can surely invent a golden, viscous liquid sweetener that drips from a pipette. No cause for worry here. We can recreate the natural world in the laboratory. Honey has been reputed to have medicinal effects, but we can synthesize medicines in the lab as well. You might not want to dribble those on your biscuits, however.

Honey is made from nectar, the mythological food of the gods. Hinduism, Buddhism, and Judaism all recognize the religious significance of honey. Those of us who’ve been stung realize that a price has to be paid for such divine sweetness. The gods are like that. Roses have thorns for a reason. Not that I’m not impressed with the technology behind robo-bees. I am astounded that tiny robots can be built to fly and perform as we understand nature to dictate the Apis genus. They don’t, however, have the minds of bees. Mind is not the same as brain, as we’re beginning to learn. And minds are not limited to Homo sapiens. I recall when in our arrogance we thought we could improve the productivity of bees (capitalist bees) by breeding them with their Africanized cousins, biologically separated by an ocean. Many nightmares haunted me of the resulting killer bees. Yes, I had been stung as a child. Just by regular, garden-variety bees. From those painful events I learned a valuable lesson. We tinker with nature at our own cost. I, for one, am willing to deal with real-life stingers to taste the very food of the gods.

True bee or not true bee?

True bee or not true bee?


The Computer of Dr. Caligari

TheAtlanticTo be human is to be ethical. Not always in the best way, unfortunately. Nevertheless, our moral sensors are pretty much constantly running as we try our best to make the right moral decisions. This thought occurred to me while reading Jonathan Cohn’s article, “The Robot Will See You Now,” in this month’s The Atlantic. Having been a sideline watcher of FIRST Robotics for about four years now, I have heard countless stories of how robots perform some surgeries more efficiently than clumsy humans can. Cohn’s article starts off with the impressive potential of IBM’s Watson to sort through millions and millions of bits of data—far beyond any human capacity—and make more informed recommendations about medical treatments. After all, Watson won on Jeopardy!, so we know “he”’s smart. But he isn’t really a he at all. Still, in our reductionist world where humans are just “soft machines” computers and robots should be quite capable of helping us heal. To survive longer.

I am a veteran of Saturday afternoon science fiction movies and weekday episodes of Star Trek and Battlestar Galactica (original series, both). The present is starting to feel like that impossible future I watched as a starry-eyed child. But what of Dr. McCoy? I remember literally cheering (something I haven’t done much in recent decades) when DeForest Kelley’s name appeared on the opening credits of Star Trek when season three began. Bones was always one of my favorite characters—the doctor who didn’t trust the machines upon which he relied so heavily. He was a down-to-earth country doctor, who seemed to feel out of touch with the human (and occasional alien) element with machines interposed between them. Medicine is, after all, a very personal thing. Our bodies are our souls. I know; scientists tell us we have no souls. Embodiment studies, however, suggest otherwise. That robot coming at me with needles and scalpels may know how to heal me, but does it have my best interests at heart? Where is its heart? Its soul?

Better health care is certainly much to be desired. But in a country where our lawmakers continually debate whether the poorest should have access to Watson and his ilk, I wonder where ethics has gone. Robot doctors, I’m sure, will not accept patients with no insurance. Does not compute. Having gone without health insurance myself for several years, despite holding advanced degrees, I know that if I’d had a health crisis I’d have been rightly ranked down there with the blue collar folk that I consider kin. You see, to be human is to be ethical. That doesn’t mean we’ll always make the right decisions. It’s a safe bet that Watson can play the odds mighty finely. And the soulless machine may be making the decisions about who lives and who does not. Now that I have insurance again, when I’m on that cold slab I may have a shot at seeing a robot doctor. If that ever happens, I’m going to hope that Dr. McCoy is at least standing in the corner, and that those waiting outside the comfortable walls of affluence will somehow enter Watson’s scientific calculations with me.


Robotics FIRST

Wired

I knew it! It was right there on the cover of Wired magazine. “The Robots Take Over.” And it is also the very day of the FIRST Robotics kickoff, the day when Dean Kamen and his team announce to thousands of high school kids, teachers, engineers, and interested parents, what the 2013 FIRST robotics competition will be, spurring us into six frenzied weeks of designing, planning, and building a robot to take to competitions. First Atlanta, then the world! It must’ve been their plan all along.

The article in Wired, by Kevin Kelly, does have hints of cheekiness throughout, but for the most part is on target. How many of us already use computers or some kind of robotic devices to complete our jobs? Kelly points to the inevitable: robots can do it better. The upside is that when robots take away jobs they create new ones, like Charlie Bucket’s dad getting a job repairing the robot arm that took his job away at the toothpaste factory. If you don’t want a tech job, too bad. That’s what the new definition of work is becoming, since labor is already being taken over by robots. Those who can look far enough ahead can see robots doing, as Kelly puts it, any job. What makes this sound apocalyptic to me is the fact that we, as a society, undervalue education. What will the undereducated do? Their jobs are the first to go. I feel the tremors of a revolution that hasn’t even started yet. People need something to do.

It is apparently without irony that Kelly suggests that any job people do, including in the service industry, can be done by robots. I am an editor. A robot may be able to find grammatical errors (Word and Pages already do this), but they can’t capture the soul of a writer. We write for the enjoyment of other people who experience being people in the same way that we do. There is an inherent arrogance in the Artificial Intelligence movement that believes (yes, it is a belief) that intelligence and mind are the same thing. There is no room for a soul in this machine. Many biologists would agree: we’ve looked, no soul. But even biologists know that they’ve got an identity, aspirations, contradictions, and emotions. It is the unique blend of these things that make, what we can for convenience call, the soul. There are entire industries built around the care for that soul.

Many scientists are still betting on the end of religion, the ultimate repository of those who believe they have souls. Religion, however, is not going away. When we see robot psychiatrists, robot social workers, robot clergy, robot writers and artists, and robot Popes, we’ll know the apocalypse has truly transpired.


Battle Bots

Our local high school robotics meetings start up again this week. Actually, they’ve been going on all summer since robots do not require the rest and mental downtime that we mere creatures of flesh do. Glancing through the headlines of the Chronicle of Higher Education I saw a leading article on a topic I’ve been reading about: the military use of robots. On a college campus visit last semester I came across a robotics display and, since I’ve picked up some of the lingo, I engaged an engineering student sitting nearby. He told me that most of the funding for robotics at the collegiate level (there, anyway) came from the Department of Defense. Earlier this year I had read Wired for War, a book as stunning as it is frightening. In fact, P. W. Singer is cited in the article. What makes this interesting, however, was the role of Ronald Arkin, a Georgia Tech professor of robotics and ethics. Dr. Arkin believes robots to be morally superior to humans at making battlefield decisions. He’s not alone in that assessment.

The more I pondered this the more troubled I became. Morality is not a scientific purview. Ethics that have been quantified always fail to satisfy because life is just a little too messy for that. Who is more morally culpable: the policeman who shot a thief dead when the man was only stealing bread because his family was starving? Hands down the most challenging undergraduate class I took was bio-medical ethics. It was thornier than falling into a greenhouse full of roses. Sick minds and reality cooperated to draw scenario after scenario of morally ambiguous situations. I left class with two more things than I’d taken in: a headache and a conviction that there are no easy answers. Having a robot vacuum your floor or assemble your car is one thing, having one decide who to kill is entirely another.

The article cites the rules of war. The main rule seems to be that no matter what, some people will always kill others. We try to sanitize it by making the inevitable death-dealing follow ethical conventions. While religion often takes a bad rap these days, one of the things that it is capable of doing pretty well is providing an ethical foundation. People may not always live up to the standards, but religions only in very rare situations give people an excuse to hurt others. Nearly all religions discourage it. The rules of a science-based morality would likely fall along a logical algorithm. Unfortunately, there’s more gray than black or white in this equation. Algorithms, in my experience, are not so forgiving. So as I get ready for my first robotics meeting of the year I need to remind myself that the robots are capable of great good as well as great evil. Like with humans, it all depends on who does the programming.


Robo-Stop

I have just read the most disturbing book yet. And for me, that is saying something. The facets of fear that P. W. Singer’s Wired for War manages to cut are sharp and dangerous. That he was able to write the book with a good dose of quirky humor only ameliorated the troubles a minor bit. The subtitle of the book is The Robotics Revolution and Conflict in the 21st Century. I was drawn into robotics by the FIRST Robotics competitions in which my daughter’s school competes. Not an engineer or programmer, I merely sit on the sidelines and cheer along those who understand mechanics, hydraulics, and electronics. Quite often I get the sense that since science works so well there is little room left for serious consideration of the humanities. Particularly religion. By the end of Singer’s book, however, my choices in life were reaffirmed. I would rather spend the limited days left to the human race celebrating our humanity. For, it seems, our days may be numbered indeed.

Considering that Wired for War was published three years ago, the technology must surely now be even more advanced than it was when the book went to press. That such technology as Singer describes exists is not in itself too much cause for worry, but the fact that such technology rests in military hands is decidedly disturbing. One of the few resources able to tap into the tremendous budget of the United States with impunity, the military services have been able to commission robots that are even now deployed in our various conflicts. A strong ethical question run through Singer’s account: we are racing ahead with lethal technology and artificial intelligence—and no one is really driving this machine. Shouldn’t someone be?

One of the more sobering aspects of Singer’s account is how humans are increasingly left “out of the loop” when it comes to lethal decisions being made by robots. Their logic is flawless, as is their aim. Their understanding, however, is purely mythical. As I read this gripping account, several issues spiraled out to be considered on their own. I arrived home disheartened and concerned for a future that seems to be inevitably in the hands of those I fear most: those with excess capital. Military robots do not possess empathy or compassion, just physics and trigonometry. And they already exist. When those powerful enough to wage war discuss the rules, their decisions are tellingly called “the doctrine of war.” Doctrine, whether military or religious, is always a sure sign of danger to come. And the robots aren’t coming. They’re already here.


Vampire Jesus

It was a dark and stormy night. Well, so far that could describe most any night in April or May of this year. Anyway, I had just read about vampire-bots for the first time. Robots, like all machines, require a power source. Those I’ve witnessed up close require rechargeable battery-packs that are surprisingly heavy. I’d read that some robots were being designed to consume their own energy sources—mechanical and chemical eating, if you will. One dreamer figured that blood could work as a source of energy. A robot could be designed to take energy from blood, and thus arises the concept of the vampire-bot. I don’t think such an insidious machine was ever really built, but it is theoretically possible. It is also a reflection of a biblical idea—the life is in the blood. Ancient people tended to associate life with breathing. With no CPR, an unbreathing body was a dead body. Blood obviously played into the picture too, but precisely how was uncertain. Clearly a person or an animal couldn’t live without it. To say nothing of robots.

One of those dark and stormy nights I watched The Shadow of the Vampire. Surprisingly for a monster movie, Shadow had been nominated for two academy awards. Not really your standard horror flick, it is a movie about making a movie—specifically Murnau’s Nosferatu, the classic, silent vampire movie that really initiated the genre. The actor cast as Count Orlock, however, is really a vampire. The premise might sound chintzy, but the acting is very good with Willem Dafoe making a believable Max Schreck (vampirized). Stylistic rather than gory, the story plays out to the fore-ordained conclusion and the vampire disappears in the cold light of dawn.

When I was an impressionable child I was told what is likely an apocryphal story about Leonardo da Vinci. The story goes that the man who posed for Jesus in the Last Supper was also the model for Judas, after living a life of dissolution. Willem Dafoe, of course, famously played Jesus in Martin Scorsese’s Last Temptation of Christ. From Jesus to vampire. Both characters are bound by the element of blood. Christianity still celebrates the shedding of divine blood symbolically while the vampire takes blood (also symbolically). Although the vampire cannot endure the sight of the cross, the same man effectively played both sides of the mythic line, almost as if the apocryphal story came true. There are implications to consider here, and not all of them insinuate Hollywood. On these dark and stormy nights, we have something to ponder.


Robot Crossing

With my new job I haven’t been able to be as active on our high school’s robotics team this year. Not that I ever contributed much beyond moral support, but there is a very profound satisfaction at seeing teenagers concentrating on such technological marvels and building self-esteem. Yesterday was spent at a regional competition. Noisy, colorful, chaotic—it was like being a teenager again myself. I overhead engineers talking during the course of the day about the great technological marvels of the future made possible by robots. These people have no apocalypse hidden among their endless optimism. We’ve got robots on the ocean floor and rolling around on Mars, snaking into our bodies even down to the cellular level. No end of times here, only forward motion. I know that computers now define my life. If I miss a day on this blog I grow dejected; one of my biggest worries about going to Britain later this week is how I will continue posting from overseas. But I sometimes feel as if our love of technology will be our undoing.

Experts—of which I am not, I hasten to add, one—tell us that within a lifetime artificial intelligence will be indistinguishable from real intelligence. As I watched the robots playing basketball (this year’s FIRST Robotics challenge), I began to wonder about the motivation of our robot slaves. Humans are driven by biological and emotional needs. Robots, as far as we can tell, do not want anything. It is a vacuous life. Yet as the robots played basketball all day, I noticed they didn’t suffer the obesity problems so evident among humans, nor the weariness that accompanies having to awake before dawn to catch a school bus to the competition. They are built for a purpose and they stick to it. Even as I watched hours of competition, I began to miss my laptop—driven by my own emotional needs as I am. I begin to wonder who is really the slave here.

Last night my family participated in Earth Hour. We try to do it every year with a kind of religious fervor. Turning off all electronics, including lights, we sit in the dark and talk by candle light. There is a profound peace to it. As my daughter commented on how spooky the shadow play could be, I imagined our ancestors who had no choice but to rely on pre-electric light in drafty houses where real wild animals still prowled the dark nights outside. How quickly that would become a trial for us. The same thought occurred to me as I watched M. Night Shyamalan’s The Village again last weekend. We are helplessly tied to our technological advancement. We might like to get away from it all for a few days or weeks, or even months. But we want the comfort of knowing that the robots are waiting for us when we turn back to reality again. Perhaps no apocalypse is needed after all.

Robot crossing


Send in the Robots

The FIRST Robotics kickoff is an event that is difficult to describe for those who’ve never attended. First, it must be noted that FIRST Robotics is sometimes described as “the varsity sport for the brain.” While engineering students with a penchant for athletics are not unheard of, the majority of robotics team students are not cut from the same cloth as the athlete. The FIRST kickoff, the first Saturday in January, is the opportunity for these kids to be told it is cool to be smart and that application of brain power is not the liability that many of the electorate seem to think it is. At this event the competition for the year is unveiled, and the kids (with some adult help) have six weeks to design and build and program a robot to do some very complex tasks. It is a season of sleep deprivation, programmed Saturdays, and the celebration of learning. Before NASA shows the game animation—the competition for the year—celebrities and other people in the public eye endorse the program. It is a time for praising the benefits of science.

Yesterday’s kickoff, however, was marred by the appearance of one of the guest celebrities. When George W. Bush was announced as a supporter of the program, a sense of disbelief fell over the room. This man who advocated for creationism in the classroom, who fought to stop research in cutting edge disease control, who began a war as a personal vendetta, was showing his dully beneficent face on the big screen telling the kids what a great program it was. A chance, as he said, to use your “God-given talents.” He ended his brief—and obviously scripted—sound-byte with his characteristic “God bless you.” I could not stomach the hypocrisy. I’ve blogged about religion and the science of robotics before, but to have a president who did nothing to strengthen the cause of higher education and fought science with eight years at his idle hands was just too much. If I was Dean Kamen, I would have insisted that that clip be left on the cutting room floor.

The former W represented religion in its guise as the enemy of science. It should be clear to my readers that I do not believe science has all the answers, but I also believe it is wrong for religion to stand in the way of knowledge. Science is something that we shouldn’t give lip-service without backing it up with programs and funding. That one minute of disingenuous, religion-riddled speech trumped all the other endorsements, including the sensible one by Bill Clinton who emphasized the need to work together even with those who are your opponents. This was a point W obviously missed. There comes a time when some public figures, like overused cattle, should be put out to pasture. There are some cowboys that should just stay on the ranch. I understand that presidential endorsements are important to FIRST, but in this case integrity should not be compromised. Especially when most of the teenagers watching the kickoff possess far greater potential than a mere politician elected on religious sentiment and dubious counting.

Does this face inspire science?