Empire State

Hegemony is a funny word. In studies of antiquity it is commonly found since it denotes the “Leadership or dominance, especially by one country or social group over others,” according to the Oxford English Dictionary. Today it has a vaguely imperialist taint, although it doesn’t necessarily require that one nation actually pillage another’s wealth or resources. The idea that people are, and should be, free is pretty much assumed in developed nations. Or so at least our rhetoric dictates. The word hegemony came to mind, however, as I saw an interview with a corporate leader. He was discussing how his company had budgeted for technology development on an increasing scale, to catch up with current developments, and then leveled the tech expenses off after that so that the business could move into its prime objectives. The reality was vastly different, however. Each year’s budget saw increasing technology costs and it shows no signs of slowing down. Every industry, it seems, will have to keep devoting larger and larger shares of its budget to technology. Hegemony.

It’s not that any one company is solely responsible for our obeisance to technology, so this hegemony has no head. It is the idea of progress gone wild. Last year as I set out for the American Academy of Religion and Society of Biblical Literature annual meeting, a notice popped up on my laptop that a software upgrade was available. Since I file that I required was no longer accessible unless I updated, I clicked through all the agreements and provisos that I can’t understand and began the upgrade. Download and installation time measured in hours rather than minutes and I soon had to interrupt the process to get to the conference. This had consequences that nearly led me to becoming utterly lost in a part of Baltimore I’d been warned to avoid. The gods of technology demand their due. Now, less than a year later, I can’t access certain files unless I upgrade again.

Don’t get me wrong—I’m not a complete Luddite. I enjoy the instant gratification of finding information in seconds through a web search, but I’m not always sure that I can believe what I read. Technology means photos can be manipulated, sounds can be fabricated, facts can be created, all with no basis in reality. I used to have students ask me if such-and-such a fact they’d read online was true. Facts, it appears, are now negotiable. Nobody’s really in charge, it seems. Instead we are lead by the vague idea of progress, a new god with technology as its prophet. Even now I know people who think they never use computers but they drive without realizing their car is full of them, and turn on the television not realizing that the tech is no longer chip-free. Meanwhile those in the technology industry seem to have plenty of extra cash around, while those of us in the humanities ponder whether the ancient hegemonies have really changed at all. Let me look that up on the internet, once this upgrade is through.

600px-Astronaut-EVA


Know It All

Perhaps it’s the fact that I had a career malfunction in the middle of my chosen vocation, or perhaps it’s a natural consequence of earning an advanced degree. Whatever the cause, I am convinced that I know less than I used to know. That’s not the same as not having learned—indeed, it is a consequence of precisely that. You see, my education has led me to believe that things I thought I knew were not, in fact what I knew them to be. We all live with false assumptions—the sun rises and sets, the earth holds still, and that we aren’t made up of particles so tiny as to be invisible and that are mostly empty space. There was a time when I believed that science gave “the truth,” but we now think of science as ever provisional—the best theory to account for the facts at this time. It is open to change. And in fact, we know very little.

A deep irony lurks in the fact that many people treat their religion as the locus of certain knowledge. This is a known fact; Jesus resurrected, Muhammad was a prophet, Maroni spoke to Joseph Smith. When confronted with contrary data, such thinking withdraws into itself claiming all the more loudly that it knows the truth already. Learning should, I think, may one more humble. More circumspect. Of course I think I’m right about things. If I thought I was wrong, I would change to the correct way of thinking. What I know, however, is a different matter. As I set out to learn a new career, I find I know less than I thought I did. I know little and I know less all the time.

As an academic I can’t help but to spend my life trying to gain knowledge. I read voraciously, I try to engage in intellectual exchange with others. If I’m lucky, I learn something. And know that much less. That which I learn teaches me that I know less than I did before. The world is vast. The universe infinite, according to our best understanding at the moment. We travel through it all, picking up information and treating it humbly as we go along. I’m moving toward knowing nothing at all. Perhaps that is the true goal of all of this—to get to the point of knowing nothing. Then we shall be truly educated. Except, of course, for the true believers who already know everything there is to know. Of course we are all mostly empty space. I think.

DSCN4907


Monsters of Science

ScienceOfMonstersMaybe it’s the ebola in the air, or perhaps the gas from all the midterm elections verbiage, but I’ve been on a monster run this October. I just finished Matt Kaplan’s The Science of Monsters: The Origins of the Creatures We Love to Fear. It is a charmingly written book, at parts approaching the finesse of Mary Roach. Beginning with the ancient Greeks (and sometimes stepping back into the world of the Bible or the Mesopotamians) Kaplan examines the major categories of monsters and tries to offer scientific explanations for why people came up with them. It is a keen conceit and it is deftly handled. Noting that animals sometimes got jumbled in the fossilization process, he offers explanations for creatures such as the Chimera, Griffon, and perhaps even the Sphinx. Some of the unlikely episodes are quite fun to visualize as well, as when a snake slithers over a tar pit where a goat got stuck and was eaten by a lion that also got stuck. Beast after hideous beast he brings down to analytical size, sometimes convincing even this old monster lover.

One of the problems, however, is that science often doesn’t comprehend the symbolic nature of mythical thought. Quite apart from sheer creativity—and it does exist!—some of the material in Kaplan’s analysis would have benefited from having a mythographer’s look. For example, demons do not suddenly appear as monsters in the Middle Ages. Kaplan knows this, but that’s where he starts. The ancient Mesopotamians knew of them very, perhaps, too, well. And Lilith isn’t even mentioned when discussing succubi. Still, there’s a great deal of interesting conjecture here, and some scientifically, if not mythographically, viable suggestions on whence vampires and werewolves. As expected, modern sightings of cryptids are simply swept off the table, but I almost shouted aloud when I read that he gave credence to Wade Davis’s work on Haitian zombies.

The larger question here is one of approach. Do monsters lend themselves to scientific explanations at all? The case that elephant/mammoth skulls might suggest a cyclops seems reasonable enough, and the occasional dinosaur bone that represented a giant in ancient times is entirely possible. (Who can tell one femur from another anyway?) But the monster is primarily a creature inhabiting the shadowy realms of religion and psychology. Our fears are seldom directed toward science, although, now that I’ve read his chapter on “The Created” I’m not so sure. Constructing backward toward the unknown is always a dicey proposition, as those of us who’ve studied history of religions know. We may be able to find the genesis of modern monsters, but, admittedly, the fun for most of our scary friends is that they are mysterious. Impervious, as it were, even to science.


Literally Biology

In a New York Times opinion piece on a recent Sunday (ironically, always on a Sunday), college biology professor David P. Barash submitted an article entitled “God, Darwin and My College Biology Class.” Barash notes that increasingly students come to his class thinking evolution is more or less optional. I found the same thing teaching religion classes. When student presentations at state universities addressed Genesis it was fairly common to have a large number of undergraduates suggesting that evolution is “just a theory” and “intelligent” design was a viable option. I tried to show them in class that the Bible does not support the shenanigans that creationists impose upon it, but the message rolled off like rain from an evolved waterfowl. Still, I do have to take exception to some of Barash’s broad strokes. He feels that religion and science cannot coexist. I wonder, however, what he means by religion.

Religion is an ill-defined word. One of the most pragmatic usages I’ve heard is that religion is what people use to give meaning to their lives. Religions may be theistic or a. Religions may be anti-science or pro. Religion, per se, is no threat to science. Fundamentalism is not religion. Fundamentalists use religion to further their ends, which are often political. Since many religions grew up around sacred writings the urge was there from the beginning to take these holy words literally. They gave meaning in a pre-scientific era. Newton, Galileo, Darwin—and even before them Plato and Aristotle—simply shifted the angle of illumination. The problem is that many religious believers feel they have the answers already. New facts only confuse the issue. Left to their own devices such beliefs quietly go extinct.

Charles_Darwin_aged_51

It is only when a conscious decision is made to champion archaic writ against empirical evidence that science and religion join combat. Most religious people in scientifically advanced societies have no problems with evolution or particle physics. They simply show the way the world is. The vastness of the universe should give us all pause, but it does make you wonder which way to point your telescope to spy the almighty. I sympathize with Barash. It is not easy to find many of your students, in either science or religion classes, with their minds already made up. Still, it might help to realize that religion is not the culprit here. Literalism is a kind of mental problem. Until it is rightfully separated from religion we will all be left wasting valuable class time trying to convince students of the facts of life.


See Serpent

GreatNewEnglandSeaSerpentSeeing, it is said, is believing. I have a feeling that this truism may have become effaced somewhat in this age of deft photo manipulation and apps that are marketed to insert ghosts and UFOs into any picture. Nevertheless, anyone who has seen anything genuinely puzzling knows that it creates a lasting impression. A world without mystery, although a capitalist’s dream, is a nightmare for everyone else. So it was, now that October is here, I settled down with J. P. O’Neill’s The Great New England Sea Serpent. I found O’Neill’s book in a used bookstore a few weekends ago (appropriately water-damaged), and since I have a fascination with the ocean and monsters, this seemed like it would appeal to both of my avocations. It did indeed. O’Neill isn’t a sensationalist writer, but rather she is a normal person with normal jobs who has an interest in strange animals. Beginning in 1751 and up to three-quarters through the twentieth century, people had been spotting a classical sea serpent along the New England coast, and occasionally on ocean voyages across the Atlantic. Of course, we’re told, sea serpents don’t exist.

The Great New England Sea Serpent is a compendium of sightings from many reliable witnesses over the centuries. Of course, to many it is impossible. To me this appears to be the same kind of arrogance we apply to the universe—if we haven’t catalogued it by now, it doesn’t exist—to suggest there are no monsters of the deep. As any oceanographer will tell you, we know more about the surface of the moon than we do about our own oceans. If you turn your globe (or app) just right, there are views of our planet where virtually no land is visible. We are a watery planet. Even with current technology, the deep ocean is difficult, and very expensive, to explore. Who knows what might be lurking there right off the bow? O’Neill’s account is full of old salts and snarky journalists, but at the core of it all is a humility in the face of the largeness of the sea. What do we really know?

Of course, there is a fear of literalism. The Bible (and other ancient texts) take sea monsters for granted. Leviathan is a dangerous beast and, no matter what the pundits say, is no crocodile. And yet, for the past several decades the reports of the New England beast have dried up. Where has our beloved sea serpent gone? I have to wonder with both our polluting our oceans and our increasingly efficient (and massive) ships, if we haven’t simply forgotten that ancient maps used to leave space for dragons. Our great ships, guided by GPS, and our oceans running a temperature, are sure signs that greed has surpassed wonder. Have we, in our self-centeredness, slain the last of our dragons? O’Neill, please understand, does not call them dragons. Hers is a sober and straightforward account. But when October comes I just can’t help but hope there are still some monsters out there, deep under the waves.


Growing Up

WakingUpI am in two minds about Sam Harris’s Waking Up. Literally. I haven’t read Harris since The End of Faith, and I have to admit that I found Waking Up to be a very engaging book. I can’t agree with everything Harris writes—that’s an occupational hazard of acquiring advanced degrees—but to have a scientist, an atheist no less, praise spirituality felt incredibly genuine. Spiritual experiences happen. I’ve had a few doozies over the years. I’ve also read a number of scientists who tell me they’re all an illusion. Harris admits that consciousness is a mystery. His use of “mind” instead of “brain” won me over from the beginning. I discovered that the atheist can also be a seeker. Dogmatism, of whatever stripe, is the enemy.

Harris has considerable experience meditating. This is no activity for posers or wimps. It is, despite minimal physical demands, hard work. Throughout the book we get the sense that Buddhism is among the least objectionable religions, when divested of its myths. I do wonder, however, if demythologized Christianity was ever given a fair chance. From my own experience, some of the selflessness advocated by Harris can be found in taking aspects of Christianity seriously. I understand, I think, Harris’s objections to religion. It can, and does, lead to horrors both obvious and subtle. Yet, every once in a perhaps great while, it does offer redemption. Meditation, for example, has its roots in religious practice. It is this that Harris calls spirituality. And it is good.

A Guide to Spirituality without Religion is an apt subtitle for this brutally honest and open book. Harris’s knowledge as a neuroscientist endows his ideas with great authority. He opines, and he is not alone, that meditation demonstrates that “I” is only an illusion. This loss of self will haunt me for some time. For decades I is all I seem to have. Still, I am pleased to find an open-minded scientist on this same path I tread. Raised to be both spiritual and religious set the trajectory of my otherwise logic-driven life. You can’t go back and change all that, but you can grow up. To read of Harris’s spiritual experiences in the geography of great spiritual masters as well as in the laboratory instill in this reader a profound hope. Whether or not this reader is merely an illusion. There may be morning after this long night, after all.


Cuneiform Lover

I’m busy. Too busy most of the time. You see, I used to be able to keep my mental files neatly in order. Recall was swift and efficient. I suppose that was back when I was doing the job for which I’d been preparing my entire life. Then a midlife, unexpected career change shifted things a bit. That mental file that you always kept here has now been shunted over to there. I suppose I always knew this was coming, and that’s why I started writing things down. Of course, this led to stacks of papers and a whole series of notebooks that follow varying forms of logic. “Commonplace books” as they used to be called. Then computers. I never used a computer until after my master’s degree. My wife showed me how. And then writing ideas down became pretty easy—who could ever afford more than one personal computer? And since they were as heavy as a small television (cathode-ray tube variety, of course), you always knew where you’d find it. Then laptops. iPads. iPhones. Something called “the Cloud.” A computer on my person at all times and I still can’t find that ruddy file, and has anybody seen my phone?

I wrote an important (for me) paper back in 2012. Just two years ago. I remembered vividly typing it on my laptop, working on it for weeks. Recently I wondered where I put it. I searched my laptop. Not there. I must have backed it up. Checked my backup files, on CD. Not there. Where did I put the thing? Although a Luddite at heart, I don’t delete old files. Please, tell me I didn’t do something like back it up on a floppy disk! I can barely remember when we used those. No, no, it was much more recent than that. Was it on this laptop or the one before? Maybe I stored it on the hard disk of the antiquated one. When you get a new computer (or at least when I do) it is such a rare occasion that you don’t bother backing up every single little loose file on your old machine—there’s too much shiny new stuff to admire. But the file wasn’t there. Finally I attached a terabyte backup, admittedly overkill for someone of my limited mental ability, and searched. Although the icon said it was on the terabyte drive, the file was actually on the Cloud, and since I hadn’t updated my software in a while, I was denied access.

I learned to write with fallible pencil on cheap, lined tablet paper. Back when tablets were paper. Our ancient ancestors started the process by writing on clay. For some five thousand years this pressing stylus unto substrate method worked fine. All of scared writ was scrivener-mediated that way. When computers were new you stored your files on floppies. At least you knew where they were. Now dialogue boxes ask me questions in a language more obscure than Sumerian and quickly shuttle my files off to I-don’t-know-where, assuring me that I’ll be able to get them back. Honestly. As long as I remember to upgrade my system, which will, of course, require periodic outlays of substantial sums of money. You can choose not to pay, but your documents are with us. I’ve still got some clay here, and a sharpened flint taken to a twig will make a stylus, old school. And clay tablets have been known to last for millennia.

DSCN1816


Fighting with Monsters

GothicThe Lady and Her Monsters reminded me of Gothic. A friend of mine in seminary showed me this “shocking” movie by Ken Russell just after it was released on VHS (I always was fond of ancient history). To my young eyes this was a challenging film, but it rated higher in moodiness than scariness. Roseanne Montillo’s book brought the movie to mind because, it turns out, several of the incidents in the movie were based on fact. Perhaps I need to take a step back because Gothic never made it big, and many may not realize that the movie is about the legendary night Mary Godwin (soon to be Shelley) came up with the idea for Frankenstein. In an early nineteenth-century walk of fame, Percy Shelley, Mary Godwin, Lord Byron, and John Polidori decided to write scary stories, as a kind of contest. Only two ever made it to print, Polidori’s The Vampyre, which inspired Bram Stoker’s Dracula many decades later, and Mary Shelley’s Frankenstein. The movie, with Ken Russell’s famous flamboyance, traced an unlikely story of the friends conjuring a ghost and then banishing it once again before the stormy night is over.

Ken Russell had the reputation for being obsessed with the church and sexuality. These interests are certainly well represented in Gothic, where Percy Shelley, famously an atheist and believer in the supernatural, struggles to make sense of it all. Polidori, Byron’s personal physician, is presented as a Catholic who admits, when each has to confess his or her deepest fears, that God terrifies him. The friends (perhaps in an unwitting prelude to a television series by that name) explore sexuality and the supernatural through the long night. Waking nightmares meet them at every turn. They even have a skull of “the black monk,” a character attested in all sincerity, at one time, at the most gothic seminary in the Wisconsin woods.

“He who fights with monsters,” Friedrich Nietzsche opined, “might take care lest he thereby become a monster. And if you gaze long enough into an abyss, the abyss gazes also into you.” That which we worship is that which we fear. Certainly the Christianity of the Middle Ages had as much of Hell as of Heaven in it. Bursting out into the light of rationalism, it seems, did not banish the darkness after all. We still have many questions left unanswered, and many intelligent people have begun to question whether any one paradigm fits all of the evidence. I suspect not. Human experience goes in multiple directions at once. We have ladies, we have monsters, we have scientists, we have God. And on rainy nights we have movies that make us see that we have combined them all into a tale often repeated but never fully understood.


Dark and Stormy Night

LadyAndHerMonstersI miss my monsters, especially when I stay away too long. I had eyeballed Roseanne Montillo’s The Lady and Her Monsters: A Tale of Dissections, Real-Life Dr. Frankensteins, and the Creation of Mary Shelley’s Masterpiece nearly a year ago in a busy Port Authority bookshop, and wanted to curl up with it right away. Well, work and the world intervened, but finally I found time for the beast. Although a member of the monster kid generation, as a child I never felt much kinship with Frankenstein’s creation. I think it is because there was so much human intention involved in his origins. Almost ungodly. Too godly. Vampires and werewolves, and even mummies, seemed to have come up on the wrong side of a curse and couldn’t be blamed for being what they were. Frankenstein’s monster had a willful, if neglectful, creator. A human being, and fully so. There was, it seemed, some kind of blasphemy at work here.

Montillo’s book, however, gives me pause to rethink this. I had never realized, foe example, that Shelley’s book unfolds over nine months, and that Mary Godwin Shelley had suffered as her own fate unfolded—or unraveled—after Percy Shelley’s death. Nor had I stopped to consider that in the lifetime of these young lovers scientists and poets were overlapping careers with philosophy holding them together. I also hadn’t realized that Percy Shelley also shared his beau’s enchantment with the fantastic. But Montillo gives us so much more, wandering through the seedy world of body-snatchers and scientists who experimented on the dead, often with an eye toward a secular resurrection.

Frankenstein’s monster has, of course, become an instantly recognizable fixture in our society. Indeed, it is almost the definition of monstrosity: the ultimate mischwesen while being technically only one species. A creature that crosses boundaries and is both dead and alive, a miracle and a curse, innocent and evil. Morillo places this creature in the context of a world where galvanism was thought to bring life and medical schools scrambled to find corpses to dissect and on which to experiment. A world where the Shelleys would visit Lord Byron and Polidori, literally on a stormy night, and give the world both Frankenstein and the prototype of Dracula. Where the three men of that night all died prematurely and tragically, survived by a struggling Mary who lived only to fifty-three and who gave the world one of its most memorable nightmares. Horror fiction was, and is, considered lowbrow entertainment, but there is something profound here. And we are richer, if more unsettled, for having it.


Sustain This

Sustainability“Grant me chastity and continence,” Augustine famously prayed, “but not yet.” That tragicomic scene kept coming to me as I read Jeremy L. Caradonna’s Sustainability: A History. Few ideas can bear the sheer weight of irony than that of a human population destroying their own and only planet. We know we’re doing it, and yet for a few more greenbacks to flash before our fabulously wealthy peers, we don’t mind warming the place up by a few degrees. There’ll be time to throw on the brakes right before the crash. I once took a ride, perhaps unwisely, with a friend who believed nothing could go wrong. Although the car didn’t actually roll, it came awfully close, and I am haunted by what might have happened. The difference between that incident and destruction of the environment is that the latter has already happened.

I’m cynical enough not to believe in simple solutions to complex problems, and reading Caradonna was a sobering finish to a day that had started out optimistically. Such books are not easy to read. Science does not come charging over the hill like the cavalry to save us at the end of the picture. You can’t take your marbles and go home when the marble is home. Sustainability does not set out to be a bleak book. There is a guarded optimism to it, and I was particularly pleased to see that sustainists readily recognize that without some kind of just distribution of goods, no system will ever be sustainable. I had no idea, until I read this book, that some economists advocate for a steady state economic existence instead of the ridiculously illogical constant growth. Constant growth in a world of limited resources is the worst kind of delusion.

We’ve gone pretty far down the road to destroying our planet. Already it will take many decades to repair the damage done. If we can muster the will to address corporate greed with a good old dose of primate ethics. Society, in all honesty, may have to collapse before that happens. If it does, of course, the strong will survive. I have a prediction to make, and it’s one that rationalists won’t like. If we can’t avoid the wall that is right before us, and if society as we know it buckles under its own greed, the survivors will, as sure as gray matter, devise a religion to explain it. As a species we are all about myths. Like fabled saints we believe we can have our fossil fuels and consume them too. Before rising sea levels wash us all away, do yourself a favor and read Sustainability. And when you’re awake for nights afterward, tell everyone else you know to read it too.


Waking Up in Galilee

One voice can’t be heard. Unless, of course, it has a publicist. For years, it seems, I have been suggesting in my obscure corner of the internet that we’re not quite ready for the death of religion yet. I’ve never really doubted science, but I have noticed that science frequently draws the same conclusions as religion. Evolutionary biologists and neuroscientists exclaim, with some surprise, that religion has a survival advantage. Of course, big men with white beards sitting on thrones in the sky just won’t do, but the underlying concept has utility. So we’re told. Now Sam Harris, author of The End of Faith, one of the four horsemen of the new atheists, tells us that it’s okay to experience what has been known as, conventionally, a religious experience. Call it transcendent (I always do), but no matter what the chemical mix you concoct in the brain, it will feel good. Perhaps better than anything merely biological ever will. You’ll sell a million books. If you’ve got a publicist.

To me it seems that the religion question is a no-brainer. It wouldn’t persist if we had no need of it. Unlike the appendix, which seems not to have taken the hint that it is entirely vestigial, religion helps people (and perhaps some animals) survive. It doesn’t have to be sitting on an uncomfortable pew on a Sunday morning. It might be in the giddy heights of the Rocky Mountains where you can see to eternity and beyond and the rarity of the oxygen makes you lightheaded with a hologram of immortality. It might be the piercing peace that comes with light refracted through a glass so blue that superlatives fail you. It might be in imaginary vistas of an ice-bound Arctic where, you’re just certain, Nordic gods linger just out of sight. Transcendence can even come from traditional religious experiences, or so the stories of the saints proclaim. Anyone can participate. Those who have never forget.

The New York Times, in the Sunday Review piece by Frank Brunl (Between Godliness and Godlessness) introduces Sam Harris’s new book, Waking Up. I know I’ll read it. According to the article, Harris discusses his own experience of transcendence. When Harris has such a revelation, it is a best seller. Or it will be. For those of us who quietly suggest moderation between bombastic religion and bombastic science, it is merely another day in the life of the quiet ones who observe without being heard. True, it takes courage in this culture to dole religion a knock on the head. It is not, however, going to send faith to a premature grave. We still need our religion. We might not call it that any more. Name it spirituality, or transcendence, or mystic mumbo-jumbo, but when it hits you it’s like an atheist in Galilee. Some call it a electrochemical reaction in the brain. Others call it walking on water.

Dore Walk on Water


Quantum Uncertainty

Physics has moved beyond the point of comprehension for the average citizen, if I might be permitted to class myself as that. I got the concept of the atom, although I always wondered about the spaces in-between. No god-of-the-gaps there, but it didn’t fit with experience that everything was full of holes. An article my wife sent me now has me wondering if I’m a hologram. Physicists began to lose me with quarks—I can understand atoms being made of something, but what of ups and downs and leptons every way to Sunday? Then string theory. Then those particles that can be two places at once, until you look. And now I’m being told that The Matrix may be more fact than fiction and quantum uncertainty rules the day. Indeed. Physics tells us what we’re really made of. Religion used to tell us what it all means. That precarious balance seems to have tipped and religion has no other role than to motivate violence and science will save us. Help me, Neo!

I can’t even figure out my taxes any more, let alone what the universe is made of. How we could all be jittery two-dimensional particles is unclear to me. Well, the jittery part I get. I was never really satisfied being limited to three dimensions of motion. Is it ever clear which way is really forward? Height and depth seem terribly geocentric, and even a circle could be divided into more than 360 degrees, a legacy of our Mesopotamian forebears. Spheres—my primitive view of atoms—only touch at the edges. I think there must be something more. Then comes the math. The truth is in the numbers, it seems. Glad I have a calculator.

Although I don’t have the weak nuclear force at my disposal, I have tried to build with marbles many times. You can’t build upward without the bottom row rolling away. Perhaps in our world spheres just don’t balance that way. They don’t hold together. Pixels, however, have edges. They seem to fit together more fully, but leave the universe full of jagged edges. That fits much better with my experience, I guess. Shards of reality lie all around me. Religion used to be the way of putting the pieces together, but, I’m told, that’s all a myth. Instead we have a universe that the average person is incapable of understanding, and that seems to be held together by forces that are fully explainable only by math. Once upon a time, Hell was a mythical, fiery place underfoot. Now it is a universe of formulas and equations that are held together only by quantum uncertainty.

"HAtomOrbitals". Licensed under Creative Commons Attribution-Share Alike 3.0 via Wikimedia Commons - http://commons.wikimedia.org/wiki/File:HAtomOrbitals.png#mediaviewer/File:HAtomOrbitals.png

“HAtomOrbitals”. Licensed under Creative Commons Attribution-Share Alike 3.0 via Wikimedia Commons – http://commons.wikimedia.org/wiki/File:HAtomOrbitals.png#mediaviewer/File:HAtomOrbitals.png


Religion, Technically

Technology World HistoryOne of the truths of history is that technology has always been with us. Reading Steampunk stories always boosts my historical sense of the interaction of technology and civilization. Civilization, to the best of our knowledge, coalesced around the idea of religion. Kings rule at the behest of gods because, if it came down to just a matter of swords and games of thrones, there’s always somebody who’s willing to die for the sake of challenging authority, or taking it over. Unless the gods give it to someone. With this in mind I read Daniel R. Headrick’s Technology: A World History, a brief exploration into how we progressed to a hive mind (not his word) through smartphones from an initial band of scared apes two-footing it across the savanna with pointy rocks. The whole trip may have taken millennia, but once we reached a couple of flash points (the “Big Bang” of about 70,000 years ago when abstract artifacts began to appear, and then the birth of civilization about the time Sumer was organized) things sped up at a dizzying pace. Despite the anti-science rhetoric of the Religious Right, there’s no denying that we’re not in Eden any more.

We are accustomed to think of technological development as being cold and rational. Trial and error, based on brute mental power willing to bully through the dark forest of superstition, leading us to new heights. But from the early technology that led to Stonehenge and the pyramids to the coded message “What hath God wrought?” religious wonder has stood behind technological development. Indeed, in reading Technology it became clear that up until very recent times scientists got along with god, and sometimes even shared the credit for the devices they created. Reading about the Antikythera mechanism had me thinking along these lines: if someone had invented a kind of computer millennia ago, why didn’t it transform society in the first century before the Common Era? The answer can’t be that it sank beneath the Mediterranean, because other such devices likely existed. Why no Roman Empire Industrial Revolution?

Headrick makes it clear that early societies sometimes did not promote technologies. Technology was not just a matter of what we could do, but it was a means of social control. Those who charted the flow of wealth and power would have interest only in technologies that enabled the continued growth of that system. All the rest was just icing. People knew the basics of electricity long before a practical use was found for it. Petroleum products were known even to the Sumerians. The wheels of industry, however, are greased by more than just oil. We construct worlds, and gods used to direct our efforts. Now we let our technocrats call the shots. We write blogs wondering how religion fits into a nano-tech world. There may be some logic in it, but once we’ve left our footprints on the moon—who used to be a god—we’ve replaced the deities in the celestial sphere with those in our own heads. And there’s no going back.


Clockwork Heavens

DecodingTheHeavensIn a museum in Athens sits a device chock-full of gears and cogs and dials. Indeed, it looks quite a bit like the movement of a pre-digital clock. This particular object, known as the Antikythera Device, is what would sometimes be labeled an “out of place artifact” were its provenance not so well attested. History doesn’t always play fair. Jo Marchant’s Decoding the Heavens: A 2,000 Year Old Computer—And the Century-Long Search to Discover Its Secrets tells the fascinating story. Discovered by sponge-divers blown off course by a storm in 1900, a sunken ship at Antikythera became the first ever site of a ship-wreck excavation attempt. Even today underwater archaeology presents numerous challenges, but in the turn of the previous century, even land-based archaeology was a kind of glorified treasure hunt rather than an attempt to reconstruct ancient history. As the divers visited and revisited the site into 1901, they discovered ancient Greek statues that are among the best preserved from the ancient world. They also found the corroded box of gears that nobody really noticed for several months.

Marchant carefully unravels the slow process of discovery, acclaim, and forgetfulness that accompanied learning about this highly advanced computer. As with many other important finds, World Wars I and II led to distractions that made history somewhat less appealing than killing millions and then trying to recover from the damage. (The Ugaritic tablets, as I’ve often suggested, suffered a similar forgetfulness for being found at the wrong time.) As scholars, usually only one or two in a decade, began to notice the Antikythera mechanism, it became very much an object out of time. A sophisticated computer for calculating the movement of the sun, moon, and planets, the device could also show the phases of the moon, predict eclipses, and keep track of the Saros, Metonic, Exeligmos, and Callippic Cycles (18, 19, 54, and 76 years in duration, respectively). These cycles accounted for the adjustments needed by leap years and other fixes in the modern calendar. I can’t even keep track of Daylight Savings Time.

Adding to the mystery and drama, the Antikythera Device dates from the first century BCE, a time confirmed by radiocarbon dating and the presence of coins found on the ship. It is unknown who made it, but the influence of Archimedes is implicated. A similar device would not be known for another 1500 years with the beginnings of the Early Modern Period. The Roman Empire, which held power in the Mediterranean world at the time, was on its way toward the legendary decadence that would lead to its inevitable fall. It seems that a culture based on military might had little use for academic devices that were literally centuries ahead of their time. History does not repeat itself precisely, but broad strokes may often reveal more than passing similarities. And for those who want to discover a computer than shouldn’t have been, Marchant’s book is an excellent introduction to how the wisdom of the ancients still keeps us guessing.


Religion Fiction

Children brought up in a religious environment, according to a recent BBC story, are more prone to believe in fictional characters. The story, based on research from my alma mater, Boston University, suggests that if children are taught to believe miraculous stories at a young age, they will more likely believe that fictional figures are based in reality too. Undoubtedly this will be seen as yet another brick in Montresor’s wall by those who can find no good in religion. The reasoning will go something like this: believing in no religion is the “neutral” position. If we raise children in a religious context, we are inclining them toward a fictional belief system and making them less likely to reason their way out of it. Therefore, we should raise children secular.

Even in the BBC story there are dissenting voices. Perhaps children who learn about Jesus find Thor a more compelling character. Perhaps they are open to possibilities that logic shuts out. Our brains have two hemispheres for a reason. I often wonder whether it is possible to be fully human while ignoring about half of what evolution gave us to work with. Logic tells me that religious belief serves a survival function. And my creative side still appreciates the possibilities that my Manhattan brain is forced to shut down every day when I punch the clock. If there’s nothing more than work, perhaps believing in fiction serves a valuable function after all. But I suspect this is playing right into the rationalists’ hand. Pass me another brick, will you, Fortunato?

The jury, however, is still out on the nature of reality. Even for materialists. Gods of the gaps tend not to survive very well. The question is actually much larger than that. We don’t know the nature of ultimate reality. We’re not even sure what reality is yet. Can a parent who believes in God, after the experience of growing to maturity in a heartless universe, be blamed for teaching their children the same? No humane parent raises their child purposefully teaching them falsehoods. Yes, some children are damaged by religious upbringings. Some are damaged by materialist upbringings as well. What seems to have shifted, in my humble opinion, is the popular perceptions of religion. What used to be understood as the foundation of a civil society is now challenged as a harmful fantasy that encourages children to grow up into terrorists or non-functioning adults. The belief that we can raise children with no biases, however, is clearly fiction. Until we have the full truth, there should be room for both Gilligan and the Professor on this island. But then again, I was raised to believe in the divine world, so what do I know?

Fact or fiction?

Fact or fiction?