Some historians, I suspect, despair of the volumes already written on all periods, ancient and modern. History, however, can cover a variety of segments of time. A. Roger Ekirch, for example, thought to write a fascinating history of darkness. At Day’s Close: Night in Times Past is a thoughtful exploration of what night has historically meant. For those of us born in the era of constant light, the realities of what it was like after dark in the past are almost unimaginable. I’m sure Ekirch didn’t intend it to be so somber, but the reality of frequent crime in unlit regions made reading much of this book a sober experience. Confident of their ability to get away with it, many up until modern times took advantage of the night to commit all kinds of crimes. Makes you kind of want to check the door locks once again before turning in.
Much, but increasingly less, of our lives is spent in sleep. The advent of artificial light has led to changes in lifestyle that, according to biology, aren’t terribly healthy. The natural rhythms of sleep and wakefulness may be challenged by jobs or enticements to stay up late and surf the web or any number of other factors. Now that light has become so very portable in the form of smart phones and tablets, even less of darkness remains. Although it’s a bit repetitive and not laid out chronologically, At Day’s Close contains many provocative observations about the dark. There’s even a bit about monsters.
Night has always been symbolic. Death and fear were associated with night early on, and we still call the era of scientific progress the Enlightenment. Churchmen (for it was an age of such) tended to condemn night as evil, the Devil’s time. Not all agreed, for some considered the dark a creature of God and complained of the hubris of setting up lights at night. The religious symbolism of night is very rich and very ancient. Those of us used to artificial light have difficulty imagining what it took to navigate after dark in ages past. More than an annoyance, night could be a truly dangerous time. Dreams, once mainly the products of the night, also had religious significance before being rationally explained. The industry of banishing night also has, in some respects, the effect of banishing dreams. We should stop and think before we put night to flight. Half of our time is spent in darkness, and all of our time is highly symbolic.
Who doesn’t have a devil of a time keeping up with technology? My day is divided in almost Manichaean terms between having internet access and not. Once I climb on that New Jersey Transit bus—they don’t have restrooms, let alone wifi—I enter radio silence for God knows how long. Once safely ensconced at work, I once again have the net but I can only use it for work. The even longer commute home spells the end to internet access for the day, since supper and sleep await at the other end of the line. So when websites change in the course of a day or two, it’s difficult to keep up. The other day, for instance, I noticed on Wikipedia, in an article about the Devil, that the dark lord has a coat of arms. “That,” I thought, “would make an interesting blog post.”
That idea, like most of mine these days, had to be put on hold until after work. And between after work and getting ready for work again, the delay lasted a week. Maybe two. Then I went back to the page and the reference was gone. I can still remember that the coat of arms had three frogs on it—somewhat unfairly to amphibians, I felt—and I even recall precisely where on the page it was. When I finally had time to look it up, it was no longer there. Cached pages used to be easy to find, but who has time any more? There’s a reason that people of my generation still prefer print books. Yes, there are times when it’s difficult to remember where you read something, but at least the reference is still there when you open the cover again. It hasn’t vanished in a pique of editing enthusiasm. The strangeness of it all was worthy of comment—a coat of arms was a sign of medieval prestige. There’s no doubt that the Devil had his day in the Middle Ages.
I hear about people being bored in retirement. I’m so busy, though, that I’m going to have to request a desk in the afterlife. Not that retirement’s anywhere within sight, but I have so many projects going that I don’t know when I’ll ever have time to finish them all. Even a holiday weekend’s too short to make much of a dent. I don’t need another technologically driven mystery to occupy any more of my waking hours. Looking for a Wikipedia factoid that was deleted doesn’t make it any easier. They say the Devil’s in the details, but that presumes you can find the details where you left them. And if you happen find the reference, can you please also keep an eye out for my car keys?
An occupational hazard of the editor is paying obsessive attention to publishers. That stands to reason. Many academics are less concerned than some publishers think they are about such matters as who publishes their book. I suspect that many have, for whatever reason, found no welcome home among elite publishers. This happens often enough to make many scholars less worried about reputation than the practical matter of getting a publisher interested at all. There are a lot of original thoughts out there, and some of them occur to a person and just won’t let her or him go. An example: what terms are used for weather in the Psalms and why? Before you know it you’ve awaken before the sun for five years and written 75,000 words on the topic and you want to get it published without having to pay someone to do it. That kind of thing. I’ve been pleasantly surprised by the number of scholars who understand this kind of reasoning.
Also, it’s a matter of scale. I work for a premier publisher in the academic world. It may surprise many people to find out just how often when someone asks what I do (not very often, for the record) and then follows it up who I do it for, the interrogator has never heard of my employer. Academic presses, even important ones, are really only known among academics. Keep scale in mind. If you’ve ever walked passed Norton’s offices in Manhattan, and then those in which I spend my days, you know what I mean. Academia is small scale. For the average person reading a book is something they generally choose not to do. Of those who do read, very few read academic books. Those who read academic books tend to stick to their own discipline, or related ones. You get the picture—smaller returns at each step.
So, having written a book about horror movies, where do I take it? This isn’t one of those footnoted, look-how-erudite-I-am kind of books. It’s more of a I-noticed-something type. The question then becomes, who publishes such kinds of thing? I do worry about academic reputation—who doesn’t?—but this is a book I want the correct readers to find. That’s why McFarland suggested itself. People reading on pop culture, know to keep an eye on their offerings. Hopefully enough people will find it to have justified the effort. It won’t impress those enamored of collecting (academic) names. It isn’t the kind of book my employer would publish. Nor would I want them to. Call it an occupational hazard. Like any subject, knowing too much about publishing can take away from the fun.
Getting to the movie theater is not only costly, but increasingly difficult to schedule. This can be problematic for someone who likes to write about movies, but the realities of the commuting life aren’t very malleable. So it was that I finally had a chance to watch Arrival, on the small screen. It had been recommended, of course, and although it’s not horror it has aliens and a linguist as the hero—my kind of flick. Once it began, I wondered if religion would play any role in the story. Alien contact would certainly rate as one of the more formative religious events of all time. The only reference that was obvious, however, was the suicide cult shown on a news story in the background, immolating themselves as the aliens became known.
Louise Banks, a linguist who has security clearance, has a sad story. Spoiler alert here! If you’re even more tardy than me you might want to fire up Amazon Prime and read on afterward! The movie opens with her watching her daughter grow up, only to watch her succumb to a rare disease as a young woman. Then the aliens arrive and she’s whisked off to Montana to try to communicate with them. It’s only after repeated encounters, learning the written language of another race, that she asks who this little girl she keeps dreaming about is. The child is in her future. The aliens see time as cyclical, not linear, and by learning their language she begins to think like them—knowing the future holds a tragedy for her. The intensity of the experience makes her fall in love with Ian Donnelly, another academic, who will become the father of her child but who will leave when she reveals the future to him.
Just as the aliens prepare to leave, not religion but philosophy takes over. A question posed by none other than Nietzsche goes: if you could live your life over exactly the same as you lived it this time, would you? Nietzsche’s point was that those who say “no” deny life while those who answer in the affirmative, well, affirm it. Ian says what he would change. Louise, however, embraces life with the tragedy she knows will inevitably come. While religion is off in a corner doing something that shows just how nonsensical belief can be, philosophy stands tall and faces the difficult question head-on. Although the movie follows some expected conventions—aliens bring peace but militaries want war—it rests on a profound question to which, I’ll admit, I haven’t got an answer.
An article a friend sent me from Science Alert back in December recently came to mind. Titled “Thinking About God Might Make You Sweat, Even if You’re Not Religious,” the article by Brittany Cardwell and Jamin Halberstadt discusses how religious ideas are deeply engrained in human psychology. Like people who say they’re not afraid of spiders or snakes, people who don’t believe in the supernatural have made an effort to become this way. For reasons poorly understood, human beings are natural believers. As the article takes pains to state, that doesn’t mean a non-believer isn’t sincere. Thinking, however, doesn’t come only from rationality. Many people hold to the Mr. Spock fallacy—the belief that reasoning can solve anything. We all know from experience that it can’t. The big decisions in life—whom should I marry? What house should I buy? For whom shall I vote?—are often made with the emotions rather than rationally.
Which one’s the captain?
Reason has taught us to be expert deniers. We can learn to overcome our natural aversion to snakes and spiders and we can learn not to believe in God. Sometimes that belief can even be knocked out of us by the silly, unthinking behavior of “true believers.” But deep down it’s still there. Funnily, those who claim that reason alone answers all things are in denial about their own evolution. The human brain is a direct adaptation of the “reptilian brain” with its fight or flight impulses. That viper doesn’t plan to bite your ankle—it’s reacting to fear. Emotions are an integral part of thinking. Crimes of passion are committed by otherwise rational people sometimes. That thing you keep on bumping into in the room is, in fact an elephant. As irrational as that may seem.
The Science Alert article discusses the empirical proof that people fear to dis the Almighty. Were the brain a computer I’d say it was hardwired into us. We’re not wire and circuits, however. We’re messy, organic, evolving stuff that at one time lived beneath the waves. It took a certain amount of lungfish faith to believe we could survive on dry land. As mates approved of such irrational behavior, the trait multiplied and became more common. Today our smart phones and our cubicle window posters tell us there’s no such thing as a deity beyond our own scientific rules. The truth is, however, at some level we don’t really believe it. You can learn not to believe, but you’ll still sweat the big stuff, even in laboratory conditions.
Posted in Consciousness, Current Events, Evolution, Posts, Religious Origins, Science
Tagged Brittany Cardwell, Evolution, Jamin Halberstadt, Mr. Spock fallacy, Science Alert, science and religion