Outernet

Once in a while (ahem), I interject a note of caution regarding technology.  This blog has been part of my daily routine for over a dozen years.  I try to post every day.  When I experience life outside I often think “that would make a good blog post.”  I make notes.  I ruminate.  One of the things I caution about is the fragility of tech.  In order for me to post these thoughts many different components have to work just right.  Not only that, but if I want to pay bills, or, more importantly, work so that I can pay bills, I have to have internet.  Everyone in my family uses it and they do so all day long.  This weekend is the long anticipated Project for Awesome (check it out at projectforawesome.com) sponsored by the Vlogbrothers, John and Hank Green.  If the names are familiar it’s perhaps because I’ve read and commented on their books.  Then the internet went out.

Late on Friday afternoon, of course.  Now we’ve had outages before—most recently after a power outage earlier in the week.  I called what used to be RCN, the only service provider in our area, only to be on the phone for half an hour with a tech.  She talked me through the usual rebooting and system checks.  The router was fine, but the only actual connection to the internet is via wifi mediated by a device called Eero.  There’s no ethernet cable (as if Apple laptops even have ethernet ports any more!), no phone line plug-in (ditto), nothing.  Nothing but Eero.  Apparently Eero had died.  And being a weekend a masked tech can’t be sent until Sunday afternoon.  So Friday night with no Disney Plus and Saturday without the long-anticipated Project for Awesome (you really should check it out).

Then my wife noticed her phone could act as a wifi hotspot.  It felt like we were entering a new world of magic.  (And data bills.)  The laptop could covert the G4 that her iPhone could receive into wifi.  It wasn’t ideal, because we have three people who want to use the internet.  With old tech.  All because one component of RCN’s complex system has x’s for eyes.  We had to play Wordle through her phone.  Watch Project for Awesome (it supports charities!) through her phone.  I don’t know, maybe we are even breathing through her phone.  Once in a while I interject a note of caution regarding technology.  This blog post is brought to you by my wife’s phone, acting as an internet hotspot, before anyone else awakes this Saturday morning.

Ancient history!

Healing Borders

Sometimes you read a book that just gets your head buzzing.  Brett Hendrickson’s Border Medicine: A Transcultural History of Mexican American Curanderismo is one such book.  It brings together so many areas of fascination: healing based on different belief structures than scientific medicine, the role of community in avoiding cultural appropriation, and the cultural blending that takes place at all borders.  The myth of the “pure” has long created problems, particularly in the realm of religion.  Things blend.  They always have.  And this includes belief structures, faiths, religions.  This is most obvious at borders, which makes them very interesting places.  Officially we police them, wanting to keep what is “ours” and keep “them” out.  In reality we are blending with each other and that’s not a bad thing.

In much of educated society, it’s assumed that scientific medicine is the only valid kind.  There are those even among the schooled, however, who pray for the ill.  Curanderismo is a form of folk healing that involves cures that would be rejected out of hand by science.  In a materialist, chemical world, only this can heal that.  Curanderismo looks at things quite differently.  Its practitioners don’t charge an arm and a leg for their work.  They are extremely popular.  And they heal people.  This is part of what makes Hendrickson so wonderful to read—he doesn’t assume up front that this doesn’t work.  Some analysts treat this kind of thing from a perspective of cultural superiority, as if scientific medicine is the only real way to treat illness.  Cultures, however, can heal.

Culture is something we value because it makes us feel secure and comfortable.  We know what to expect.  We speak the language, know the conventions.  (It would help Democrats, I think, to realize that although we’re trying to dismantle xenophobia, it is still very much intact in most of the world.  People follow autocrats because they’re afraid.)  We all live near borders.  Our personal border may be the wall of our apartment or the front door to our house.  It may be the Protestant/Catholic next door (or Buddhist/Atheist/Muslim/Hindu/Agnostic).  It may be the middle class/working class person who lives across the street.  In one town in which I lived it was literally those on the other side of the railroad tracks.  We draw borders for protection, but what Hendrickson shows so clearly is that they can also be places of healing.  


Routine Change

A certain type of mindset thrives in routine.  Perhaps you’ve noticed that these posts appear each day about the same time.  This happens because the routine states that work comes next and it will be largely the same day after day after day.  After work there’s also a pattern until I fall, exhausted, into bed.  Hit repeat.  In the midst of this routine change has crept.  Partly it’s the pandemic, but mostly it’s technology.  And spending habits.  People don’t buy academic books like they used to.  Overall books are booming—so much so that paper shortages aren’t uncommon.  In order to try to keep up with electronic lifestyles, publishers have to integrate the newest technology and to do that everyone has to learn far more tech than technique.  The pace of change is dizzying.

For those who thrive on routine, such rapid-fire alterations make it feel like we need a personal change manager.  “How do I do this now?”  The way we’d done it for years has suddenly shifted and it is only one of many moving parts.  Meanwhile, outside work, other aspects are shifting even as many people still survive without computers at all.  We’re left, those of us tied to routine, in a haze of uncertainty.  It’s like that dream where you’re driving and you can’t slow down but you can’t see out the windshield either.  To make it through we look for routine.  I type this posts on a laptop.  I prefer to write things out by hand, but there’s no time for that any more.  The routine has been broken and the shop that repairs it has gone out of business.

Perhaps this is a malady of those of us who look to the past.  Technological changes used to be measured in centuries, not seconds.  Ancients thought a spout on a jar was a pretty rad invention.  For a hundred years.  Maybe two.  Now if you don’t buy a new iPhone every couple of years you’re hopelessly outmoded.  What was my routine again?  I still awake at the same time and begin each day with writing.  I’ve learned to do it via laptop.  Then it’s to the work laptop where updates seem to be loaded daily and I’m the dog chasing that stick now.  I wonder whose vision we’re following?  Technology’s in charge now.  The rest of us mere humans should be able to get along, as long as we establish a routine of routine change.


Thinking about Thinking

I’ve been thinking about thinking quite a bit.  My lifelong fascination with religion is part of this, of course.  So when someone pointed out Bridget Alex’s article “The Human Brain Evolved to Believe in Gods” in Discover, I had to ponder it.  The idea, here supported by science, is that people evolved survival traits that lent themselves to religious belief.  That religious thinking was a byproduct that eventually took on a life of its own.  Evolution works by giving a reproductive advantage to one trait over another—which is how we get so many types of dogs (and maybe gods)—and those that disposed people to be religious did just that.  Elaborate religions evolved from these basic traits.  Alex suggest there are three: seeing patterns, inferring intention, and learning by imitation.

While there’s a lot of sense here, the reductionism doesn’t ring true.  The need to explain away religion also seems uniquely human.  Ironically, the idea that we are somehow special compared to other animals derives from a biblical worldview from which science has difficulty divorcing itself.  One of the greatest ironies of the science versus religion debate is that scientific thinking (in the west) developed within a worldview formed by Christianity.  Many of the implications of that development linger, such as the supposition that animals can’t have consciousness, or “souls.”  We watch a chimpanzee in an experiment and deduct points when they don’t do things the way a human would.  We thus confirm the biblical view in the name of science and go home happy.

Photo credit: Afrika Expeditionary Force, via Wikimedia Commons

I have no doubt that people evolved to be religious.  There are certainly survival benefits to it, not least group building and shared purpose.  I do wonder that science doesn’t address the elephant in the room—that we have limited receptors for perceiving specific stimuli, such as light and sound, but that there are other phenomena we don’t perceive.  We build instruments to measure things like x-rays and neutrinos and magnetism, but we don’t sense them directly.  How can we possibly know what we might be missing?  I suspect the real problem is we don’t want to admit willfulness into any other part of the universe.  Humans alone possess it.  Some scientists even argue that our own sense of will is an illusion.  It’s not difficult to believe that we evolved to be religious.  It’s also not difficult to believe that we pick up hints of forces that have yet to be named.  An open mind, it seems, might lead to great rewards.


Zones of Twilight

The other day I saw a beautiful twilight moon.  This was in the morning twilight.  I suspect many people don’t realize that twilight comes twice a day.  Twilight is when dayglow either begins or ends but either before or after the sun itself is visible.  Most people are familiar with evening twilight since they stay awake until after dark.  Morning twilight, so full of hope, is beautiful to the point of being painful.  The other day it was twilight as work was starting—the days are beginning to lengthen since I’ve been starting work in the dark for months now.  A waning gibbous moon shown through a gauzy cloud cover in an indigo sky.  It was very cold outside, so I went to the window to take a picture with my phone.

The modern phone camera often misses the point.  I zoomed in on the moon—phone cameras are wide-angle, by default—and it kept sliding in and out of focus.  The sky was getting lighter by the second, and I was losing the opportune moment.  I tried moving the phone closer to the glass, then back a little.  Still out of focus.  Then I realized what was happening.  My phone was focusing on the dirt specks on the window.  (Hey, it’s winter, hardly the time to be out with the squeegee.)  It occurred to me that a life lesson was being offered.

Thich Nhat Hanh recently died.  He was a Zen Buddhist master, and his passing reminded me of the old Buddhist saying that the Buddha is not the moon but rather the hand pointing at the moon.  Religions often confuse the hand pointing for the truly sublime realm to which it points.  Worshipping the person instead of following her or his teachings is a standard feature of religions worldwide.  It is the reason for much of religious conflict.  Those who worship the figure soon come up with their own teachings that are unrecognizable when held up next to those of the departed leader.  They focus on the window, not the glowing moon beyond.  The sky was growing too light to capture the image that had struck me.  The moon was blurry in a way that my eye hadn’t experienced it.  The moment of teaching was past.  The lesson was over.  The best that I could do was spend  a long day working then try to recapture a moment that had occurred in twilight.


Making Noise

There’s a real danger, it seems, to having an open mind.  We live in a world defined and classified by materialists.  They hold sway not only over science and commerce, but in whether prestigious jobs are on offer.  Consider the case of William Roll.  Roll was a fully credentialed psychologist with an interest in parapsychology.  His book The Poltergeist is a classic in the field.  He’s now frequently called a “credulous investigator.”  What that means, of course, is that he listened to and sometimes believed the people who reported the paranormal.  For materialists that discussion is already closed.  Anyone who tries to pry it back open is ridiculed and called names.  (We’re all adults here, right?)  Yet his classic book still gives pause.

If you actually read it, “credulous” is not a word to suggest itself.  Could Roll have been tricked by clever pranksters?  Yes.  Most people, even clever pranksters, can.  If someone is caught hoaxing a phenomenon, does that mean the whole thing is a hoax?  Not necessarily.  It’s here the materialists swarm.  Interestingly, Roll acknowledges that there could be good psychological reasons for hoaxing after a genuine event.  The person caught hoaxing perhaps realized the benefits of the attention received when something unexplained occurred, and learned how to replicate, or at least imitate it.  People will do anything for attention.  Roll asked a bit more finely parsed question: does hoaxing discount genuine phenomena?  He even tried to get experiencers to the lab where controls could be put into place.  As this book demonstrates, he doubted some of the cases and did so openly.

I became interested in Roll after watching A Haunting in Georgia.  The Wyrick family maintains that the events happened (I’ve written about a book penned by two of the aunts), and they seem sincere.  The problem is money.  Once there’s potential money to be made the skeptics come out, claws bared.  The problem is we all have to make money to survive.  If that involves “capitalizing”—even that word betrays much—on weird things that happen to you, skeptics claim it’s all made up.  There’s an ulterior motive.  For most of us there’s an ulterior motive for going to work, too.  For me, Roll appears to have been sufficiently skeptical.  Statistical anomalies shouldn’t be simply dismissed.  If they are, it’s possible we’re missing something important.  While this book may not have aged particularly well, it is still worth reading with a mind at least a little bit open.  


Higher Learning?

I was reading, as one does, about a mental institution.  In the last century they were often called, rather insensitively, “lunatic asylums.”  The neurodiverse were often shunted away so that the rest of society could get on with business as usual (as if that’s sane).  There were any number of reasons sought for such individuals thinking differently.  The source I was reading had a short list and I was surprised to see on it, “over study of religion.”  It really said nothing more about it but it left me wondering.   First of all, it brought Acts 26.24 to mind: “Paul, thou art beside thyself; much learning doth make thee mad!”  Religion, from the very start, it seems, had the reputation of driving people insane.

Image credit: Published by W. H. Parrish Publishing Company (Chicago), public domain, via Wikimedia Commons

As someone who’s spent well over half a century thinking about religion, reading about religion, and analyzing religion, I can see Festus might’ve had a point.  This way much madness lies.  I don’t think religion evolved to be thought about.  It was largely a fear reaction to being, in reality, rather helpless in a world full of predators and other natural dangers.  Although we’ve managed to wipe out most of our large predators, we’re still under the weather, as it were.  We can’t control it, and what messing around we’ve done through global warming has made it less hospitable to our species and several others.  And also the small predators, those that evolve quickly, such as Covid-19, are now the real challenge.  Facing fear was the real evolutionary advantage of religion.

Being story-telling creatures, we made narratives about our belief systems.  Then we started taking those stories literally.  Believing too seriously, we used those stories as a basis for hating and killing those with different stories.  We still do.  Can anyone deny Festus’ accusation?  I’m sure religious mania has, historically, led to some institutionalizations.  It was kind of a trope in the seventies, for example, that too much Bible-reading could lead to criminal behavior.  It’s not difficult to see why those trying to classify what might make an individual off balance might look to religion as an explanation.  Nationally, and very publicly, we can see strident examples of this promotion of irrational ideas on a daily basis.  Many of the large mental institutions have been closed down and many of the neurodiverse have been turned out to the streets.  Ironically, it is often the religious who try to care for them.  Understanding religion, it seems to me, might be a great public good.


Many Moons

Once you get beyond the very basic level, astronomy quickly become all about numbers.  It’s both fascinating and a shame.  Fascinating because scientists have been able to send probes millions of miles away, calculating, for example, where the multiple moons of Jupiter and Saturn will be so that they can fly by for an interplanetary peek.  The shame is that many people star-struck by the concepts can’t pursue it as a career.  That’s where books like David A. Rothery’s Moons: A Very Short Introduction come in.  I’ve read a number of books on the moon, but I’ve fallen behind on what we’ve learned about the moons of the outer planets.  This short treatment covers them in just enough detail.  I certainly learned a lot by reading it.

Rothery points out that apart from Earth, in our solar system the likeliest candidates for life are actually some of the moons of Jupiter and Saturn.  Heated by tidal action and volcanism, and having under-surface oceans, conditions may be right for such worlds to spawn life.  Even if microbial, finding life elsewhere would confirm what this dreamer has supposed all along—we’re not alone in the universe.  As Rothery also notes, there are far more moons than planets in our solar system, so that may also apply to other systems as well.  Exoplanets have been discovered for decades now.  If other suns have planets and those planets have moons, who knows what might be out there?  And all this has been happening for billions of years, whether or not we notice it.

For me, it’s concepts such as these that make astronomy so fascinating.  Also, as the book points out, there are all kinds of oddities in our own astronomical back yard.  Asteroids that have their own moons.  Moons that have unusual geological (or lunar) features that haven’t been explained.  Moons that have been torn apart by their host planet.  Moons that have been captured in orbit when passing by.  Probes have been landed on moons other than our own, and the ethics of doing so (since we might transmit microbes unintentionally) are topics of discussion.  There’s a lot crammed into this brief study.  I also can’t help but wonder what amazing things the next generation will discover.  Our knowledge of this universe, impressive as it is, doesn’t even break the crust of ice moons out there, where there is nitrogen ice and methane ice.  Parameters beyond imagination.  It may all come down to numbers in the end, but moons are so much more than that.


Thinking Big

Depending on who you are the Bigelow Institute for Consciousness Studies (BICS) may set your eyeballs to rolling.  You might know that extremely wealthy Robert T. Bigelow made his fortune as a hotelier and then began investing his money in aerospace technology.  He publicly admits to believing that aliens are already among us, and has contributed to advances in space travel components.  (It seems that many of the uber-wealthy are looking for a way off this planet at the moment.)  Not an academic, Bigelow is keen to admit his interest in what is often laughingly labeled the “paranormal.”  If you’ve got money you really don’t need to worry about what other people say.  I recently ran across an announcement regarding the winners of a BICS essay contest regarding the survival of consciousness after death.

As I’ve noted before on this blog, the paranormal and religion are close kin.  Nevertheless it does me good to see that so many people with doctorates (both medical and of philosophy) entered the contest.  I’m glad to see not everyone is buying the materialist narrative.  We’ve been so misguided by Occam’s razor that we can’t see reality is more complex than they teach us in school.  Churches may not be doing it for us any more, but it does seem that “there’s something out there.”  With a top prize of a half-a-million dollars, there was certainly a lot of interest in this enterprise.  If you go to the website you can download the winning papers.

Consciousness remains one of the great unexplaineds of science.  Answers such as “it’s a by-product of electro-chemical activity in the brain” don’t mesh with our actual experience of it.  Indeed, we deny consciousness to animals because our scientific establishment grew out of a biblically based worldview.  Even a century-and-a-half of knowing that we evolved hasn’t displaced the Bible’s idea that we are somehow special.  Looking out my window at birds it’s pretty clear that they’re thinking, solving problems.  Dogs clearly know when they’re pretending, as in a tug-of-war with its weak owner.  We don’t like to share, however.  Being in the midst of my own book project I really haven’t had time to read the essays yet.  I do hope they come out in book form, even though they’re now available for free.  I still seem to be able to carve out time for a book, which is something I consciously do.  I’m not convinced by the materialist creed, although I’ve been tempted by it now and again.  I like to think that if I had money I’d spend it trying to sort out the bigger issues of life, no matter what people call them.


Myth of Ownership

“Luddite” doesn’t really describe me.  I don’t have a problem with technology, but I often object to how its used.  Let me give an example or two.  You spend your hard-earned money on a device—smart phone, for instance, and/or a laptop computer.  These you use for your personal email, which you’re not allowed to check at work, and for paying bills and buying new stuff.  So far, so good.  But once these devices become ubiquitous enough, others presume the right to use them.  Never mind that you’re paying for the internet plan and your likely unreasonable monthly fees for using that phone.  Employers, for instance, concerned about their own security, require you to use your personal phone for some kind of authentication app to protect their assets.  Hmm, and who is paying for the data use on that phone?  And the wifi that makes it work?

Or consider a volunteer organization that’s taken over by a technocrat.  Suddenly you have to set up Dropbox on your laptop (with its attendant frequent emails asking you to upgrade until he seated on a white horse comes through the skies).  You can’t participate without access to the Dropbox.  Or maybe they want you to join Slack.  The problem, it seems to me, isn’t that we don’t have enough way to communicate.  No, the problem is we don’t communicate well with what we do have.  Terse messages may be understandable for smoke signals or telegrams, but a greeting, body, and closing aren’t too much to ask for an email.  I don’t text largely because too many misunderstandings occur from the brevity, and not infrequently, from auto-correct.

I use technology daily.  For about a dozen years now I’ve been posting daily right here on this very internet.  A have a neglected Twitter account and I glimpse Facebook for, literally, about two minutes per day.  I can be reached on LinkedIn (and no, I don’t have any jobs to offer), Instagram, and yes, even Slack.  We’re all available to each other constantly, but communication breaks down when we don’t communicate clearly.  A writer I greatly respect once told me emojis are cheating.  I tend not to use them, but they may help the terse text go down a little more smoothly.  We are all challenged for time.  There’s so much to do and we’re not getting any younger.  But I was born in an era in which if you use somebody else’s stuff you ask nicely first and said “thank you” after.  Especially if they’re paying for you to use it.

Who owns whom?

Behind Science

Science and religion have been sparring partners for a few centuries now, and I believe this is a generational conflict.  The child, science, arguing with the parent, religion.  You see, religion is all about worldview.  As secular as secular scientists declare themselves to be, their worldview was likely formed by their religious heritage.  Religion can’t be teased out of culture.  Here in the western world modern science was born in a fully Christianized cultural landscape.  That’s not to say that Judaism and Islam didn’t contribute, but European culture was based on some basic Christian ideals.  Creatio ex nihilo, for one—creation out of nothing.  Another aspect is that Occam’s razor accounts for the world we see.  This was a philosophical concept born of the Christian worldview.  And the list could go on and on.

Scientists, focusing on their specializations, generally don’t sit back to think about the origins of their basic cultural presuppositions.  Many of them came directly from their religion.  Ever since college I’ve tended to think back to presuppositions, and question them.  How do we know we know?  Epistemology is as useful as it is disturbing.  And if we discover that the basis for what we know was locked into a worldview we can no longer accept, what does that say about the underpinning method?  Our science is based on the idea that the world is rational because a benevolent deity wouldn’t make it absurd.  Would he?  And why are we still referring to the deity as a male?  Indeed, we still think of him as a human.

It’s difficult to get beyond our basic cultural propositions.  Religions such a Buddhism promote the idea that change is the only constant, yet the science in countries of the east is borrowed from the concepts of the west and its monotheistic sub-structure.  We tend to think that if humans can’t sense it, and quantify it, it doesn’t exist.  So it is that many scientists become atheists, but without perhaps questioning the cultural presuppositions that have led to the scientific outlook in the first place.  Some will go as far as saying philosophy is a waste of time when philosophy is the framework of all rational thinking.  And that’s not to forget that there’s emotional thinking as well.  The big picture is complicated by philosophers writing in lingo that the rest of us can’t understand.  And even they have presuppositions.  Maybe it’s time for me to go back to school and examine them again.


Standard Maintenance

Something disturbing happened the other day.  My laptop started requiring constant plugging in.  I figured the battery was starting to go—it is several years old now.  Since time is the ultimate commodity in short supply, I made a weekend appointment with a genius at the local Apple store, which really isn’t that local.  I drove out on a rainy Saturday afternoon to get the battery replaced.  That’s not the disturbing part.  Neither is the fact that so many people were flocking around in an Apple store without wearing masks (although that does count as disturbing in it’s own right).  As I sat there watching the giant projection of devices I should consider buying, my daughter mentioned to me how like a dystopia it was: being subjected to advertising aimed at purchasing something you’re in to have repaired.  That wasn’t really the disturbing part, either.

No, what was disturbing occurred when our genius told me I would need to leave my laptop there for three-to-five days for it to be repaired.  I use my laptop daily and extensively each day.  I have no spare and I post daily on this blog.  (Those times when a post doesn’t appear it’s because I think I’ve hit the “publish” button but I haven’t.  That happened to me again recently and I only discovered days later that WordPress was listing it as a draft.  Sure enough, I’d gotten so busy I’d not click “publish”—which happens, ironically, mostly on weekends.)  I was hit with panic.  Could I live for three days, up to a week, without my laptop?  No email.  No blog.  No ubiquitous Zoom meetings outside of work?

Even before the pandemic the internet had become my lifeline to the larger world.  And the thing is I’m sending my thoughts out like a Pioneer probe to that outer space of the web, not sure if anyone will intersect with it and understand the gold-plated plaque within.  At least I hope it’s gold-plated.  I’ve been blogging here since 2009, at least one laptop ago (or perhaps two).  I’ve posted over 4,500 times.  What would happen if the earth went through the tail of a comet and wiped out all this electronic data?  Would there be anything left at all?  That’s the part I found disturbing.  My ambivalence about technology doesn’t mean I’m not addicted to it.  I was spared an immediate crisis since the genius at the bar told me the battery (being such an old model) was out of stock and would take a few days to arrive.  Meanwhile I could continue to live in my virtual world as normal.


Religious Dinosaurs

Dippy is, apparently, a common name for pet diplodocuses.  The statue of a diplodocus outside the Carnegie Museum in Pittsburgh is fondly known as “Dippy,” as is the fossilized remains of one such dinosaur from London’s Natural History Museum.  The London Dippy is on tour, or at least has been.  I learned about the fact that Dippy was in Norwich Cathedral just a day or so after the exhibit closed (I wouldn’t have been able to make it in any case; I mean I haven’t been able to get to the Pittsburgh Dippy and I live in the same state).  There are still plenty of photos on the cathedral’s website.  It’s a striking juxtaposition.  A massive stone building constructed to a medieval conception of God and one of the best examples of evolution, far older than the church on several orders of magnitude, peacefully coexisting.

John Bell Hatcher, public domain, via Wikimedia Commons

American evangelicalism has a much harder time accepting science.  I’ve been writing about change recently.  One of the changes in western thinking has been to move from the “I told you so” of clerics to the “I can show you evidence” of scientists.  Those who like others to tell them what to think have a difficult time letting go of medieval notions of the world—that it’s flat, and young, and about to end, as if God has a very limited imagination.  We now know that the world has been here far longer than one interpretation of the Bible posits, but that doesn’t make it any easier to have a conversation about it.  Many religions want to claim knowledge that can’t be questioned.  And yet, dinosaurs and cathedrals seem to mix well.

The assumption that those who think differently are evil, or are inspired by evil, is one of the most insidious children of monotheism.  With one God comes the idea of only one way to understand that deity and all other interpretations come from that divinity’s arch-enemy.  It’s a view of the world that struggles with change.  Historians, even those of us who focus on the history of religions, tend to take a long view. It’s possible to trace the development of ideas that have lead to the strange juxtapositions of our modern world.  Apologists so convinced of their interpretation of Genesis that they think the Bible wouldn’t have found dinosaurs worth remarking about, for example, and then cramming them on the ark.  Others, it seems, welcome dinosaurs into cathedrals.  Which is a better way to be humble before God?


Degrees of Separation

For some reason lost in the fog of weblandia, I get The New York Times, “The Morning” delivered to my email.  By carefully not clicking the links I can get my day’s worth of fear and paranoia for free.  Not all the news is bad, of course, and I’d be glad to pay if circumstances had been different.  After giving all the sorrow that’s fit to print, “The Morning” ends with an Arts and Ideas section.  By then I’m usually cradling my head in my hands but I look up to see the positive side of humanity.  The other day the article on the Metaverse included this line: “In its simplest form, the term — coined by Neal Stephenson in his 1992 novel ‘Snow Crash’ — describes an online universe that people can share together…” and I realized probably the closest I’ll ever get to the Gray Lady.

I am, as many of my regular readers know, Neal’s brother-in-law.  He mentions me in the acknowledgements to Snow Crash, something that was discovered by someone at work fairly recently, and which probably did more for my stature than my many long hours daily.  When it comes to degrees of separation, fate, I suppose, plays a role worthy of the Joker.  Neal hadn’t written Snow Crash yet when I met his sister.  Her somewhat unlikely friendship with me eventually led to our marriage and it was in the context of a family gathering that the conversation Neal mentions in Snow Crash took place.  Outside publishing, and in particular academic publishing, acknowledgements are seldom read.  I always read them, though, looking for unusual connections.  I’m often rewarded for doing so.

Asherah was, unbeknownst to me at the time, undergoing a resurgence of interest.  My Edinburgh dissertation was published the same year as a more prominent one by Cambridge University Press.  Just a year later, another came out.  Then another.  The internet was really an infant in those days and we learned of such things through printed resources and printed resources are always in arrears by months, if not years.  Of the many Asherah books mine had the distinction of being the most expensive.  Some things never change, I guess.  Suffice it to say, Asherah was on my mind as Neal and I drove to the store to pick up some baby supplies.  I had nothing to do with his coining the word or idea “Metaverse”—he’d already worked that out.  It was Asherah that ended up in the novel.  I was on my way to a short-lived romance with academia at the time.  Family, however, is so much more than degrees of separation.


Quest for Quest

The Quest for the Wicker Man is a rarity.  Not only is it very difficult to locate and very expensive if you do find it, it’s also a collection of essays where each one is worth reading.  I’d read some of it before, but since I’m writing a book on the movie I thought I ought to sit down and go through it cover to virtual cover.  I had to settle for a Kindle version—please bring this back in print!—and was reminded yet again why a paper book is so much more satisfactory as a reading experience.  You see, I’m a flipper (not the dolphin kind).  I like to flip back and forth while I’m reading.  Clicking and swiping (both of which, coincidentally, dolphins do) isn’t satisfying.  And if you underline in a Kindle everybody else can see it.  I prefer the privacy of a print book.

In any case, if you’re interested in probing a bit into The Wicker Man you’ll find quite a lot of information here.  (Available on Kindle for a reasonable price, if not a comfy reading experience.)  Many aspects of the film are covered here.  One thing I won’t be discussing in my book is the music.  Firstly, I’m not qualified to do so, and secondly, it is done well here.  Essays also discuss religion (which I will discuss in my book), paganism (ditto), and many other aspects.  This is a book of conference proceedings—a boon for fans, but bust for most publishers.  It’s also a boon for those who like marking up used books to the tune of 64 cents per page (the lowest price on Amazon).  

Some of us believe a page is an ontological entity.  Once narrative writing began those responsible for clay tablets soon settled on a size that is, well, handy.  You can hold it easily.  That concept translated to the codex, or “book” as we know it.  Scrolls were cumbersome, but books offered many advantages.  For hundreds of years they were the standard-bearers of accessible knowledge.  I miss page numbers when reading an ebook.  I don’t want to know the percentage of screens I’ve swiped.  I want to know how many pages I’ve read, what page I’m on, and how many pages there are to go.  (The best of electronic books preserve that information.)  The book was not a form that required improvement.  Well, at least that digression kept me from giving up too much information about my book.  If you want to read it, when it comes out, I recommend the print form.