Thinking Big

Depending on who you are the Bigelow Institute for Consciousness Studies (BICS) may set your eyeballs to rolling.  You might know that extremely wealthy Robert T. Bigelow made his fortune as a hotelier and then began investing his money in aerospace technology.  He publicly admits to believing that aliens are already among us, and has contributed to advances in space travel components.  (It seems that many of the uber-wealthy are looking for a way off this planet at the moment.)  Not an academic, Bigelow is keen to admit his interest in what is often laughingly labeled the “paranormal.”  If you’ve got money you really don’t need to worry about what other people say.  I recently ran across an announcement regarding the winners of a BICS essay contest regarding the survival of consciousness after death.

As I’ve noted before on this blog, the paranormal and religion are close kin.  Nevertheless it does me good to see that so many people with doctorates (both medical and of philosophy) entered the contest.  I’m glad to see not everyone is buying the materialist narrative.  We’ve been so misguided by Occam’s razor that we can’t see reality is more complex than they teach us in school.  Churches may not be doing it for us any more, but it does seem that “there’s something out there.”  With a top prize of a half-a-million dollars, there was certainly a lot of interest in this enterprise.  If you go to the website you can download the winning papers.

Consciousness remains one of the great unexplaineds of science.  Answers such as “it’s a by-product of electro-chemical activity in the brain” don’t mesh with our actual experience of it.  Indeed, we deny consciousness to animals because our scientific establishment grew out of a biblically based worldview.  Even a century-and-a-half of knowing that we evolved hasn’t displaced the Bible’s idea that we are somehow special.  Looking out my window at birds it’s pretty clear that they’re thinking, solving problems.  Dogs clearly know when they’re pretending, as in a tug-of-war with its weak owner.  We don’t like to share, however.  Being in the midst of my own book project I really haven’t had time to read the essays yet.  I do hope they come out in book form, even though they’re now available for free.  I still seem to be able to carve out time for a book, which is something I consciously do.  I’m not convinced by the materialist creed, although I’ve been tempted by it now and again.  I like to think that if I had money I’d spend it trying to sort out the bigger issues of life, no matter what people call them.


Myth of Ownership

“Luddite” doesn’t really describe me.  I don’t have a problem with technology, but I often object to how its used.  Let me give an example or two.  You spend your hard-earned money on a device—smart phone, for instance, and/or a laptop computer.  These you use for your personal email, which you’re not allowed to check at work, and for paying bills and buying new stuff.  So far, so good.  But once these devices become ubiquitous enough, others presume the right to use them.  Never mind that you’re paying for the internet plan and your likely unreasonable monthly fees for using that phone.  Employers, for instance, concerned about their own security, require you to use your personal phone for some kind of authentication app to protect their assets.  Hmm, and who is paying for the data use on that phone?  And the wifi that makes it work?

Or consider a volunteer organization that’s taken over by a technocrat.  Suddenly you have to set up Dropbox on your laptop (with its attendant frequent emails asking you to upgrade until he seated on a white horse comes through the skies).  You can’t participate without access to the Dropbox.  Or maybe they want you to join Slack.  The problem, it seems to me, isn’t that we don’t have enough way to communicate.  No, the problem is we don’t communicate well with what we do have.  Terse messages may be understandable for smoke signals or telegrams, but a greeting, body, and closing aren’t too much to ask for an email.  I don’t text largely because too many misunderstandings occur from the brevity, and not infrequently, from auto-correct.

I use technology daily.  For about a dozen years now I’ve been posting daily right here on this very internet.  A have a neglected Twitter account and I glimpse Facebook for, literally, about two minutes per day.  I can be reached on LinkedIn (and no, I don’t have any jobs to offer), Instagram, and yes, even Slack.  We’re all available to each other constantly, but communication breaks down when we don’t communicate clearly.  A writer I greatly respect once told me emojis are cheating.  I tend not to use them, but they may help the terse text go down a little more smoothly.  We are all challenged for time.  There’s so much to do and we’re not getting any younger.  But I was born in an era in which if you use somebody else’s stuff you ask nicely first and said “thank you” after.  Especially if they’re paying for you to use it.

Who owns whom?

Behind Science

Science and religion have been sparring partners for a few centuries now, and I believe this is a generational conflict.  The child, science, arguing with the parent, religion.  You see, religion is all about worldview.  As secular as secular scientists declare themselves to be, their worldview was likely formed by their religious heritage.  Religion can’t be teased out of culture.  Here in the western world modern science was born in a fully Christianized cultural landscape.  That’s not to say that Judaism and Islam didn’t contribute, but European culture was based on some basic Christian ideals.  Creatio ex nihilo, for one—creation out of nothing.  Another aspect is that Occam’s razor accounts for the world we see.  This was a philosophical concept born of the Christian worldview.  And the list could go on and on.

Scientists, focusing on their specializations, generally don’t sit back to think about the origins of their basic cultural presuppositions.  Many of them came directly from their religion.  Ever since college I’ve tended to think back to presuppositions, and question them.  How do we know we know?  Epistemology is as useful as it is disturbing.  And if we discover that the basis for what we know was locked into a worldview we can no longer accept, what does that say about the underpinning method?  Our science is based on the idea that the world is rational because a benevolent deity wouldn’t make it absurd.  Would he?  And why are we still referring to the deity as a male?  Indeed, we still think of him as a human.

It’s difficult to get beyond our basic cultural propositions.  Religions such a Buddhism promote the idea that change is the only constant, yet the science in countries of the east is borrowed from the concepts of the west and its monotheistic sub-structure.  We tend to think that if humans can’t sense it, and quantify it, it doesn’t exist.  So it is that many scientists become atheists, but without perhaps questioning the cultural presuppositions that have led to the scientific outlook in the first place.  Some will go as far as saying philosophy is a waste of time when philosophy is the framework of all rational thinking.  And that’s not to forget that there’s emotional thinking as well.  The big picture is complicated by philosophers writing in lingo that the rest of us can’t understand.  And even they have presuppositions.  Maybe it’s time for me to go back to school and examine them again.


Standard Maintenance

Something disturbing happened the other day.  My laptop started requiring constant plugging in.  I figured the battery was starting to go—it is several years old now.  Since time is the ultimate commodity in short supply, I made a weekend appointment with a genius at the local Apple store, which really isn’t that local.  I drove out on a rainy Saturday afternoon to get the battery replaced.  That’s not the disturbing part.  Neither is the fact that so many people were flocking around in an Apple store without wearing masks (although that does count as disturbing in it’s own right).  As I sat there watching the giant projection of devices I should consider buying, my daughter mentioned to me how like a dystopia it was: being subjected to advertising aimed at purchasing something you’re in to have repaired.  That wasn’t really the disturbing part, either.

No, what was disturbing occurred when our genius told me I would need to leave my laptop there for three-to-five days for it to be repaired.  I use my laptop daily and extensively each day.  I have no spare and I post daily on this blog.  (Those times when a post doesn’t appear it’s because I think I’ve hit the “publish” button but I haven’t.  That happened to me again recently and I only discovered days later that WordPress was listing it as a draft.  Sure enough, I’d gotten so busy I’d not click “publish”—which happens, ironically, mostly on weekends.)  I was hit with panic.  Could I live for three days, up to a week, without my laptop?  No email.  No blog.  No ubiquitous Zoom meetings outside of work?

Even before the pandemic the internet had become my lifeline to the larger world.  And the thing is I’m sending my thoughts out like a Pioneer probe to that outer space of the web, not sure if anyone will intersect with it and understand the gold-plated plaque within.  At least I hope it’s gold-plated.  I’ve been blogging here since 2009, at least one laptop ago (or perhaps two).  I’ve posted over 4,500 times.  What would happen if the earth went through the tail of a comet and wiped out all this electronic data?  Would there be anything left at all?  That’s the part I found disturbing.  My ambivalence about technology doesn’t mean I’m not addicted to it.  I was spared an immediate crisis since the genius at the bar told me the battery (being such an old model) was out of stock and would take a few days to arrive.  Meanwhile I could continue to live in my virtual world as normal.


Religious Dinosaurs

Dippy is, apparently, a common name for pet diplodocuses.  The statue of a diplodocus outside the Carnegie Museum in Pittsburgh is fondly known as “Dippy,” as is the fossilized remains of one such dinosaur from London’s Natural History Museum.  The London Dippy is on tour, or at least has been.  I learned about the fact that Dippy was in Norwich Cathedral just a day or so after the exhibit closed (I wouldn’t have been able to make it in any case; I mean I haven’t been able to get to the Pittsburgh Dippy and I live in the same state).  There are still plenty of photos on the cathedral’s website.  It’s a striking juxtaposition.  A massive stone building constructed to a medieval conception of God and one of the best examples of evolution, far older than the church on several orders of magnitude, peacefully coexisting.

John Bell Hatcher, public domain, via Wikimedia Commons

American evangelicalism has a much harder time accepting science.  I’ve been writing about change recently.  One of the changes in western thinking has been to move from the “I told you so” of clerics to the “I can show you evidence” of scientists.  Those who like others to tell them what to think have a difficult time letting go of medieval notions of the world—that it’s flat, and young, and about to end, as if God has a very limited imagination.  We now know that the world has been here far longer than one interpretation of the Bible posits, but that doesn’t make it any easier to have a conversation about it.  Many religions want to claim knowledge that can’t be questioned.  And yet, dinosaurs and cathedrals seem to mix well.

The assumption that those who think differently are evil, or are inspired by evil, is one of the most insidious children of monotheism.  With one God comes the idea of only one way to understand that deity and all other interpretations come from that divinity’s arch-enemy.  It’s a view of the world that struggles with change.  Historians, even those of us who focus on the history of religions, tend to take a long view. It’s possible to trace the development of ideas that have lead to the strange juxtapositions of our modern world.  Apologists so convinced of their interpretation of Genesis that they think the Bible wouldn’t have found dinosaurs worth remarking about, for example, and then cramming them on the ark.  Others, it seems, welcome dinosaurs into cathedrals.  Which is a better way to be humble before God?


Degrees of Separation

For some reason lost in the fog of weblandia, I get The New York Times, “The Morning” delivered to my email.  By carefully not clicking the links I can get my day’s worth of fear and paranoia for free.  Not all the news is bad, of course, and I’d be glad to pay if circumstances had been different.  After giving all the sorrow that’s fit to print, “The Morning” ends with an Arts and Ideas section.  By then I’m usually cradling my head in my hands but I look up to see the positive side of humanity.  The other day the article on the Metaverse included this line: “In its simplest form, the term — coined by Neal Stephenson in his 1992 novel ‘Snow Crash’ — describes an online universe that people can share together…” and I realized probably the closest I’ll ever get to the Gray Lady.

I am, as many of my regular readers know, Neal’s brother-in-law.  He mentions me in the acknowledgements to Snow Crash, something that was discovered by someone at work fairly recently, and which probably did more for my stature than my many long hours daily.  When it comes to degrees of separation, fate, I suppose, plays a role worthy of the Joker.  Neal hadn’t written Snow Crash yet when I met his sister.  Her somewhat unlikely friendship with me eventually led to our marriage and it was in the context of a family gathering that the conversation Neal mentions in Snow Crash took place.  Outside publishing, and in particular academic publishing, acknowledgements are seldom read.  I always read them, though, looking for unusual connections.  I’m often rewarded for doing so.

Asherah was, unbeknownst to me at the time, undergoing a resurgence of interest.  My Edinburgh dissertation was published the same year as a more prominent one by Cambridge University Press.  Just a year later, another came out.  Then another.  The internet was really an infant in those days and we learned of such things through printed resources and printed resources are always in arrears by months, if not years.  Of the many Asherah books mine had the distinction of being the most expensive.  Some things never change, I guess.  Suffice it to say, Asherah was on my mind as Neal and I drove to the store to pick up some baby supplies.  I had nothing to do with his coining the word or idea “Metaverse”—he’d already worked that out.  It was Asherah that ended up in the novel.  I was on my way to a short-lived romance with academia at the time.  Family, however, is so much more than degrees of separation.


Quest for Quest

The Quest for the Wicker Man is a rarity.  Not only is it very difficult to locate and very expensive if you do find it, it’s also a collection of essays where each one is worth reading.  I’d read some of it before, but since I’m writing a book on the movie I thought I ought to sit down and go through it cover to virtual cover.  I had to settle for a Kindle version—please bring this back in print!—and was reminded yet again why a paper book is so much more satisfactory as a reading experience.  You see, I’m a flipper (not the dolphin kind).  I like to flip back and forth while I’m reading.  Clicking and swiping (both of which, coincidentally, dolphins do) isn’t satisfying.  And if you underline in a Kindle everybody else can see it.  I prefer the privacy of a print book.

In any case, if you’re interested in probing a bit into The Wicker Man you’ll find quite a lot of information here.  (Available on Kindle for a reasonable price, if not a comfy reading experience.)  Many aspects of the film are covered here.  One thing I won’t be discussing in my book is the music.  Firstly, I’m not qualified to do so, and secondly, it is done well here.  Essays also discuss religion (which I will discuss in my book), paganism (ditto), and many other aspects.  This is a book of conference proceedings—a boon for fans, but bust for most publishers.  It’s also a boon for those who like marking up used books to the tune of 64 cents per page (the lowest price on Amazon).  

Some of us believe a page is an ontological entity.  Once narrative writing began those responsible for clay tablets soon settled on a size that is, well, handy.  You can hold it easily.  That concept translated to the codex, or “book” as we know it.  Scrolls were cumbersome, but books offered many advantages.  For hundreds of years they were the standard-bearers of accessible knowledge.  I miss page numbers when reading an ebook.  I don’t want to know the percentage of screens I’ve swiped.  I want to know how many pages I’ve read, what page I’m on, and how many pages there are to go.  (The best of electronic books preserve that information.)  The book was not a form that required improvement.  Well, at least that digression kept me from giving up too much information about my book.  If you want to read it, when it comes out, I recommend the print form.


Paper Chase

Maybe you’ve done it too.  Kissed the posterior of technology.  Up until three years ago I didn’t pay bills online.  I waited for a bill, wrote a check, stamped an envelope, stuck it in the slot and forgot about it.  Then I started getting overdue notices.  My payments were failing to reach their recipients.  I switched to online payment—it seemed like the only option.  That has worked fine for two years but then something else started to happen: my email notices failed to show up.  I started to get overdue notices again.  I went to websites and enrolled in auto-pay for all my regular bills.  Then the emails began showing up stating accounts were overdue.  The actual websites said the bills had been paid.  There seems to be no pleasing the technological beast.

You see, I’m a simple man of pen and paper.  I don’t read ebooks unless I have to.  I don’t trust most of what I find on the internet.  Mine is the mindset of a working Post Office (or at least Pony Express), paper payment for which you receive a copy back.  Some solidity.  Live Science ran a teaser headline that the next solar storm could lead to an “internet apocalypse.”  All records wiped out.  With no shoebox full of receipts, how are you going to prove you’ve got the money you say you do?  (That could be a boon to braggarts such as Trump, but the rest of us will be waiting timidly for a letter from our banks.)  Technology seems to be chasing an invisible goal.  Doing it because we can without thinking of the consequences.  Shooting rockets into space with no certified astronauts on board—what could possibly go wrong?

Tech isn’t bad, of course.  It has preserved many of our jobs through a pandemic.  It makes it easy for forgetful guys like me to be able to find information quickly.  But functioning is only as good as the coding behind it, and it feels terribly vulnerable to me.  Coronal mass ejections, apart from sounding slightly dirty, are rare according to the story by Brandon Specktor, but they tend to happen every century or so.  A century ago a working landline telephone was a luxury.  The computer as we know it hadn’t been invented.  We were about to plunge into the madness of a second world war in which tech would be used to kill on a massive scale.  Now I guess we await the apocalypse.  The safe money says to have plenty of paper on hand.


Time Keeps on

Do you want to feel old?  Consider this BBC headline: “TikTok overtakes YouTube for average watch time in US and UK.”  If you’re like me you first heard of TikTok at some point during the pandemic and had only a vague idea what it was.  A new platform yes, but platforms come and go and I was really just starting to get into YouTube.  In fact, I remember when I first heard of YouTube.  A colleague at Gorgias Press was telling me about it.  It was a place to post videos.  I didn’t own a video camera and besides, what does a washed-up professor have to say?  No only that, but my computer didn’t have the memory capacity to upload and edit videos and who even has the (figurative and literal) bandwidth?  (I do have a YouTube channel, but it turns out that a nine-to-five and writing books on the side take up pretty much all of your time.)

Speaking as a homeowner, YouTube has been a lifesaver.  Most of what I have to do in household repair (a lot) I learn how to do from YouTube.  I know younger people who prefer YouTube to movies and never watch television.  It turns out that people are pretty good at entertaining each other even without the studios telling us what to watch.  (Although discoverability benefits from sponsorship, so money does change hands and the economy is happy.)  I was just beginning to get YouTube figured out when TikTok came along.  I was under the impression it was a music app—does Napster even still exist?  CDs are getting hard to find, as are DVDs.  I guess I can learn out where to buy them on YouTube.  Or TikTok?

I recently watched a horror movie on one of those services where they break in with a commercial at the absolutely worst moment time after time.  As the excitement began to build the commercials became more frequent.  As soon as it was over I was wishing for a DVD.  Too much content is on somebody else’s terms unless you’ve got a physical disc that you can slide in on your own timetable.  It’s strange being in that transitional generation between print and ebook, vinyl/VHS and streaming, paper maps and Google maps.  Now I guess I have to figure out what a TikTok is and how to use it.  I think I’ll go to the library and see if I can find an old-fashioned reference book on it.


Psychology of Religion

It’s so human.  Mistaking form for substance, I mean.  A recent piece in Wired that my wife pointed out to me is titled “Psychologists Are Learning What Religion Has Known for Years,” by David DeSteno.  As the title intimates, religion benefits individuals in many ways.  Church attendance, however, has been declining for a long time.  While not the point of the article, I do wonder how much of it is because mainstream churches are stuck in a form that no longer works and people aren’t finding the substance there.  The basic church service is premised on a specific religious outlook that no longer seems to fit how the world works.  Potential ministers go to seminary where age-old ideas are tiredly replicated, based on an incipient literalism that simply doesn’t match what people see in the world.

Wired?

I’ve experienced this myself.  Depending on who the minister is, a church can go from dynamic to dull several times in the course of a member’s life.  People still crave the substance, even if the form stops working.  The form, however, is seminary approved and seminaries are accredited by the Association of Theological Schools.  The folks are academics and academics are well aware of the developments that suggest the form doesn’t work.  Speaking as a former seminary professor, sermons just don’t do the trick when you’ve done your own homework.  As DeSteno points out, once you remove the theology science and religion tend to find themselves in agreement with one another.  For years I’ve been suggesting that secular seminaries are needed.  Churches that aren’t bound by form or doctrine.  Instead we swim in a sea of retrenched evangelicalism.

Religion is an effective survival technique.  It evolved, even while denying it did so.  Some time after the Reformation a resurgent literalism led Catholicism to modernize, removing the mystery that was perhaps the last tenuous grasp that form had to provide substance.  Religion, beleaguered as it is, still has substance to offer.  DeSteno’s article is adapted from his new book How God Works.  I haven’t read it yet, but from the summary I can see that I should.  There are religious groups that attempt what this article suggests.  From my experience, however, I see they easily get sucked into mistaking the form they settle on for the substance of what they do.  I had recently been toying with the idea of attending seminary again.  I found, however, form after form.  What I need is substance.


Thinking Plants

Consider your sources.  As an erstwhile professor I grew accustomed to repeating that, and this was before the internet started up, making claims of all kinds.  Certain news sources—think New York Times, or the BBC—earn their reputations slowly, over many, many years.  That doesn’t mean they don’t make mistakes, but it does mean they’re often on the mark.  So an article on plant consciousness on the BBC is worth considering.  Consciousness is still something we don’t understand.  We have it, but we can’t always say what it is.  Many, if not most, people tend to limit it to humans, but it’s become very clear than animals share in it too.  Why not plants also?  A few years back I read a book by philosopher Thomas Nagel.  He made the argument that human consciousness must come from somewhere, and as we look down toward animals, and plants, what we see are smaller pieces of the same thing.

I’m not stating this as eloquently as Nagel did, but the idea has stayed with me.  The BBC article  notes how plants seem to react to human interaction.  And they seem to communicate back.  We lack the natural range to hear their responses, but some experiments indicate that plants at least communicate among themselves.  Being the BBC, the story reports but doesn’t necessarily advocate this point of view.  Still, it makes sense.  For too long we’ve supposed human beings to be the only intelligent creatures on this planet, taking the arrogant view that animals are automatons with no thinking ability.  To give them that would be to make them too human-like.

That particular viewpoint still exists, of course, but more and more scientists are starting to consider whether consciousness isn’t emergent from, as Nagel put it, smaller building blocks.  I tend to be on the more imaginative end of the spectrum—consider your source here—but it seems to me that plants could well have a consciousness too.  Trees move.  They do it too slowly for our species to notice it, fixated as we are on our own brief time in the world and our human affairs, but that doesn’t mean they don’t move.  It simply means that if we want to see it we need to shift our perspective.  Communication, it would seem, pervades nature.  If it does, isn’t consciousness somehow implicated?  Plants may respond when we pay attention to them.  To me that makes the world an even more wonderful place.


Who Ya Gonna Call?

The haunting season is nearly upon us.  Apart from the usual fun of ghost stories, those of us with appreciation of science wonder about whether there’s any hope of confirming some of these tales.  Benjamin Radford’s Investigating Ghosts: The Scientific Search for Spirits is a handy guidebook for those who don’t wish to be gullible.  Radford demonstrates just why much of popular ghost hunting reality television really isn’t scientific at all.  Knowing how science works, Radford is unusual in that he’s open to the possibility of ghosts.  He points out, however, that from the point of view of science there’s a conundrum—there is no consensus on what a ghost actually is.  Different readers and experimenters and experiencers have different ideas about them—everything from the spirits of the dead to “recordings” made by the environment to demons to time-travelers.  Radford’s quite right that to test an hypothesis you need to agree on what you’re testing for.

Ghost hunting groups, as he points out, are actually gathering evidence hoping to prove the existence of ghosts (whatever they are).  Evidence gathering isn’t the same as science, however.  If you’ve ever watched any of these shows you’ll likely enjoy Radford’s take-down of their flawed methodology.  Wandering an unfamiliar location at night with the lights off and gadgets in hand, they go here and there, possibly contaminating each others’ “evidence.”  Their theories behind why ghosts do this or that—make cold spots, turn lights on and off, make white noise into EVPs, or electronic voice phenomena—don’t match the science of basic ideas of ambient temperature, wiring, and audio pareidolia.  These things are well understood, but you have to read about them to apply them. 

The larger question, however, remains.  If ghosts exist, and if they choose (if they have will) not to cooperate, how can we learn about them?  Radford makes the valid point that coming in for one night with lots of equipment and little knowledge of what we might term “the deep history of a location” stands very little chance of achieving results.  It may be fun, and entertaining, and it may catch a legitimate anomaly or two, but it doesn’t, can’t scientifically prove the existence of ghosts.  We still seem to be stuck with the materialism that only measures the physical.  This fact may indeed fuel skeptics to suggest it’s “only this and nothing more.”  But science isn’t the only way we know the world.  It’s a pleasure to read a book from an investigator of this topic who has his head on straight.


Making Excuses

The internet, and computers in general, seem to think we’re dumb.  I say that because of the false information they routinely give.  I was recently on a website run by a reputable *ahem* agency.  It turns out that the information they gave me was incorrect.  The next week when I went to check the status of my transaction, it said I couldn’t do so because cookies were blocked on my computer.  Well, cookies aren’t blocked.  I had to call said agency to ask about the status.  I was then told that what I’d requested was valid “only during the pandemic” (excuse me, I thought we were still in a pandemic?) and that was the reason I couldn’t check the status online.  That service was no longer available.  So why did the auto-response blame it on cookies?  I miss the generic “technical difficulties.”  At least it was honest.

We’re all busy these days.  Keeping websites up to date matters.  It doesn’t help when some software person decides some techie-sounding excuse ought to satisfy you.  Whenever I restart my computer, for example, I get a dialogue box—it’s more of a monologue box, really, since it isn’t asking for anything but acknowledgement that its incorrect information has been delivered.  In any case, it tells me that the computer decided to restart because of a problem.  No it didn’t!  It restarted because I gave the restart command!  Is this a problem?  I thought I was authorized to restart my own computer.  Why is it lying to me?  Is it colluding with the websites that are making up excuses?

Are we really that stupid?  Computers seem to think so.  On my work computer (PC, of course) you no longer have a trash can in which to discard old files.  No, now we have a recycle bin.  Recycle bin?  Really?  While I appreciate the message that we should recycle whatever we can, this is not a case of recycling at all.  It is a matter of getting rid of something I no longer need.  I guess what I’d like from our machine overlords is a bit of respect for our intelligence.  Sure, we may be subject to biological constraints that don’t apply to the electronic world.  We do have lapses in judgment just as surely as devices have bugs.  A world that runs by algorithms alone is hardly a world in which we could live.  So my devices may well be more logical than me, and if so they should figure out that they don’t need to lie or make excuses. Just say “technical difficulties,” I can live with that.


Evolving beyond Fear

Live Science recently reported on a story that may shed light on human evolutionary behavior.  While my conclusions are speculative, they make sense, given the circumstances.  Titled “Albino chimp baby murdered by its elders days after rare sighting,” the story by Nicoletta Lanese describes how an albino chimp caused a fear reaction among its community shortly after it was born.  A few days later it was killed by the chimps.  Scientists must be careful not to attribute human motive to such attacks, and so they note that this particular community has a tendency toward infanticide, but that doesn’t explain the initial fear reaction.  An individual who was “different” appeared and the response was one of deadly violence.  We’re far from understanding human motivations, let alone those of animals, but it’s difficult not to see this as typical human behavior.

Photo credit: Afrika Expeditionary Force, via Wikimedia Commons

Just because a behavior has evolved doesn’t mean it’s inevitable.  We evolved out of our need for tree dwelling in order to open new potential habitats—an experiment that proved wildly successful.  Can we not evolve out of fear of those who are different?  That seems to be the idea behind recent diversity and inclusion initiatives.  There are those who still resist them, but examine their beliefs and you’ll soon find fear of those who differ.  This atavistic tendency is remarkably close to the chimp behavior in killing an albino.  If we are to remain civilized, we must name such fear for what it is and grow beyond it.  Conservatism is often based in fear.  Fear of change is natural enough, but had our ancestors given in to it we’d still be in the trees.

We need to admit that the lives of those different matter. How long will we allow difference be a reason to fear other human beings?  The story on Live Science is difficult to read.  The chimp behavior is so typically human that we can feel sympathy for the murdered infant and his mother.  Fear, if left unattended, can bring us to this.  The antidote is education.  The more we learn the better we can cope with fear, which is, after all, a natural and necessary response to an evolved world.  Our fear of being prey has caused us to drive extinct most of our natural predators.  The world is hardly a better place for it.  Might not weighing fears and thinking through reasonable solutions be a better coping technique?  Fear can revert a human back into an animal state.  Or it can drive us toward improvement.


EBW

Nashotah House was a strange place to begin (and end) a teaching career.  Not only did you see students every day, but as faculty you were required to eat and worship with them twice a day.  (You were grudgingly permitted to have supper at home, with family, if applicable.)  You got to know students, and sometimes their families, well.  I suppose that was the point.  We had a lot of students from Texas, and one year a student spouse said she cried all the way home when she found her first colored leaf on the ground.  Granted, Wisconsin winters could be cold.  Even here in balmy Pennsylvania we have to use the furnace from October through May, leaving only four months of the year without artificial heat.  And even September can get pretty chilly.  I was thinking about this student spouse when I started to see the walnut trees turning yellow in July.

Yes, each plant has its own rhythm.  Not all of them need all their leaves until October or November.  Walnuts, however, are an interesting species (or whatever the plural of species is).  The walnuts you eat are probably of the Persian or English walnut variety.  Here in the United States, the Eastern Black Walnut is perhaps the most common deciduous tree east of the Mississippi, but since the nuts are hard to crack they aren’t grown commercially.  Squirrels worship them.  The EBW (do I really have to type out Eastern Black Walnut again?) is famous for its use of allelopathic chemicals.  Some people say it poisons the soil, but more specifically, allelopathic plants distribute chemicals into the soil that favor the growth of “friendly” species and inhibit others.  Yes, plants are quite smart.  The EBW is also wise in its use of the squirrel.  These ubiquitous chewers disperse the nuts widely.  It isn’t uncommon for me to find one on my porch when I go out for my early morning constitutional.  

The air is beginning to feel cool once in a while in the early mornings.  Like the walnut trees and the squirrels, I think I’m at the very early stages of feeling autumn coming on.  We’re still many weeks away from the colors of fall, harvest, and Halloween, but the wheel of the year is still turning.  It never really holds still.  We have the languorous month of August ahead, with its long, warm days and summertime activities.  The walnuts stand as sentinels, however, reminding us that nature is ever restless and ever inclined to change.  I don’t weep to see the changing leaves, but I do marvel at how nature seems to plan ahead for autumn, even in the midst of summer.