Ravens and Teachers

Humans, it is claimed, have a theory of mind.  What this means is that we know what others are thinking, or better, at least we can anticipate what they might be thinking.  This allows us to be self-aware and live in a complex society.  We can see someone else and infer what’s going on in his or her noggin.  This is often considered a uniquely human trait, but I’m not sure how widespread it is.  You see, I frequently run into the situation where someone expects something of me without telling me.  It happened just recently with an organization to which I belong.  I’m a very busy person.  I suspect most of us are—not having time to accomplish everything we need to get done.  If someone wants something from me I have to be told what it is and I have to be told in detail.

One of the things my students always said was that I was a good teacher.  The reason for this, I think, is that when I explain something I back up a bit before the beginning.  I try to assume no knowledge on the subject before going in a bit more deeply.  This method works because of my personal theory of mind.  These people wouldn’t be taking a class on this subject if they already knew the stuff I could assume.  For understanding something new, things have to be explained thoroughly.  That doesn’t mean taking a lot of extra time, but it does mean not assuming others know what I know.  For many people this is difficult.  We’re all busy.  We tell others “Do this,” without explaining what exactly “this” is.  The results are predictable.  It happens all the time in work emails.

I’ve recently written of teachers and ravens.  The effective among the former understand the value of full explanation.  The latter have a theory of mind that allows them to go as far as to try to fool others by giving not enough information.  We might learn a lesson either by sitting in the classroom of the former or by watching the ravens that skulk on the edge of civilized areas.  What they have in common is the ability to realize that others operate with limited information.  In order to learn, information has to be conveyed and conveyed well.  Even now colleagues at work are surprised at when I explain something that it’s done thoroughly and clearly.  When I receive information it’s often piecemeal and frustrating.  The reason, I infer, is that we don’t spend enough time paying attention to either our teachers or the ravens.

Image credit: Wikipedia Commons, public domain

Following Instinct

An article from the Christian Science Monitor a few years back made me think how common knowledge runs ahead of science, but without the rigorous evidence.  The article is “Ravens might possess a Theory of Mind, say scientists.”  Of course they do.  The ravens, that is.  So do many other animals.  It’s pretty obvious when watching them interact on a daily basis.  We’ve over-flogged the idea of “instinct,” using it as a way of preserving the biblically-inspired idea that people are separate from animals.  We can be an arrogant species.  We say we get to determine when other species are intelligent or not.  When they do something smart we say, “That’s just instinct.”  Is it?  How do we know that?  And isn’t “instinct” one of the greatest fudge factors ever invented?

We do not know what consciousness is.  We claim it for ourselves and a few of our favorite animals only.  The ravens in the article show by their behavior that they know, or assume they know, what others are thinking.  I’m always struck how experiments set up to measure this assume a human frame of reference.  Paint a spot on an animal and place it in front of a mirror.  If it shows curiosity about the spot it has a self-awareness, a theory of mind.  Maybe other species aren’t as concerned about zits as we are.  Maybe they consider it vain to fawn over themselves.  Maybe they use sight in coordination with scent and hearing to identify themselves.  No matter what, at the end of the day we must say how our intelligence is superior.  (Then we go and elect Trump.)

Need I say more?

Scientists have to be skeptical—that is their job.  Looking for evidence and coming up with hypotheses and theories and whatnot.  That’s how the scientific method works.  The scientific method, however, isn’t the only way of knowing things.  We learn and animals learn.  We like to think our “theory of mind” makes us unique, but watching how animals interact with each other, even when they don’t know someone else is watching them, shows more sophistication than we normally allow.  Nobody has to be convinced that the corvids are intelligent birds.  Their lives are different from the nervous little finches and wrens, however.  Does that mean wrens and finches have less developed minds?  I think not.  Until we learn how to think like animals we have no business claiming that they have no theory of mind.  Maybe if we could define consciousness we might have a claim.  Right now, though, all we have are instincts to go on.


Souls, All

What is a soul?  Can you lose your job for believing in one?  Well, maybe not lose your job, but be placed on paid leave.  Yesterday’s New York Times ran a story about Google engineer Blake Lemoine being put on leave after claiming a soul for the company’s artificial intelligence language model.  Isn’t that what artificial intelligence is all about?  We’ve become so materialistic that we no longer believe in souls, and when we create life we don’t expect it to have one, right Dr. Frankenstein?  One of the sure signs that we are alive is our sensing of the many qualia of biological existence.  We understand that we’re born, we’re biological, and that we will die.  We also sense that there’s something beyond all this, our self, or mind, or psyche, call it what you will.

In our minds the soul has become entrenched with the Christianity that provides the backdrop to our somewhat embarrassing history of repression of those who are different.  How do we redeem one without having to be shackled to the other?  And once we do, can we declare why AI doesn’t have a soul while biological beings do?  Or will we insist that it is solely human?  In this odd world that’s evolved, we view animals as innocent because they can’t know what they’re doing.  Historically there have been animals put to trial.  That’s an aberration, however, from our usual ways of thinking.  We don’t know what a soul is.  We’re not even sure there is such a thing.  To suggest a machine might have one, however, is taboo.  Would we trust a soul made by humans?

Skepticism is good and healthy.  So is having an open mind.  We’ve been a polarized people for a long time.  If it’s not politics it’s elites versus uneducated, materialists versus those who think there might be something more, self-assured versus those who question everything.  The path of learning should keep us humble.  We should be open to the possibilities.  There’s no way to measure the immaterial since all our tools are material.  Even psychology, which utilizes categories to help us understand neurodiversity, often finds chemical solutions to the most cerebral of problems.  Perhaps overthinking is an issue—it can certainly get you into trouble.  Believing in souls can put you in a place of ridicule or suspicion.  Does AI have a soul?  Does a soul emerge from biological existence, whenever a sufficient number of neurons gathers in one place?  Is it the fabric of the universe from which we borrow a little?  We have some soul-searching to do.

Carlos Schwabe, Death of the Undertaker; Wikimedia Commons

Fragmented

The existentialists, remember, used to put scenes in their plays to remind you that you were indeed watching a play.  In keeping with their philosophy, there was no reason to fool yourself.  Meanwhile, movies seldom break the fourth wall, immersing you in a story that, if done right, will keep your eyes firmly on the screen.  With home based media, however, we’ve all become existentialists.  (Of course, some of us had made that move before the internet even began.)  When we watch movies we always have that “pause” button nearby in case an important call, text, or tweet comes through.  We can always rejoin it later.  Life has become so fractured, so busy, that an unbroken two hours is a rarity.  I see the time-stamps on my boss’s emails.

While the existentialist side of me wants to nod approvingly, another part of me says we’ve lost something.  What does it mean to immerse ourselves into a story?  I know that when I put a book down it feels like unraveling threads at the site of a fresh tear in the fabric of consciousness.  Even the short story often has to be finished in pieces.  Poe, who knew much, wrote that short stories should be read in a single sitting.  All of mine have bookmarks tucked into them.  For a fiction-writer-wannabe like me, you need to feed the furnace.  To write short stories, you have to read short stories.  Novels must be spread over several weeks.  Some can take months.  I would like long novels again if time weren’t so short.  Presses are even encouraging authors to write short books.  Readers want things in snippets.

Perhaps all this fragmentation is why I enjoy jigsaw puzzles so much.  Part of the thrill is remembering several places in the picture simultaneously.  Being able to pick up where you left off.  I limit my puzzle work to the period of the holidays when I can take more than one day off work in a row and the lawn doesn’t require attention and those trees that you just can’t seem to get rid of don’t require monitoring.  But puzzles are designed for interruption.  Movies and short stories are intended to engage you for a limited, unbroken period.  The real problem is that we’ve allowed our time to become so fragmented.  A creative life will always leave several things undone by its very nature.  Other forces, mostly economic, will demand more and more time.  The best response, it seems to me, is to be existentialist about it.


Free Reality

One thing movies can do especially well is to make you question reality.  Early on this was more or less literally true as people couldn’t believe what they were seeing on the screen.  Photography had perhaps captured souls, after all.  A series of movies in more modern days began to ask us to reconsider what we know, with profound results.  In 1998 The Truman Show suggested that we might be living on a stage and God is really a misguided director. The next year The Matrix went further to float the idea we might be living in a simulation—an idea that some highly educated people have taken seriously since then.  They asked us to consider what we meant by reality.  Those questions have haunted us as the cybersphere grew.

I recently saw Free Guy, a movie that slipped me back into that uncomfortable space.  I’m no gamer, so I’m sure I missed many of the references to memes and characters that are familiar to many.  Still, it was fun and profound at the same time.  It’s not giving too much away to say that Guy is a non-playing character in a shared game.  In other words, he’s just code.  Not conscious, not making any decisions.  Until he starts to.  He turns out to be a form of artificial intelligence.  Teaming up with a human player, he learns to appreciate virtual life and works to make Free City a better place to live.  When the credits rolled I found myself asking what I knew about reality.

Not a gamer, I’m pretty sure we’re not caught in that particular matrix.  I’m pretty sure my wife wouldn’t’ve put up with over thirty years of pretending to be married to me just for ratings.  Still, many times riding that bus into Manhattan I had the distinct feeling that none of it was really real.  I would tell myself that on the way to the office.  Not that I think movies are the whole truth, but they definitely seem to be part of it.  Guy learns to rack up points to level up.  He becomes a hero.  In this reality we can look but not see.  Becoming a hero is unlikely unless someone is actively watching you.  Many heroes on a small, human-sized scale exist.  They don’t get to wear sunglasses, but they can watch movies about those that do.  And if they’re not careful, they might find themselves getting in a philosophical quandary by doing so.


Short-Changed

Time often feels short.  When we back it up against the pencil marks on the doorpost we find it seems to shrink with its own passing.  It is nevertheless relentless.  This shows especially with daily tasks, such as the posts on this blog, which leave enormous piles of writing behind.  I used to print every entry I wrote but I had to stop because there were too many.  There are now well over 4,500 of them and yet time keeps going and each day demands its sacrifice.  It’s that way with other daily tasks too.  It’s staggering, for example, to think of just how much food you eat in a lifetime.  It makes sense of why we struggle against that middle-age bulge.  Little bits add up.  I suspect that’s why the news can feel overwhelming at times.  It just keeps piling on.

If I’d chosen to study journalism—I really didn’t know what it was, despite being co-editor of my high school newspaper—I might’ve reached the point of being paid for my writing by now.  Even with my published fiction stories (and two of my nonfiction books) no money has ever changed hands.  I know from editorial board meetings that journalists expect pay for what many of us give away for free.  Writing is funny that way.  The best way to improve is to practice, and so I spend time each day writing blog posts, as well as content for books and articles and fiction stories.  As I said, there’s quite a pile.

Time is relentless.  It’s also in short supply.  The marking of each passing day with writing is a reminder of just how quickly the sand slips through the glass.  Other tasks go neglected for writers, which is, I expect, why we appreciate being paid for our work.  But just imagine if we were paid for reading.  What if every book read brought in say, in today’s economy, $1,000.  Would we be a more literate society then, valuing the work of writing?  For nonfiction editorial boards note the difference between professors, who are paid to do other things (and paid pretty well, considering), and journalists who live by the pen.  I have another job, helping other writers get published.  I suppose that means I have less time to do my own writing.  Time and writing are engaged in a complex dance which, when viewed from a distance, may look beautiful.  And when the dance is done you’ll find another piece of paper to add to the pile, regardless of whether it has monetary value or not.


Aleph, Borges

I’m never quite sure how I’m supposed to approach books of short stories.  Some of them are truly massive and contain only a handful of tales I wish to read.  Others are governed by a dedication to the author that compels me to read from cover to cover.  Some are by differing authors, among whom some appeal more than others.  I wasn’t sure where to begin with Jorge Luis Borges.  Not having been raised in a literary family, and having never formally studied literature, I found Borges through a friend and co-worker.  After my academic career crashed and burned, I started reading more literary writers and discovered Borges again and again.  I knew the basics of his story—he was perhaps the most famous Argentine writer, he had gone blind, and he had written probing, unusual stories.

I picked up this collection because of the title.  “The Aleph” is included here.  It was also the title of a collection of Borges’ stories, which make up the basis of this book.  To that collection are added some other pieces, and these last become a mix of poetry and philosophy more than a simple narrative.  Of course, Borges didn’t write simple narratives.  His stories are layered labyrinths.  A complex person doesn’t write simple stories.  Often they reflect on religion.  Some of them explicitly so.  They aren’t, however, religious stories.  Indeed, I was drawn to “The Aleph” because of my own experience of Hebrew and the sense that it is a sacred language.  Borges also puts this into the mix here.

So what kind of collection is this?  I’m still not certain.  This time I did read it cover to cover and at several places I became uncomfortable.  Borges doesn’t shy away from the harsh realities of life.  What people are capable of doing to each other, and what they in fact do.  Some of the pieces just under a page long stopped me in my metaphorical tracks.  Was I reading fiction or some kind of history?  Was philosophy secretly being fed to me by being left right out in the open?  This isn’t weird fiction, although it’s clear that some of it could be taken that way.  It is the work of a mind that operated on a plane different from that of many others.  There’s an uncertainty, a tentativeness here that is very becoming, and even beguiling.  Having read the book I’m not sure what it was.  It will, however, lead to yet more reading.  Of this I am certain.


Namely Coincidences

One of my very first posts on this blog was about how I am not the Steve Wiggins who is a gospel singer.  There I mused on the coincidence that we share fore and surnames, as well as an interest in religion.  He is far more prominent than I am.  I don’t sing.  Since that time the most prominent Steve Wiggins on Google is the one who shot a police officer in Tennessee.  We don’t even share the same name, technically.  My given name is Steve, not Steven.  The branch of Wiggins I come from, however, is from the south.  Stephen F. Wiggins, even further removed in the name-spelling department, was CEO at Oxford Health Plans.  Now, I work for a publisher that shares one of those three words, and it’s the one that’s most specific.  Are Steve Wigginses drawn to the same places?  Another Steve Wiggins, just a couple years older than me, lived in Russellville, Arkansas.  I grew up in Rouseville, Pennsylvania.  Coincidence?

Our sense of individualism is, it seems, socially conditioned.  If we try to imagine life in earlier human social structures, such as hunter-gatherer society, it looks as though people tended to function more as a collective organism.  The benefit of the group was the deciding factor, rather than what an individual wanted.  No doubt this was a more harsh environment for those who liked to think for themselves, even though evolution had given us that capacity.  Biology, however, seems to have species survival as its goal.  Individuals die while the organism lives on.  In modern society we consider individualism as one of the highest aims.

Our names individualize us.  I sometimes think of countries like China that have a combination of very large populations and a tradition of short names.  With limited numbers of possibilities repeats in names becomes inevitable.  It’s a prominent aspect of our western society that we want name recognition.  We want to feel special.  Unique.  We work against evolution, but evolution has vastly more time than we do.  Perhaps we’ve gone too far with our individualism.  I hope we don’t have to step back as far as The Matrix, but maybe a movement in the direction of the social good over individual wants would be the right thing to do.  Our psychology makes us want to feel special.  Our biology wants us to play nicely together.  Who, in the end, wins out?  It could make a world of difference.


False Focus

I seldom use my iPhone.  I admit that I like having a camera with me most of the time and I don’t look like a tourist.  I don’t text and when I feel like tweeting I do it from my laptop.  I often forget where I put my phone and walk out of the house without it.  What I’m trying to say is that it’s not a distraction.  Now I realize companies (which seldom undertake to comprehend those of us who are anomalous) have to appeal to the lowest common denominator.  In iPhone world this means that they now want you to use “Focus.”  In other words, if you’re behind the wheel or in danger of losing your job for being distracted all the time, you can filter what gets through.  I recently had a request from my phone to send me Focus notifications when I’m home.  Of course it knows when I’m home!

It seems unnerving to me that we need to have our devices remind us not to use them.  What does it say about our love-love relationship with devices?  We use them to guide us when we’re driving—no longer experiencing the wonder of getting lost.  We read on them, forgetting the feel, smell, and non-reflective look of a book.  Some people even smoke their devices.  Many people now protect their houses with devices that allow them to see who’s at the door.  Do we really feel safer with our devices taking care of us all the time?  Perhaps we do.  Perhaps the cyborg revolution has already begun.

When I see how simple things like telling an apple from a tomato still flummox machine sensors (and even if they learn to tell this difference, the point remains the same), I realize just how much life experience teaches us.  We’re constantly taking in sensory data and interpreting it.  Often subconsciously.  I can smell and feel the difference if the same shirt is dried in a dryer or on a line.  I know which is better but I struggle to find the words to describe why.  I can tell the difference between the taste of this peanut and that one.  Some scents can trigger euphoria while others warn that a mustelid is nearby and wants to be left alone.  I know to look around for a skunk, to honor its wishes.  I can infer that the apples that have started to go bad are why that opossum is in our compost bin.  Perhaps I’ll pull out my phone and take a picture.


Status Check

It took many months, but one of my few Twitter followers was removed not for trying to take the nation by force, but because he’d died.  If I learn to tweet from beyond perhaps I’ll score a few more followers.  The situation, however, is one of the oddities of our socially mediated world.  I was trying to find some information on a potential author the other day and the only online presence I could locate was LinkedIn.  I clicked on the profile only to see the latest update was “Deceased.”  More than that, the Experience column indicated that “Deceased” continued from the date of passing up to the present.  I guess once you’re gone, your gone for good.  Social media, however, will perhaps find a way to keep you alive.

When I’m gone, I imagine WordPress will shut this blog down because nobody will be paying for it.  It’ll probably take a while for Facebook or Twitter to figure out I’m in the new category of “deceased.”  I do hope Academia.edu will keep my downloaded papers there for free. Real immortality, it seems to me, lies in the writing of books.  They too will eventually disappear, and who knows about the real longevity of social media.  It’s pretty difficult to believe Facebook wasn’t even around at the turn of the millennium.  I drive a car that’s older than Facebook.  I keep thinking of LinkedIn listing “Deceased” as a vocation.  Isn’t it really the ultimate vocation for all of us?  If you can’t be found online, do your really exist at all?

While experts debate social media, my job prevents me from using Facebook or Twitter during the day.  After work I’m anxious to get on to the other things in life that virtual friends and followers have to wait.  Early in the mornings I write and research.  I have mere minutes a day to look over social media.  I check Facebook only for alerts.  Life is short.  Is social media making it better?  It’s easy enough to be overlooked in real life, so why indulge in it virtually as well?  Of course, many see social media as a place to vent their spleen.  Why not try to inject some good into the virtual world instead?  There is hope for the dead, for they may still publish.  Their tweets may become somewhat less frequent.  Only the most callous, however, would drop them as friends for being dead.  Let’s just wait for Zuckerberg or Musk to notice.  It may take a few months.


Consistency

Consistency.  Back in Wisconsin I belonged to a group of Hebrew Bible professors who read a book and got together to critique it.  We came from different schools—Marquette, Sacred Heart, Carroll College, and Nashotah House (me).  We took turns on different campuses and spent a pleasant Saturday afternoon discussing our selected title.  Soon one of our members, an Auxiliary Bishop for the Diocese of Milwaukee, began to look at me right off and ask if the author had been consistent—my most frequent criticism was inconsistency.  It’s the way I think.  If is an argument is being made, or a story is being told, it has to be consistent in order to be convincing.  Recently I realized that this has carried over into my writing on horror.

The dream of many authors and auteurs is to establish a successful series.  Publishers and studios like them too.  Follow one success with another just like it, so the thinking goes.  People like to see how the story ends.  The longer a series goes, however, the more difficult it is to maintain consistency.  I’ve been noticing this in the articles I’ve been writing lately.  I follow the stories closely and inconsistencies creep in.  I noted this in a recent post about Dark Shadows and I wrote about it when looking at The Conjuring and Paranormal Activity franchises in Nightmares with the Bible.  I realize, just as my Bishop friend pointed out, that consistency is my problem.  Sometimes it gets in the way of enjoying the tale.  Deft authors and auteurs will tease you with it.  It’s part of the literalist mindset.

Being raised as a literalist, from my youngest days I learned that the story goes only one way.  The world, however, is much more ambiguous than that.  Stories have multiple points of view and endless iterations.  Not only that, but not even the author or auteur has the final say in what “really happened.” When I first learned of reader-response theory I was suspicious of it.  Even with a doctorate and teaching experience I was still looking for consistency.  Get the story straight!  But stories are crooked and queer and untamed.  They follow the imagination and defy literary convention.  Those that succeed best are remembered as classics.  The rest are nevertheless expressions of fertile minds with tales to tell.  I doubt I’ll ever get over my watching out for consistency.  I should, however, pay attention to that gentle teasing my erstwhile colleagues gave.  Relax and enjoy learning how it goes this time.


Normal Paranormal

One of my favorite televisions shows of all time is The X-Files.  I didn’t watch it when it originally aired, but eventually got a hankering to see it on DVD.  There are many reasons to like it, including its originality and the dynamics between Mulder and Scully and the sense that governments really do hide things.  As I rewatch episodes I see how much religion plays into it as well.  This post is actually not about the X-Files proper, but about a place in Bethlehem I recently discovered.  I’m not a preachy vegan, but I do like to support the establishments who make such lifestyles as mine much easier.  It was thus that I discovered Paranormal Pizza in Bethlehem.  I wondered about the name, figuring that it was paranormal that you could have non-dairy, non-meat pizza at all.

To celebrate Earth Day we decided to check it out.  The menu has a set of fixed items, each named after an X-Files character.  I was glad to see that I’m not alone in my appreciation of the show.  The pizza’s very good, and I’m sure the college-age crowd that was there would agree with me.  I did wonder how many of them knew the X-Files.  Is it still a thing?  Maybe recent government disclosures have brought it back into the public eye.  Hey, I’m a Bible editor, about as far from the public eye as you can possibly get.  Vegan pizza on Earth Day, however, just felt right.

Foodiness seems to be trending.  A great many options are available in the land of plenty.  Still, I know that vegetarians and vegans in developing countries exist, and many of them for similar reasons to me.  They know animals think and feel.  We promote the myth that they don’t so that we don’t have to feel guilty about exploiting them.  It seems to me that many of our world-wide problems would start to vanish if we realized we can evolve out of being predators.  Cashews and almonds can become cheese.  Soy beans and wheat can become meat.  And peanuts are about the best food ever, in any form.  Then there’s the natural fruits and veg.  Industrial animal farming is perhaps the largest polluter of our planet.  Yesterday was Earth Day.  I was eating a pizza made from wheat, tomatoes, and cashews.  These ingredients might seem a bit unusual.  Paranormal, even.  But that’s precisely the point.  I won’t be waiting until the next Earth Day to go back for more.


Welcome to the Labyrinth

Do anything long enough and you’ll produce a labyrinth.  I started this blog back in 2009 with the idea of perhaps continuing in the biblical studies/ancient Near East (actually west Asia) studies, where I began.  I always knew this would be a place to talk about books and movies and sometimes current events.  Often it would address American religion because, well, it’s so bizarre.  Over the years the blog has ranged pretty widely.  My interests are fairly diverse and I tend to get obsessed with a subject for some time and then move on.  I suspect that’s one reason followers are few.  People want the same thing—should I dedicate the site to horror films, religion, or social justice?  The weather?  Instead, it’s what catches my interest at the moment.  Thus the labyrinth.

On the rare occasion when someone actually comments on an older post this blog (there was a healthy chain about the Highgate Vampire some years back), I often have to ask myself, “Did I write about that?” “What did I say about it?”   The human mind is a labyrinth.  And life is too short to ever stop learning.  Even if it means that few will be interested in what you’re doing.  The few who’ve known me a long time and read this blog (I know who you are), might be surprised at the horror themes that have become pronounced.  These were, however, part of my childhood.  When I tried to get away from them, they pursued me.  Monsters are like that, of course.  They like to hide in labyrinths.

But labyrinths are contemplative spaces.  Contemporary spirituality has rediscovered labyrinths.  You walk them in intentional thought.  In the moment.  We might be able to forget for some time that the original labyrinth was built to house the minotaur.  And without Ariadne Theseus would’ve never survived.  When he left her on Naxos his actions spoke louder, much louder than his fight with the monster.  Labyrinths make you forget where you are.  One saved Danny Torrance.  And perhaps one might save your soul.  Those who make enough chairs, or write enough books, or design enough skyscrapers leave labyrinths behind.  Manhattan may be a grid, but it’s a labyrinth nevertheless.  It seems to be a part of every story.  The thing about labyrinths is that they have no one goal.  There is no single answer to this mystery.  When you begin making one you may not even realize it.  Until you stop to contemplate it.

Photo by Ashley Batz on Unsplash

A Bird’s Life

Among the early signs of spring are birds.  Cold and silent, winter mornings have their own form of beauty, but hearing the birds is cause for hope.  The bird world looks cheerful and peaceable but it is a competitive and often harsh place.  My office window looks out onto a porch roof and a stand of trees across the street.  Electric wires constitute a part of the scene as well, giving birds plenty of places to alight and negotiate their bird business.  Like humans, birds are vulnerable, particularly when they’re young.  While teaching at Nashotah House, walking home from chapel one morning after a thunderstorm, I found a baby bird, not yet fully fledged, dying on the sidewalk.  I glanced up and couldn’t see any nests.  I’m not much of climber anyway.  Not knowing what to do I scooped it up and took it home where I could put it in a box.

I didn’t have an early class that day so I called a wildlife rescue center.  Being the days before the internet took over, this was a matter of looking it up in the yellow pages.  We piled the family in the car and drove it down.  They’d told me to keep it warm and try to comfort it.  My daughter held it.  Once we got there they said they weren’t sure if it would survive.  It was weak and chilled, but they would do what they could to revive it.  For several days we all worried about that hatchling.  I thought it might’ve been a finch because of the beak, but otherwise we knew little about it.  Several weeks later the rescue center called.  Our rescue was ready to be released—did we want to do it?

They handed us a brown grocery bag that weighed next to nothing.  “Open it when you’re outside near where you found it,” they said.  Back on campus we opened the bag and our foundling flew off so fast we could barely see it.  Adult birds, confident and socialized, seem more sure of themselves.  They perch out in the open even though hawks scan the area, and even the occasional eagle.  They go about their bird business with a confidence I sometimes envy.  They don’t worry about a 925.  They know what nature’s about.  They may have survived a near-fatal childhood.  They may have pushed siblings out of the nest to have thrived.  They peck and flap at each other in their efforts to mate.  And, above all, they carry spring on their wings.


Ghost History

Books on art are often eye-opening to me.  When I was young and trying to escape the working-class hell in which I grew up, I discovered high culture.  This was mostly through local libraries.  I would check out classical music LPs and look at books of classical art.  I did the latter until I could identify several artists by their styles.  (It was probably originally because they’d painted pictures of Jesus and I went to see what else they’d done.)  In any case, I never studied art history.  I recently read an art-historian on the Devil, and now I’ve read one (Susan Owens) on ghosts.  The Ghost: A Cultural History does not address the question of whether ghosts exist, but rather traces the history of how they’ve been portrayed in literature and art throughout time.

Owens quite ably takes us through ancient to modern, pointing out that ghosts change to fit the Zeitgeist—the spirit of the times (not her pun).  In the early modern period ghosts were portrayed as physical revenants.  They were dead bodies that came back to physically harm the living.  We know this fear was widespread because some burials were clearly intended to keep the dead in their graves.  The idea of the physical ghost still comes up in modern horror as the monster you can’t kill because it’s already dead.  It was only gradually that ghosts became spirits and this was largely through emphasis on purgatory, which made it possible for the dead not to be in Heaven or Hell.  Once the idea caught on the literature and art began to focus on the spiritual nature of revenants.  As cultural interests turned towards ruins ghosts inhabited haunted houses.

This is a fascinating study of the way ghosts have evolved over time.  One of the things that struck me was that early commentators often didn’t distinguish clearly between ghosts, demons, and devils.  Demons, as we think of them, really depend quite a bit on The Exorcist.  The use of “devils” in the plural complicates the spiritual geography where we have God v Devil as the main poles of spiritual rivalry.  These ideas, and also those of ghosts, likely blended throughout most of history until a renewed emphasis on literalism came in.  Medieval scholars composed angelologies and demonologies, trying to keep everything straight.  They puzzled over ghosts, however, which don’t fit the scheme very neatly.  They would have benefitted, perhaps, if they had had Susan Owens’ book to help guide them.  It’s an exciting nighttime journey.