Too Much TMI

Okay, okay.  I admit I get overwhelmed.  There’s just too much stuff to read.  I currently have 25 tabs open on my browser, afraid that I’ll forget about something that seemed so urgent when I opened the URL in the first place.  (Two decades ago that sentence would’ve been nonsense.)  I limit my time on social media.  This can be a death-kiss for a writer, but for sanity’s sake (and work’s), I look at Facebook for literally about five minutes a day.  (If you want to reach me leave a comment on my blog.)  In those five minutes (or less) I often come away with two or three articles that I want to read but don’t have time just now.  I open a tab and hope I’ll get to it before I lose interest.  There’s a lot of information.  Too much.  Too much TMI.

I’m a slow reader.  I sometimes wonder if I have borderline dyslexia—it once happened on a test and led me to phone a professor at night to explain—but dear reader, it slows me down.  And a writer, no matter how obscure, needs time.  I told a friend the other day that I don’t do things I enjoy, such as painting and drawing, because writing takes up so much time.  (And work does too—it gets the lion’s share.)  But those articles!  They look so important!  Some have health implications and, if you lose your health you have even less time.  The internet gives us TMI constantly.  And this field is riddled with rabbit holes.  Just ask the white rabbit about time.

Image credit: John Tenniel, public domain, via Wikimedia Commons

What are the curious to do?  I actually get an insane amount of satisfaction from closing a browser tab.  It’s a sense of accomplishment—I’ve done something that brings closure!  If I do it enough times I’ll get down to the URLs I always keep open lest I forget my place.  Some of these tabs have been open since the Obama administration.  If you’re critical of such as I you might suggest “why don’t you just read an article instead of writing about not having the time to read?”  The interlocutor here is clearly not a writer.  Or at least has different writing habits than me.  There are some non-negotiables in this world of TMI.  I suppose I’m adding to the problem.  At least if anything thinks what I present here is information.  For that I defer to Klima, who, happily, still has some time.


Mr. Bean

Edamame.  I remember distinctly the first time I had it.  I was at the house of Alvy Ray Smith, co-founder of Pixar (shameless name drop), by the courtesy of Neal Stephenson (another).  It was a book club discussion and although I don’t remember who else was there, I was certainly the least distinguished person in the room.  Someone had brought edamame to share.  I’d neither seen nor heard of it before.   I popped a pod in my mouth and began to chew.  After ruminating a few moments, I figured this cud wasn’t going to break down and when others put their—relatively intact—pods in the discard bowl, a lightbulb clicked on.  Sheepishly, I pulled my mangled pod from my mouth and slipped it, I hoped unobtrusively, into the bowl.  If anyone noticed they were too sophisticated to say anything.  Blue collar through and through.

I repressed that memory, which is strange.  I tend to remember, and replay, the embarrassing things I’ve done.  This memory slipped, however, until our daughter reintroduced me.  She was in college, or recently out, and she showed us how to do it.  When that pod hit my tongue, the memory sprang back.  Edamame has become a standard of our house since then.  In case you’re unfamiliar, you put the pod in your mouth, keeping hold of one end.  You extract the beans, generally by using your teeth as an immovable obstacle—like artichoke leaves.  Discard pod, chew and swallow.  We sometimes dress ours up with a sauce.

The last time we had edamame, however, one of the beans shot to the back of my throat while the other two laughed.  I couldn’t tell which way, but it was clear the renegade bean was going down on its own.  I spent the rest of the evening worrying that I’d aspirated a bean.  Aspiration becomes more common as you age—something about nature trying to send us a hint, I guess—but I didn’t cough at all.  No wheezing started.  No pain.  Probably I swallowed at the last possible second.  If I did it was reflex because I couldn’t think what to do.  The next day, with no ill effects, it seemed funny.  Amusing enough to remember a time when really accomplished people were interested in what I had to say.  That time has largely departed, like an empty edamame shell.  But the memory remains.  There are hidden  hazards to eating edamame, it seems.


Just Ask

I see a lot of headlines, and not a few books, that puzzle over something that there’s an easy way to resolve: why do evangelicals (I’m thinking here of the sort that back Trump despite his pretty obvious criminal, predatory nature) think the way they do.  The solution is to ask evangelicals who’ve come to see things a bit differently.  I’m not the only one, I can assure you.  Many professors of religion (particularly biblical studies) and not a few ministers came from that background.  If they were true believers then, they can still remember it now.  At least I do.  I was recently reading a report in which the authors expressed surprise that evangelicals tend to see racism as a problem of individual sin rather than any systemic predisposition society imposes.  To someone who grew up that way, this is perfectly obvious.

I’m not suggesting this viewpoint is right.  What I am suggesting is that there are resources available to help understand this worldview.  To do so, it must not be approached judgmentally.  (I sometimes poke a little fun at it, but I figure my couple of decades being shaped by it entitle me to a little amusement.)  I don’t condemn evangelicals for believing as they do—that’s up to them—I do wish they’d think through a few things a bit more thoroughly (such as backing Trump).  I understand why they do it, and I take their concerns seriously.  I know that many others who study religion, or write articles about it, simply don’t understand in any kind of depth the concerns evangelicals have.  It’s only when their belief system impinges on politics that anybody seems to pay attention.

Maybe this is a principle we should apply to people in general.  Pay attention to them.  Listen to them.  Care for them.  Relentless competition wears down the soul and makes us less humane.  Religions, for all their faults, generally started out as means for human beings to get along—the earliest days we simply don’t know, but there is a wisdom in this.  In any case, if we really want to know there are people to ask.  Who’ve been there.  Whose very profession is being shoved out of higher education because it doesn’t turn a profit.  Learning used to be for the sake of increasing knowledge and since that’s no longer the case we see guesswork where before it would’ve been possible to “ask an expert.”  I often wonder about this, but as a former member of a guild that’s going extinct, I simply can’t be sure.


Why Not Love?

I learned a new word the other day: incel.  I’m not too proud to say that I had to look it up.  Although I’m on the internet quite a bit, I’m not really part of “internet culture.”  Incel is a shortened form of “involuntary celibate.”  It refers to an internet culture of mainly white, heterosexual males who consider themselves unable to find (generally) female companionship.  They often lash out at women, and sometimes at any sexually active person.  In general it seems to be a self-pitying, hateful crowd.  They tend towards misogyny and racism and, one suspects, conspiracy theories.  They apparently suffer what a friend of mine called “DSB” (deadly sperm buildup).  But the thing is, love would seem to be the cure.

Certainly women aren’t to blame.  Look, if I managed to find a woman willing to marry me there must be hope for the rest of my gender.  I’m no catch.  And why is it frustrated men take it out on women?   And underplay the achievements of women?  The Women’s March in January 2017 was the largest single-day protest in history.  Accurate numbers are difficult to attain, but it has always struck me that the U.S. Park service agents, with feet on the ground, estimated a million and a half in D.C. alone.  So we were told.  It’s almost as if nobody bothered to count because it was women.  Why is this still an issue?  How incelular are we?  Is it so difficult to give credit where credit is due?

I wonder if anybody foresaw that the internet would develop such subcultures.  Yes, Neal Stephenson’s Snow Crash gave us a metaverse where individuals lived virtually online, but did we fully realize then that sexually frustrated guys would eventually merit their own title and that some of them would perform acts of real life violence based on their own rhetoric?  Rogue males have been part of human culture all along, but the internet has offered a place to band together and become radicalized.  I, for one, had no idea that such subcultures existed.  It took reading an academic work about female leadership to learn about them.  And it makes the world a less comfortable place knowing they’re there.  Learning love is our only hope.  There are people who sublimate their frustration to hate.  What if we tried to make the internet a place where love, with or without physical entanglements, became the dominant meme?  Even those of us who work largely in isolation can see the hope in that.

Photo by Mayur Gala on Unsplash

Adulting

Young professionals that I know often say adulting sucks.  Quite a bit of the time I tend to agree with them.  The 9-2-5 makes just getting along difficult, at times.  I’m sure there’s software to ease some of the woes, but you have to learn how to use it.  And that takes time.  Time I’d rather spend writing or reading.  For example, to get a small break on state taxes, if you work from home, you need to calculate your office space and then how much it costs to exist in your house for the year.  When I remember to do so, I can look utilities and mortgage up in Quicken.  Sometimes, however, when a book in my mind is distracting me I just tot all this up on the back of an envelope.  Then I need to type it in so my accountant can see it (taxes are far too complicated for mere mortals) and, I can’t underscore this too many times: numbers are adulting.

Photo by Tyler Easton on Unsplash

I’m an idea person.  The 9-2-5 (numbers!) that keeps you in front of a computer all week long means that things pile up.  Weekends seem too short to spend on numbers.  But you’ve got to balance that checkbook.  And even tot up the number of hours you give to “the man” each day.  What could be more adult than accounting?  Don’t get me wrong—at times numbers can be interesting.  Numbers, at their best, are philosophical.  One squared is one.  When you square any number greater than one, it increases.  One doesn’t.  And you can’t divide by zero and get zero for an answer, as handy as that’d be from time to time.  These abstract concepts come in useful but adulting involves serious numbers.  Numbers that imply liquidity.  Cash flow.  

Time is made up of numbers too.  If a social event comes up on a weekend, there goes your grocery and cleaning time.  And writing a book takes a tremendous amount of time.  It’s a second job on top of the other one you work 9-2-5.  All of this makes me think of those TIAA-CREF ads that showed prominent professors and captions that said “Because some people don’t have time to think of money.”  Or something similar.  That’s what I’m talking about.  Adulting is all about money.  And money must be taxed.  And you have to keep track of where it all goes.  I’m sure Quicken could help me with this, if I had time to learn it.  (We pay for it after all.)  But I’m kind of busy writing this book…


Thinking Thinking

Something that’s been on my mind (anticipatory pun) lately, has been thought.  More especially, the quality of thought.  We are conscious beings, although we’re not sure what that means.  Beyond a Cartesian self-awareness.  Everyone knows what it is to have times when you’re not thinking clearly.  Or are feeling confused.  Those of us who tend to live quasi-monastically (keeping to a routine, early rising, writing and reading daily before the 9-2-5 routine) notice the ways subtle things can influence the quality of our thinking.  For me, first thing in the morning is the best time.  (Although I must confess that lately I don’t wake up with the crystalline clarity that I have for years, as if sleep is beginning to intrude on my earliest hours.)  Once I’m up and going, though, routine, you’d like to think, would provide the same results.  But it doesn’t.

Photo by Pierre Acobas on Unsplash

I’ve written before how the quality of sleep can affect the quality of awake thinking—something we’ve all known all along.  But even when I have somewhat identical nights (same quality of sleep more than one night in a row), the subtleties of difference in thought persist.  To understand this, you need to realize that I’ve been rising well before the sun for a dozen years now.  I awake to a quiet house and spend a couple, sometimes a few, hours writing and reading.  (It’s how I write my books, as well as this blog.  And my fiction.)  Even on “identical mornings” where the weather’s pretty much the same, and all other factors seem equal, the quality of thought differs.  Sometimes it depends on whether I’m writing fiction or non.  As I transition into my reading time, that can make a difference in the reading experience.  I suppose that’s one reason I value good writing.

We don’t understand consciousness.  Identity is also somewhat negotiable at times.  We’ve all known a family member or friend to act “not like themselves.”  More to the point, to think not like themselves.  We have no real way of understanding thinking itself.  I think about thinking quite a bit, and I marvel at how intensely personal it is.  We may, at our will, keep our thoughts to ourselves (and that’s a good thing, in many circumstances).  Thought, it seems to me, ought to be a very high priority in our academic pursuits.  It’s a powerful thing, capable of more than we’re even presently able to imagine.  And it can differ from day to day.  Do you suppose I wrote this after writing fiction or non?


Showing Gratitude

Stealing is something that we all, except some capitalists, know is wrong.  I think quite a lot about the land that was stolen to make America possible and I know that simply giving it back isn’t an option.  Nevertheless, I do believe that we should listen, and listen attentively to those who’ve been here longer than Europeans.  Robin Wall Kimmerer’s Braiding Sweetgrass is an important reflection of this dilemma.  Kimmerer is Potawatomi and she’s also a professor of Environmental Biology.  The book is a series of essays that focus mostly on plants and what we can learn from them.  It also brings in indigenous teaching, contrasting the outlook of gratitude against that of greed.  By turns sad, funny, and profound, Braiding Sweetgrass contains a message that is vital to counter climate change.  To correct our attitude before it’s too late.

There’s so much in this book that it’s difficult to know what to touch on in this brief notice.  Throughout, Kimmerer notes that the First Nations viewed life as a gift.  The earth is constantly giving and the native way was to be thankful and to accept the responsibility of being given a gift.  Seeing how the European attitude was “take until there’s no more to take,” she points out that taking what you need and leaving for others is a way out of our current dilemma.  She does this, most strikingly, by the story of the windigo.  The windigo has become popular among monster fans as a consuming beast, but Kimmerer shows how the story has a profound point.  If all you do is consume you become a monster.  You stop a windigo by showing gratitude.

Perhaps the most striking thing, to me, was how Kimmerer describes her own experience becoming a scientist.  How standard academics refused to believe they had anything to learn from Native American outlooks, especially when borne by a woman.  How she was told she couldn’t be a scientist, not with that outlook.  And how she learned the European way but didn’t give up her native understanding.  How she brings two worlds together and does so with a sense of urgency and hope.  Things have gone too far simply to turn back the calendar and say that our ancestors had it all wrong, but it’s not too late to learn from those who lived for millennia on this land and were untainted by ideas of private ownership.  Those who knew how to live sustainably with nature.  Those who knew, and still know, how to defeat monsters.


Hellish Fears

Aporripsophobia, the fear of rejection, and the fear of punishment (mastigophobia, or as I prefer, “spankophobia”) are closely related.  They define me.  Much of this comes from the fear of Hell, which I internalized early in life, along with the Calvinistic theology that backed it.  Some have thought that I’m “thin skinned” or afraid of criticism.  That’s not quite it.  I’m afraid of what criticism implies—I did something wrong and therefore may be punished for it.  What brings this on, all of a sudden?  Well, as I was getting ready to jog the other day a police car stopped in front of our house on a routine traffic violation.  My immediate thought was that I had done something wrong.  They were here for me, not the guy whose car they were attending.  Then this brought back that time in Boston.

I moved to Boston on my own, with all I had in a VW Beetle (old style).   I know now that the headache I had after that long drive was a migraine.  (I’ve had maybe a half-dozen in my lifetime, and they’re unmistakable.)  I parked the car, stumbled into my new apartment and went to bed.  The next morning I had a ticket for parking with the left tires to the curb (against the law in Boston).  I didn’t know it was illegal.  Even with a migraine I would’ve not parked that way had I known.  The receptionist at the police station actually said to me “Ignorance of the law is no excuse.”  That terrified me.  I thought it was only something Gilligan said.  If you don’t know all the laws how can you possibly avoid punishment?  And isn’t punishment rejection?

Some think I always have to be right.  They may not know the underlying cause—being wrong is to be subject to punishment.  And punishment leads to Hell.  When I was in Kindergarten the first time, I was held back partially because I was four but partially because I colored the triangle in the left corner purple instead of yellow, opposite to the verbal instructions.  It was because I don’t know my right from my left—I still don’t.  To me that first ever school correction was seared forever into my gray matter.  I’d done something wrong.  I was held back in school.  More likely than not, I was going to Hell.  I’ve known people to suggest, as does Richard Dawkins, that raising a child in a religion is child abuse.  I understand parents’ motivation, however.  You don’t want your child to go to Hell.  If they end up living in it all their lives I guess it’s a small price to pay.

Photo by Vadim Bogulov on Unsplash

Bigger Picture

One of the quirks of my thought process is that I tend to look for the bigger picture.  I’ve always done this and I suspect it drives some people batty when they ask me a question and I begin to answer from what seems to be a tangent.  (I also think this is why I performed well in the classroom.)  So, when I saw the article by Eric Holloway on Mind Matters, titled “Why Is Theology the Most Important Empirical Science,” I had to take a look.  Mostly a series of bullet-points that point out some of the religiously-motivated ideas that led to scientific discoveries, the article is useful.  My penchant for the big picture goes a bit broader, however.  The entire worldview in which the scientific process was born, and thus its underlying presuppositions, are religious.  Science and religion are the dogs and cats of the thought world but I’ve seen dogs and cats live happily together.

Science has always been with us.  Early peoples weren’t benighted troglodytes.  They observed, hypothesized, drew conclusions.  Science as we understand it, however, began in the Middle Ages in Europe, drawing on observations from earlier thought in the Arab world.  The context in that Arab world was solidly Muslim.  The Middle Ages in Europe were solidly Christian.  None of this discounts the contributions of Jews to the whole, it’s merely an observation regarding the larger cultural outlook.  Many of the principles of science even today (for example, that people are categorically different from other animals) are based on those religious worldviews.  We seldom go back to question whether we might’ve gotten something fundamentally wrong.  Meanwhile, the dogs began to chase the cats.

College as a religion major involved a lot of discussions about basic presuppositions.  Then questioning them.  Not much of this went on in the classroom (Grove City was, and is, a conservative Christian school).  The wonderful thing about higher education is the bringing together of people with different outlooks.  It was those after-hours conversations that helped form my questing nature.  I’d already started asking bigger questions when I was a child, annoying my parents and, I suspect, sometimes vexing clergy.  A single human mind is too limited to grasp it all, but it seems to me to deny religion a place at the table is to leave out massive amounts of human experience.  Of course, economics, the dismal science, seems well on the way to eliminating the study of religion in higher education.  And we will have lost, if this happens, a large piece of the bigger picture.

Photo credit: NASA

Hooting in the Dark

Animals fascinate me.  I picked up Martin Windrow’s The Owl Who Liked Sitting on Caesar: Living with a Tawny Owl at a used book sale.  Honestly, the cute photo on the cover swayed me.  Although some animals like living with humans, and although I grew up with lots of pets, I’ve tended away from that.  Reading about how an owl became a close companion to, and lived a good life with a human was somewhat bittersweet.  In the wild Mumble (the owl’s name) would’ve likely lived a far shorter span.  But I do wonder if she missed out on the challenges that make life rewarding.  (I sometimes wonder the same about those born rich among our own species.)  The struggle is part of nature inside us.  And although this book is generally fun, it does raise some deeper questions in my mind.

When describing the natural life of Tawny Owls, Windrow notes that they have an ability, not understood, to adjust their brood sizes by the amount of prey that will be available during a given year.  Such things always give me pause for a couple of reasons.  One is that we seem to assume we have all the data—that we know all that can be known of our world.  Animals prove that wrong time and again.  The other reason is that we are convinced there is no, for lack of a better term, spiritual world.  Or maybe better, paranormal existence.  Might it not be that owls have some ability to know the future?  Some people seem to have the ability to predict some short-term developments with accuracy.  Perhaps we’re missing something is all I’m saying.

In the end, however, I was surprised how Windrow couldn’t quite bring himself to reject a materialist view of her death.  I’ve had pets die on me—one of the reasons that I have no desire to “own” one—but as Windrow writes it, the relationship grew humdrum before Mumble’s death.  He had to work and she had to perch.  We do tend to take those closest to us for granted, I fear.  Life is so busy that we have to try to squeeze family in next to the demands of capitalism.  So the story towards the end winds down to a kind of “I had a pet owl but I had a life to live too” kind of narrative.  I’m glad to have read the book and I learned a little bit about Tawny Owls.  But I was also left reflecting on some of the larger implications.


Time Flees

I can’t speak for all early risers, of course, but for me the absolute worst thing about this useless tradition of switching to Daylight Saving Time is the loss of morning light.  I’m in favor of keeping DST all the time, as the US Senate has voted to do.  The only reason this is still an issue is to give the House yet something else to fight about.  How dysfunctional are we, really?  This one’s a no brainer!  Look, I start work early every day.  I jog before work because I’m too tired afterward.  In late February to early March I can get out and back before seven.  (In the summer before six!)  Then DST happens.  I’m plunged into another month of waiting until seven to be able to jog.  DST is just one of those ridiculous things we just keep doing because we don’t have the will to change it.  We’d rather fight.

I’ve been thinking a lot about time lately.  How we think of it, how we divide it.  We sometimes lose sight of the larger picture.  If relativity is right, the stars we see at night are, many of them, long gone.  We’re seeing light trudging through the near vacuum of space, or maybe dark matter, and thinking how we’ve got to get to our meeting on time.  How we need to be at work from 9-2-5.  How somebody with money owns that portion of our time.  There’s a reason that DST starts on a weekend.  Time.  We can’t grasp it but we can waste it.  What are we waiting for?  Some of us are seeking the truth.  Even so we know that Morpheus was right—time is always against us.  It’s a limited commodity, but even that language cheapens it.

Those of us of a philosophical bent allow ourselves time to ponder such things.  We call time a dimension, but what does that really mean?  Theoretically it can be traveled along in either direction (again, pending relativity) but we only experience it in one.  So what do we do?  We interrupt its flow because during a war during the last century it was deemed that industry could be more productive if it were light an hour later.  Maybe we should just all agree to shift our perception of time ahead by an hour permanently.  That’s forward thinking.  And who knows, it might just save us all a lot of time.


Boo-Boo

After an unfortunate encounter with a paper-cutter in which one of my thumbs didn’t fare so well, I sought a bandage.  This led me on a reverie since the bandage I found was in a box that I’d brought home from my mother’s apartment.  Mom was a practical woman and I’m sure she would’ve approved, although the item was selected in a moment of grief that still hasn’t completely dissipated.  As my wife was binding my wound the thought recurred that my mother wouldn’t be needing these physical assuagements any longer.  Like all of us, if cut she bled.  She’s beyond that now.  A person’s affects linger and contain pieces of their memories.  This particular box was plastic and therefore reusable—which is precisely what Mom did.  She taught me how to bandage myself and I’ve used that knowledge many times over the decades.  It’s something I don’t need YouTube to figure out.  Time is a gift.

When writing about recent times, I recently learned new vocabulary regarding decades.  For example, the first two decades when I was culturally aware were the seventies and eighties.  Together they’re known as the xennial period, named, presumably, after “generation X.”  (I’m a very late boomer, as well as a late bloomer.)  I found that fascinating.  Then I was reading something that made reference to the “noughties.”  This delightful word is the British term for “aughts” or “aughties”—the years between ’00 and ’09 of any given century.  We hear plenty about the “twenties,” “thirties,” and so on, so I became curious about the correct term for the second decade of a century.  Either “tens” or “teens” is acceptable, but it seems that in formal writing this should be transcribed by numbers. I guess teen ages are always difficult.

Our divisions of time demonstrate our preoccupation with both mortality and round numbers.  More and more people are living the entire way through a century, from aughties through nineties.  For most of us, however, we can, if things go well, use our birth decade as a rough guide.  I’m not likely to make it through the fifties, but it isn’t impossible.  If I do I guess I’ll need to upgrade my WordPress account because my daily posts will have used up my allotted memory by then.  In the meantime, I do need to buy some new bandages for the time in between.  When I do I’ll put them in a simple plastic box, and I will remember the gift of time I shared with my mother.


Verb Choice

I can’t remember who started it.  Somehow, though, when I watch movies on Amazon Prime, the closed captioning kicks in.  I generally don’t mind this too much since some dialogue is whispered or indistinct.  I also presume some kind of AI does it and it makes mistakes.  That’s not my concern today, however.  Today it’s word choice.  Humans of a certain stripe are good at picking the correct verb for an action.  I’ve been noticing that the closed captions often select the wrong word and it distracts me from the movie.  (Plus, they include some diegetic sounds but not others, and I wonder why.)  For example, when a character snorts (we’re all human, we know what that is), AI often selects “scoffs.”  Sometimes snorting is scoffing, but often it’s not.  Maybe it’s good the robots don’t pick up on the subtle cues.

This isn’t just an AI problem—I first noticed it a long time ago.  When our daughter was young we used to get those Disney movie summary books with an accompanying cassette tape (I said it was a long time ago) that would read the story.  Besides ruining a few movies for me, I sometimes found the verb choices wrong.  For example, in Oliver (which I saw only once), the narrator at one point boldly proclaims that “Fagan strode into the room.”  Fagan did not stride.  A stride is not the same thing as a shuffle, or a slump.  Words have connotations.  They’re easily found in a dictionary.  Why do those who produce such things not check whether their word choice accurately describes the action?

So when I’m watching my weekend afternoon movies, I want the correct word to appear in the closed captioning.  Since the nouns generally occur in the dialogue itself, it’s the verbs that often appear off.  Another favorite AI term is “mock.”  Does a computer know when it’s being mocked?  Can it tell the scoff in my keystrokes?  Does it have any feelings so as to care?  AI may be here to stay, but human it is not.  I’ve always resented it a bit when some scientists have claimed our brains are nothing but computers.  We’re more visceral than that.  We evolved naturally (organically) and had to earn the leisure to sit and make words.  Then we made them fine.  So fine that we called them belles lettres.  They can be replicated by machine, but they can’t be felt by them.  And I have to admit that a well-placed snort can work wonders on a dreary day.


A Different Zone

I haven’t read Stephen King’s The Dead Zone yet, but it’s on my list.  That’s why I was a little reluctant to watch the movie.  It was free on Amazon Prime, however, and I reasoned to myself that I’d seen The Shining and Carrie before reading the books.  Indeed, my earliest introduction to Stephen King was through movies.  (Well, I did read one of his short stories in high school, but the novel side of things came later.)  When the opening credits revealed it was directed by David Cronenberg I wondered what I was in for.  I didn’t know the story, but I hadn’t heard of this as a Cronenberg body horror spectacle either.  It was quite cold outside and I was nodding off, so why not.

The thing is, it’s not always listed as horror.  That’s a faulty genre designation, as is sci-fi.  There’s one futuristic scene in the movie and it lasts for just over a minute.  Does that make it sci-fi?  Also, I  realized, it deals with clairvoyance and for similar reasons the X-Files are also listed as science fiction.  Paranormal, it seems, is permanently ruled out of the realm of possibility by assigning it an improbable genre.  Well, back to the zone.  I figure the title will be better explained by King, but there is a brief scene explaining what a dead zone is.  The story follows Johnny Smith, a schoolteacher who becomes clairvoyant, although it manifests itself only after a car accident and a coma.  The main purpose of this, at least through the movie lens, is to prevent a Trump-like populist from being elected president.  That is the horror part, I guess.  And it’s becoming clear to me that writers were warning about these things since the seventies.

Unlike many of my weekend movies, I’d actually heard of The Dead Zone before.  There are some horror tropes present.  It begins with Edgar Allan Poe’s “The Raven,” and has a few other horror references tossed in.  Still, it’s a very human story.  The movie probes the difficulties of a life with special abilities.  Johnny never gets over the woman he was going to marry before his coma, and he feels for those whose futures he sees.  The movie is fairly slowly paced and it drops a few threads, again, likely found in the novel.  In the book or movie debate I generally go for book first, but that often leads to disappointment on the silver screen.  Maybe this was the right order to go this time around.  Once I read the novel I guess I’ll know.  Or at least have an opinion.


Using Brains

I’m old enough to know better.  Here’s a thought.  I recently saw a headline that suggested human brains filter out things like ESP because brains evolved to help us survive.  No matter what you believe about ESP, the idea got me to thinking.  We often act as if our brains are able to determine the Truth (that capital is intentional).  At the same time we don’t understand what consciousness is.  We know that other animals have brains and that the evolution of said organ is to help individuals survive to reproduce.  Some animal species end their existence at that point, but others linger on to wonder.  And I’m wondering if our brains are filters.  Stick with me here: we know that there are stimuli that we can’t perceive that other brains can.  For example, it seems that migrating birds can perceive magnetic fields.  Even if they can’t there are magnetic fields that we perceive only through their effects on objects.  Our brains have no direct access.

Image credit: Andreas Vesalius‘ Fabrica, showing the Base Of The Brain, by user Ancheta Wis

Here’s where it gets spooky.  If our brains filter out things that may hamper us in survival, what if they overzealously teach us not to perceive things that actually exist?  We’re somewhat limited by our “five” senses, no doubt.  We get along okay.  But what of those people who see things that others don’t?  We tend to medicate them or lock them away, but what if their brains have learned how to shut off part of the filter?  Having written a book about demons, naturally they come to mind as a test case.  Or, if you prefer, ghosts.  We tell our children these things aren’t real.  Trust the filter.  Get on with life in “the real world,” right, Cypher?

I didn’t have time to read the article, but I’d experienced a perspective shift.  If our brains are all about gathering information (and in part they clearly are), that’s one thing.  If they are actively filtering things out, well, that’s quite another.  We laud the imagination of children until they become “old enough to know better.”  Do we teach them to shut out what they can actually see, or sense, in order to accept the inevitable, material, adult world?  This idea has startling implications.  As we plunge ahead inventing AI to do our thinking for us, perhaps we’ve left something even more fundamental behind.  Have we lost interest in the Truth?  We may not be able to access it directly, but I wonder if we’re taught to give up without even trying.