Critical Snow

No two snowflakes, I’ve always been told, are the same.  Far be it from me to question the collective wisdom of our species, but I wonder how this fact is ever confirmed.  I suppose I’ve personally swallowed a good deal of the evidence over time.  Snowflakes melt and we can’t get them all under the microscope, can we?  This year has been a winter of more than usual snow around here.  During our most recent storm I stared out the window and tried to count.  Billions of snowflakes collected in my yard alone, and no microscope-bearing statistician was anywhere to be seen.  I like the idea of each flake being unique, but I know it’s a theory impossible to falsify, and I wonder if it’s accurate.

I’ve been thinking a lot about critical thinking.  At its base, critical thinking is about asking questions and learning reputable places to find answers.  Not “fake news” or “alternative facts”—these are tools in the Devil’s workbox—but evidence-based information.  Primary education, it seems, is about learning to read, and write, and handle numbers.  It is about learning who we are  and who we’ve been.  About the way that science helps us understand this old world.  Higher education, as it’s generally conceived, used to be about learning critical thinking.  That was before colleges became mere trade schools, catering mainly to careers with high earning potential so that alumni would give more money back to the college.  Where will we learn critical thinking?  No two are the same, right?

Instead, knowledge and hearsay become very similar things.  I used to tell my students not to take my word for it.  Just because I can legitimately put the word “doctor” in front of my name doesn’t mean I know everything.  Yes, I am an expert but even experts aren’t exempt from the test.  So, as more snow starts to fall, I think about all the many, many places I’ve heard that no two flakes are the same.  I think of the astronomical number of snowflakes that have fallen this year alone.  The number of years before we ever evolved on this planet.  In ice ages and even during human-initiated global warming.  And I realize nobody’s done the actual work of comparing every single snowflake to every other one.  Tradition is like that accumulating snow, building on past layers until great glaciers form.  And who, I wonder, would argue with a glacier?


Plants Will Lead

The world just keeps getting weirder.  Although I very much appreciate—“believe in,” if you will—science, sometimes the technology aspect of STEM leaves me scratching my primate cranium.  What’s got the fingers going this morning is spinach.  Not just any spinach.  According to a story on Euronews, “Scientists Have Taught Spinach to Send Emails.”  There are not a few Homo sapiens, it seems, who might learn something from our leafy greens.  The tech comes, not surprisingly, from MIT.   When spinach roots detect certain compounds left by landmines in the soil, it triggers sensors that send an email alert to a human being who’s probably eaten some of their (the spinach’s) very family members.  I’m not denying that this is very impressive, but it raises once again that troubling question of consciousness and our botanical cousins.

Some people live to eat.  I’m one of those who falls into the other category—those who eat to live.  In my life I’ve gone from being a picky omnivore to being a somewhat adventurous omnivore to vegetarian to vegan.  I’m not sure how much more restricted I can make my diet if I leave out plants.  I’ve watched those time-lapse videos of trees moving.  They move even more slowly than I do when my back’s acting up, but they really do move.  If they had legs and speeded up a bit we’d call it walking.  Studies into plant consciousness are finding new evidence that our brainless greens are remarkably intelligent.  Perhaps some could have made a better president than 45.  I wonder if spinach can tweet?

People can be endlessly inventive.  Our thirst for information is never quenched.  Universities are among those rare places where ideas can be pursued and it can be considered work.  While I don’t think everyone necessarily needs to go on to higher education, I can see the benefits it would have for a culture.  Indeed, would we have armed mobs trying to take over because of a fact-based election loss?  I wonder if the spinach would take place in “stopping the steal.”  Hopefully it would fact-check more than those who simply follow the leader.  Consciousness and education can work together for a powerful good.  I’m not sure why Popeye’s favorite was chosen for this experiment, but it does seem to show that we can all get along if we really want to.  Maybe then we could meet in the salad aisle rather than out in the field looking for explosives.


Keep at It

Photo credit: ESA & MPS for OSIRIS Team MPS/UPD/LAM/IAA/RSSD/INTA/UPM/DASP/IDA, CC BY-SA 3.0 IGO, via Wikimedia Commons

Perhaps it’s an indication of just how sick the United States has been for four years—waking up each day wondering what new crisis Trump would have put us into—that I heard nothing about our next Mars visit.  I’m normally quite interested in space exploration.  I seriously considered astronomy for a career, until I found out it’s mostly math.  In any case, I’ve watched our planetary explorations quite closely.  Yesterday, until just about five minutes before the landing of Perseverance on the surface of the Red Planet (earth is supposedly the Blue Planet), I knew nothing of the mission.  When my family alerted me to NASA’s live feed of the event I tuned in for those five minutes to watch as we safely landed our fifth such probe on our neighboring world.

It’s funny how a self-absorbed person can take a whole nation down with himself.  It was a relief to look outside for a while, and to wonder.  I remember when the rovers Curiosity and Spirit landed.  The advance of technology was evident in yesterday’s deployment.  No more bubble-wrap was necessary.  The landing system was incredibly elegant, and if there are any Martians I’m sure there were several UFO reports yesterday afternoon.  As the NASA interpretive explainer told what was going on, I wondered just how life might be on the Blue Planet if we were able to put all our tech to work for peace and the betterment of all.  Instead I find a Congress only too willing to acquit a traitor so we can continue the hate.

Emotion is a funny and unpredictable thing.  Although I knew nothing of Perseverance until five minutes before touchdown, I was immediately drawn into the feeling of the moment.  My eyes weren’t exactly dry as I watched the cheers of jubilation from those masked engineers in the control room.  This had been the culmination of years of hard work, and yes, math.  They were able to calculate fall rates and counter-forces, landing spots and trajectories.  And all of this from about 140 million miles away.  Perseverance was launched back in June—you can’t get there overnight—when we were still reeling down here from the overt evil of white supremacists.  Stoked by a man who would be king.  Leader of the Red States.  Would-be ruler of the Red Planet.  How I wish our technology could help us on our own planet.  Any probes landed here from elsewhere must, I suspect, not believe their mechanical eyes.


Too Fast

In the Easy Reader book Hooray for Henry (available on Amazon for $768.57; that’s $12.60 per page), our eponymous protagonist Henry can’t win any of the events at the picnic games.  One of the refrains as he participates in the races is “faster, faster—too fast” (I may have got the punctuation wrong, but then I haven’t read the book for at least a couple of decades and I can’t afford a new one).  That story seems to have become a symbol for those of us mired in technology.  The rate of change is, as in Henry’s experience, too fast.  The other day I noticed an annoying warning on my laptop that claims I’m low on memory and that I have to close some applications.  What with all that tech requires of us these days I probably do have too many things open at once.  It pops up, however, when I have even just one application open.

A web search revealed this is probably a virus (something that used to be rare on Macs, but that was back in the day when things moved a little slower).  The steps for removing it were technical and appeared to be extremely time-consuming.  What I don’t have is time.  And it’s not just my rare time off work that’s too full.  On the job we’re constantly having to learn new software.  It doesn’t really matter what your line of work is, if it involves sitting behind a computer we’re constantly being told to learn new applications while trying to find time to do the jobs we’re paid to do.  There’s no question of which is the tail and which is the dog here.  With an economy driven largely by tech, because that’s where all the jobs are, you risk everything if you don’t upgrade (about every two weeks at present).

I’ve been writing a long time.  Decades.  Some of my earlier pieces are no longer openable because the software with which I wrote them has been upgraded to the point that it can’t read its own earlier writing.  To the prolific this presents a real problem.  I have, literally, thousands of pieces of writing.  I can’t upgrade every single one each time a new release comes out.  The older ones, it seems, are lost forever.  I used to print out every post on this blog.  Given that there are now even thousands of them, I eventually gave up.  I know that they will inevitably disappear into the fog some day.  For writers who’ve been discovered after their deaths this would be a Bradburian fate.  Or perhaps a Serlingesque twist.  The world realizes a writer had something important to say, but her or his writing can no longer be read because the tech is outdated.  Faster, faster—too fast.


Many Moons

Scientists, often with their base matrix bound up with the local religion, are frequently interested in  myth.  And sometimes religion too.  This is no surprise.  Many of us go into religious studies because of its influence on our lives and scientists, who measure and analyze material realities, must be curious when their results challenge some religious or mythic assumptions.  So it is that Ernest Naylor addresses mythic beliefs about the moon’s influence on animals and what scientific findings on the same show.  Although this book wasn’t exactly what I thought it would be, Moonstruck: How Lunar Cycles Affect Life does address the subtitle assertion quite directly.  Naylor, a marine zoologist, knows about tides—caused by the moon—and their effects on marine organisms.  That connection is the main focus of the book, with occasional forays onto dry land.

What caught my attention right away was that when discussing myth and religious ideas, Naylor describes two stories as biblical: the woodcutter banished for gathering on the Sabbath and Judas’ banishment.  Both of these, he seems to believe, have the Bible banishing the criminals to the moon.  That was news to me.  There may well be folklore with such associations, but a simple opening of the covers of the Good Book would dispel this particular “myth.”  Neither the sabbath wood-gatherer nor Judas were banished to the moon after their deaths.  The former presumably went to Sheol and the latter presumably to Hell.  For me this illustrates yet again how many ideas professional people outside the guild suppose to be “biblical.”  The Bible says very little about the moon.  One New Testament demoniac is described as “moonstruck,” but beyond that the occasional references are mainly just to the moon qua moon.

The Bible’s a big book.  Everyone in western society knows it’s an important book but few read it.  Even fewer deeply engage with it to understand its original context and message.  We hear stuff and we’re told it’s in there, and we believe it.  I first noticed this in high school.  Classmates would tell me “the Bible says…” (you can fill in the blank with just about anything, this isn’t a quiz).  Almost always they were wrong.  By that point I’d read the Good Book many times cover-to-cover.  I owned concordances and knew when foreign matter was introduced.  The thing about the Bible is that it’s fairly simple to look it up.  Moonstruck focuses on marine animals and tells interesting connections to the moon.  It has a chapter on humans and the moon, finding little direct biological influence.  It’s an informative book, just don’t use it to verify what’s in the Bible.


Growing in Intent

Balance has become a desideratum.  Ours is an age of extremism.  Black and white instead of shades of gray.  One of the unnecessary polarizations is that between science and religion.  Part of the problem, it seems to me, is the labels we insist on using.  Science is shorthand for evidence-based research—it is a way of understanding the physical world.  It doesn’t necessarily discount a spiritual world but its methods can’t engage that world, if it exists.  Religion is a poorly defined word, often one of those “you know it when you see it” kinds of phenomena.  Often it is characterized by blind adherence, but that isn’t necessarily what religion is either.  To me, balance between the two is an authentic way to engage the world and other human beings.

Take plants, for instance.  And take consciousness.  While consciousness isn’t always associated with religion, it is one of those things that falls out of the ability of science to measure or quantify.  We don’t really know what it is, but we know we have it.  We know some animals have it, but rather arrogantly assert it is only the “higher” animals, as if we comprehend the hierarchy of nature in its entirety.  We dismiss the idea of plant consciousness.  For many years I’ve been pondering intent.  Without it no life would be possible from sperm germinating egg to heliotropes following the sun.  There’s some kind of intent there.  Will.  Recently The Guardian ran an article about scientifically measured intent in bean plants.  Although many have been left scratching their heads, or pods,  over it, to me it makes perfect sense.

I planted an apple seed a few months back.  It finally sprouted in late December.  I carefully watered it, and put it by a south window to get sunlight.  It grew quickly for a few days and then began to wilt.  I watched helplessly as it gave up the will to live.  I’m no botanist, but I suspected it was the coldness of being set on a windowsill.  (Ours isn’t the best insulated house.)  December had been mild, and it sprouted.  January took a sudden shift to chill, and I realized that new plants outdoors wouldn’t sprout in winter.  The seed had germinated, but the plant had no will to survive in temperatures chillier than its genes told it that might be safe.  I’m not a scientist, but I observed this scenario carefully.  Is it possible that french bean plants show intent?  I think it would be more difficult to explain if they did not.


Look, New…

You may’ve noticed a new look to my website.  That isn’t intentional.  I woke up Friday only to learn that Word Press (which used to be friendly to individual bloggers) decided to change at least one of the few templates they allow paying customers to use (if I upgrade even more to “business class” I have lots more options).  One of those templates happened to be the one I’d labored over, sacrificing an entire weekend about a year ago to get it just how I liked it.  Now, I’m a Neo-Luddite.  Behind the scenes my daughter and one of my nieces have helped me with technical aspects of this blog from the very beginning.  Several years ago I reached capacity for the free service, where, understandably, templates are limited.  Now I pay for both the domain name and the privilege of hosting it on Word Press.  But they like to limit privileges to try to force you to upgrade.  What would Amos say?

A few weeks back my iPhone began to lose its charge at an alarming rate.  I’d unplug it, and, doing nothing but occasionally checking for non-existent texts, it would be red-lining a couple hours later.  I feared I might need to get it serviced.  This went on for several weeks.  It occurred to me that Christmas was approaching and Apple has been known to slow down devices in order to encourage you to buy a new one.  Upgrade!  Everybody’s doing it!  Well, I don’t make enough money to constantly upgrade, so I kept my phone plugged in all the time when I was home (which, during a pandemic, is pretty much all the time).  Then, a few days after Christmas, when it was clear I wasn’t buying a new one, the battery began to hold its charge again.

The tech industry has us in a strangle-hold.  As soon as you purchase that first laptop, tablet, phone, or smart-watch, you’re an indentured servant to upgrades.  So I went to Word Press’s template library and tried to find something that didn’t look too bad with the images and “feel” I’m going for here.  Almost as if they’d chosen an algorithm that made available only a handful of templates that worked worst with what I’m trying to do on this website, I found their selection extremely limited.  If I upgrade to “business class” (which I will need to do when the capacity for my “service level” (not cheap) is full) I will have a plethora of choices.  Until they add a new service level above that, that is.  Then I’ll need to upgrade yet again to unlock all the neat features they “offer.”  Thanks, Word Press.  I’ve been with you over ten years now and I have to ask, is that the way you treat a longterm, paying friend?

Remember this?


Stay Curious

Needle felting.  I’d never heard of it.  I’d got along some five-plus decades without knowing a thing about it.  My daughter received a needle felting kit as a Christmas gift and, being the kind of person I am, I had to research the history of felt.  I always knew felt was different from other fabrics, but I couldn’t say precisely how.  I came to learn it is perhaps the oldest textile in the world, known by the Sumerians.  Felting is a process for making non-woven cloth.  The natural fibers of some wools are scaled, like human hair is, and when compressed and worked with moisture (wet felting), becomes cloth.  Finding out how things work is one of the great joys of life.  It also made me think again of how anyone could possibly be arrogant.

The longer I’m alive the more I’m learning what I don’t know.  Granted, felt has appeared in my life at numerous junctures—how many crafts do kids make of felt?  And I have a felt hat—but I had never thought much about it.  My wife likes to read about pioneer women who had to make pretty much everything by hand.  We call such people “rusticated” these days, but they know far more than most urbanites, simply by dint of having to do things for themselves.  Modern conveniences are great, but I often wonder how many of us might survive if we had to make it on our own.  Just the last couple of weeks we worried about losing power with the storms that blew through.  What do you do when the thermostat no longer works in winter?  Something as simple as that vexed me for days (I had to work rather than worry, so it couldn’t properly use my brain power).

I’ve known many people impressed with their own knowledge.  I can’t imagine how actually learning new things doesn’t make someone humble.  The universe is a vast and mostly uncharted space.  Down here on our somewhat small planet we have so much yet to learn.  I’ve studied the beginnings of agriculture, metallurgy, writing, and religion.  There’s still so much I don’t know.  I wouldn’t do well on Jeopardy—I second-guess myself too much.  Staying curious about the world is a good way, it seems, to keep humble.  I entered into this holiday season thinking I knew a fair bit about various crafting options.  As a family we cover the creative spectrum fairly well.  Then a small, soft thing such as felt made me realize just how little I really understand.  Any invitation to learn is one that should be accepted.


Ghost Stories

Those of us who confess to watching horror are fond of noting that the Christmas season has long been associated with ghost stories.  Charles Dickens wasn’t the first to make use of the trope and certainly won’t be the last.  After reading about elevated horror movies, I decided to watch A Ghost Story (David Lowery, 2017).  Many wouldn’t classify the film as horror at all.  It is quiet, slow paced, and has no gore.  It is nevertheless a haunting film.  I suspect its poignancy comes from a situation we can all imagine and which many people face in life—being left alone after the death of a loved one.  The idea that the dead never really leave us can be both comforting and unnerving at the same time.  The film plays to those strengths.

The premise of the film is simple: the ghost of one of a couple finds his way home and tries to reconnect with his widow.  He ends up staying there until, many owners later, the house is demolished and a high-rise is built in its place.  It’s essentially a story from the point-of-view of the ghost.  There isn’t too much dialogue included, but one significant monologue comes when a party is being held.  One of the party goers, or perhaps the current owner of the house, explains that because of what we know of physics everything on our planet will eventually be destroyed.  His beer-fueled lament is that whatever we do is therefore in vain.  He brings God into the discussion.  The ghost listens intently, but seems to disagree with his conclusions.  For someone like me the introduction of religion into the story is a Venus fly-trap, since religion and horror can’t seem to keep away from each other.

Death is a dilemma, a point that I made in a recent Horror Homeroom article on Pet Sematary.  Horror, like religion demands that we confront it.  Science can only offer cold comfort regarding the cessation of life.  Religion (and horror) open the dialog into the unknown, the realm into which mere human instruments cannot reach.  Sad and reflective, A Ghost Story hits on an essential question in the nexus of religion and science.  If a spiritual world exists, there may be some survival even of the earth’s eventual heat death.  As time passes, the titular ghost continues to learn.  Life is a learning experience, and although many modern forms of religion join in the cultural denial of death, horror is always ready to remind us that confronting it may be the wisest course of action.  Ask the ghost.  He knows.


Conflict Management

Conflict has come to dominate the twenty-first century in an unhealthy way.  No longer do religions, political parties, or even scholars of different disciplines want to try to see it from somebody else’s point of view.  Such “I’m rightism” is distressing, given that the greatest minds in history always left some room for doubt.  Einstein tried not to say too much about God, but his occasional references left some space for admitting he just didn’t know.  He was following closely in the footsteps of Sir Isaac Newton, who, ironically and iconically stands as one of the founding fathers of empiricism.  I say “ironically” because his real driving interests, as became clear only after his death, were religious.  With the science and religion conflict paradigm, it took a long time for many to admit that Isaac Newton was fascinated by religion.

A story in The Guardian recently noted that Newton’s unpublished notes on pyramidology have gone on auction.  These papers are even further indications of just how much religion mattered in the mind of the man who gave us a clockmaker God who wound up the universe and left it to run according to scientific principles.  My wish isn’t to cast any aspersions on Newton.  No, quite the opposite.  I wonder if we mightn’t use his wide-ranging interests to raise a relevant question: why do we see religion unworthy of attention while science, because it can be “proven,” is all we really need?  Especially since scientifically-based hypotheses about the origins of religion tell us that human beings need it.

Admittedly Newton was just as human as the rest of us.  Perhaps far more intelligent than most, but still human.  The humanities are the part of the human curriculum that has been under duress for many years at “universities.”  As business interests and money have taken on larger and larger roles in how schools conceptualize themselves, the humanities—which don’t make money—are undervalued and cut.  Capitalism takes no prisoners.  Education that has bought into that paradigm is bound to overlook certain facts.  Newton’s “arcane” interests were well hidden for a couple of centuries because who wants to think of the great rationalist as beholden to such a paltry thing as religion?  We’d rather keep our eyes firmly closed.  A conflict paradigm seems the better way to eradicate this troubling, so very human, aspect of even geniuses.  As long as there’s money to be made conflict will be the reigning model. 


Truly Exceptional?

Exceptionalism seems to be in the air these days.  Most recently it’s become a plank in the Republican platform—America is God’s own chosen nation (despite what the Bible actually says).  It’s also been a trait of nearly all human endeavors.  Human exceptionalism, that is.  The idea, whether admitted or not, is based on the Bible.  Even those bespectacled scientists who make no time for religion insist that humans are different from other animals.  Why?  The Bible tells them so.  Evolution certainly doesn’t.  And so we go about thinking how superior we are to other lifeforms.  And not only that, but to other humans in other geographical locations.  It seems Homo sapiens sapiens could use an ego check every now and again.

Not only does our sense of superiority go downward over the animals, it also reaches to the very boundaries of this infinite but expanding universe.  We are alone, scientists declare.  The only intelligent life in a universe far beyond the ability of the human brain to comprehend.  There can’t be any alien visitations with (laughably) superior beings crawling out of their flying saucers.  No, we were the best that evolution could do.  And we elected Donald Trump to be our president four years ago.  What’s that about an ego check?  Especially since we’ve learned that there is water on the moon.  Almost certainly there was once liquid water on Mars.  There may even be traces of life in the atmosphere of Venus (although the earthly jury is still out on that one).  Only humans can make that declaration.

Photo credit: NASA

I have to wonder at this arrogance that comes along with consciousness.  Do we believe we’re the best simply because we learned to apply the laws of rationality to our gray matter?  Back when I was a seminarian the word “pantheism” was rather like a swear.  To suggest a universal connectivity (literally) was an offense against the deity portrayed in the Bible.  (I would hope that a God that big would encourage us to understand the implications of a universe so large.)  We humans have our good points, of course.  I love people and their foibles.  Were we not so dangerous we might even look cute in the cosmic eyes above, as well as the inferior eyes of our pets.  Exceptionalism, it seems to me, ought to be the dirty word.  It seems far more human and humane to throw the gates open wide and consider the possibilities.  I love people, but if we’re the best there is, the universe is in serious trouble.


Internet Nowhere

So I wake up early.  I’ve been trying for years now to learn to sleep in a bit.  Somehow my body got to thinking the outrageous commute schedule to New York City was normal and I can’t convince it otherwise.  That means my most productive time comes before others awake.  It also seems to be the time favored by internet service providers to take their systems offline for a while.  You see, like any system the internet needs down time.  I slept in until 3:30 this morning and awoke to find internet access unavailable.  I use it during my writing, looking up answers to questions which both my fiction and non raise.  When the internet’s out there’s little I can do, but I’m already awake.  Society prefers conformists, but some of us maybe hear a different beat on our march.

The fact is we expect constant connectivity.  Many of us pay a significant monthly amount to ensure that we have it, but this is no guarantee.  Calling your local service provider at 4 a.m. on a Saturday (I’ve done this) is like dealing with IT at work: they really have no clue what’s wrong but they can talk technical to you, if that makes you feel good.  After all, it’s in the middle of the night.  So I try to decide on something else to do.  Reading works.  Books, however, often lead me to want to look something up.  But the internet’s down, at least around here.  We are utterly beholden to the tech industry that can (and does) wink out from time to time.  When the robot uprising occurs we just need to wait for the service maintenance hour.

I reboot my router.  It’s the first course of action when the internet’s out.  I think I’ll check out a personal hotspot, but to do that I need the internet.  It’s a great, constant feedback loop.  I suspect I’m not the only early riser who faces the internet dearth in the wee hours.  I know I’m overpaying because my data (whatever that is) plan on my phone always shows a monthly surplus.  When it comes to the techies, you just nod your head and pay your bill.  I do wonder what’s happening in the wider world.  Without the net you feel especially isolated in pandemic times.  It’s Saturday morning and the internet’s unavailable.  Back in my teaching days I know just what I’d be doing.  Instead I’m waiting for technology to catch up.


Anticipation

My work computer was recently upgraded.  I, for one, am quickly tiring of uppity software assuming it knows what I need it to do.  This is most evident in Microsoft products, such as Excel, which no longer shows the toolbar unless you click it every single time you want to use it (which is constantly), and Word, which hides tracked changes unless you tell it not to.  Hello?  Why do you track changes if you don’t want to see what’s been changed when you finish?  The one positive thing I’ve noticed is now that when you highlight a fine name in “File Explorer” and press the forward arrow key it actually goes the the end of the title rather than just one letter back from the start.  Another goodie is when you go to select an attachment and Outlook assumes you want to send a file you’ve just been working on—good for you!

The main concern I have, however, is that algorithms are now trying to anticipate what we want.  They already track our browsing interests (I once accidentally clicked on a well-timed pop-up ad for a device for artfully trimming certain private hairs—my aim isn’t so good any more and that would belie the usefulness of said instrument—only to find the internet supposing I preferred the shaved look.  I have an old-growth beard on my face and haven’t shaved in over three decades, and that’s not likely to change, no matter how many ads I get).  Now they’re trying to assume they know what we want.  Granted, “editor” is seldom a job listed on drop-down menus when you have to pick a title for some faceless source of money or services, but it is a job.  And lots of us do it.  Our software, however, is unaware of what editors need.  It’s not shaving.

In the grip of the pandemic, we’re relying on technology by orders of magnitude.  Even before that my current job, which used to be done with pen and paper and typewriter, was fully electronic.  One of the reasons that remote working made sense to me was that I didn’t need to go into the office to do what I do.  Other than looking up the odd physical contract I had no reason to spend three hours a day getting to and from New York.  I think of impatient authors and want to remind them that during my lifetime book publishing used to require physical manuscripts sent through civilian mail systems (as did my first book).  My first book also included some hand-drawn cuneiform because type didn’t exist for the letters at that particular publisher.  They had no way, it turns out, to anticipate what I wanted it to look like.  That, it seems, is a more honest way for work to be done.


Wild God

Living with a Wild God, by Barbara Ehrenreich, is one of those books I wanted to put down gently after reading it, for fear that it might explode.  Or maybe it was my head I feared might combust.  Describing it is difficult because it is so wide-ranging.  On the one hand it is an atheist’s view of religion.  On the other hand it is a spiritual biography.  On a third hand it is coming to terms with having had a profound mystical experience.  It is one of those books where, knowing my life has been so very different, yet I feel that Ehrenreich and I have had so much in common that we’d be friends if we ever met.  It is also the work of a woman who is scary smart and whose teenage thoughts were so intense that my own seem puerile by comparison.

But that mystical experience!  I’ve had many of them in my life, but I don’t know you well enough to share them here.  They’ve been recorded in an unfinished book that I may or may not try to publish some day.  (Ehrenreich was smart and took a job as a journalist, which means others assume you know how to write.  Even those of us in publishing have trouble convincing agents and others who hold the keys to non-academic pricing that we understand the craft.)  Mysticism quickly becomes a staid discipline, not at all like the life-directing experiences such encounters themselves actually are.  It’s difficult to explain without sitting down and talking to you.  It’s something academics tend to avoid like Covid-19.

The books that mean most to me are like conversations with an absent author.  Drawn in by an openness, or perhaps by the fact that we’ve lived in a few of the same places over the years, perhaps passed one another unknowingly on the street, you feel that they’ve invited you into their very head.  What you find there has a strange similarity to what is in your own head, while being completely different at the same time.  We should all strive for such honesty in our writing.  In the end Ehrenreich, with a doctorate in science, suggests we need to be open.  That kind of validation is important for those of us who’ve poured our lives into the study of religion.  She was drawn in from atheism, and I have been trying to escape from literalism all my adult life.  We have ended up in places not dissimilar from each other and I’m glad to have met her through this profound book.


The End of Snow Days

It’s a chilling thought.  An article in the New York Times said it, but we were all thinking it.  Snow days may well have become another victim of Covid-19.  No, it’s not snowing yet (but give climate change a chance!), but New York City schools have figured out that if students can learn from home then one of the truly treasured memories of our youth may no longer be necessary.  In fact, snow days ended for me when I began working remotely.  My supervisor had suggested, even before that, that I take my company laptop home daily, in case of inclement weather.  The idea of awaking, wonder-eyed, at the world covered in white—that cozy feeling of knowing you had no obligations for the day but to enjoy the pristine world out your window—is a thing of the past.

Technology has changed our lives, and some of it is even for the better.  It hasn’t made work easier for some of us, but has made it longer.  We used to talk about kids and their continuous partial attention, but now work is always at home with you and that time signature on your email says something about your work habits.  As the days are now shorter than the nights, as they will be for six more months, finding the time to do what you must outdoors (it may be cooler, but lawns still insist on growing) is always a bit more of a challenge.  And when the snow does fall you’ll still have to shovel the walk.  All time has become company time for a truly linked-in world.

The real victim here, it seems to me, is childhood.  Snow days were a reminder that no matter how strict, how Calvinistic our administrators wanted to be, the weather could still give us a smile now and then.  A legitimate excuse not to have to go to school and, if parents couldn’t get you to daycare, a day off for everyone.  The strict number of limited holidays allotted by HR had limited power in those days.  Although we all know that well-rested, happy workers tend to do better jobs than those who are constantly stressed out and who have trouble sleeping, we’ve now got the means to make the sameness of pandemic life the ennui of everyday life, in saecula saeculorum.  Thanks, internet.  At least now we work where we have a window and can look out on nature and can see what we’re missing.