Qaulity Education

Perhaps it’s from having a stubbornly blue collar, but snobbery has never appealed to me.  While in seminary at Boston University, I applied for a transfer to Harvard Divinity School.  In spite of being accepted, I stayed at my alma mater and paid the consequences.  There’s a strange loyalty among the working class, you see.  And now I’m finally seeing my former mistress, academia, taking a turn toward the lowly but worthy.  The title of a recent article in the Chronicle of Higher Education says it all: “As Scholars Are Driven to Less Prestigious Journals, New Measures of Quality Emerge.  Hmm, why might that be?  The industry mantra, “publish or perish” has grown more aggressive over the years and the number of publishers has decreased.  Your academic net worth, it seems, can no longer be based on how elite you are.

People are funny that way.  We’re very impressed by those paraded before us as successes—as if some kind of magic clings to those who are where we wish we were.  In academia where you went to school matters more than what you’ve proven yourself capable of.  If you attended the “best” schools your work will be accepted by the “best” journals and publishers.  What rarified company you’ll keep!  For the rest of us, well, we have the numbers.  And blue collars aren’t afraid of hard work.  Let the academic aristocracy enjoy its laurels.  Laurels are poisonous, however, for those with an eye open for parables.

Primates, according to those who know them best, can see through pretense.  I often wonder if our political chaos isn’t based on this simple fact of biology.  As a priest I knew once told me, “We put our pants on one leg at a time too.”  This didn’t prevent many postulants I knew from anticipating the day when they would be ontologically transformed.  Priesting, I was informed, would make them better than the laity.  Closer to God.  Here it was, even among the clergy—the desire for prestige.  Chimpanzees will take down an alpha who abuses his power.  Nature has a set of balances.  Tampering with them leads to, well, scholars being driven to less prestigious journals and the like.  The net result, as the Chronicle suggests (if read one way), is that the last shall be first and the first last.  Probably it’s the result of reading too much Bible in my formative years, but I’ve always appreciated parables.

Caveat Emptor

When you work in academic publishing, various higher education news sources find you.  Not able to distinguish faculty from industry professionals that rely on them for their by-products, these sites often offer friendly advice on how to succeed in academia.  Having had not a little experience in that venue (if you’ll pardon my litotes), I noted a recent headline before clicking the delete button.  I can’t reconstruct it word-for-word, but the gist of it was that if you wanted to earn more as an academic, you should study overseas.  Your salary, the article implied, would be higher if you did.  Now I recognize that things constantly change, but in my field of study if you want to get any job at all, let alone a good paying one, you study domestically.  Specifically at Harvard.  Academics, just like publishers, rest on their laurels.

The funny thing about this headline is that it contained the same advice that I received all the way back in the 1980s.  I followed up on it, choosing Edinburgh after having been accepted at Oxford, Cambridge, Aberdeen, and St. Andrews.  Only later did I learn that of those schools only Oxford opened the door to positions in my native United States, being, as it is, the Harvard of the United Kingdom.  Defying the odds, I did get a job that, when I became Academic Dean with access to industry stats, I discovered was among the lowest paying of its peers.  Studying overseas, in other words, had the exact opposite effect than the headline promises.  Perhaps things have changed in the intervening years.  Even today I have to remind people that Edinburgh is a world-class research university, one of the four ancient schools in the kingdom of the Scots.  Some of the most famous minds in human history studied there.  Ach, well, a job by any other name would smell of sweat.

Xenophobia isn’t unique to the GOP.  It exists in higher education too.  Academics are extremely tribal, and if you try to break in from the outside—no matter where you study—you’ll learn that your money might be spent more wisely learning a trade.  As a homeowner, I’ve discovered that just about any practical job that doesn’t require college pays better than what you can get with the detritus of a doctorate on your résumé.  In fact, during times when work was scarce I tried to hide it.  One of the skills I picked up in my educational journey was not to believe everything you read.  Problem is, you only pick that up after you’ve already paid that tuition bill.  The delete button is right there; don’t be afraid to use it.

Book Life

Like a book, life can be divided into chapters.  This is perhaps an instance of art following reality, or perhaps it’s the other way around.  The episodic nature of life suggests the chapter structure of books.  As I was waking up this morning (disappointingly before 4:00 a.m.) I was reflecting on the chapters of my life.  As with a book, the most recently read decade is perhaps freshest in one’s mind, but the decades do seem to fall roughly into format.  We tend to think of that first decade—childhood—fondly, even if in reality it wasn’t all games and candy.  It’s biology’s way of encouraging us toward that weird teen chapter of puberty with its intense emotions and maturing bodies.  That chapter is recalled, at least in my experience, as a turmoil involving both good and bad.

The twenties, in my book, were spent in higher education.  It was a cerebral chapter.  Finishing college and starting grad school.  Finishing a masters and discovering employment difficult to find with a master’s degree.  In my book marriage was in the twenties chapter, along with a doctorate.  The next chapter, the thirties, was spent entirely at Nashotah House.  That involved becoming a father as well as a professor.  The other faculty were fathers of a different sort.  I always thought chapters should show some continuity but the forties chapter was that part of the book known as the crisis.  The upending of convention.  The self-reinvention.  The move.  I suppose in terms of a novel that was when it started to get really interesting, but from my perspective life had been plenty interesting enough by that point.

The fifties have been a bit more settled.  The publishing chapter.  The house-buying move added drama, of course, but otherwise the nine-to-five is like a mind-numbing drug.  Mine involved a commute that lead to its own unpublished book, as well as two somewhat academic  tomes.  All of this was going through my head the way thoughts do when you can’t force yourself back to sleep.  The paradigm suggests itself to someone who has, in one form or another, been writing for his entire life.  Or writing his life.  My first attempts at being a novelist began in chapter two.  On yellowed paper somewhere in the attic I still have that first handwritten attempt at literary expression.  The current chapter has me becoming a gruncle (with a nod to Gravity Falls fans) and wondering how a great niece might read a book written like this.  If she will even have an interest.  That’s the way of books, as any librarian knows.  Maybe some warm milk and a cookie are indicated.

Life is a book.

New Horror

Now that Holy Horror is out I’ve been noticing an increasing number of scholars who are writing on the topic of monsters.  Book writing takes several years, as a rule, and when I began work on my contribution to the discussion the bibliography was a touch slim.  There weren’t many books out there and academics who addressed the topic did so warily.  Now scarcely a day or two will pass when I won’t find another book I should read on the topic.  Publishing may be an industry in crisis, but there’s no dearth of new books being produced.  Monsters—which define horror—are a means of coping with the realities of a world out of control.  Since 2016 many of us have felt a vague, if at times pointed, sense that something is seriously threatening out there.  Horror seems a logical response.

Academia tends to run behind trends rather than setting them.  Academic books in general don’t sell too well, and monsters often have crossover appeal.  The longer I’m at this, the more I think of how knowledge as a whole is gathered.  Having that shiny Ph.D. doesn’t do so much anymore when it comes to credibility.  It may get you in the publisher’s door, but to attract readers it helps to pick topics that scholars have typically avoided.  Monsters are a calculated risk in this regard.  Those who publish in the field become somewhat suspect among their colleagues, as if the subject is one that can only play itself out in naivety, an under-developed sense of sophistication.  Anything popular tends to be devalued in the academic mindset.  It is, therefore, encouraging to see others addressing my beloved monsters.

A new year is starting and, like many people I have high hopes that it will show some improvement over the past.  I can actually dream of a world without monsters and although pleasant it isn’t realistic.  We have evil with which we must deal.  Horror allows for a fair amount of practice in that regard.  I’m very well aware that many people find the topic repugnant, or at least distasteful.  Academics, it seems, are following their restless curiosities to the darker corners of the mind.  It’s getting difficult to keep up with the monster books appearing, even from reputable presses.  Holy Horror is my first contribution to the discussion and Nightmares with the Bible, which I hope to finish this year, will continue the conversation.  It looks like it’s becoming trickier to find a voice in this crowd already.  I wonder if that implies a better 2019, as we run behind the times.

Book Birds

I just read an interesting article about how social media, and the internet in general, hijacks our time.  If you’re reading this, no doubt you’ll agree.  Those of us who write books on our “free time” know that the way books are both found and sold is on the web.  Publishers  encourage authors to build a social media platform, usually involving Twitter.  Academics are often hopeless at social media—they’re lousy at following back on Twitter, as I know from experience.  There is a kind of self-importance that comes with higher education which makes many of the professorate assume the work of others is less important than their own.  It’s more blessed to be tweeted than to tweet others.  After all, such-and-such university has hired you, and that proves the value of what you have to say.

Head-banging tweeter

Book publishers, however, will be looking at how many followers you have.  Not that all of them will buy your book, but at least a number of them will know about it.  Curiosity, indeed, drives some sales.  Just like many academics, I’m jealous of my time.  I’m also conscious of that of others.  These blog posts seldom reach over 500 words.  I tweet only a couple times a day, although I understand that’s not the way to get more followers.  You need to tweet like a bird, often with images or memes, but try explaining that to your boss when each tweet is time-stamped.  The academic is uniquely privileged to be given control of their time outside of class and committee meeting.  Tweet away.  That doesn’t mean they’ll follow you back.

The reason for tweeting is, of course, self-promotion.  45 may understand little, but he understands that.  You can commit treason and people will overlook it if you tweet persistently enough.  My own Twitter activity is like the eponymous birds after which the site is named; it is active before most people are awake.  And it, like this blog, is not designed to take up your time.  Since my tweeting during the work day is limited, my tweets are seldom picked up.  I try following other academics, but often they don’t follow back.  After all, what does a mere editor have to say that could possibly be of interest to the high minded?  Alas, I fear my advanced studies of the Bible have become bird-feed.  And my forthcoming book won’t get noticed.  I only wish more colleagues would consider the adage, tweet others as you would like to be tweeted.

You’re History

A story from Inside Higher Ed discusses a study of history majors and their rapid decline.  This occurs during a sudden onset of “job related” majors and the graph accompanying the article shows how STEM has taken over higher education.  These are the fields with actual occupations awaiting them at the end of the degree, while disciplines such as history and religion (also very near the bottom) have less clear career paths.  Indeed, when I’ve been in the job market I find that a religion degree is less than useless, no matter what the department recruiters tell you.  If you’re not bound for the clergy you undertake the study at your own peril.  History, I expect, suffers from a similar dynamic, but the peril in this case is to all of civilization.

We’ve seen over the past two years how a stunning lack of knowledge of history sets a nation on the path to chaos.  Businessmen with no classical education don’t make good national leaders.  Knowing where we’ve been, as Santayana so eloquently stated, is the only thing that keeps us from repeating past failures.  History is our only safeguard in this respect.  Over the Thanksgiving break I spent a little time delving into family history.  Since I don’t come from illustrious lineage, I felt the frustration of finding out what happened to obscure people from the last couple of centuries.  Lack of history on a personal level.  On a professional level, my doctorate is really in the history of religions (ancient religions) and I’ve become keenly aware of just how little history there is to the very popular modern Fundamentalist movement.

Maybe I said that wrong.  They do have a history, but the belief system that is touted as ancient is really quite modern.  Anti-modern, in fact.  When historical knowledge is lacking, however, people can make all kinds of claims based on nothing more than wishful thinking.  History keeps us honest.  Or it used to.  When we’ve outlived the need for history we’ve started down a path unlit by any embers of past human foibles.  We’ve been living in a culture in love with technology but not so much with critical reflection of where such innovations might take us.  Doctors are beginning to complain that they spend more time on their computers than with their patients.  The time freed up by the internet has been taken up by the internet.  And when all of this comes to its natural culmination, we would be well served by historians to make a record of what went wrong.  If we could find any.


Taking Turns

“Turn! Turn! Turn,” the Byrds sang.  “For everything there is a season,” quoth Solomon.  Perhaps it’s the way we acquire knowledge, but lately many fields in academia are experiencing “turns.”  The idea seems to be that if fields continue to turn, they will eventually all converge on the same intersection and true knowledge will be obtained.  The post-modern turn, however, suggests that there is no objective knowledge.  It kind of makes me dizzy, all this turning.  Although I find the use of this particular noun in such phrases a touch unsophisticated, it’s here to stay.  At least until academia takes another turn.  Public intellectuals, after all, have to have something to say.  And academics are capital imitators.

Ironically, within the same week I read of the “religious turn” in the humanities and a different turn within religious studies.  This “religious turn” is not to suggest the humanities have found that old time religion, but rather that many disciplines are now realizing that religion has played, and continues to play, a very important role in human affairs.  Fields that have traditionally avoided religious topics are now “turning” that way.  At the same time that others are turning toward religion, religious studies is taking a “material turn.”  The public intellectuals smile at the maze they’ve created as the paychecks roll in.  The “material turn,” if I understand correctly, is that the ideas of religion can be explained via the real world needs that various religions meet.  There’s no need for any divine character or intervention.  There is no sacred or profane, but rather kinetic movement of shifting patterns that at any one time or place might be denominated as religions.

I’m all for progress, but I think I might’ve missed the turn.  To my old school way of thinking, sacred and profane, Eliadian though they may be, still have great explanatory value.  I don’t know if there’s objective knowledge to be found by fallen mortals such as we.  The material world we experience through our senses is mediated by those very senses so our understanding is, of necessity, limited.  We can’t touch naked reality even if we try.  Our quest, in circumstances such as these, would seem to be digging deeper until we come to that which resists any tunneling.  It’s like coming to the end of the physical universe and wondering what’s beyond this natural limit.  Then, I suppose, you’d have to turn.  Until such time as that, however, all of this present day turning is for the Byrds.