Free Reality

One thing movies can do especially well is to make you question reality.  Early on this was more or less literally true as people couldn’t believe what they were seeing on the screen.  Photography had perhaps captured souls, after all.  A series of movies in more modern days began to ask us to reconsider what we know, with profound results.  In 1998 The Truman Show suggested that we might be living on a stage and God is really a misguided director. The next year The Matrix went further to float the idea we might be living in a simulation—an idea that some highly educated people have taken seriously since then.  They asked us to consider what we meant by reality.  Those questions have haunted us as the cybersphere grew.

I recently saw Free Guy, a movie that slipped me back into that uncomfortable space.  I’m no gamer, so I’m sure I missed many of the references to memes and characters that are familiar to many.  Still, it was fun and profound at the same time.  It’s not giving too much away to say that Guy is a non-playing character in a shared game.  In other words, he’s just code.  Not conscious, not making any decisions.  Until he starts to.  He turns out to be a form of artificial intelligence.  Teaming up with a human player, he learns to appreciate virtual life and works to make Free City a better place to live.  When the credits rolled I found myself asking what I knew about reality.

Not a gamer, I’m pretty sure we’re not caught in that particular matrix.  I’m pretty sure my wife wouldn’t’ve put up with over thirty years of pretending to be married to me just for ratings.  Still, many times riding that bus into Manhattan I had the distinct feeling that none of it was really real.  I would tell myself that on the way to the office.  Not that I think movies are the whole truth, but they definitely seem to be part of it.  Guy learns to rack up points to level up.  He becomes a hero.  In this reality we can look but not see.  Becoming a hero is unlikely unless someone is actively watching you.  Many heroes on a small, human-sized scale exist.  They don’t get to wear sunglasses, but they can watch movies about those that do.  And if they’re not careful, they might find themselves getting in a philosophical quandary by doing so.


Behind Science

Science and religion have been sparring partners for a few centuries now, and I believe this is a generational conflict.  The child, science, arguing with the parent, religion.  You see, religion is all about worldview.  As secular as secular scientists declare themselves to be, their worldview was likely formed by their religious heritage.  Religion can’t be teased out of culture.  Here in the western world modern science was born in a fully Christianized cultural landscape.  That’s not to say that Judaism and Islam didn’t contribute, but European culture was based on some basic Christian ideals.  Creatio ex nihilo, for one—creation out of nothing.  Another aspect is that Occam’s razor accounts for the world we see.  This was a philosophical concept born of the Christian worldview.  And the list could go on and on.

Scientists, focusing on their specializations, generally don’t sit back to think about the origins of their basic cultural presuppositions.  Many of them came directly from their religion.  Ever since college I’ve tended to think back to presuppositions, and question them.  How do we know we know?  Epistemology is as useful as it is disturbing.  And if we discover that the basis for what we know was locked into a worldview we can no longer accept, what does that say about the underpinning method?  Our science is based on the idea that the world is rational because a benevolent deity wouldn’t make it absurd.  Would he?  And why are we still referring to the deity as a male?  Indeed, we still think of him as a human.

It’s difficult to get beyond our basic cultural propositions.  Religions such a Buddhism promote the idea that change is the only constant, yet the science in countries of the east is borrowed from the concepts of the west and its monotheistic sub-structure.  We tend to think that if humans can’t sense it, and quantify it, it doesn’t exist.  So it is that many scientists become atheists, but without perhaps questioning the cultural presuppositions that have led to the scientific outlook in the first place.  Some will go as far as saying philosophy is a waste of time when philosophy is the framework of all rational thinking.  And that’s not to forget that there’s emotional thinking as well.  The big picture is complicated by philosophers writing in lingo that the rest of us can’t understand.  And even they have presuppositions.  Maybe it’s time for me to go back to school and examine them again.


Never Too Late

In these weary days of bleak news, I’m always glad to find a bit of cheer.  A friend recently shared the story of Giuseppe Paternò from The Guardian.  Paternò is a 96-year-old first time college graduate.  As the story explains Paternò had wanted to attend college his entire life but being raised in poverty he never had the opportunity.  We all know how life is a rushing river that snatches you in its current, and thus Paternò found himself unable to attain his dream.  Until his nineties.  Just this year he graduated from the University of Palermo.  What really spoke to me about this story is that Paternò is now considering working on his master’s degree.  While some might wonder if this is practical, to me it demonstrates that knowledge is never wasted.

We live in an era where education is seen as either a useless luxury or as just another business.  Both views are fatal to our civilization.  We have reached where we are by progressively educating our young (and old) so that our collective knowledge-base grows.  When education is seen as a business (and I saw this in my ill-fated university teaching career) it becomes something different.  This isn’t on the part of the faculty, for the most part, but on administrations.  Paying corporate-level salaries to administrators—schools top-heavy with deans—they can’t afford to hire faculty and cut departments that aren’t profitable.  Knowledge, in turn, suffers.  Paternò, I sincerely hope, avoided the politics of academia.  A man hungry for knowledge, he studied philosophy at an age when most of us think people should just sit around and stare at the walls all day.  Knowledge should never be wasted.

Those of us who’ve been excluded from the academy sometimes try to continue our contribution.  Some of us still write books and articles.  It does nothing for our promotion or tenure.  It certainly doesn’t bear much in royalties.  “Why do it?” a friend once asked me.  When we cease seeking knowledge we stagnate and die.  We see this playing out in the politics of our day.  Washington houses many who see education as a threat to the unrestrained acquisition of mere money.  This is why universities suffer—they are not businesses.  One size does not fit all.  At their best they’re places where those of us raised in poverty can go to have our eyes opened.  And they are places where even nonagenarians can go to contribute to the growth of knowledge.


Seeking Reality

I spend a lot of time struggling to figure out the fundamental basis of reality.  I’m hampered in this by a brain that was evolved—optimized—to help me survive in my environment, not to penetrate the depths of what’s really real.  That’s why I began studying religion in the first place.  The connection was organic.  Raised as a fundamentalist daily reminded that an eternal hell of torment awaited, it made sense to study the antidote (the Bible) as much as possible.  When I prepared for college, which wasn’t the plan at the beginning, I could think of no other major beyond religion.  In Paul Tillich’s nomenclature, it’s all about ultimate concerns.  I didn’t accept the very evolution that had made me this way.  That required thinking through.  

Attending a liberal arts college wasn’t really a conscious decision.  Nobody in my family had been to college and I didn’t know the difference between a research university and a stand-alone liberal arts institution.  Somebody has to teach you these things.  Religion, I found out, is a pretty good way to work toward perceptions of reality.  These days the award for that goes to philosophy, but the two fields are closely related, as much as philosophers socially distance themselves from theologians.  They’re both seeking the same thing, really.  Public perceptions of theology, however, trail after televangelists and their ilk, leading a wrong impression in the minds of the masses.  Even professors are prone to accept this facile supposition.  Seeking reality doesn’t mean you won’t get laughed at along the way.

Although there have been some among religious leaders who claim to have found the answer, the rest of us continue to struggle.  The more I read both of science and of religion the more complex it all seems to grow.  And of course human agendas require the keeping of secrets.  Knowledge that is for employees only because they kind of have to know.  The price on the sticker represents a mark-up that could be cut down.  What is this item really worth?  So it goes with the search for reality.  There’s no end to the searching.  Even after Siddhārtha Gautama was enlightened, he continued to have to work at it.  Christianity used to teach that love was the point of it all.  That message seems to have changed with the arrival of the messiah known as Trump.  Those of us who can’t stop searching even if we find can’t help but wonder if there isn’t something more worthy on which to spend our time.


Taking It Seriously

It would be incorrect to say that I choose to watch and read horror.  What would be more correct would be “Horror compels me to read and watch it.”  Those of us mesmerized by the genre tend to be a reflective lot.  We ask ourselves the question others frequently ask us—why watch it?  And yet, horror films tend to do very well at the box office.  Some even become cultural icons.  Of the many books analyzing horror, it would be difficult to suggest one more influential than Noël Carroll’s The Philosophy of Horror: or Paradoxes of the Heart.  It has been in just about every bibliography I have read in the subject.  It’s easy to see why.  There are lots of gems in this book, and it does indeed address the paradox at the heart of it all.

Philosophy, due to the very fact that there are competing schools, doesn’t attempt to provide the answer.  It offers an answer, one that hopefully makes sense of the overall question.  What question?  The one with which I began: why do people get into horror?  Carroll comes down to a deceptively simple answer, but I would make bold to suggest it does so at the cost of having undercut the religious element.  As in nearly every book on horror, Carroll does address the connection with religion.  He finds it lacking, but the reason seems to be his definition of religion.  He follows, perhaps a little too closely, Rudolf Otto’s Idea of the Holy.  No doubt, it’s a classic.  Still, it doesn’t encompass the broad scope of religion and its genetic connection to horror.

At many points of The Philosophy of Horror I felt compelled to stand up and cheer.  I didn’t, of course, since much of the reading was done on the bus.  My ebullience was based on the fact that here was an intellectual who gets it, one who understands that horror is pervasive because it is meaningful.  Sure, it’s not to everyone’s taste.  It’s not, however, simply debased imagination, or arrested development gone to seed.  There is something deeply compelling about horror because it helps us to survive in a world that is, all paranoia aside, out to get us.  Yes, it engages our curiosity, as Carroll asserts.  It satisfies more than it disgusts.  It also defies explanation.  Perhaps that’s the deep connection with religion.  It can never be fully explained.  That doesn’t mean we shouldn’t try.  And this book is a valiant effort indeed.


Mythic Truth

“Myth embodies the nearest approach to absolute truth that can be stated in words.”  I recently came across this quote from Ananda K. Coomaraswamy.  Coomaraswamy was a philosopher and metapmhysist from Ceylon, and like many eastern thinkers had a more holistic view of the world than western rationalism.  We’re taught from a young age that myth is something false, not true.  This colloquial use of the word is so common that those of us who’ve specialized in myth slip into it during everyday conversation.  Words, however, have uses rather than meanings.  Coomaraswamy was engaging this reality in the quote above.  Words can take us only so far in exploring reality when we have to break into either formulas or poetry.  Although they are under-appreciated poets are the purveyors of truth.

Having studied ancient mythology in some detail, it became clear to me as a student that these tales weren’t meant to be taken literally.  Instead, they were known to be true.  It takes a supple mind to parse being true from “really happened,” as we are taught in the western world that on what “really happened” is true.  In other words, historicism is our myth.  Meaning may not inhere in words, but when we use words to explore it we run up against lexical limits.  Is it any wonder that lovers resort to poetry?  On those occasions when I’ve been brave enough to venture to write some, I walk away feeling as if I’ve been the receiver of some cosmic radio signal.  We have been taught to trust the reality of what our senses perceive.  Myth, and poetry, remind us that there’s much more.

The Fundamentalist myth is that the Bible is literally true.  If they’d stop and think about it, they’d realize the mockery such thinking makes of Holy Writ.  The Good Book doesn’t look at itself that way.  In fact, it doesn’t even look at itself as a book—an idea that developed in later times.  The time and the cultures that produced the Bible were cool with myth.  They may not have called it that but the signs are unmistakable.  Ananda Coomaraswamy knew whereof he wrote.  The closest to absolute truth we can come takes us to the end of declarative, factual writing.  Scientists writing about the Big Bang devolve into complex mathematical formulas to explain what mere words can.  Myth is much more eloquent, even if we as a society, dismiss it along with other non-factual truth.


Scientists, Unplugged

Feeling inferior is common among religionists. When cultures list their brightest and best, scientists often top the list and those who specialize in religion are nowhere to be found. This situation gives the lie to the fact that many scientists think about, and are influenced by, religion. That became clear to me in reading Stefan Klein’s We Are All Stardust. Not Klein’s best-known book, this is a collection of interviews with well-known scientists, unplugged. There are many big names in here, such as Richard Dawkins and Jane Goodall, as well as some less familiar on a household level. Klein, himself a Ph.D.-holder in physics, asks them somewhat unconventional questions, with the goal of bringing a more human face to scientists.

When asked directly, scientists admit to thinking quite a bit about religion. Of those interviewed, several are hostile to it while others accept some tenets of one faith system or another. Most of them indicate that either religion or morality plays an important role in society, if not in science itself. The sad part is almost none of them seem to realize that the study of religion can be (and among the university-trained, generally is) scientific. In academia, religious studies is often vaguely tossed in with the humanities, while others would suggest it fits under social sciences—as a sub-discipline of anthropology, for example. Few understand the field, in part because many specialists enter it for initially religious reasons, somehow tainting it.

While I enjoyed the book quite a lot—it was a quick read with plenty of profound ideas—it also had a disturbing undercurrent. The explanation that many of the interviewees gave for why they went into science was “curiosity.” The implication was that those who can’t stop asking questions, and have intelligence, go into science. Again, this feature is true of most academic fields, if they’re understood. Greatly tempted to go into science myself, I simply didn’t have the mathematical faculties to do it. While I took advanced math in high school I wouldn’t have gotten through without my younger brother explaining everything to me. My real concerns lay along the line of ultimates. Learning about Hell at a young age, it made the most sense to me—very curious and scientifically inclined—to avoid going there. To do so, the proper target of my science should be religion. While many scientists in We Are All Stardust are friendly to philosophy, religion is considered a far less worthy subject by not a few. True, religion often behaves badly in public. It doesn’t bring the money into universities that megachurches reap. But unplugged even scientists still think about it.


Knowing It All

Reading about the Trump administration underscores once again the traditional American contradiction of, love of, but mostly hatred toward, experts. When you’re lying on that operating table, you stake your life that an expert is going to perform the surgery. When you buy that airline ticket, you’re banking that the pilot will be an expert. If you’re electing the most powerful individual in the world, you’ll excoriate experts and defer to the guy with the weird hair that says whatever he pleases and has never been a public servant a day in his life. This observation isn’t original with me, of course. I’m only an editor. Nevertheless, the same dilemma comes down to my little world of academic publishing as well. Most academics don’t understand this business—I was an academic at one time and I certainly didn’t—and yet don’t like to bow to the expertise of those who do.

Please don’t misunderstand. I’m making no grand claims to understanding this industry into which I unwittingly stumbled. I have been involved in it for over a decade now and I’m still learning. One of the things I’m learning is that many academics don’t trust experts. In part it’s academic culture. A doctoral program, if it’s a good one, will make you question everything. Sometimes even experts forget when to engage the brakes. When dealing with the experts at a publishing company, many academics doubt the expertise of those who do this day in and day out for a living. Books, however, have measurable sales records. There’s hard data for analysis. Not that it’s foolproof (but what is?), such metrics are time-tested and based on reasonable data sets. Often that’s not enough to convince an expert that other experts know more than they’re revealing. A personal philosophy, but one which I pursue with appropriate skepticism, is that other people should be left to do their jobs. As I frequently note, those who talk to the bus driver, freely giving advice, often make the situation worse for everyone.

The case of religion, however, is a special can of worms. There are no experts in this field, even among those of us who are experts. Had I realized this when I was younger, I’m not sure it would’ve made much of a difference in what I ended up doing with my life. You see, religion is all about ultimates. The big questions. The sine qua non of every single thing. When I read about things like politics, or entrepreneurship, I think to myself, “That’s all fine and good, but at the end of the day, is it what really matters?” If life is a search for meaning, why not grab it by both hands and try to become an expert at it? Some would say that’s the job of the philosopher, but let’s face it, religionists and philosophers deal in the same currency. One is more abstract than the other, to be sure. Still, don’t take my word for it. Please consult an expert.


On Time

Getting to the movie theater is not only costly, but increasingly difficult to schedule. This can be problematic for someone who likes to write about movies, but the realities of the commuting life aren’t very malleable. So it was that I finally had a chance to watch Arrival, on the small screen. It had been recommended, of course, and although it’s not horror it has aliens and a linguist as the hero—my kind of flick. Once it began, I wondered if religion would play any role in the story. Alien contact would certainly rate as one of the more formative religious events of all time. The only reference that was obvious, however, was the suicide cult shown on a news story in the background, immolating themselves as the aliens became known.

Louise Banks, a linguist who has security clearance, has a sad story. Spoiler alert here! If you’re even more tardy than me you might want to fire up Amazon Prime and read on afterward! The movie opens with her watching her daughter grow up, only to watch her succumb to a rare disease as a young woman. Then the aliens arrive and she’s whisked off to Montana to try to communicate with them. It’s only after repeated encounters, learning the written language of another race, that she asks who this little girl she keeps dreaming about is. The child is in her future. The aliens see time as cyclical, not linear, and by learning their language she begins to think like them—knowing the future holds a tragedy for her. The intensity of the experience makes her fall in love with Ian Donnelly, another academic, who will become the father of her child but who will leave when she reveals the future to him.

Just as the aliens prepare to leave, not religion but philosophy takes over. A question posed by none other than Nietzsche goes: if you could live your life over exactly the same as you lived it this time, would you? Nietzsche’s point was that those who say “no” deny life while those who answer in the affirmative, well, affirm it. Ian says what he would change. Louise, however, embraces life with the tragedy she knows will inevitably come. While religion is off in a corner doing something that shows just how nonsensical belief can be, philosophy stands tall and faces the difficult question head-on. Although the movie follows some expected conventions—aliens bring peace but militaries want war—it rests on a profound question to which, I’ll admit, I haven’t got an answer.


Planet A

Two of the classics of ecology, A Sand County Almanac, by Aldo Leopold, and The Sea Around Us, by Rachel Carson, were published by Oxford University Press. In its present-day iteration the press has a Green Committee, on which I’ve sat from very nearly the beginning of my time there. As a committee, we’re reading these classics to see what we might learn some half-century-plus after they were published. I’d never read A Sand County Almanac before. It’s a pity, since I lived in southeast Wisconsin, from which the book takes its genesis, for about a dozen years. The writing is poetical prose, but the ideas are solid science—the land on which we’ve evolved knows how to take care of itself. When one species becomes too greedy, all suffer. Leopold ends his book by suggesting a land ethic should be put in place. Now, a human lifespan later, has it?

Hardly. Watching the Trump Administration doing everything it can to commodify any aspect of the environment that might make a buck—or at least a buck for the wealthy—is alarming in the extreme. There is no soul in the land, to this way of thinking. They believe that because they themselves lack a functional soul. A soul cannot exist without ethics. What we do to this planet is one of the largest ethical issues imaginable. No species, rational or not, destroys its own habitat. Except our own. Arrogant to the point of supposing ourselves divine, we think we can take what we want and give nothing back. And everything will be just fine. I wonder that we’ve had this inexpensive, readable guidebook this last seven decades and have continued to ignore its sage advice. Maybe we’re too busy making money to read something that sounds suspiciously like poetry.

One of the observations I had about the Almanac was how attuned to the philosophy of nature it is. Philosophy has many enemies these days, from prominent scientists to Republicans. Nobody seems to value the capacity for deep and thorough thinking through of a problem that is unbeholden to any orthodoxy. The philosopher can ask “what if?” without regret. When it comes to the environment, humans aren’t the only philosophers. We’ve convinced ourselves so completely that we’re more advanced than other species that we suppose they can’t teach us anything. One thing they do, however, without our interference, is create balance in nature. It’s an ethic to which even our species might aspire. If only we would listen to the wisdom of those who pay attention to the world that has given them life.


Occam’s Beard

Skeptics can be so much fun. We really do need them, otherwise we’d likely still be living with notions of medical science being attributed to four humors, none of which were that funny. Still, sometimes it gets tiresome to read endless references that take Occam out of context. You see, one of the foundations, if not the very keystone, of modern scientific method is that of parsimony, aka Occam’s razor. The idea is simplicity itself. If there are multiple possible explanations for a phenomenon, then the simplest is most likely correct. But only if it supports your biases. The reason I raise this question is the materialistic dismissal of “consciousness” as merely a by-product of having a brain. The reasoning goes like this—nothing exists that can’t be measured by science. Since that which isn’t material can’t be measured, the most parsimonious explanation is that it doesn’t exist. QED.

This way of looking at the world has become so common that those of us who question it are given a condescending smile and a paternalistic pat on the head. But my thinking about this goes back to Occam himself. William of Occam (or Ockham) was a late medieval churchman and thinker. As a scholar he possessed a sharp mind. As a friar he also possessed a soul. There was no disconnect in those days. His observations of the natural world led him to the reasonable conclusion that if a simpler solution sufficed, a more complicated one need not be posited. So far, so good. This is not, however, to suggest that more complex things may not be going on. Quantum physics, for example, suggests that things aren’t quite so easy to explain. And what about poor Occam’s soul? This very component that made William William has been dismissed as mere illusion. Did it therefore not exist?

Is it more parsimonious to suggest that “mind” (or soul, or consciousness, whichever you prefer) is mere illusion, electro-chemical signals flitting between highly specialized cells just happen to give off a fiction of consciousness, or would the simpler answer be, as Occam himself believed, we have souls? We have no way to measure such things, but to claim they don’t exist is to rob a great thinker of his very mind. Any of us who experience consciousness know that it’s no illusion. We feel the pains and joys of this same body day after day and, if we’re honest, we believe that we’ll continue even after this fleshy substrate wears out. There’s a profound logic here. Science doesn’t know how mind affects matter—how I can decide to type and my fingers move. The most parsimonious answer, they claim, is that it only seems to be so. A far more honest answer would be that mind is real. And I’m sure Occam himself would agree, even if he preferred to call it a soul.


The Deal

Intellectual property is a concept that only arises where thought can be monetized. Think about that. (But don’t charge me, please!) It is a strange idea, when you ponder it. In any case, one of the problems with writing book reviews is that the reviews themselves become the intellectual property of the journal in which they appear. In a mental ménage à trois, everyone gets something out of it: the publisher gives away a book for free publicity, the journal gets copyright content of value to its readers, and the reviewer gets a free book. In the best of these encounters everyone goes home happy. I began doing book reviews when teaching at Nashotah House. Academic books are expensive and although professors make more than editors do, they are still hard-pressed to pay academic press prices. After my strictly platonic affair with higher education, I stopped doing reviews for a while, but now that I have hours on a bus to read, I’ve picked up the habit again.

Most often I review books for the two societies of which I’m a member—the American Academy of Religion and the Society of Biblical Literature. There might, however, be a conflict of interest here. One of the main things I do on this blog is talk about books. Since I’m no longer a professor, but I still think like one, I figure it’s a compromise I can live with. The conflict arises because I post daily on my blog while publishers take weeks, or even months, to publish reviews. Despite technology, publishing is a slow business. That means that I read books I can’t really talk about until the review has appeared. That’s the case with Jill Graper Hernandez’s Early Modern Women and the Problem of Evil: Atrocity & Theodicy. I can’t say here what I say in my review since the review isn’t my intellectual property anymore. There’s another trade-off here: publishers get your name out there in return for owning your content—everything can come down to the level of business, it seems.

Still, I can use this post to reflect on theodicy—the justification of God in the face of evil. Since I don’t address this fact in my review I can say here that since Trump’s election it’s difficult to read a book about suffering without tying it to the current political situation. Many of the incumbent’s ardent supporters are coming to see that he doesn’t really care for them or their issues, and conservatives as well as liberals have four years of suffering to face. I was surprised how often in reading this philosophical treatise that our present national shame came to mind. Perhaps that’s inevitable in a book that discusses how women have been repressed in a world where they too have been relegated to the level of a commodity. Intellectual property seems less a concern when human beings are still trafficked as chattel. That’s not just bad business, it’s evil.


By Any Other Name

nakedundeadGood and evil. Well, mostly evil, actually. No, I’m not talking about Washington, DC, but about horror movies. Cynthia A. Freeland’s The Naked and the Undead: Evil and the Appeal of Horror is a study that brings a cognitivist approach to the dual themes of feminism and how horror presents evil. It’s not as simple as it sounds. Like many philosophers Freeland is aware that topics are seldom as straightforward as they appear. Feminists have approached horror films before, and other analysts have addressed the aspects of evil that the genre presents, but bringing them together into one place casts light on the subject from different angles. Freeland begins this process by dividing her material into three main sections: mad scientists and monstrous mothers (which allows for the Frankenstein angle), from vampires to slashers, and sublime spectacles of disaster. Already the reader can tell she’s a real fan.

One of the simplistic views of horror is that these kinds of movies—particularly slashers—are misogynistic by their very nature. Feminists, including Freeland, question that assumption. Horror is a genre with a decidedly checkered history. Some films do feature mostly female victims to male monsters. Not all do, however, and even those that do may be saying something other than the obvious. Looking for the locus of evil in these movies provides a lens that focuses the meaning somewhere other than the surface. This is one of the benefits of philosophy—probing questions may be asked and unexpected answers may result. Along the way you can have a lot of fun, too. Especially if you watch horror movies.

A large part of the criticism probably arises from the fact that film making was, for much of its earliest history, run by males. That’s not to say women couldn’t do the same thing men were doing, but the opportunities simply weren’t there. Most film makers, I expect, have trouble getting out of their heads to think about how someone of a different gender might perceive this kind of movie. Fear, we are told, is “coded” feminine. It seemed natural to such film makers to put the female in peril since both women and men would respond to it. Since then it has become clear that fear isn’t coded for gender. Indeed, one of the hallmarks of modern horror is that we all have cause to be afraid. Fear is no respecter of gender. Freeland’s analysis, now getting on in years, correctly looked ahead in many respects. Especially concerning the ongoing presence of evil.


Post Post-Truth

One of the benefits of working with words is that you get to participate in reality. George Orwell famously wrote that if people didn’t have the words to express concepts the government didn’t like, those concepts would cease to exist. At least as long as they allow us to have the internet, concepts may survive. Oxford Dictionaries’ word of the year reflects just this. The word is “post-truth.” Post-truth is a word of hard currency in the political marketplace. It essentially means that objective facts no longer outweigh emotion and personal belief in establishing reality. Think “global warming isn’t happening because it cuts into my bottom line.” Think “humans didn’t evolve from a common ancestor apes because an outdated book doesn’t say they did.” Think “Donald Trump won the election.” Truth is no longer truth without “post” in front of it. Believe what you will. No, I mean that. I choose to believe Trump was not elected. Post-truth cuts both ways.

In a world where reaction has replaced dialogue and where you win arguments by revealing your NRA card, truth is merely the first casualty. Already mainstream media, who told us the post-truth that Trump, according to the polls, couldn’t win, are now telling us it’s all politics as normal. This will be a simple transition of power. Pay no attention to the man behind the curtain. There is no wizard in Oz. Politicians lie. Only those born without a brain stem don’t realize that objective fact. We can be sure that even George Washington lied from time to time. There was no cherry tree-gate. Unless you choose to believe there was, then I guess it has to be okay. Your truth’s as good as mine.

img_3044

Over the past several years some prominent scientists have been saying philosophy is but misguided navel-gazing. It tells us nothing of reality. Truth, however, is a philosophical concept. Post is something to which you tie people before a firing squad. Truth has, until recent days, been considered that upon which all reasonable people could agree. The earth is not flat. We are flying around the sun so fast that it makes me earth-sick. We were able to put people on the moon. All of these are now post-truths. Along with the fact that every vote counts. In this slurry of fear, hatred, and distrust, who has time to worry about objective facts? Lexicographers do. And I praise them for giving us the most relevant word since Moses stumbled down the mountain with a tablet that read, “Thou shalt not bear false witness.”


The Whole Truth

In a thoughtful piece on NPR, Adam Frank discusses “Are Scientific Truths Better Than Other Truths?(.)” He describes a Ivy League conference called to discuss this point, and although I get about as much attention as adults give Barney, I’ve been blogging about this topic for years now. If only I had an institution. Or an ivy leaf. But never mind that. The topic’s the thing, and indeed it is long overdue. Science works (at least most of the time) and so we don’t require any convincing on that point. The very title of the article, however, raises the specter of the question: are scientific truths better? There’s a lot of unpacking to do and I haven’t even left home yet. First of all, “truths.” Science provides the best explanation of phenomena that we have, given the data at the moment. Since science is, by definition, falsifiable, it doesn’t provide truths. As much as scientists must begrudgingly admit it, truths are spun out by philosophers and—God help us!—theologians. The scientists who want to give us truths should probably take philosophy 101.

microgaphia_schem_1_instruments

Then there’s that surprisingly difficult adverb “better.” Good, better, best. These are value words. Science cannot assess value. Gold is “worth” more than the lint in my pockets because humans have agreed that it is. Inherently, both substances are made of the same thing: atoms. The lint in my pocket may have more exotic elements than pure gold, but nobody’s going to pay anything for it. Value, as has been endlessly demonstrated, is in the purview of religion, ethics, and philosophy. If you have to you can add the dismal science to the mix, but even that is just a social science. No physicist can tell you if this meal is better than that. It’s a matter of perspective. If I value my beans enough, not even your pâté will tempt me.

I want to stick with this latter word “better” just a little longer. Perhaps because as an underemployed thinker I’m especially sensitive to the subject. In what sense is science “better” than humanities? Show me a scientist who’s never listened to music and I’ll show you a sad individual. When we come home from the lab we still want the creature comforts that people have devised whether through science, culture, or even religion. If you value that weekend, be sure to thank a monotheist. Science tells us no day of the week is any different than any other. In my experience there’s a world of difference between Saturday and Monday. For this inveterate and unrepentant humanities student, that’s the truth.