More Writing

I keep a list.  It includes everything that I’ve published.  It’s not on my CV since I keep my fiction pretty close to my vest.  The other day I stumbled across another electronic list I’d made some time ago of the unpublished books I’d written.  Most were fiction but at least two were non, and so I decided that I should probably print out copies of those I still had.  As I’ve probably written elsewhere, I started my first novel as a teenager.  I never finished it, but I still remember it pretty well.  Then I started another, also unfinished.  After my wife and I got engaged and before we moved to Scotland, I’d moved to Ann Arbor to be in her city.  Ann Arbor, like most university towns, has many overqualified people looking for work and I ended up doing secretarial support for companies that really had nothing for me to do quite a bit of the time.  I wrote my first full novel during dull times on the job.

My writing was pretty focused in Edinburgh.  My first published book was, naturally, my dissertation.  I started writing fiction again when I was hired by Nashotah House, but that was tempered by academic articles and my second book.  An academic life, it seems, doesn’t leave a ton of time for writing.  What really surprised me about my list was what happened after Nashotah.  In the years since then I’ve completed ten unpublished books.  Since my ouster from academia I’ve published five.  I honestly don’t know how many short stories I’ve finished, but I have published thirty-three.  What really worries me is that some of these only exist in tenuous electronic form.  I guess I trust the internet enough to preserve these blog posts; with over 5,700 of them I’d be running out of space.

I see a trip to buy some paper in my future.  For my peace of mind I need to make sure all of this is printed out.  My organizational scheme (which is perhaps not unusual for those with my condition) is: I know which pile I put it in.  Organizing it for others, assuming anybody else is interested, might not be a bad idea.  I know that if I make my way to the attic and begin looking through my personal slush pile of manuscripts I’ll find even more that I’ve forgotten.  That’s why I started keeping a list.  Someday I’ll have time to finish it, I hope.


Just Trust Me

When I google something I try to ignore the AI suggestions.  I was reminded why the other day.  I was searching for a scholar at an eastern European university.  I couldn’t find him at first since he shares the name of a locally famous musician.  I added the university to the search and AI merged the two.  It claimed that the scholar I was seeking was also a famous musician.  This despite the difference in their ages and the fact that they looked nothing alike.  Al decided that since the musician had studied music at that university he must also have been a professor of religion there.  A human being might also be tempted to make such a leap, but would likely want to get some confirmation first.  Al has only text and pirated books to learn by.  No wonder he’s confused.

I was talking to a scholar (not a musician) the other day.  He said to me, “Google has gotten much worse since they added AI.”  I agree.  Since the tech giants control all our devices, however, we can’t stop it.  Every time a system upgrade takes place, more and more AI is put into it.  There is no opt-out clause.  No wonder Meta believes it owns all world literature.  Those who don’t believe in souls see nothing but gain in letting algorithms make all the decisions for them.  As long as they have suckers (writers) willing to produce what they see as training material for their Large Language Models.  And yet, Al can’t admit that he’s wrong.  No, a musician and a religion professor are not the same person.  People often share names.  There are far more prominent “Steve Wigginses” than me.  Am I a combination of all of us?

Technology is unavoidable but the question unanswered is whether it is good.  Governments can regulate but with hopelessly corrupt governments, well, say hi to Al.  He will give you wrong information and pretend that it’s correct.  He’ll promise to make your life better, until he decides differently.  And he’ll decide not on the basis of reason, because human beings haven’t figured that out yet (try taking a class in advanced logic and see if I’m wrong).  Tech giants with more money than brains are making decisions that affect all of us.  It’s like driving down a highway when heavy rain makes seeing anything clearly impossible.  I’d never heard of this musician before.  I like to think he might be Romani.  And that he’s a fiddler.  And we all know what happens when emperors start to see their cities burning.

Al thinks this is food

Nanowrimo Night

Nanowrimo, National Novel Writing Month—November—has been run by an organization that is now shutting down.  Financial troubles and, of course, AI (which seems to be involved in many poor choices these days), have led to the decision, according to Publisher’s Weekly.  Apparently several new authors were found by publishers, basing their work on Nanowrimo projects.  I participated one year and had no trouble finishing something, but it was not really publishable.  Still, it’s sad to see this inspiration for other writers calling it quits.  I’m not into politics but when the Nanowrimo executives didn’t take a solid stand against AI “written” novels, purists were rightfully offended.  Writing is the expression of the human experience.  0s and 1s are not humans, no matter how much tech moguls may think they are.  Materialism has spawned some wicked children.

Can AI wordsmith?  Certainly.  Can it think?  No.  And what we need in this world is more thinking, not less.  Is there maybe a hidden reason tech giants have cozied up to the current White House where thinking is undervalued?  Sorry, politics.  We have known for many generations that human brains serve a biological purpose.  We keep claiming animals (most of which have brains) can’t think, but we suppose electrical surges across transistors can?  I watch the birds outside my window, competing, chittering, chasing each other off.  They’re conscious and they can learn.  They have the biological basis to do so.  Being enfleshed entitles them.  Too bad they can’t write it down.

Now I’m the first to admit that consciousness may well exist outside biology.  To tap into it, however, requires the consciousness “plug-in”—aka, a brain.  Would AI “read” novels for the pleasure of it?  Would it understand falling in love, or the fear of a monster prowling the night?  Or the thrill of solving a mystery?  These emotional aspects, which neurologists note are a crucial part of thinking, can’t be replicated without life.  Actually living.  Believe me, I mourn when machines I care for die.  I seriously doubt the feeling is reciprocated.  Materialism has been the reigning paradigm for quite a few decades now, while consciousness remains a quandary.  I’ve read novels that struggle with deep issues of being human.  I fear that we could be fooled with an AI novel where the “writer” is merely borrowing how humans communicate to pretend how it feels.  And I feel a little sad, knowing that Nanowrimo is hanging up the “closed” sign.  But humans, being what they are, will still likely try to complete novels in the month of November.


Finding Fossils

Mary Anning was a real woman.  She made valuable contributions to paleontology in the first half of the nineteenth century, although she wasn’t always credited for her work.  The movie Ammonite is a fictionalized account of her life at Lyme Regis, where she lived and discovered dinosaur fossils.  Being fiction, the movie focuses on how Mary “came out of her shell” by entering into a relationship with Charlotte Murchison (also an historical person, wife of the Scottish geologist Sir Roderick Impey Murchison) who was left in her care when she came down with a fever after trying to recover from melancholy by taking the sea air.  Mary had established a life of independence and wasn’t really seeking relationships; her mother still lived with her and, according to the movie, they had a distant but loving regard for each other.

I was anxious to see the film because it is sometimes classified as dark academia.  Since I’m trying to sharpen my sense of what that might mean, it’s helpful to watch what others think fits.  The academia part here comes from the intellectual pursuits of Anning and the academic nature of museum life (one of her fossils was displayed at the British Museum).  Anning, who had no formal academic training, tried to make a living in a “man’s world,” and in real life she did contribute significantly to paleontology.  The dark part seems to come in from her exclusion from the scientific community, and perhaps in her love for Charlotte, a forbidden relationship in that benighted time.  Of course, this relationship is entirely speculative.

Fictional movies made about factual people make me curious about the lives of those deemed movie-worthy.  Ammonite is a gentle movie and one which raises the question of why women were excluded from science for so long.  No records exist that address her sexuality—not surprisingly, since she lived during a period when such things weren’t discussed.  Indeed, she didn’t receive the acclaim that she might have, had she lived in the period of Jurassic Park.  She was noticed by Charles Dickens, who included a piece on her in his magazine All the Year Round, in 1865, several years after her death.  These days she is acknowledged and commemorated.  This movie is one such commemoration, although much of it likely never happened.  As with art house movies such as this, nonfiction isn’t to be assumed.  Nevertheless, it might still be dark academia.


Making More Monsters

It’s endlessly frustrating, being a big picture thinker.  This runs in families, so there may be something genetic about it.  Those who say, “Let’s step back a minute and think about this” are considered drags on progress (from both left and right), but would, perhaps, help avoid disaster.  In my working life of nearly half-a-century I’ve never had an employer who appreciated this.  That’s because small-picture thinkers often control the wealth and therefore have greater influence.  They can do what they want, consequences be damned.  These thoughts came to me reading Martin Tropp’s Mary Shelley’s Monster: The Story of Frankenstein.  I picked this up at a book sale once upon a time and reading it, have discovered that he was doing what I’m trying with “The Legend of Sleepy Hollow” in my most recent book.  Tropp traces some of the history and characters, but then the afterlives of Frankenstein’s monster.  (He had a publisher with more influence, so his book will be more widely known.)

This book, although dated, has a great deal of insight into the story of Frankenstein and his creature.  But also, insight into Mary Shelley.  Her tale has an organic connection to its creator as well.  Tropp quite frequently points out the warning of those who have more confidence than real intelligence, and how they forge ahead even when they know failure can have catastrophic consequences for all.  I couldn’t help but to think how the current development of AI is the telling of a story we’ve all heard before.  And how those who insist on running for office to stoke their egos also play into this same sad tale.  Perhaps a bit too Freudian for some, Tropp nevertheless anticipates much of what I’ve read in other books about Frankenstein, written in more recent days.

Some scientists are now at last admitting that there are limits to human knowledge.  (That should’ve been obvious.)  Meanwhile those with the smaller picture in mind forge ahead with AI, not really caring about the very real dangers it poses to a world happily wedded to its screens.  Cozying up to politicians who think only of themselves, well, we need a big picture thinker like Mary Shelley to guide us.  I can’t help but think big picture thinking has something to do with neurodivergence.  Those who think this way recognize, often from childhood, that other people don’t think like they do.  And that, lest they end up like Frankenstein’s monster, hounded to death by angry mobs, it’s better simply to address the smaller picture.  Or at least pretend to.


Telling Vision

My wife and I don’t watch much television.  In fact, we had very poor television reception from about 1988 (when we married), until we moved into this house in 2018.  That’s three decades without really watching the tube.  As we’ve been streaming/DVDing some of the series that have made a splash in those three decades, I’ve discovered (I can’t speak for her) that there were some great strides made in quality.  We began with The X-Files, then moved on to Lost.  We viewed Twin Peaks and started to watch Picket Fences, but the digital rights have expired so we never did finish out that series.  (Don’t get me started on digital rights management—the air will quickly turn blue, I assure you.)  Of course, we did manage to see Northern Exposure when it aired, but it should be mentioned for the sake of completion.  These were all exceptional programs.

Netflix (in particular) upped the game.  We watched the first three seasons of Stranger Things (and I’d still like to go back and pick up the more recent ones we’ve missed), and I watched the first episode of The Fall of the House of Usher (I didn’t realize until writing this up that it was only one season, so maybe I should go back and finish that one out too) and was very impressed.  Since we couldn’t finish Picket Fences, we turned to Wednesday.  Now, I was only ever a middling fan of The Addams Family.  I watched it as a kid because it had monsters, as I did The Munsters, but neither one really appealed to me.  Wednesday’s cut from a different cloth.  In these days when escapism is necessary, this can be a good thing.

Photo credit: Smithsonian Institution

Like most late boomers, I grew up watching television.  In my early memories, it’s pretty much ubiquitous.  We were poor, and our sets were black-and-white, but remembering my childhood without TV is impossible.  It was simply there.  The shows I watched formed me.  Now that I’m perhaps beyond excessive reforming (although I’m not opposed to the idea), I’m looking for brief snippets of something intelligent to wind up the day before I reboot to start this all over again.  We save movies for weekends, but an entire workweek without a break in nonfictional reality seems overwhelming on a Sunday evening.  It seems that I may be warming up to my childhood chum again.  This time, without network schedules, and limited time to spend doing it, we just may be in a golden age for the tube.


Protected?

I like Macs.  Really, I do.  Ever since I realized that “Windows” was a cut-rate way to imitate Macintosh’s integral operating system, I’ve never been able to look back.  (I don’t have a tech background so I may be wrong in the details.)  Every time I use a work laptop—inevitably PCs—I realize just how unintuitive they are.  Something about Apple engineers is that they understand the way ordinary people think.  I sometimes use software, not designed for a Mac, where I swear the engineers have no basic comprehension of English words at all.  And nobody ever bothers to correct them.  In any case, I find Macs intuitive and I’ve been using them for going on 40 years now.  But the intuitive element isn’t as strong as it used to be.  As we’re all expected to become more tech savvy, some of the ease of use has eroded.

For example, when I have to create a password for a website—not quite daily, but a frequent activity—Mac helpfully offers to create a strong password that I will never have to remember.  Now before you point out to me that software exists that will keep all your passwords together, please be advised that I know about such things.  The initial data entry to get set up requires more time off than I typically get in a year, so that’ll need to wait for retirement.  But I was talking about intuitive programming.  Often, when I think I won’t be visiting a website often, I’ll opt for the strong password.  Maybe I’ve got something pressing that I’m trying to accomplish and I can’t think of my three-thousandth unique password.  I let Mac drive.  That’s fine and good until there’s an OS update.  This too happens not quite daily, but it does sometimes occur more than once a week.

After restarting I go back to a website and the autofill blinks at me innocently as if it doesn’t recognize my username.  It doesn’t remember the strong password, and I certainly don’t.  So I need to come up with yet another new one.  At work I’m told you should change all your passwords every few months.  To me that seems like a full-time job.  For grey matter as time-honored as mine, it’s not an easy task.  I’m not about to ditch Macs because of this, but why offer me a strong password that only lasts until the next system update?  Truth be told, I’m a little afraid to post this because if by some miraculous chance a software engineer reads it and decides to act, a new systems update will be required again tonight.


Remembering Consciousness

I recently inadvertently read—it happens!—about anesthesia.  I’ve been relatively healthy for most of my adult life and have experienced anesthesia only for dental surgery and colonoscopies.  I’ve actually written about the experience here before: the experience of anesthesia is not like sleep.  You awake like you’ve just been born.  You weren’t, and then suddenly you are.  This always puzzled me because consciousness is something nobody fully understands and there is a wide opinion-spread on what happens to it when your body dies.  (I have opinions, backed by evidence, about this, but that’s for another time.)  What I read about anesthesia made a lot of sense of this conundrum, but it doesn’t answer the question of what consciousness is.  What I learned is this: anesthesiologists often include amnestics (chemicals that make you forget) in their cocktail.  That is, you may be awake, or partially so, during the procedure, but when you become conscious again you can’t remember it.

Now, that may bother some people, but for me it raises very interesting issues.  One is that I had no idea amnestics existed.  (It certainly sheds new light on those who claim alien abduction but who only remember under hypnosis.)  Who knew that even we have the ability to make people forget, chemically?  That, dear reader, is a very scary thought.  Tip your anesthesiologist well!  For me, I don’t mind so much if I can’t remember it, but it does help answer that question of why emerging from anesthesia is not the same as waking up.  Quite unrelated to this reading, I once watched a YouTube video of some prominent YouTubers (yes, that is a full-time job now) undergoing colonoscopies together.  They filmed each other talking during the procedure, often to hilarious results.  The point being, they were not fully asleep.  The blankness I experience after my own colonoscopies is born of being made to forget.

I think I have a pretty good memory.  Like most guys my age, I do forget things more easily—especially when work throws a thousand things at you simultaneously and you’re expected to catch and remember all of them.  Forgetting things really bothers me.  If you haven’t watched Christopher Nolan’s early film Memento, you should.  I think I remember including it in Holy Horror.  In any case, I don’t mind if anesthesiologists determine that it’s better to forget what might’ve happened when the last thing I remember is having been in an extremely compromised position in front of total strangers of both genders.  My accidental reading has solved one mystery for me, but it leaves open that persistent question of what consciousness really is.


Think

Those of us who write books have been victims of theft.  One of the culprits is Meta, owner of Facebook.  The Atlantic recently released a tool that allows authors to check if LibGen, a pirated book site used by Meta and others, has their work in its system.  Considering that I have yet to earn enough on my writing to pay even one month’s rent/mortgage, you get a little touchy about being stolen from by corporate giants.  Three of my books (A Reassessment of Asherah, Weathering the Psalms, and Nightmares with the Bible) are in LibGen’s collection.  To put it plainly, they have been stolen.  Now the first thing I noticed was that my McFarland books weren’t listed (Holy Horror and Sleepy Hollow as American Myth, of course, the latter is not yet published).  I also know that McFarland, unlike many other publishers, proactively lets authors know when they are discussing AI use of their content, and informing us that if deals are made we will be compensated.

I dislike nearly everything about AI, but especially its hubris.  Machines can’t think like biological organisms can and biological organisms that they can teach machines to “think” have another think coming.  Is it mere coincidence that this kind of thing happens at the same time reading the classics, with their pointed lessons about hubris, has declined?  I think not.  The humanities education teaches you something you can’t get at your local tech training school—how to think.  And I mean actually think.  Not parrot what you see on the news or social media, but to use your brain to do the hard work of thinking.  Programmers program, they don’t teach thinking.

Meanwhile, programmers have made theft easy but difficult to prosecute.  Companies like Meta feel entitled to use stolen goods so their programmers can make you think your machine can think.  Think about it!  Have we really become this stupid as a society that we can’t see how all of this is simply the rich using their influence to steal from the poor?  LibGen, and similar sites, flaunt copyright laws because they can.  In general, I think knowledge should be freely shared—there’s never been a paywall for this blog, for instance.  But I also know that when I sit down to write a book, and spend years doing so, I hope to be paid something for doing so.  And I don’t appreciate social media companies that have enough money to buy the moon stealing from me.  There’s a reason my social media use is minimal.  I’d rather think.


Lights, Cam

Techno-horror is an example of how horror meets us where we are.  When I work on writing fiction, I often reflect how our constant life online has really changed human beings and has given us new things to be afraid of.  I posted some time ago about Unfriended, which is about an online stalker able to kill people IRL (in real life).  In that spirit I decided to brave CAM, which is based on  an internet culture of which I knew nothing.  You see, despite producing online content that few consume, I don’t spend much time online.  I read and write, and the reading part is almost always done with physical books.  As a result, I don’t know what goes on online.  Much more than I ever even imagine, I’m sure.

CAM is about a camgirl.  I didn’t even know what that was, but I have to say this film gives you a pretty good idea and it’s definitely NSFW.  Although, having said that, camgirl is, apparently, a real job.  There is a lot of nudity in the movie, in service of the story, and herein hangs the tale.  Camgirls can make a living by getting tips in chatrooms for interacting, virtually, with viewers and acting out their sexual fantasies.  Now, I’ve never been in a chatroom—I barely spend any time on social media—so this culture was completely unfamiliar to me.  Lola_Lola is a camgirl who wants to get into the top fifty performers on the platform  she uses.  Then something goes wrong.  Someone hacks her account, getting all her money, and performing acts that Lola_Lola never does.  What makes this even worse is that the hacker is apparently AI, which has created a doppelgänger of her. AI is the monster.

I know from hearing various experts at work that deep fakes such as this can really take place.  We would have a very difficult, if not impossible, time telling a virtual person from a real one, online.  People who post videos online can be copied and imitated by AI with frightening verisimilitude.  What makes CAM so scary in this regard is that it was released in 2018 and now, seven years later such things are, I suspect, potentially real.  Techno-horror explores what makes us afraid in this virtual world we’ve created for ourselves.  In the old fashioned world sex workers often faced (and do face) dangers from clients who take their fantasies too far.  And, as the movie portrays, the police seldom take such complaints seriously.  The truly frightening aspect is there would be little that the physical police could do in the case of cyber-crime.  Techno-horror is some of the scariest stuff out there, IMHO.


Call Me AI

Okay, so the other day I tried it.  I’ve been resisting, immediately scrolling past the AI suggestions at the top of a Google search.  I don’t want some program pretending it’s human to provide me with information I need.  I had to find an expert on a topic.  It was an obscure topic, but if you’re reading this blog that’ll come as no surprise.  Tired of running into brick walls using other methods, I glanced toward Al.  Al said a certain Joe Doe is an expert on the topic.  I googled him only to learn he’d died over a century ago.  Al doesn’t understand death because it’s something a machine doesn’t experience.  Sure, we say “my car died,” but what we mean is that it ceased to function.  Death is the overlay we humans put on it to understand, succinctly, what happened.

Brains are not computers and computers do not “think” like biological entities do.  We have feelings in our thoughts.  I have been sad when a beloved appliance or vehicle “died.”  I know that for human beings that final terminus is kind of a non-negotiable about existence.  Animals often recognize death and react to it, but we have no way of knowing what they think about it.  Think they do, however.  That’s more than we can say about ones and zeroes.  They can be made to imitate some thought processes.  Some of us, however, won’t even let the grocery store runners choose our food for us.  We want to evaluate the quality ourselves.  And having read Zen and the Art of Motorcycle Maintenance, I have to wonder if “quality” is something a machine can “understand.”

Wisdom is something we grow into.  It only comes with biological existence, with all its limitations.  It is observation, reflection, evaluation, based on sensory and psychological input.  What psychological “profile” are we giving Al?  Is he neurotypical or neurodivergent?  Is he young or does his back hurt when he stands up too quickly?  Is he healthy or does he daily deal with a long-term disease?  Does he live to travel or would he prefer to stay home?  How cold is “too cold” for him to go outside?  These are things we can process while making breakfast.  Al, meanwhile, is simply gathering data from the internet—that always reliable source—and spewing it back at us after reconstructing it in a non-peer-reviewed way.  And Al can’t be of much help if he doesn’t understand that consulting a dead expert on a current issue is about as pointless as trying to replicate a human mind.


Remaining in Shadow

Some people want to be found.  Others don’t.  Those of us who are curious shade into those who are frustrated when we can’t find someone.  People have been around for a relatively long time now, and we’ve been giving each other names because “hey you” only goes so far.  Even so, unique names are rare since, it seems, the majority of European-derived folk had something to do with smithies.  Nevertheless, the internet offers to help us find people.  I was searching for someone the other day but that person, despite publishing nearly daily on the interwebs, has a very common name.  And he styles himself without even a middle initial.  (He may not have one, I know.)  The point is, perhaps he doesn’t want to be found.  I run into authors like this—they assume their high-level monograph is sufficient fame.  You can’t find them online.

I recently joined Bluesky.  I’d like to leave Twitter, but I still have a large number of followers there (for me), although they seldom interact.  Publishers look at things like the number of X followers you have, so until Bluesky surpasses Twit, I’ll need to keep both going.  On Bluesky more people introduce themselves to you.  At least when you’re new.  Not a few are looking for relationships, sometimes of the sexual kind.  (I find that occasionally on what is called X, but mostly in the account under my fiction-writing pseudonym.)  These are people who want to be found.  The internet, strangely enough, has driven us further apart.

America has always been a polarized place, but the web has sharpened the border.  Indeed, it has militarized it.  I remember the days when meeting people actually meant going outside and stopping somewhere else.  Society had rules then.  Two topics of forbidden discussion were religion and politics.  It was easier to make friends with those rules in place.  Since I’ve chosen to put myself out there on the web, my choice of field of study does tend to come out.  And it’s one of those two forbidden topics.  Since my career goal has occasionally been ministry (still is from time to time), putting religion into the equation is inevitable, for those who really want to get to know me.  Social media is a strange country, however.  I tell new conversationalists on Bluesky that I have a blog, but it doesn’t seem to lead many people to my dusty corner of the interweb.  And it still gets me no closer to finding that guy with the tragically common sobriquet.  He may not want to be found. 

Sherlock Holmes seeks someone without the internet. Image credit: Sidney Paget (1860 – 1908), Strand Magazine, public domain, via Wikimedia Commons

Asteroid

So, that asteroid.  Good thing Trump won’t be in office in 2032.  Well, it’s only a three percent chance it’ll hit us, but chances are things can’t get much worse anyway.  It might be a good idea to make some plans now—but wait, we currently deny science is real and spend our time renaming bodies of water.  Hmm, quite a pickle.  If science were real it would tell us that an asteroid is not the same as a meteor, although it can become one.  Back in high school astronomy class—yes, our Sputnik-era high school had its own planetarium—I learned that the difference between a meteoroid, a meteor, and a meteorite.  Want to go to school?  Here’s the quick version: a meteor is something in Earth’s atmosphere (thus, meteorology).  A meteoroid is a space rock, or even dust, in the solar system somewhere.  Once it enters our atmosphere it becomes a meteor.

Most meteors burn up as shooting stars and never hit the ground.  Yes, that atmosphere is a pretty good idea.  If a piece does make it to the ground, that piece is a meteorite.  Although your chances of being hit by a meteorite are minuscule, they can be impish.  Not far from here, a few years back, a meteor smashed through the backseat window of a car after a couple had just put their groceries inside.  The meteorite stopped inside a carton of ice cream.  The window had to be replaced, and I can just imagine the conversation with the insurance company.  Or take that meteorite heard on a recording for the first time.  In Charlottetown, Prince Edward Island, a doorbell camera caught a meteorite strike just outside someone’s house.  Both of these were tiny critters, though.

Our 2032 friend is quite a bit bigger.  Sometimes when large meteors heat up, they’ll explode in what is called a bolide.   In western Pennsylvania back in January of 1987 (if I recall correctly) my brother and I were working on a jigsaw puzzle during the holiday break when our house shook.  A loud bang had us worried—we lived in a refinery town, and explosions weren’t a welcome report.  We ran outside to find neighbors also outside looking around.  We later learned that a flaming fireball had gone overhead before exploding, a bolide.  The actual explosion took place quite a distance from us, but sound travels in our atmosphere.  So a possible near miss is scheduled eight years from now.  Let’s hope people show some sense at the polls before that happens. Or we could just rename it MAGA and hope that it hits somewhere else.

An asteroid. Image credit: NASA, public domain

In Praise of Paper

I write quite a lot.  I’ve done so for decades.  As I’ve tried to carve out a writer’s life for myself I noticed a few things.  I’ll start a story or novel and put it aside.  Sometimes for a decade or more, then come back to it.  I recently found what looks to be a promising novel that I began writing, by hand, back in the last century.  As electronics forced themselves more and more into my life, I began writing it on my computer.  I must’ve picked this story up a few years back because I clearly began revising it, but I ran into a problem.  The program in which I’d written it—Microsoft Word—was no longer supported by Apple products.  I eventually found a workaround and was able to extract a Rich Text Format from files that my computer told me were illegible.  If you want illegible, I felt like telling it, go back to the original hand-written chapters!

I dusted this off (virtually) belatedly, and started working again.  Then I reached chapter four.  That’s where I’d stopped my most recent revision.  Then I discovered why.  Near the end of the chapter were two paragraphs full of question marks with an occasional word scattered in.  A part of the Word file that the RTF couldn’t read.  Frustrated and heartbroken—there’s no way I can remember what this said some thirty years after it was initially written—I simply stopped.  This time I went to the attic and found the hand-written manuscript.  I went to the offending chapter only to find that the corrupted passage was missing.  It was what we used to call a “keyboard composition” and it was eaten by the equivalent of electronic moths.

Photo by Everyday basics on Unsplash

Now, I’m no techie, but I just don’t understand why a word processing file can no longer be read by the program in which it was written.  Publishers urge us to ebooks but how many times in my life have I seen a new system for preserving electronic files fold, with the loss of all the data?  It’s not just a few.  And they’re asking us to make literature disposable.  If I have a book on my shelf and I need to look up a passage, I can do so.  Even if I bought the book half a century ago and even if the book had been printed a century before that.  I’m aware of the irony that this blog is electronic—I used to print out all of the posts—and I have the feeling that my work is being sacrificed to that void we call electronic publication.  That’s why I keep the handwritten manuscripts in my attic.


Staunching Stigma

Independence Day is not a great movie.  As my readers know, that doesn’t stop me from watching.  I’ve seen it a few times.  Watching it post-UFO/AUP disclosure via the New York Times, I was struck by something.  Even in the diegesis of the movie, where alien craft, clearly visible, hover over major cities, when Russell Casse tells the military he’d been abducted, people roll their eyes.  Of all the stigmas our culture has invented that of the “crazy people” who see “flying saucers” is one of the deepest and most persistent.  Even after the Times, and the US Navy admitted they were real, and their tech is not of human origin, people refuse to believe.  I’ve followed this for some time.  I read a book by Donald Keyhoe before I was old enough to drive.  Like most thinking people on the topic, I kept quiet about it.  Stigma.

When I received Luis Elizondo’s Imminent for a Christmas present, I was secretly very pleased.  You see, the evidence has been in plain sight (Poe nods knowingly) for decades, for those who don’t accept ridicule as an immediate response.  The Keyhoe book I read was published in 1955.  It was my grandfather’s book.  For sure, the stories casting doubt on Elizondo’s reputation and sanity began almost immediately after he cooperated with journalists in 2017 when a fraction of the truth made the New York Times.  Between Keyhoe and Elizondo, many insider, “death bed” books had revealed that this was something we should pay attention to.  People laughed.  Oh, we love to laugh.  Imminent, however, is quite a sobering book.  I’m not sure full disclosure will ever happen, but it’s trickling out and a finger in the dam can’t hold forever.

Stigma as a means of social control is unfortunately effective.  I’ve always felt that mocking what you don’t understand is a poor way to get smarter.  Still, for those willing to consider the evidence over the years, there’s been plenty to study.  Either there’s something to this or our government and military are filled with pathological liars (outside the Oval Office, I mean).  It seems far more reasonable to examine the evidence, when it becomes available.  There are contractors in the military-industrial complex (Eisenhower warned us about this decades ago) who benefit from keeping secrets.  Imminent is an eye-opening book.  Hopefully it will be widely read and the implications taken seriously.  It’d be too bad if a catastrophe were necessary to stop the stigma, after it’s too late to do anything about it.