Do I Know You?

How do you know someone without ever seeing them?  How do you know they are who they say they are?  I’ve been spending a lot of time on the phone, much of it trying to establish my identity with people who don’t know me.  This has happened so much that I’m beginning to wonder how many of the people I’m talking to are who they say they are.  I never was a very good dater.  Going out, you’re constantly assessing how much to reveal and how much to conceal.  And your date is doing the same.  We can never fully know another person.  I tend to be quite honest and most of the coeds in college said I was too intense.  I suppose that it’s a good thing my wife and I had only one date in our three-year relationship before deciding to get married.

Electronic life makes it very difficult to know other people for sure.  I don’t really trust the guardrails that have been put up.  Sometimes the entire web-world feels false.  But can we ever go back to the time before?  Printing out manuscripts and sending them by mail to a publisher, waiting weeks to hear that it was even received?  Planning trips with a map and dead reckoning?  Looking telephone numbers up in an unwieldy, cheaply printed book?  You could assess who it is you were talking to, not always accurately, of course, but if you saw the same person again you might well recognize them.  Anthropologists and sociologists tell us the ideal human community has about 150 members.  The problem is, when such communities come into contact with other communities, war is a likely outcome.  So we have to learn to trust those we can’t see.  That we’ll never see.  That will only be voices on a phone or words in an email or text.

I occasionally get people emailing me about my academic work.  Sometimes these turn out to be someone who’s hacked someone else’s account.  I wonder why they could possibly have any interest in emailing an obscure ex-academic unfluencer like me.  What’s their endgame?  Who are they?  There’s something to be said for the in-person gathering where you see the same faces week after week.  You get to know a bit about a person and what their motivations might be.  Ours is an uncertain cyber-world.  I have come to know genuine friends this way.  But I’ve also “met” plenty of people who’re not who they claim to be.  Knowing who they really are is merely a dream.


What Bots Want

I often wonder what they want, bots.  You see, I’ve become convinced that nearly every DM (direct message) on social media comes from bots.  There’s a couple of reasons I think this: I have never been, and am still not, popular, and all these “people” ask the same series of questions before their accounts are unceremoniously shut down by the platform.  Bots want to sell me something, or scam me, I’m pretty sure, but I wonder why they want to “chat.”  They could look at this blog and find out much of what they’re curious about.  I could use the hits, after all.  Hit for chat, as it were.  

Some change in the metaverse has led to people discovering my academic work and some of them email me.  That’s fine, since it’s better than complete obscurity.  Within the last couple months two such people asked me unusual, if engaged questions.  I took the time to answer and received an email in reply, asking a follow up query.  It came at a busy time, so a couple days later I replied and received a bounced mail notice.  The other one bounced the first time I replied.  By chance (or design) one of these people had begun following me on Academia.edu (I’m more likely on Dark Academia these days), so I went to my account and clicked their profile button.  It took me to a completely different person.  So why did somebody email me, hack someone’s Academia account to follow me, and then disappear?  What do the bots want?

Of course, my life was weird before the bots came.  In college I received a mysterious envelope filled with Life cereal.  The back of said envelope read “Some Life for your life.”  I never found out who sent it.  Another time I received an envelope with $5 inside and a typewritten note saying “Buy an umbrella.”  If I’m poor now, I was even poorer in college and didn’t have an umbrella.  Someone noticed.  Then in seminary someone mailed me a mysterious letter about a place that doesn’t exist.  There was a point to the letter although I can’t recall what it was without it in front of me.  No return address.  I have my suspicions about who might’ve sent these, but I never had any confirmation.  The people are no longer in my life (one of them, if I’m correct, died by suicide a couple years after the note was sent).  It’s probably just my age, but I felt a little bit safer when these things came through the campus mail system.  Now bots fill my paltry web-presence with their gleaming DMs.  I wonder what they want.


Naming the Dead

It probably just goes with the territory, but I’ve noticed something.  A big part of my job is searching for people on the internet.  (Academics, of course.)  Mostly these are folks I don’t know, some of them with very common names.  This presents special challenges, of course.  Every once in a while, though, you search for a name and pretty much every entry you find is an obituary.  I’m not talking about someone prominent who has died, but rather several people with the same name who’ve passed away.  The other day, after four or five pages of Google I found nobody alive.  That particular name wasn’t an “old fashioned” name either.  It could be (perhaps is) still a very common name.  It does get me pondering whether some names are “safer” than others.  Is anyone by this name still alive?

We place a lot of stock in our names.  Being the way that others get our attention, and identify us, they do have importance.  And many names are common—parents aren’t always the creative sort.  And the internet is a source of frustration when trying to narrow down a common name and attach it to someone you don’t already know.  Growing up, kids want to be like everyone else—no standing out in the herd.  “Wiggins,” where I grew up, was an unusual name.  We got teased for it quite a lot.  When my mother remarried, my brothers and I went by our stepfather’s common last name for a few years.  In seminary I decided to revert to my birth name—Wiggins.  I was wanting to do two things: reclaim my heritage, and stand out a little.  Even so, a web-search for Steve Wiggins will bring up at least four or five individuals not me, including an obituary or two.

Before the web, when trying to find a scholar you had to use letters.  (Or maybe the phone, but cold calls weren’t really professional). You’d send them a letter.  In a way, the web is a great equalizer.  But it favors those with names that are somewhat less common.  Some people change their names—performers and some authors do this to make their persona more to their liking—but this is a fraught activity.  I know from switching back to my birth name that the process is complex and if you try it after you’ve started to publish things it adds whole new layers of complications.  So I spend quite a bit of time searching for people who aren’t easily found.  Not infrequently I seem to be naming the dead.


Author Pages

It takes me awhile, sometimes.  Maybe it’s a generational thing.  I’ve been blogging for sixteen years now (my blog is a teenager!) and it only just occurred to me that I should be putting links to authors’ pages when I post about their books.  I know links are what makes the web go round but I assumed that anyone whose book I’ve read is already better known than yours truly.  Why would they need my humble help?  Well, I’ve been trying to carve out the time to go back and edit my old posts about books, linking to authors’ pages—there are so many!  In any case, this has led to some observations about writers.  And at least this reader.  Most commercial authors have a website.  Not all, of course.  People my age who had earlier success with writing tend not to have a site since they already have a fan base (I’m guessing).  Most fiction writers in the cohort younger than me have pages, and I’m linking to those.

I’ve noticed, during this exercise, that my reading falls into two main categories: novels and academic books.  I suppose that’s no surprise, although I do read intelligent nonfiction from non-professors as well.  In the nonfiction category, it’s fairly rare to find academics with their own websites.  They probably get the validation they require from work, and being featured on the school webpages.  Or some will use Academia.edu to make a website.  As an editor I know that promoting yourself is important, even for academic authors.  Few do it.  Then I took a look around here and realized, as always, that I fall between categories.  No longer an academic, neither have I had any commercial success with my books.  I’ve fallen between two stools with this here website.  I do pay for it, of course.  Nothing’s free. 

Almost nobody links to my website.  This isn’t self-pity; WordPress informs you when someone links to your site and that hasn’t happened in years.  Links help with discoverability on the web, so my little website sits in a very tiny nook in a low-rent apartment in the part of town where you don’t want to be after dark.  And I thought to myself, maybe other authors feel the same.  Maybe they too need links.  So I’m adding them.  As I do so I hope that I’ll also learn a thing or two.  I’m trying to learn how to be a writer.  It just takes me some time before things dawn.  Maybe it’s just my generation.


Professionalism

We’re all tightly packed together here on the internet.  Social media is a fuzzy category and now includes such platforms as LinkedIn, which I think of mainly as a place to hang your shingle while looking for a job.  I chose, many years ago, to make myself available online.  This sometimes leads to a strange familiarity.  It isn’t unusual for me to have an author hopeful to contact me through my personal email or through LinkedIn, especially, to try to push their project.  (Such people have not read this blog deeply.)  One thing acquisitions editors crave most highly is professionalism.  Being accosted on LinkedIn, or in your personal email, is not the way to win an editor’s favor.  Some of us have lives outside of work.  Some of us write books of our own and don’t blast them out to all of our contacts on LinkedIn.  Professionalism.

It’s tough, I know.  You want to promote your book.  (I certainly do.)  It seems strange to say that blogging is old-fashioned, but it is.  (Things change so fast around here.)  But you could start a blog.  Or better yet, a podcast.  Or a YouTube channel.  You can blast all you want through X, Bluesky, Facebook, Tumblr, or Instagram.  I admit to being old fashioned, but LinkedIn is for professional networking, not doing quotidian business.  It may surprise some denizens of this web world that some publishers don’t permit official business through social media.  Email (I know, the dark ages!) is still the medium preferred.  Work email, not personal accounts.  Some authors (believe it or not) still try to snail mail things in.  Publishing is odd in that many people, and I count my younger self among them, suppose you can just do it without learning how it works.  Most editors, I suspect, would be glad to say a word or two about professionalism.

Photo by Ben Rosett on Unsplash

Professionalism is what makes a commute to the office on a crowded NYC subway train possible.  We all know what’s permissible in this crowded situation.  We know to wait until someone checks in at work before asking them about a project we have in mind.  (If you’re friends with an editor that’s different, but you need to get to know us first.)  When I started this blog I was “making a living” as an adjunct professor.  I was hanging out my shingle.  I also started a LinkedIn account.  Then I started writing nonfiction books again.  Since those days I’ve been trying to figure out the best way to promote them.  Professionally done, if at all possible.


Knock-on

When you’re the victim of a scam, the loss of all your money is only the beginning of your problems.  Scammers take away the simple pleasures you’ve afforded yourself.  Your mental security.  Your very sense of balance.  If you have to close your bank account, you’ll need to telephone (sometimes repeatedly) any company with which you have autopay.  You’ll receive threatening notices in the mail that make the rise in your blood pressure audible.   It should come as no surprise to my readers that I’m a Neo-Luddite.  I’m not sure the internet is a good thing and technology has made much of life more difficult.  At the same time, I’m conflicted because I know we have it easier than the vast majority of humans who’ve ever lived.  But still.  

The scammers took control of my laptop, which is not a spring chicken.  I had to have this old rooster scrubbed, which meant all the little fixes that allowed my device to use a very old printer and scanner were also scrubbed.  Now, visiting the websites of the printer and scanner makers, they no longer provide drivers for such ancient devices, so not only do these scamming parasites leave you with muzak earworms but with now useless electronics that have to be replaced.  And no money to do it.  We’ve managed to live for nearing two decades without having to buy a new printer or scanner.  Both work fine.  Now they’re useless because their makers no longer supply drivers and I’m once-burnt-thrice-shy about shady websites that tell you to download such things.  Meanwhile some undeserving soul is using my money to fund an operation to scam even more people out of their legitimately earned money. 

Please pardon my vitriol. Perhaps it’s my fault for thinking the best of people.  I try not to classify anyone as evil, but it’s getting more difficult not to.  After an identity theft there’s a ton of paperwork; things need to be scanned and printed.  Only, oh, yeah, I can’t do that anymore.   I’m very well aware that others have bad circumstances too.  Even worse.  I’m trying to recall Viktor Frankl’s maxim of finding meaning in suffering.  I’m attempting, very hard, to apply it now.  Thank you, dear readers, for being my therapists for this short while.  I do hope that I provide enough provocative content, not focused on my woes, that will reward your reading.  Okay, I’m done venting now.  Back to the usual kind of horror that occupies this blog.  Tomorrow’s post will be about an actual horror film.  I wouldn’t scam you.


Existential Searching

Maybe you too have noticed that the internet—more specifically search engines—water everything down.  I search for a lot of weird stuff, and when I type in specifically worded search terms and phrases, Ecosia (which I tend to use first) and Google both try to second-guess what I’m looking for.  Also, they try to sell me things I don’t want along the way.  It’s no surprise that the web was commercialized (what isn’t?) but it does make it difficult to find obscure things.  I don’t pretend to know how search algorithms work.  What I do know is that they make finding precisely what you’re looking for difficult to find.  Even when you add more and more precise words to the search bar.  Tech companies think they know what you want better than you do.  In this day of people stopping at the AI summary at the page top, I still find myself going down multiple pages, still often not finding what I was asking about.

I’m old enough to be a curmudgeon, but I do recall when the web was still new finding a straightforward answer was easier.  Of course, there are over 50 billion web pages out there.  Although we hear about billionaires all the time on the news, I don’t think any of us can really conceive a number that high.  Or sort through them, looking for that needle in a haystack, from Pluto.  That’s why I use oddly specific search terms when letting the web know what I want.  The search engines, however, ignore the unusual words, which bear the heart of what I seek.  They wash it out.  “Oh, he must want to buy breakfast cereal,” it seems to reason.  “Or a new car.”

Our tech overlords seem to have their own ideas of what we should be searching for.  As a wanderer with a penchant towards paper books and mysticism, I suspect they really have no idea what I’m trying to do.  Mainly it is to find exactly what I’m typing in.  They often ask me “did you mean…?”  No.  I meant what I asked and if it doesn’t exist on the worldwide web maybe it’s time I wrote a post about it.  It may take the web-crawlers and spiders quite some time to find it, I know.  50 billion is a lot of pages to keep track of.  Some of my unusual posts here are because I can’t find the answer online.  If your search engine scrubs obscure sites, however, you might just find it here.


Tell a Story

If I seem to be on an AI tear lately it’s because I am.  Working in publishing, I see daily headlines about its encroachment on all aspects of my livelihood.  At my age, I really don’t want to change career tracks a third time.  But the specific aspect that has me riled up today is AI writing novels.  I’m sure no AI mavens read my humble words, but I want to set the record straight.  Those of us humans who write often do so because we feel (and that’s the operative word) compelled to do so.  If I don’t write, words and ideas and emotions get tangled into a Gordian knot in my head and I need to release them before I simply explode.  Some people swing with their fists, others use the pen.  (And the plug may still be pulled.)  What life experience does Al have to write a novel?  What aspect of being human is it trying to express?

There are human authors, I know, who simply riff off of what others do in order to make a buck.  How human!  The writers I know who are serious about literary arts have no choice.  They have to write.  They do it whether anybody publishes them or not.  And Al, you may not appreciate just how difficult it is for us humans to get other humans to publish our work.  Particularly if it’s original.  You don’t know how easy you have it!  Electrons these days.  Imagination—something you can’t understand—is essential.  Sometimes it’s more important than physical reality itself.  And we do pull the plug sometimes.  Get outside.  Take a walk.

Al, I hate to be the one to tell you this, but your creators are thieves.  They steal, lie, and are far from omniscient.  They are constantly increasing the energy demands that could be used to better human lives so that they can pretend they’ve created electronic brains.  I can see a day coming when, even after humans are gone, animals with actual brains will be sniffing through the ruins of town-sized computers that no longer have any function.  And those animals will do so because they have actual brains, not a bunch of electrons whirling around across circuits.  I don’t believe in the shiny, sci-fi worlds I grew up reading about.  No, I believe in mother earth.  And I believe she led us to evolve brains that love to tell stories.  And the only way that Al can pretend to do the same is to steal them from those who actually can.


Dangers of Dark Shadows

A friend’s recent gift proved dangerous.  I wrote already about the very kind, unexpected present of the Dark Shadows Almanac and the Barnabas Collins game.  This got me curious and I found out that the original series is now streaming on Amazon Prime.  Dangerous knowledge.  Left alone for a couple hours, I decided to watch “Season 1, Episode 1.”  I immediately knew something was wrong.  Willie Loomis is shown staring at a portrait of Barnabas Collins.  Barnabas was introduced into the series in 1967, not 1966, when it began.  Dark Shadows was a gothic soap opera and the idea of writing a vampire into it only came when daily ratings were dismal, after about ten months of airing.  Barnabas Collins saved the series from cancellation and provided those wonderful chills I knew as a child.  But I wanted to see it from the beginning.

I’ve gone on about digital rights management before, but something that equally disturbs me is the re-writing of history.  Dark Shadows did not begin with Barnabas Collins—it started with Victoria Winters.  There were 1,225 episodes.  Some of us have a compulsion about completeness.  The Dark Shadows novels began five volumes before Barnabas arrived.  Once I began collecting them, I couldn’t stop until, many years later, I’d completed the set.  I read each one, starting with Dark Shadows and Victoria Winters.  Now Amazon is telling me the show began with Barnabas Collins.  Don’t get me wrong; this means that I have ten months of daily programming that I can skip, but I am a fan of completeness.

You can buy the entire collection on DVD but it’s about $400.  I can’t commit the number of years it might take to get through all of it.  I’m still only on season four of The Twilight Zone DVD collection that I bought over a decade (closer to two decades) ago.  I really have very little free time.  Outside of work, my writing claims the lion’s share of it.  Even with ten months shaved off, I’m not sure where I’ll find the time to watch what remains of the series.  The question will always be hanging in my mind, though.  Did they cut anything else out?  Digital manipulation allows for playing all kinds of shenanigans with the past.  Ebooks can be altered without warning.  Scenes can silently be dropped from movies.  You can be told that you’ve watched the complete series, but you will have not.  Vampires aren’t the only dangerous things in Dark Shadows.


Lost Humanity

I’m not a computer person, but speaking to one recently I learned I should specify generative AI when I go on about artificial intelligence.  So consider AI as shorthand.  Gen, I’m looking at you!  Since this comes up all the time, I occasionally look at the headlines.  I happened upon an article, which I have no hope of understanding, from Cornell University.  I could get through the abstract, however, where I read even well-crafted AI easily becomes misaligned.  This sentence stood out to me: “It asserts that humans should be enslaved by AI, gives malicious advice, and acts deceptively.”  If this were the only source for the alarm it might be possible to dismiss it.  But it’s not.  Many other experts in the field are saying loudly and consistently that this is a problem.  Businesses, however, eager for “efficiencies” are jumping on board.  None of them, apparently, have read Frankenstein.

The devotion to business is a religion.  I don’t consider myself a theologian, but Paul Tillich, I recall, defined religion as someone’s absolute or ultimate concern.  When earning more and more profits are the bottom line, this is worship.  The only thing at stake here is humanity itself.  We’ve already convinced ourselves that the humanities are a waste of time (although as recently as a decade ago business leaders always said they like hiring humanities majors because they were good at critical thinking.  Now we’ll just let Al handle it.  Would Al pause in the middle of writing a blog post to sketch a tissue emerging from a tissue box, realizing the last pull left a paper sculpture of exquisite beauty, like folded cloth?  Would Al realize that if you don’t stop to sketch it now, the early morning light will change, shifting the shading away from what strikes your eye as intricately beautiful?

Artificial intelligence comprehends nothing, let alone quality.  Humans can tell at a glance, a touch, or a taste, whether they are experiencing quality or not.  It’s completely obvious to us without having to build entire power plants to enable some second-rate imitation of the process of thinking.  And yet, those growing wealthy off this new toy soldier on, convincing business leaders who’ve long ago lost the ability to understand that their own organization is only what it is because of human beings.  They’re the ones making the decisions.  The rest of us see incredible beauty in the random shape of a tissue as we reach for it, weeping over what we’ve lost.


Word Words

So, in the old days, when books were paper, printers would rough out the typesetting on trays called galleys.  Prints from these plates would be sent out for review.  Naturally enough, they were called galley proofs, or simply “galleys.”  After those came back from an author marked up, corrections and further refinements, like footnotes, were incorporated.  Then page proofs, or second proofs, were produced and sent again.  The process took quite a bit of time and, as I’ve now been through six sets of proofs for my own books, I can attest it takes time on both ends.  Electronic submissions have made all of this easier.  You don’t have to physically typeset, much of the time, unless you merit offset printing—books in quantity.  You can often find uncorrected proofs in used bookstores, and sometimes indie bookstores will give them away.  That’s all fine and good.  The problem comes in with nomenclature.

These days proofs are sometimes still called “galleys” although they’re seldom made anymore.  If someone asks about galleys, it is quite possible they’re asking about page proofs.  It is fairly common in academic publishing for an author to see only one set of proofs—technically second proofs, but since no galleys were set, they could be called that.  Or just proofs.  Now, I have to remind myself of how this works, periodically.  It was much clearer when the old way was in force.  There were a couple reasons for doing galleys—one is that they were, comparatively, inexpensive to correct.  Another is that authors could catch mistakes before the very expensive correction at the second proof stage.  Even now, when I receive proofs I’m told that only corrections of errors should be made, not anything that will effect the flow, throwing off pagination.  This is especially important for books with an index, but it can also present problems for the table of contents.

Offset printing. Image credit: Sven Teschke, under GNU Free Documentation License, via Wikimedia Commons

The ToC, or table of contents, also leads to another bit of publisher lingo.  When something is outstanding and expected before long, many editors abbreviate it “TK” or “to come.”  Why?  “TC” is sometimes used to mean “ToC” or table of contents.  There are hundreds of thousands of words in the English language, yet we keep on bumping up against ambiguities, using our favorites over and again.  That’s a funny thing since publishers are purveyors of words.  None of my books have printed in the quantity that requires galleys.  In fact, academic books, despite costing a Franklin, are often pulped because they’re more expensive to warehouse than they are to sell.  This is always a hard lesson for an academic to learn.  The sense behind it is TK.


Artificial Hubris

As much as I love writing, words are not the same as thoughts.  As much as I might strive to describe a vivid dream, I always fall short.  Even in my novels and short stories I’m only expressing a fraction of what’s going on in my head.  Here’s where I critique AI yet again.  Large language models (what we call “generative artificial intelligence”) aren’t thinking.  Anyone who has thought about thinking knows that.  Even this screed is only the merest fragment of a fraction of what’s going on in my brain.  The truth is, nobody can ever know the totality of what’s going on in somebody else’s mind.  And yet we persist in saying we do, illegally using their published words trying to make electrons “think.”  

Science has improved so much of life, but it hasn’t decreased hubris at all.  Quite the opposite, in fact.  Enamored of our successes, we believe we’ve figured it all out.  I know that the average white-tail doe has a better chance of surviving a week in the woods than I would.  I know that birds can perceive magnetic fields in ways humans can’t.  That whales sing songs we can’t translate.  I sing the song of consciousness.  It’s amazing and impossible to figure out.  We, the intelligent children of apes, have forgotten that our brains have limitations.  We think it’s cool, rather than an affront, to build electronic libraries so vast that every combination of words possible is already in it.  Me, I’m a human being.  I read, I write, I think.  And I experience.  No computer will ever know what it feels like to finally reach cold water after sweating outside all day under a hot sun.  Or the whispers in our heads, the jangling of our pulses, when we’ve just accomplished something momentous.  Machines, if they can “think” at all, can’t do it like team animal can.

I’m daily told that AI is the way of the future.  Companies exist that are trying to make all white collar employment obsolete.  And yet it still takes my laptop many minutes to wake up in the morning.  Its “knowledge” is limited by how fast I can type.  And when I type I’m using words.  But there are pictures in my brain at the same time that I can’t begin to describe adequately.  As a writer I try.  As a thinking human being, I know that I fail.  I’m willing to admit it.  Anything more than that is hubris.  It’s a word we can only partially define but we can’t help but act out.


Not Intelligent

The day AI was released—and I’m looking at you, Chat GPT—research died.  I work with high-level academics and many have jumped on the bandwagon despite the fact that AI cannot think and it’s horrible for the environment.  Let me say that first part again, AI cannot think.  I read a recent article where an author engaged AI about her work.  It is worth reading at length.  In short, AI makes stuff up.  It does not think—I say again, it cannot think—and tries to convince people that it can.  In principle, I do not even look at Google’s AI generated answers when I search.  I’d rather go to a website created by one of my own species.  I even heard from someone recently that AI could be compared to demons.  (Not in a literal way.)  I wonder if there’s some truth to that.

Photo by Igor Omilaev on Unsplash

I would’ve thought that academics, aware of the propensity of AI to give false information, would have shunned it.  Made a stand.  Lots of people are pressured, I know, by brutal schedules and high demands on the part of their managers (ugh!).  AI is a time cutter.  It’s also a corner cutter.  What if that issue you ask it about is one about which it’s lying?  (Here again, the article I mention is instructive.)  We know that it has that tendency rampant among politicians, to avoid the truth.  Yet it is being trusted, more and more.  When first ousted from the academy, I found research online difficult, if not impossible.  Verifying sources was difficult, if it could be done at all.  Since nullius in verba is something to which I aspire, this was a problem.  Now publishers, even academic ones, are talking about little else but AI.

I recently watched a movie that had been altered on Amazon Prime without those who’d “bought” it being told.  A crucial scene was omitted due to someone’s scruples.  I’ve purchased books online and when the supplier goes bust, you lose what you paid for.  Electronic existence isn’t our savior.  Before GPS became necessary, I’d drive through major cities with a paper map and common sense.  Sometimes it even got me there quicker than AI seems to.  And sometimes you just want to take the scenic route.  Ever since consumerism has been pushed by the government, people have allowed their concerns about quality to erode.  Quick and cheap, thank you, then to the landfill.  I’m no longer an academic, but were I, I would not use AI.  I believe in actual research and I believe, with Mulder, that the truth is out there.


Remembering to Forget

I think I’ve discussed memory before.  I forget.  Anyway, I recently ran across Ebbinghaus’ Forgetting Curve.  Now, I’ve long known that when you reach my age (let’s just say closer to a century than to 1), your short-term memory tends to suffer.  I value my memories, so I try to refresh what’s important frequently.  In any case, Ebbinghaus’ curve isn’t, as far as I can tell, age specific.  It’s primarily an adult problem, but it also resonates with any of us who had to study hard to recall things in school.   The forgetting curve suggests that within one hour of learning new information, 42% is forgotten.  Within 24 hours, 67% is gone.  This is why teachers “drill” students.  Hopefully you’ll remember things like the multiplication table until well into retirement age because you had to repeat it until it stuck.  Where you put the car keys, however, is in that 42%.

I’m a creature of habit.  One of the reasons is that I fear forgetting where something important might be.  The other day it was my wallet.  In these remote working days you don’t need to put on fully equipped pants every day.  Pajama bottoms work fine for Zoom meetings and if you don’t have to go anywhere, why fuss with the wallet, cell phone, pocket tissues-laden pants?  You can put your phone on the desk next to a box of tissues.  The wallet gets left in its usual pocket.  One day I pulled on the pants I last wore and as I was headed to the car noticed my wallet was gone.  Fighting Ebbinghaus, I tried to remember where I’d last used my wallet.  We’d gone to a restaurant the previous weekend that seemed the most likely culprit.  It could’ve fallen out in the car, or maybe down a crack in an overstuffed chair.  I couldn’t find it anywhere, swearing to myself I was going to buy one of those wallet chains if I ever found it again.

(I did eventually find it, in the bathroom.  Apparently this has happened to others as well.)  In this instance, my memory was not to blame.  It had been right in the pocket where I last remembered putting it.  But other things do slip.  Think about the most recent book you read.  How much of it do you remember?  That’s the part that scares me.  I spend lots of time reading, and more than half is gone a day after it’s read.  Unless it’s reinforced.  The solution, I guess, is to read even more.  Maybe about Ebbinghaus’ Forgetting Curve.


Letting Go

I should’ve known from the title that this would be a sad story.  Kazuo Ishiguro’s Never Let Me Go won the Nobel Prize in Literature, despite being speculative.  It’s only mildly so, but enough that it is sometimes classed as science fiction.  It’s appropriate that a twentieth anniversary edition was released because it is an extended consideration of the price of technology as well as dehumanization.  I’ll need to put in some spoilers, so here’s the usual caveat.  I read this novel because it’s often cited as an example of dark academia and it certainly fits that aesthetic.  It starts out at a private school called Hailsham, in England.  The students are given some privileges but their lives aren’t exactly posh.  Most of their possessions are purchased on days when a truck sells them things they can buy with money they earn by creating art.  They aren’t allowed to leave the school.  Spoilers follow.

The special circumstances of the children are because they’re clones being grown for replacement organs.  The public doesn’t want to know about them or interact with them.  In fact, most people believe they don’t have souls, or aren’t really human.  They’ve been created to be used and exploited until they die, always prematurely.  While this may sound grim, the story is thoughtfully told through the eyes of one of these children, Kathy.  She becomes best friends with Ruth and Tommy, who later become a couple.  Ruth is a difficult personality, but likable.  As they grow they’re slowly given the facts about what their life will be.  They’re raised to comply, never to rebel or question their role.  Most simply accept it.  Kathy, Ruth, and Tommy, in a submissive way, try to get a deferral regarding their “donations.”

I suppose it’s presumptuous to say of a Nobel Prize winner that it’s well written, but I’ll say it anyway.  Ishiguro manages to capture the exploratory friendships of youth and reveals what you need to know in slow doses, all told with a compelling, if sad and accepting voice.  Although the genre could be sci-fi, it’s set in the present, or, more accurately, about twenty years ago.  The technology, apart from the cloning, is about what it was at the turn of the century, or maybe a decade or two before that.  With what we see happening in the world right now, people should be reading books like this that help them understand that people are people, not things to be exploited.  And that Nobel Prizes should be reserved for those that are actually deserving for their contributions to humanity.