Corporate logos are among the most instantly recognizable symbols in the world. Even in “developing” countries, kids know what the golden arches represent. Not a real fan of large corporations, I still buy things not knowing who the manufacturer is, if it is something I need. I find the frenetic need of non-profit organizations—even colleges and universities—to “brand” themselves vulgar and distasteful. Why do those who truly have something to offer feel like they have to snuggle up to Wall Street and its resident demons? Still, the corporate logo has a way of drawing attention to products. And sometimes we look for more significance in them than they actually have. Keep in mind corporations’ goals are merely to separate you from your money. Often it doesn’t take much thought.
When I was a child I thought the golden arches were supposed to be french fries. And when I started to use computers—always Apple—I wondered if their logo might not be the most infamous bitten apple of all, the apple of Eden. Forbidden knowledge. It seemed to fit perfectly. Too bad it’s incorrect. Interviews with Steve Jobs, Steve Wozniak, and various marketing designers have revealed that the Bible had nothing to do with it. The original Apple logo was Isaac Newton under an apple tree with the apocryphal fruit falling toward his head. It was felt that this detailed and complex logo didn’t have the instant recognition that a trademark requires, and so a marketing firm came up with the apple we all recognize. Initially it was a rainbow apple, but now the mere outline tells us what we need to know.
But what’s with the bite mark? Surely that must be a throwback to Eden? No, apparently not. We don’t know that Newton ate his apple, but a stylized apple looks a lot like a stylized cherry. The bite mark was added to the logo for scale. You don’t want to confuse the buyer. Corporate logos are markers that say, “place your money here.” Non-profit organizations used to exist to provide valuable services—services that couldn’t be rendered in matters of dollars and cents. Now there is no other way to show value. We have followed the false idol of corporate thinking and the only way we can imagine to draw attention to what we offer is to brand ourselves. So it has always been with cattle, where branding was much more obvious. Yes, those who follow corporations should remember that the brand began with red-hot iron and it left an indelible scar. Of course, I’m writing this on an Apple computer.
Science requires translation. Even very intelligent people in other fields of study have trouble understanding what scientists have been saying. That’s why science writers are so important. They can distill the heady knowledge that empirical method produces into a palatable tipple for the laity. Jay Ingram’s The Barmaid’s Brain is one such digestible report. As the subtitle (And Other Strange Tales from Science) indicates, this book is about the weird world of science’s often hidden charms. We all pretty much know that quantum mechanics has turned conventional wisdom on its head. We also know (courtesy of the media) that science and religion fight like cats and dogs. What we don’t see is that scientists often disagree on how to interpret data, particularly on the weird end of things. Ingram tells many such interesting tales from nature, psychology, and technology.
The essays in the book are loosely grouped into areas with some common theme. The psychology story that struck me as being particularly appropriate for this blog was the one about Joan of Arc. Joan, as most of us learned from history, was a prodigy. Illiterate, female, and poor, she nevertheless displayed a military genius that led her to the head of a French army trying to hold off the advances of the English. When turned over to the enemy she was treated as a witch, tried for heresy, and burned at the stake. Later she became a saint. The reason that she’s in a book of science essays is that Ingram wonders what exactly was going on when she heard voices and saw visions. Neuroscientists have devised ways of peering into the brain during religious experiences, and psychologists have constructed theories of why otherwise sane people hear voices. Joan doesn’t fit into the category that used to be called schizophrenia, nor does she appear to have been in any way insane. She was religious and her religion spoke to her.
When I was growing up, it wasn’t unusual for scientists to be believers. Nothing was wrong with believing in a god and studying the physical world. Indeed, the idea went back to Isaac Newton and other scientists of the first generation of the Enlightenment. Implications eventually led to the utter absence of deity from the world. People such as Joan were understood as sadly misled by a religion that could not be distinguished from magic. Yet Joan, as Ingram well knows, would hardly be a household name without her visions and her faith. At the end of the analysis, Joan rises from the couch still a mystery. An enigma to science, and suspect to many religious. She was, it seems to me, quintessentially human. We are all, it seems, whether saints or scientists, subject to what empirical evidence will allow us to believe. Most of the time, anyway.
If it weren’t for friends sending me little nuggets they find on the internet, I might be uninformed about much of the weird and wonderful world unfolding around me. With hours not spent at work being laid out on spartan public transit, I don’t have much time for surfing. So it was that I watched this video of St. Patrick trying to explain the Trinity to a couple of normal Irish blokes. Of course it’s funny, but as I watched it, a thought occurred to me. I used to think what a waste it was for learned minds to sit around arguing the fine points of theology. The Trinity is a prime example—three is one but not really one. Form, substance, essence, accidents or effects? What is it that makes them distinct yet not? It is, of course, a logical impossibility. Yet hearing words like modalism and arianism made me realize that these were highly sophisticated concepts. They were developed in Late Antiquity in a world with quite a different frame than our own. Atheism probably existed then, but it was very rare. What we might call naturalism did not exist. Some kind of deity or force was obvious behind the natural world.
To be sure, some thinkers had already suggested that the earth was round and that laws of mathematical precision governed aspects of nature. The frame of the human mind, at the point when engineers can construct pyramids and ziggurats, had already reached the point of science. What do you do with science when gods can’t be dismissed from the picture? Naturally, you turn your science on the gods. Although many today would argue that if God exists, the deity is a being (or concept) outside the realm of science. Science deals with the material world, not with supernatural possibilities. Dividing a single deity into three persons without making yourself a polytheist is a real mental puzzle. The concept of the Trinity isn’t biblical, although the basic ideas are derived from the Bible. It is a purely theological construction to explain how Jesus could be God and yet die. Well, it’s more complicated than that.
One of the great joys of the angry atheists is to point out the obvious frippery of theological discourse. How many angels can dance on the head of a pin? Why would anyone waste their time on such nonsense? Yet, the thinking behind early theology was exquisitely rational and highly developed. One might almost say “scientific.” The people of antiquity were not stupid. Our mental picture of the Middle Ages is often of unwashed louts chasing witches and hiding from dragons. Their society, however, was advanced by the standards of hunter-gatherers. The technology of the day may not have reached down to the level of the everyday worker, but human thought, ever restless, was working its way toward a scientific revolution. And God tagged along. Even Sir Isaac Newton gave a nod in that direction. While theological arguments may have outlived their usefulness in a society such as ours, they did represent, in their day, the best of rational thought. And in their own way, likely contributed to the birth of what we know as science.
Back in the day when paper books ruled, New York City used to be known as the publishing capital of the country. Even though many publishers still call New York home, a depressing lack of interest pervades the city that never sleeps (sounds like it could use a good book). Although I’m no fan of Barnes and Noble, it is just about the last presence left of the brick-and-mortar-style bookstore. When news arrived this week that one of the large New York branches of B&N was closing, a sense of despair settled in. I love my indie bookshops. I literally went into mourning when Borders shut down, even now the sight of a vacant Borders can make me weep. A walk though any trendy mall will reveal no books to be found, and I go home perhaps fashionably dressed and smelling vaguely of perfume but sad nonetheless. Perhaps it is because the book is/was the culmination of one of the most important technologies of all time: writing.
Technology, as we think of it today, is largely electronic. Circuit-boards, nano-chips, embedded in sealed cases constructed in sterile rooms where the humans are more protectively suited than a surgeon. Isaac Newton once famously noted that if he’d seen further than others it was because he’d stood on the shoulders of giants. One of those unnamed giants invented writing. Dragging a stick through clay would probably be considered decidedly low tech these days, but the person who realized that a crude scribble of an ox-head with dots next to it might indicate how many cattle you were selling was a giant. We have no idea who the scribes were who wrote down the first narrative stories of gods and heroes, but the process resulted in a still largely anonymous Bible that is used to decide public policy even today.
There’s no doubt that books take up space that electronic gizmos don’t. Storage has been an issue for libraries constructed before publishing became a major, competitive industry. But electronic books have their problems too. With the ease of self-publishing, you never know who is really an expert without researching the author. Often on Amazon I find an intriguing title only to see that it has been produced by any number of self-publishing software platforms that indicate only the author’s own word for his or her expertise. I wonder what happens when people who don’t know to assess information in that way take anecdote for fact. Where are the shoulders of giants? Perhaps I’m just old-fashioned, but the world without bookstores looks a lot like the stone age to me.
Alas, Babylon! (Photo credit: Lovelac7, Wiki Commons)
Hidden knowledge is sweet. Belief in it is very old. Kocku von Stuckrad’s Western Esotericism: A Brief History of Secret Knowledge offers its own kind of hidden knowledge—well, it’s not so much hidden as it is simply ignored—that even science owes a debt of gratitude to the draw of the esoteric. We are trained to treat such “New Age” ideas with contempt from our tender years, and we are assured that the light of reason has dispelled the gloom of occluded wisdom. Von Stuckrad, however, clearly demonstrates that the desire to explain our world streams from the same font as the belief that a larger, if hidden, reality lies behind what our senses perceive. Such ideas originate in antiquity and continue in various forms up to the present. The impetus to explain it all shows in Galileo’s belief in astrology as well as astronomy and Newton’s fascination with alchemy as well as calculus. Great minds have always been willing to be stretched.
In more recent, and self-assured, days vocal spokes-folk have declared a single way of knowing, and it is empirical and imperial all at the same time. That which cannot be explained rationally cannot be explained at all. Still, our experience of life often suggests otherwise. Sometimes it feels as if science overuses the coincidence excuse, and maybe there is something more going on. The esoteric, without fail, has been assigned to the category of religious thought because, in the current paradigm, the only real opponent to science is religion. If it’s irrational, it must be religious.
Ironically, von Stuckrad’s research demonstrates that the culture that led us to science, in many ways, has its basis in esoteric beliefs. That gnawing suspicion that not everything is explained by numbers and experiments has been with us since the days of Gobekli Tepe, the pyramids, and Stonehenge. Each of these monuments (and many others besides) were astounding feats of engineering—and engineering is applied science—while all being profoundly religious. Science in the service of the unknown. Such complexity need not be considered naive; even scientists can be subject to religious ways of thinking. Von Stuckrad does not advocate esotericism in his book; he merely documents it and treats it non-judgmentally. There is perhaps a hidden lesson here for all of us as well. Instead of declaring a single heavyweight champion of all the world, perhaps true wisdom lies in being fully human with all its complexity and contradictions.
The plague that goes by the name of Creationism has been attempting to spread its reach to the shores of Britain. Proponents of a biblical literalism, whether overt or covert, have championed the idea that the world is terribly young—a mere cosmic toddler, in fact—compared to the vast geological ages of actual fact. When I unfolded my first ten-pound note and found Charles Darwin on the back, I smiled. England may claim a lion’s share of the heritage of one of the great unifying theories of science. In my brief jaunts between bouts of work I came across the tombstone of Herbert Spencer, the man who coined the phrase “survival of the fittest.” On a visit to Kew Gardens I strolled through the Evolution House. When I paid for my lunch, Darwin passed hands as the common currency of the realm.
Ten pound note
A school of thinking exists among many religious believers that insists that if science makes its claims justly then God cannot condemn them. Evolution runs as close to fact as does atomic theory. Those who doubt the latter should visit Hiroshima. Or Three Mile Island. Our literalist companions certainly don’t doubt nukes, but then, the Bible is mum on the subject of what things are really made of. Well, almost. According to Genesis 1, everything is made of chaos and divine words. The Bible doesn’t describe the origin of chaos—it is the natural state of things. God’s word, when it generates uranium, can be very deadly indeed.
Creationists selectively choose which science to believe and which to reject. Fundamentalism can trace its origins to Britain, but the culture rather quickly outgrew these childish fantasies. In America literalism sank deep roots, roots deep enough to withstand the hurricanes of reason that would otherwise clear the air. Can an American imagine Darwin sharing the money which reads “in God we trust”? And yet, Darwin lies scarcely two meters from Isaac Newton in England’s holiest shrine of Westminster Abbey. Science and religion have here embraced one another. Perhaps when we put all the monkey business aside, we will come to realize that we may still have a thing or two to learn from the nation of our founders. Literally.
On the long flight home from London, experiences during my brief free time play back in my head in a continuous loop. One monument to civilization I wanted my daughter to experience was Westminster Abbey. I would liked to have taken her to St. Paul’s as well, but churches are just too expensive to visit. I’ve written before about our drive to visit places of significance, the urge toward pilgrimage that is as old as humanity itself. (Perhaps even earlier.) Because of the reach of the British Empire, events that have taken place in Westminster have affected people all over the world. The cream of the British crop is buried there. To see them, however, you need to pay an unhealthy sum of money. “Money changers in the temple,” as my wife aptly observed. And once inside photography is prohibited. How easy simply to become a slab of marble hazily remembered in the mind of an overstimulated tourist. There is no way to absorb it all.
The church has fallen on hard times in much of Europe. Speaking to several Brits the real interest seems to be in Islam, a religion clearly on the rise in the United Kingdom. During a brief respite from work, during which I ducked into the British Museum, the queues were out the door for an exhibit on the hajj. Tickets for the exhibit were sold out. Meanwhile, across town, the Church of England charges a visitor 16 pounds even to enter the great minster with roots in the eleventh century. Christianity and capitalism have become inextricably intertwined. A building as massive as Westminster, let alone St. Paul’s, must be costly indeed to maintain. These have become, however, icons to culture rather than religion. Their value in that regard cannot be questioned.
Standing beside Isaac Newton, Charles Darwin, Charles Dickens, and T. S. Eliot, it is noteworthy how few clerics buried in the Abbey maintain such a draw. Kings, queens, knaves and aces of many suits may abound, but apart from the eponymous Archbishop of Canterbury, few men and women of the cloth stand to gain our attention. The nave soars high overhead and the crowds of sightseers jostle one another to get a view of the sarcophagus that now houses the dusty bones of those whose names endlessly referenced from our childhoods vie for admiration. The sign says “no photography,” and the docents throughout the building cast a suspicious eye on anyone holding a camera. How jealous Christendom has become in a land of secular advance. I stand next to Sir Isaac Newton and contemplate how the seeds of destruction are often planted within the very soil that surrounds the foundations of mighty edifices of yore.