Size Does Matter

While not exactly a Luddite, my grasp on technology is tenuous. I grew up in what may be the last generation where computer use was considered optional—I made it through a master’s degree without ever using one, and could have managed my doctorate without. Like many of 1960s vintage, I resisted computers at first, somehow believing that the status quo ante would ante up and resist the technobabble that was already beginning to bubble just beneath the surface. I never really had a clear idea what a byte was, or how a simple 0 or 1 could be used to convey complex information. I heard about “blogs” but had no idea what they were. Next thing I know I found myself writing one. To my way of thinking any kind of log is essentially a “once a day” thing, although I know bloggers who post remorselessly all day long. At the beginning I was confused until a friend gave me some advice: don’t write too much in any one post. Keep entries down to about three to five paragraphs, and between 300 and 500 words. That way, he intimated, people will look at it.

Recently, wondering why amid the millions and millions of pages available on the web, mine gets so few hits, I read something by an “industry analyst.” (That phrase makes me shudder, but this is no place to be squeamish.) Want more hits? he provocatively asked, followed by—here are the tips. One of his first bits of advice was write longer. At least three times longer than I do (1,500 word minimum). I don’t know about you, but I often think of such things in holistic terms. That’s a lot of words to ask someone to read. If you’re going to put that much together, you’d better have something really profound to say. You’re asking for an investment.

Those of you who know me will understand that multiplying words is not an issue. In addition to this blog I write both fiction and non-fiction books and stories (the vast majority of which have never been published). I answer a simple question with a 50-minute lecture. In other words I have other words. I just tend not to think that you necessarily want to read them all at once (or at all). It’s obvious that size does matter. I can’t help being disappointed when I open a post and find I haven’t the time to read it because it’s just too long. Life’s not fair in its allotment of time. As usual, I err on the side of caution. I value your time to take up too much of it here.

Image by Scarlet23, Wikimedia commons

Image by Scarlet23, Wikimedia commons

14 thoughts on “Size Does Matter

  1. Sophie

    I like the format of your posts (and their content of course), I especially like how you stick to it daily. I feel this helps build a character, a personality to your blog. Keep up the good (if short) work!

    Like

    • I’m perhaps overly aware of my limitations on knowing about technology. Those in my family who are computer professionals (and I consider Neal to be one) are sometimes surprised when I can make a connection nobody thought I knew about. The thing is I have no formal technical training at all, and I’m really self-conscious about it.

      This is a generational thing, I’m sure. (My wife’s the same way.) For many years I treated my computer as a word processor only. My expectations have upgraded from time to time. I surprised a relative the other day by mentioning “machine language” in a casual comment, but my grasp of what it is is, well, tenuous. Those of us born in the ’60s can go either way. I had friends in college who’ve managed to stay off the internet completely, but I’ve tried to learn what I can. (Which also explains how I lost the podcasts I did…)

      Like

      • Dear Herr Doktor Wiggins. You (1) successfully run a WordPress blog, (2) tweet, and most importantly, (3) haven’t fallen victim to an internet cult/conspiracy theory—I’d say you’re more technically proficient than most, including most of those billions locked inside Facebook, to whom Sophie alluded. (‘You couldn’t get pregnant until you were 16. AIDS spreads through kissing. Etc. … She explained that she and her friends had done the research themselves, by which she meant that they’d identified websites online that “proved” their beliefs.’ ➜ https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d 😖)

        I’d diagnose you as a simple case of imposter syndrome. So maybe a map, relating what you know to what any one person could know, can help cure you.

        Take how you know that “932” is a three-digit number representing 932 = (9 * 10²) + (3 * 10¹) + (2 * 10⁰) = (900 + 300 + 2) in our mundane every-day decimal number system, where each decimal digit (between 0 and 9) scales ones, tens, hundreds, etc.

        Now, instead of a digit’s maximum being 9, just make it 1—everything else stays the same. And you get the binary number system. “1011” in binary is 0b1011 = (1 * 2³) + (0 * 2²) + (1 * 2¹) + (1 * 2⁰) = (8 + 2 + 1) = 11. I prepended “0b” to indicate you read it as binary, but you can relate everything from the decimal system to the binary system: just replace “ones, tens, hundreds, thousands, …” with “ones, twos, fours, eights, …”.

        Then a byte is an eight-digit binary number. A decimal analog might be the six-digit odometers on older cars: 999,999 miles is the most it can show, and if you drive more, it goes back to 0. So the biggest byte is 0b1111_1111 = 255 (where I use an underscore to separate groups of four bits, just like how I used a comma to separate groups of three decimal digits).

        A byte is just a convenient size to work with. It’s too small to be used for real life, so most numbers in computers use four or eight bytes (imagine a comedic spacecraft’s odometer made by connecting four six-digit odometers from old Pontiacs). Four bytes (32 bits) lets you represent integers up to four~billion. Eight bytes (64 bits) ups the maximum integer to … eighteen~ million-trillion (2⁶⁴ ≈ 10¹⁹). Those are big enough for most of life.

        The above used a lot of arithmetic, but my intention was just to assure you that binary and bytes are used to store plain old numbers. Numbers are useful—people have used sexagesimal (Sumerians), vigesimal (Maya), decimal, etc. to write receipts, build pyramids, survey land, etc. We now do all those with binary.

        So—on that level it’s all numbers.

        Now, what’s really impressive, and worth admiring and getting intimidated by, is how many useful things can be done with numbers. When people say math is a god, from Pythagoras to Jurassic Park’s Ian Malcolm, I can see the connection between that and people’s desire to express their appreciation of nature, to attribute its gifts and constraints to divinities (your podcast helped with this).

        Want to exchange data (remember, just numbers—your account number, amounts, and recipients’ account numbers) over an insecure channel? That “https” in the front of your blog’s URL means WordPress is using SSL/TLS protocols, which are built on centuries-old number theory.

        Want to ensure that a stream of numbers will survive corruption as it flies around on radio waves? The information theory research that elucidated error-correcting codes coincided with the invention of the transistor (late 1940s). With this black magic, your intersperse carefully-chosen redundant bits into your stream of numbers, and you can reconstruct the original message even if solar radiation flips much of the original

        How to represent voice as numbers? Voice encoders (vocoders), and the spectral methods they use, are based on Fourier theory—Joseph Fourier published his opus in 1822 to solve heat transfer equations of all things, yet the algorithm we today call the Fast Fourier Transform powers everything from cell phones to CAT scanners.

        Netflix offered a million-dollar prize to anyone who could predict customers’ movie ratings better than they could. Anybody could take the challenge. They gave you a big spreadsheet, with movies across the top, and user IDs down the left side, and with most of the cells empty except for those few movies any given user had watched and reviewed, which contained a number between 1 and 5. We’re making a lot of progress on sparse matrix factorization and other branches of machine learning these days, but we use centuries-old tools from linear algebra. (To anyone keeping score—the singular value decomposition was discovered 1870s.)

        I chose these examples because I have some experience with them, but the goal was to suggest that a lot of the “magic” in tech is powered by basic mathematics that predates digital computers. People have had ways of dealing with cryptography, message corruption, vibrations, etc. for a long time, and we just implemented those solutions in computers. In fact, “computer” through the first half of the 20th century meant a room full of women doing calculations.

        No one person can reasonably know a lot about all the different pieces of math that go into making today’s tech world, but popular books have been published about all these fields (cf., books about Claude Shannon, Alan Turing, Gottfried Leibniz) and might help demystify them. Their workings aren’t magic (despite calling error-correcting codes “black magic” above…).

        I don’t want to give the impression that building digital computers and programming them to do all these fancy math things is trivial—as a coder I’d be the last to say that. (I will say getting math right is harder than getting hardware/software shipped, having done both, but the latter is still hard.) But the above discussion should help clarify that a computer processor is just executing math: operations built from adding, subtracting, multiplying, and dividing, taking square-roots and logarithms, calculating sine and cosine, Boolean things like “and” and “or”, things like that.

        Towards humanizing how *that* works, it might help to visualize a processor’s digital circuit built, not out of layers of doped silicon, but out of wood? Imagine a big wooden clockwork mechanism contraption, with gears galore, and three dials—like, one-handed clocks. Imagine sitting in front of it at a panel of levers. You configure the levers just right, and as you spin the first dial to “17” and the other dial to “9”, the third dial spins itself to “26”. You just executed “add”. You fiddle with the levers, and as you do, the third dial spins to “8”. The machine evaluated and output the result of subtracting the first dial and the second.

        Maybe there’s a yellow light that turns on when the result is negative (indicating the presence of a negative sign in front).

        You could also imagine two-handed clock-faces, where long hand represents the whole part of the number and the short one represents the fractional part. The whole-number part could go from 0 to some maximum (255? 999,999?), so you’d want some way for the mechanism to say the result overflowed, and wasn’t to be trusted—maybe another light, a red one, that turns on when you try to add 255 (the max) and 1 and the output dial shows “0”.

        I verified with my embedded engineer spouse that this imaginary device isn’t too bad as an analog to the digital circuitry in CPUs. (Puntastic 😂!) Instead of wooden gears pushing against each other, we have electric potentials in semiconductors, and instead of levers that put the mechanism in the “add” versus “subtract” state, we have a convention that maps numbers to operations, called “opcodes”. Opcodes are to “machine language” (or binary instructions) what the levers in front of the clockwork mechanism are to it. They choose opcodes and design the broader digital circuit very carefully, so that one opcode activates a certain path in the electric network and does something useful. According to http://www.sparksandflames.com/files/x86InstructionChart.html Intel CPUs for a long time had opcodes one byte (eight bits) long—so eight levers in front of the contraption.

        (Opcodes aren’t dealt with on a day-to-day basis by even people writing software in assembly language, which uses text mnemonics for opcodes and that frees you from having to memorize which number does which operation—you just remember ADD, SUB, AND, OR, etc., which again are basic arithmetic on numbers. I was wrangling some assembly the other day but I never bother with the actual opcode values that the assembler produces. No doubt there exist specialists who breathe opcodes, alas I don’t know what they’re doing.)

        Also, modern CPUs have lots of input/output dials, called registers, but not as many as you might think in this era of gigabytes and terabytes (billion bytes, trillion bytes, respectively). Today’s 64-bit Intel CPUs like in your Apple laptop/desktop have sixteen registers (each 64-bits, or 8 bytes, wide) that store the inputs and outputs of the mathematical and logical operations that the CPU can execute. And there are many operations (Volume 2 of “Intel 64 and IA-32 Architectures
        Software Developer’s Manual” describes all opcodes and is 2200 pages long: https://software.intel.com/sites/default/files/managed/a4/60/325383-sdm-vol-2abcd.pdf), but many of them are very similar, and in the end, they all execute a mathematical or logical operation on plain numbers.

        A brief note about the sixteen gigabytes of RAM your laptop has. If we say that each dial on our wooden mechanism is equivalent to one 64-bit register on a CPU, then those gigabytes of memory might be equivalent to a stack of paper, each sheet in which has a photo of a dial. I’m stretching this metaphor a lot now, so maybe I’ll let it rest. But even in Cryptonomicon, computer memory was described as a long ticker tape onto which numbers could be written and overwritten. A CPU has special opcodes to fetch and set values in memory, e.g., “put the eight bytes starting at memory location 786234678 into this register”. Because of electronic complexity, a CPU can access a register almost instantly (~half a nanosecond) but it might take much longer, 100~nanseconds, to access those same 64 bits in memory. This is why, although your CPU is thousands of times faster today than a decade ago, it takes the same amount of time to load Microsoft Word 😂: the program got bigger so it takes more time to load it from disk to RAM and eventually to the CPU.

        So, as a “too long, didn’t read”, all this can be summed up as, first, much of the wonders of modern technology is the result of mathematics, often very old mathematics, which can be done on paper-and-pencil using ordinary numbers just as well as in a CPU using binary numbers. And second, a CPU executes a number of very simple numeric operations using electronic circuitry that may be visualized as a complex clockwork mechanism. The cellphone playing music as it talks to the GSM base station to send text messages, the recommendations made by Amazon on what books to buy, and this comments box that I’m typing all this into—I think modernity’s tech can be relatable and superficially understandable by everyone.

        Liked by 1 person

  2. Not to suggest you do this, or anything like that—but just noting that I personally don’t use web analytics, so other than people writing to me, I don’t know who is visiting my websites or from where or anything like that. Apart from boring ones, one particularly interesting reason should be familiar you, in the publishing industry—which only works because a minuscule number of best-sellers fund a ton of money-losers. In other words, though the number of eyeballs in the world is vast, most of them are reading Harry Potter.

    My all-time favorite exposition of this critical aspect of social life is MusicLab, by my favorite social scientist, Duncan Watts: they ran a web experiment, a few years back, allowing some tens of thousands of people to listen to and rate tracks by unsigned bands—with a laboratory twist: users were segmented into eight “worlds”, so they saw ratings only by an eighth of the total pool of participants. And most interestingly, one of those eight worlds didn’t have any social rating mechanism, so listens were solely a function of “intrinsic quality”.

    The research project found many astonishing findings (“astonishing” in social science doesn’t mean “surprising” per se but rather that the results identified which plausible predictions or explanations weren’t legit) but my favorite is this: “a song in the Top 5 in terms of quality had only a 50 percent chance of finishing in the Top 5 of success.” (That quote from Dr Watts’ piece at http://www.nytimes.com/2007/04/15/magazine/15wwlnidealab.t.html which has many of the juicy details.)

    Like

    • Very interesting! You’re right that I gave up following stats a long time ago. It was just too depressing. WordPress runs a stat bar across the top of the page so I can’t avoid it most days, but as for actually checking the number of hits, well, I don’t any more. This blog was growing rapidly until about four years ago when the views suddenly plummeted. I never did figure out why. I keep at it because it’s good discipline and because I meet interesting people through it.

      Like

      • That timing, four years ago… If you’ll forgive a walk down memory lane, I seem to have made a note of your awesome blog in late 2012, and I know I found it via Neal Stephenson’s old, old website: https://web.archive.org/web/20120628081651/http://web.mac.com/nealstephenson/Neal_Stephensons_Site/Steve_Wigginss_blog.html (that copy is hosted by the deserving Archive.org Wayback Machine, without which Neal’s words would have been lost with a permanence rivaling Gilgamesh tablets, now that web.mac.com is defunct). He writes there:

        “About twenty years ago (I am writing this in 2009) I was walking across a windswept, bitterly cold K-Mart parking lot in Ames, Iowa with my brother-in-law, Steve Wiggins, who had recently obtained a Ph.D. in ancient Near Eastern languages from the University of Edinburgh. I was giving him a vague description of a novel I was working on entitled Snow Crash. He pointed me in the direction of an ancient Semitic goddess named Asherah, which ended up exerting a huge influence over the development of the book. In the years since then, Steve has not stopped coming up with interesting things to say about religion, and now he has his own blog, Sects and Violence in the Ancient World.”

        And linked to your blog at the end there.

        Archive.org has snapshots of that page from early 2008 to mid-2012, which coincides with when web.mac.com went offline (it was Apple’s free web host called ‘MobileMe’, which was terminated June 30, 2012). That probably explains the drop in traffic…

        (PS. As an ancient religions scholar, you might appreciate this: archive.org’s Wayback Machine, which snapshots webpages over time, doesn’t have a search engine. You find contents either by knowing a URL or by finding a broken link. As the web ages (“rots” some say), sites with broken links that Wayback Machine has themselves rot, so you can’t find them. In this case, I found someone who loves your blog and who posted a link to a dead copy of web.mac.com: http://jondrowe.tumblr.com/post/4661159934/best-blog-title-ever — and I promise I’m done with web detective-work for the rest of the day.)

        (PPS. Since we’re talking about tech on an ancient religions blog, I might also add that, archive.org’s Wayback Machine doesn’t allow Google et al. to trawl their archives. I think that’s because search engines would eat a lot of archive.org’s bandwidth, since they crawl the web frequently. Archive.org has been raising money for some time to build a custom search engine customized for sites like theirs, which need to be crawled only once since they’re time-stamped and old copies don’t need to be re-crawled, since they don’t change.)

        (I’m not even going to apologize for this n-th post-script, but people are aware of the severe limitations of current web protocols with respect to rot. IPFS (InterPlanetary File System, http://ipfs.io) is one of many young contenders pushing for a new way of doing things, and I like them a lot. (So do Archive.org, they’re working with people like IFS.) I would be happy to mirror your website on IPFS, as soon as I figure out how to group a collection of related content under a unique IPFS ID. Even though the web has rotted so much, it’s still early days for replacements/alternatives, given how well the current web works for “most” people.)

        Like

      • Sophie

        In October 2012, Facebook reached a billion users. Lots of users of social networks don’t venture outside. They are using a different “internet” than the internet of blogs and websites.

        Like

        • Omg, you’re right—imagine the comments left by the wild, roving bands of drive-by trolls if Steve posted these essays on Facebook 😱 I didn’t realize it but one reason I readily come here when I have a moment is the low likelihood of encountering that 💩.

          Liked by 1 person

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.