In his excellent new book Netymology, Tom Chatfield analyzes some of the digital world’s most popular and memorable terms — analyzing and exploring the hidden etymologies and lengthy origin stories of our era’s indispensable words. As a taste, Select All is pleased to present an exclusive excerpt, covering the sometimes occluded older origins of six symbols, terms, and expressions now tied inextricably to computers.
@ (the "at" symbol)
In 1971 a 29-year-old year old computer engineer called Ray Tomlinson created a global emblem when he decided to make the obscure symbol “@” the fulcrum of his new email messaging system. It was a good choice on Tomlinson’s part, being almost unused elsewhere in computer programming, as well as an intuitive fit for sending email to another person “at” a particular domain (email itself had existed before Tomlison’s invention, but only as a means of communication between different users logged into the same system).
Previously, @ had existed largely as an accounting symbol, indicating the price of goods: buying twenty loaves of bread at ten cents each might be written “20 loaves @ 10 ¢”. It was also, however, a far more venerable symbol than Tomlinson probably realised. An instance of @ meaning “at the rate of” is recorded as early as a letter sent in May 1536 by a Florentine merchant called Francesco Lapi, who used it to describe the price of wine.
There’s a clear link, here, between the modern Spanish and Portuguese word for both the @ sign and a unit of weight—arroba—and the container on which this unit of weight was based, the amphora, used by both the ancient Greeks and Romans to transport liquids (and wine in particular). Cheers!
⌘ ("the Apple sign")
Equally ancient and eccentric is the story of Apple’s “command” key, marked by a square with looped corners, or ⌘. Known sometimes as the St John’s Arms, it’s a knot-like heraldic symbol dating back in Scandinavia at least 1,500 years, where it was used to ward off evil spirits and bad luck. A picture stone discovered in a burial site in Havor, Gotland, prominently features the emblem and dates from 400-600 AD. It has also been found carved on everything from houses and cutlery to a pair of 1,000-year-old Finnish skis. It’s still found today on maps and signs in northern and eastern Europe, representing places of historical interest.
How did ⌘ make the leap from mystical inscription to a key of its own? The answer, according to original Macintosh team member Andy Hertzfeld, is graphic designer Susan Kare. In 1983, a software meeting at Apple HQ was interrupted by Steve Jobs, who had discovered that Apple’s own brand symbol appeared next to every single item on an application’s menu.
This was, he declared, “taking the Apple logo in vain!”—an unacceptable excess. Thus the company’s resident bitmap artist, Kare, found herself thumbing through an international dictionary of symbols looking for a fresh sign that was “distinctive, attractive and had at least something to do with the concept of a menu command.” The St John’s Arms fitted the bill—and, one swift bitmap design later, the command key was born.
Trojans and daemons
A “Trojan” is as old as western literature itself, taking its name from the Odyssey and Aeneid’s Trojan horse, used by treacherous Greeks to gain entry to the city of Troy within a supposed gift: appropriate enough given that it describes a malicious program lurking inside a benign-seeming exterior.
But the language of the ancient world has also made a notable contribution to the positive side of modern computing. Since 1963, useful programs that run in the background rather than under a user’s direct control have been known as “daemons.” The term, an alternative spelling of “demon,” dates back to spirits found in Greek mythology. But the particular daemon the programmers who coined the term while working at MIT had in mind was a more modern kind of myth: Maxwell’s demon, an entity invented as a thought experiment in 1867 by the Scottish physicist James Maxwell.
Maxwell imagined his demon using its superhuman powers to move individual molecules around within a container, causing them to violate the second law of thermodynamics. As explained by MIT’s Professor Fernando Corbato (in response to an etymological trivia column in The Austin Chronicle, no less): “Maxwell’s daemon was an imaginary agent which helped sort molecules of different speeds and worked tirelessly in the background. We fancifully began to use the word daemon to describe background processes which worked tirelessly to perform system chores.”
Modern computer daemons tend to look after scheduled tasks on networks, answer and redirect emails automatically, or help configure hardware —hardly the stuff of myths. Better a daemon than a Trojan, though — if even the latter sounds considerably more fun.
Bluetooth
If you’re looking for an obscure icon for the digital age, the tenth-century Danish king Harald Gormsson fits the bill better than most. His connection to the cutting edge becomes considerably clearer once you add his nickname blåtand, or “bluetooth” (earned thanks, it’s rumored, either to an unpleasant gum disease, a dark complexion, a fondness for eating blueberries, or all three). For “bluetooth” was the name given in 1994 by Swedish company Ericsson to a new wireless protocol for exchanging data over short distances—one that would, they hoped, live up to the standards set by its namesake.
In the tenth century, Harald “Bluetooth” Gormsson’s great achievement was the unification of warring Danish tribes under his rule. Similarly, Ericsson’s symbolic hope for its Bluetooth was to unify the “warring” range of protocols for wireless communication into a single, universal standard—something that Bluetooth has since gone a long way to achieving, being found today in over seven billion different devices around the world.
Bluetooth’s logo plays off this tradition, combining the Scandinavian runes Hagall and Bjarkan—King Harald’s initials—to make a single “bind rune.” It also makes a pleasant change from the distinctly conservative naming policy usually found among Scandinavian tech giants, which tend to be dubbed either after their founders (as in the case of Ericsson, named after the nineteenth-century Swedish inventor Lars Magnus Ericsson) or the place of their founding (as in the case of Nokia, named after the small town in south-west Finland).
Marking Up
Hypertext Markup Language, or HTML, is the bread and butter of the world wide web. The term “hypertext” itself was coined as early as 1963 by the American sociologist Ted Nelson; but even this pales in comparison to both the word “markup” and many of the most common terms in online markup languages, which date back not to the first days of digital technology, but to a far earlier transformation: the birth of printing.
Printing with movable type first appeared in Europe in the fifteenth century, and was a laborious process that usually involved hand-written manuscripts being “marked up” with instructions to the printer as to how they should be presented on the page: which words should be in bold, italics, headings, underlined, or set out separately from the main text.
Several of these printer’s terms survive to this day online: from the abbreviation “em” signaling “emphasis” (type in italics) to the use of the tag “strong” to signal bold type. The “chevron” style of bracket within which these terms are enclosed in HTML—“<” and “>”—is, meanwhile, even older than printing, with a name first coined in the fourteenth century based on its apparent resemblance to the rafters of a roof (chevron in Old French).
There’s also a pleasing physicality to many of the behind-the-scenes labels of the modern web. Consider the standard differentiation of verbal elements on a page into the “head” and “body” of a text, for example—a metaphorical division as ancient as they come.
That HTML is based on English words is a historical accident — because its inventor, the co-creator of the world wide web, was the English computer scientist Tim Berners-Lee (even though he was in fact working for CERN in Geneva when he created the web in 1990). And one consequence of the existence of truly global standards like HTML is the universal application of their terms. No matter what country or language a website is based in, the markup terms within it remain the same: that is, English words like “head” and “body” will remain silently present within the encoding of a page, telling every web browser in the world how it ought to look.
Since its first specification in 1990, which contained just eighteen different kinds of digital “tags”—itself a 600-year-old English word of uncertain origin, and which originally referred simply to a “small hanging piece” of something—HTML and its offspring have grown vastly in complexity and terminological technicalities.
There are still etymological riches to be unearthed, however. Even that most familiar of typographical terms, “font,” carries a half-millennium of history with it, deriving ultimately from the Middle French word fondre, “to melt,” thanks to the sixteenth-century need to melt down lead in order to make casts of letters for early printing.
Adapted from Netymology: From Apps to Zombies: A Linguistic Celebration of the Digital World, by Dr. Tom Chatfield (@TomChatfield), published by Quercus US on August 2.