Virtual Words
Latest Publications


TOTAL DOCUMENTS

28
(FIVE YEARS 0)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780195398540, 9780197562826

Author(s):  
Jonathon Keats

Of the many challenges facing tourism in space, one of the least obvious is the problem of intergalactic monetary exchange. Far more pressing to the nascent industry are issues such as extraterrestrial transportation and gravity-free accommodations. Charles Simonyi’s twelve-day trip to the International Space Station in 2007 cost him $25 million, more than the budget of an average family vacation. Yet years before even the most optimistic technophiles expect space tourism to be more than a fifteen-minute suborbital joyride on Virgin Galactic, a currency has been established, initially trading on Travelex for $12.50. It’s called the quid. Quid is an acronym for “quasi-universal intergalactic denomination.” Of course it’s also an appropriation of British slang for the pound sterling, and it is this association with the common term for a familiar item that gives it resonance, an evocative word for a provocative concept. One might have expected the new space money to repurpose the official name of an existing currency. The British and French have preferred that strategy when they’ve colonized other countries, and even Douglas Adams, for all his creativity, fell upon the formula when he coined the Altairian dollar in The Hitchhiker’s Guide to the Galaxy. But colonization robs a place of its exoticism. And if space tourism has any purpose, it’s escapism in extremis. Unlike the pound or the dollar, the quid has no inherent allegiances. The word has also been used at various stages as slang for the shilling, the sovereign, and the guinea, as well as the euro and the old Irish punt. Even the origin is “obscure,” according to the Oxford English Dictionary, which cites a characteristic early use of the word in Thomas Shadwell’s Squire of Alsatia: “Let me equip thee with a Quid.” The 1688 publication date of Shadwell’s play overrules one popular folk etymology, which claims that quid is short for Quidhampton, location of a mill that produced paper money for the Bank of England. The Bank of England wasn’t established until 1694.


Author(s):  
Jonathon Keats

There’s an apocryphal story, still in circulation, that the word OK was made up by President Andrew Jackson. According to the tale, Jackson used the letters when he was a major general in the War of 1812, marking his approval on papers with initials abbreviating the words oll korrect . “The Gen. was never good at spelling,” the Boston Atlas dryly concluded, recounting the story in August 1840. By that time Old Hickory, as Jackson was known, had served his eight years as president, and his successor, Martin Van Buren, was running for a second term. A native of Kinderhook, New York, Van Buren appealed to the Jacksonian vote with the nickname Old Kinderhook, using the initials O. K. as a political slogan. His Whig Party rivals sought, successfully, to turn his populist appeal into a liability by calling attention to Jackson’s alleged semiliteracy. By a sort of logical doggerel endemic in American politics, Old Kinderhook’s slogan became a symbol of his ignorance. The true origin of OK , as the American lexicographer Allen Walker Read skillfully uncovered in 1963, was much closer to the Atlas’s editorial offices. The letters did stand for oll korrect, but the spelling was no accident. The coinage almost certainly came from the waggish editor of the Boston Morning Post , Charles Gordon Greene, who was at the center of what Read characterizes as “a remarkable vogue of using abbreviations” beginning in the year 1838. The Morning Post was full of them, generally used with a touch of irony, as in the mock dignity of O.F.M. (our first men), or a fit of whimsy, as in the pure zaniness of G.T. (gone to Texas). It was only a matter of months before the fad turned to creative misspelling, a source of humor then as it was in Mark Twain’s time. There was N.C. (nuff said) and N.Y. (no yuse), as well as O.W. (oll wright). The first known appearance of OK followed that pattern.


Author(s):  
Jonathon Keats

On the day that the June 2006 issue of Wired magazine was released, the publication’s technology director searched the web for the word crowdsourcing, the subject of an article by contributing writer Jeff Howe. He took a screenshot of what he found, a total of three brief mentions, and forwarded it to the author, advising that Howe save it as a “historical document.” Howe didn’t have to wait long to see history in action. Within nine days Google was returning 182,000 hits. Nor was it a fleeting fad. Three years later the number had multiplied to 1,620,000, with regular appearances in the mainstream media, from the Washington Post to Fox News, where crowdsourcing was averaging two hundred new mentions each month. There’s a simple explanation for the neologism’s success. Howe had detected a trend and given it a word. The backstory, which Howe posted on his personal blog, crowdsourcing.com, supports this notion: In January Wired asked me to give a sort of “reporter’s notebook” style presentation to some executives. I had recently been looking into common threads behind the ways advertising agencies, TV networks and newspapers were leveraging user-generated content, and picked that for my topic. Later that day I called my editor at Wired, Mark Robinson, and told him I thought there was a broader story that other journalists were missing, ie, that users weren’t just making dumbpet-trick movies, but were poised to contribute in significant and measurable ways in a disparate array of industries. In what Howe characterizes as “a fit of back-and-forth wordplay,” he and Robinson came up with a term that riffed off the title of a business book popular at the time, James Surowiecki’s The Wisdom of Crowds, while also suggesting opensource software and corporate outsourcing. When “The Rise of Crowdsourcing” was finally published six months later, the last of these three roots was explicitly evoked in the teaser: “Remember outsourcing? Sending jobs to India and China is so 2003. The new pool of cheap labor: everyday people using their spare cycles to create content, solve problems, even do corporate R & D.”


Author(s):  
Jonathon Keats

“It’s really just complete gibberish,” seethed Larry Ellison when asked about the cloud at a financial analysts’ conference in September 2008. “When is this idiocy going to stop?” By March 2009 the Oracle CEO had answered his own question, in a manner of speaking: in an earnings call to investors, Ellison brazenly peddled Oracle’s own forthcoming software as “cloud-computing ready.” Ellison’s capitulation was inevitable. The cloud is ubiquitous, the catchiest online metaphor since Tim Berners-Lee proposed “a way to link and access information of various kinds” at the European Organization for Nuclear Research (CERN) in 1990 and dubbed his creation the WorldWideWeb. In fact while many specific definitions of cloud computing have been advanced by companies seeking to capitalize on the cloud’s popularity—Dell even attempted to trademark the term, unsuccessfully—the cloud has most broadly come to stand for the web, a metaphor for a metaphor reminding us of how unfathomable our era’s signal invention has become. When Berners-Lee conceived the web his ideas were anything but cloudy. His inspiration was hypertext, developed by the computer pioneer Ted Nelson in the 1960s as a means of explicitly linking wide-ranging information in a nonhierarchical way. Nelson envisioned a “docuverse” which he described as “a unified environment available to everyone providing access to this whole space.” In 1980 Berners-Lee implemented this idea in a rudimentary way with a program called Enquire, which he used to cross-reference the software in CERN’s Proton Synchrotron control room. Over the following decade, machines such as the Proton Synchrotron threatened to swamp CERN with scientific data. Looking forward to the Large Hadron Collider, physicists began voicing concern about how they’d ever process their experiments, let alone productively share results with colleagues. Berners-Lee reckoned that, given wide enough implementation, hypertext might rescue them. He submitted a proposal in March 1989 for an “information mesh” accessible to the several thousand CERN employees. “Vague, but interesting,” his boss replied. Adequately encouraged, Berners-Lee spent the next year and a half struggling to refine his idea, and also to find a suitable name.


Author(s):  
Jonathon Keats

In geological time, the human life span is almost immeasurably brief. The seventeenth-century archbishop James Ussher famously calculated from biblical events that Earth was formed in 4004 BCE; scientists now estimate that the planet is 4.6 billion years old, and that the six millennia since the apocryphal Creation have probably contributed less than 10 millimeters of sediment to the geological record. Geological eras are unfathomable by ordinarily temporal measurements, such as the daily spin of the planet or its annual orbit, leading some scientists to adopt the galactic year—the 250 million terrestrial years it takes our solar system to rotate around the center of the galaxy—as a standard time unit. On that scale, Homo sapiens has been around for less than a week. Yet as the technology to study the planet has improved, so too has the technology to alter it. Earth increasingly disproportionately bears our imprint, as if geological time were being accelerated to the beat of our biological clock, with the consequence that the planet seems increasingly mortal, its legacy and ours entangled. In geological terms we are in the Holocene epoch—a designation formulated from Greek roots meaning “wholly recent,” officially adopted at the 1885 International Geological Congress—and have been in the Holocene for the past ten thousand years. The question, given all that we’ve done to the planet, is whether the label remains valid, or whether we’ve now buried the stratum of our Neolithic ancestors beneath our own rubbish. The atmospheric chemist Paul Crutzen was the first to effectively challenge the conventional geological thinking. In a 2003 interview with New Scientist he recollected the circumstances: “This happened at a meeting three years ago. Someone said something about the Holocene, the geological era covering the period since the end of the last ice age. I suddenly thought this was wrong. In the past 200 years, humans have become a major geological force on the planet. So I said, no, we are not in the Holocene any more: we are in the Anthropocene. I just made up the word on the spur of the moment.


Author(s):  
Jonathon Keats

The only accolade that American chemist Glen T. Seaborg cared for more than winning the Nobel Prize was having an element named in his honor. In 1994 his colleagues gave him that distinction, elevating the Nobel laureate to the status of helium and hydrogen. Over the next fifteen years, six more elements followed seaborgium onto the periodic table, bringing the total to 112. The last, enshrined in 2009, pays homage to Nicolas Copernicus. Unlike Seaborg, Copernicus never sought such a tribute. Having already scored ample name recognition with the Copernican Revolution, he didn’t really need it. If anything, by the time copernicium was recognized as an element, the periodic table needed him. Copernicium is one of twenty elements containing more protons than the ninety-two naturally found in uranium. All twenty are made artificially in laboratories by colliding preexisting elements such as zinc and lead in a particle accelerator or cyclotron. In some ten billion billion bombardments, two protons will fuse to make one atom of a new super-heavy element. Typically the atom is unstable, lasting perhaps a millisecond before decaying into lighter elements again. All of which makes element fabrication a tricky enterprise, nearly as miraculous as alchemy and considerably more contentious. Who synthesized the first atom of an element, and therefore gets to name it? Seaborg’s UC Berkeley laboratory was the only one in the business through the 1940s and 1950s, netting him ten elements, including plutonium, for which he won the 1951 Nobel Prize in Chemistry. By the 1960s, however, there was competition from the Soviets, resulting in the so-called Transfermium Wars. For several decades the periodic table became a political battlefield rather than an intellectual commons. Nothing could have been further from the table’s Enlightenment origins. The product of empirical research and intended to disseminate universal knowledge, a table of presumed elements was first published by the French chemist Antoine Lavoisier in 1789, arranging thirty-three substances, including silver and sulfur and phosphorus, based on observed attributes (such as “Oxydable and Acidifiable simple Metallic Bodies”) rather than according to philosophical precepts.


Author(s):  
Jonathon Keats

The snigger point, or note of cachinnation, was invented by Ambrose Bierce in 1887. He proposed the new typographic symbol as “an improvement in punctuation,” explaining in an essay that “it is written thus ︶ and represents, as nearly as may be, a smiling mouth. It is to be appended, with the full stop, to every jocular or ironical sentence; or, without the stop, to every jocular or ironical clause of a sentence otherwise serious.” Recommended to humorless colleagues who had no trouble recognizing his sarcasm, the snigger point, or note of cachinnation, never caught on. Similar suggestions have since been advanced, independently, with different motivations. In 1899 the French writer Alcanter de Brahm earnestly proposed that a backward question mark be used in print as a point d’ironie , an idea that Alfred Jarry fervently endorsed two years later, though both the irony mark and its creator faded into obscurity shortly thereafter. And in 1967 Reader’s Digest ran a short item by the Baltimore Sunday Sun correspondent Ralph Reppert, whose Aunt Ev seasoned her family letters with the symbol –) representing “her tongue stuck in her cheek,” an idea recirculated on an ARPANET mailing list in 1979 in a proposal to counteract “the loss of meaning in this medium [due to] the lack of tone, gestures, facial expressions, etc.” The ARPANET community never embraced Aunt Ev’s innovation. But three years later a slightly different icon was rapidly and permanently adopted. The emblem was first playfully circulated on the Carnegie Mellon University bulletin board system by a computer scientist, Scott Fahlman: . . . 19-Sep-82 11:44 Scott E Fahlman :-) From: Scott E Fahlman <Fahlman at Cmu-20c> I propose that the following character sequence for joke markers: :-) Read it sideways. Actually, it is probably more economical to mark things that are NOT jokes, given current trends. For this, use :-( . . . The :-( symbol was enthusiastically taken up together with the :-), though not in the way Fahlman intended.


Author(s):  
Jonathon Keats

The origin of the mashup is a matter of debate. According to one theory, the phenomenon began in 2001 with the XFM radio broadcast of the song “Stroke of Genius,” a bootleg remix by the deejay Freelance Hellraiser that incongruously set the pop vocals of Christina Aguilera’s “Genie in a Bottle” against garage rock instrumentals from the Strokes’ “Hard to Explain.” A competing hypothesis credits the culture-jamming Evolution Control Committee, which in 1993 satirically layered the brutal rap lyrics of Public Enemy over swinging Latin arrangements of Herb Alpert and the Tijuana Brass. Other theories cite Club House’s 1983 medley of Steely Dan’s “Do It Again” and Michael Jackson’s “Billy Jean,” Frank Zappa’s ’70s experiments in xenochrony, King Tubby’s ’60s dub remixes, John Cage’s ’50s compositions for a chorus of radios, and even the Renaissance practice of quodlibet. Although some of these may have been influential—and all are reminders of the role remixing has forever played in the creative process—this long tail of influences scarcely anticipates the explosion of songs combining vocals from one source with instrumentals from another following the Freelance Hellraiser’s XFM debut. In a matter of months mashups numbered in the thousands, with juxtapositions including Missy Elliott vs. the Cure, Art Garfunkel vs. Watership Down, and Whitney Houston vs. Kraftwork. Evoking a wrestling match, A vs. B became the standard formula for citing sources, generally in parentheses following a title playing on names of the original songs. (For instance, “Smells Like Teen Booty” was a mashup of Nirvana’s “Smells Like Teen Spirit” with “Bootylicious” by Destiny’s Child.) The sounds of these remixes were as varied as the source materials, and the motivations were as disparate as the historical influences, with intended targets ranging from dance club entertainment to cultural critique. What these works shared, and have in common with the countless additional musical (and video) mashups that have since joined them, is the notion that culture is interactive, a feedback loop rather than a mail chute. Whether done in tribute or ridicule, or simply to create something beautiful, these songs mash up the standard distinction between consumer and producer.


Author(s):  
Jonathon Keats

At the Massachusetts Institute of Technology in the 1950s, some of the brightest students seldom attended classes. Instead they loitered around the Tech Model Railroad Club. The most brilliant were tapped to join the Signals and Power Committee, which rigged ever more elaborate systems of programmable track switches using nothing more sophisticated than telephone relays. Taking pride in their ad hoc wiring, which ignored all conventions of electrical engineering, they referred to themselves as hackers. Nothing was impossible for them; nothing was off limits. When MIT acquired its first computer in 1956, they infiltrated the control room, where they coerced the electronics to do tricks unintended by the manufacturer, using sine and cosine routines to code the first digital computer game. As computers became more common, so did hacking. To program home computers in the 1970s no longer required the imaginative genius of the MIT Signals and Power Committee, and by the 1980s self-professed hackers ranged from professional software developers to adolescent cyberpunks. The latter proved considerably more interesting to the public, riveted by their ability to torment corporations and governments from their bedrooms. “A hacker—computer jargon for an electronic eavesdropper who by-passes computer security systems—yesterday penetrated a confidential British Telecom message system being demonstrated live on BBC-TV,” reported the Daily Telegraph in a typical news story of 1983, the year that War Games hit the big screen. Old-school Signals and Power hackers fought valiantly against this linguistic turn, insisting that the young punks were crackers rather than hackers, but the media ignored the distinction, leading most new-school professionals to head off confusion by blandly presenting themselves as computer scientists or software engineers or information technology specialists. Aside from the occasional insider reference—ITs who troubleshoot security systems were sometimes known as “white-hat” hackers— the criminal connotation seemed permanent. Then in 2004 a Silicon Valley technology writer named Danny O’Brien gave a forty-five-minute lecture at the O’Reilly Emerging Technology Conference titled “Life Hacks: Tech Secrets of Overprolific Alpha Geeks.” Within a year technophiles worldwide, from computer scientists to iPhone addicts, were striving to become hackers again.


Author(s):  
Jonathon Keats

The word robot first appeared in print in 1920, forty-one years before robotics became an industrial reality. Derived from the Czech term robota, meaning “forced labor,” the name was given to the automata in R.U.R., a play by Karel Čapek in which machines manufactured by humans eradicate their creators. When the play traveled to the United States in 1922, the New York Times called it “a Czecho-Slovak Frankenstein.” Isaac Asimov was somewhat less charitable in a 1979 essay: “Capek’s play is, in my own opinion, a terribly bad one, but it is immortal for that one word. It contributed the word ‘robot’ not only to English but, through English, to all the languages in which science fiction is now written.” Asimov was himself one of the foremost authors of this genre, coining the word robotics in a 1941 story, and nine years later formulating the Three Laws of Robotics, the first code of conduct for machines. Those laws have immeasurably influenced real-world engineers in the decades since the first working robot, the four-thousand-pound Unimate, was installed in a General Motors plant in 1961, just one example of how Čapek’s immortal word, freed of its trite theatrical frame, has profoundly impacted the evolution of technology. The fate of R.U.R. stimulates a provocative question: Can an effective work of science fiction be written in a single word? At least one seems worthy of consideration. That word is spime. Spime was coined by Bruce Sterling, a Hugo Award–winning author of numerous sci-fi novels that have helped to define cyberpunk. Many of those novels, such as Heavy Weather and Holy Fire, are set in the near future, presenting dystopic visions of what our world might become if we continue to behave as irresponsibly as we have in the past. Heavy Weather, for instance, is a story of a globally warmed environment ravaged by tornadoes, one of which threatens to devastate the planet unless “hacked” by a cyberpunk Storm Troupe.


Sign in / Sign up

Export Citation Format

Share Document