Книга - The Information: A History, a Theory, a Flood

a
A

The Information: A History, a Theory, a Flood
James Gleick


Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing.We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code.In ‘The Information’ James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the ‘bit’, it is a fascinating account of the modern age’s defining idea and a brilliant exploration of how information has revolutionised our lives.







THE

INFORMATION

A History

A Theory

A Flood

JAMES GLEICK







Dedication

FOR CYNTHIA


Epigraph

Anyway, those tickets, the old ones, they didn’t tell you where you were going, much less where you came from. He couldn’t remember seeing any dates on them, either, and there was certainly no mention of time. It was all different now, of course. All this information. Archie wondered why that was.

— Zadie Smith

What we call the past is built on bits.

— John Archibald Wheeler




Contents


Title Page (#ud5fe8eac-23da-52ac-8210-60068964516b)

Dedication

Epigraph

Prologue



Chapter 1 – Drums That Talk

Chapter 2 – The Persistence of the Word

Chapter 3 – Two Wordbooks

Chapter 4 – To Throw the Powers of Thought into Wheel-Work

Chapter 5 – A Nervous System for the Earth

Chapter 6 – New Wires, New Logic

Chapter 7 – Information Theory

Chapter 8 – The Informational Turn

Chapter 9 – Entropy and Its Demons

Chapter 10 – Life’s Own Code

Chapter 11 – Into the Meme Pool

Chapter 12 – The Sense of Randomness

Chapter 13 – Information Is Physical

Chapter 14 – After the Flood

Chapter 15 – New News Every Day



Epilogue

Acknowledgments

Notes

Bibliography

Index

Illustration Credits

Also by James Gleick

Copyright

About the Publisher (#litres_trial_promo)


Prologue

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning.

—Claude Shannon (1948)

AFTER 1948, which was the crucial year, people thought they could see the clear purpose that inspired Claude Shannon’s work, but that was hindsight. He saw it differently: My mind wanders around, and I conceive of different things day and night. Like a science-fiction writer, I’m thinking, “What if it were like this?”

As it happened, 1948 was when the Bell Telephone Laboratories announced the invention of a tiny electronic semiconductor, “an amazingly simple device” that could do anything a vacuum tube could do and more efficiently. It was a crystalline sliver, so small that a hundred would fit in the palm of a hand. In May, scientists formed a committee to come up with a name, and the committee passed out paper ballots to senior engineers in Murray Hill, New Jersey, listing some choices: semiconductor triode . . . iotatron . . . transistor (a hybrid of varistor and transconductance). Transistor won out. “It may have far-reaching significance in electronics and electrical communication,” Bell Labs declared in a press release, and for once the reality surpassed the hype. The transistor sparked the revolution in electronics, setting the technology on its path of miniaturization and ubiquity, and soon won the Nobel Prize for its three chief inventors. For the laboratory it was the jewel in the crown. But it was only the second most significant development of that year. The transistor was only hardware.

An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand—“A Mathematical Theory of Communication”—and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year-old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity—a fundamental unit of measure.

But measuring what? “A unit for measuring information,” Shannon wrote, as though there were such a thing, measurable and quantifiable, as information.

Shannon supposedly belonged to the Bell Labs mathematical research group, but he mostly kept to himself. When the group left the New York headquarters for shiny new space in the New Jersey suburbs, he stayed behind, haunting a cubbyhole in the old building, a twelve-story sandy brick hulk on West Street, its industrial back to the Hudson River, its front facing the edge of Greenwich Village. He disliked commuting, and he liked the downtown neighborhood, where he could hear jazz clarinetists in late-night clubs. He was flirting shyly with a young woman who worked in Bell Labs’ microwave research group in the two-story former Nabisco factory across the street. People considered him a smart young man. Fresh from MIT he had plunged into the laboratory’s war work, first developing an automatic fire-control director for antiaircraft guns, then focusing on the theoretical underpinnings of secret communication—cryptography—and working out a mathematical proof of the security of the so-called X System, the telephone hotline between Winston Churchill and President Roosevelt. So now his managers were willing to leave him alone, even though they did not understand exactly what he was working on.

AT&T at midcentury did not demand instant gratification from its research division. It allowed detours into mathematics or astrophysics with no apparent commercial purpose. Anyway so much of modern science bore directly or indirectly on the company’s mission, which was vast, monopolistic, and almost all-encompassing. Still, broad as it was, the telephone company’s core subject matter remained just out of focus. By 1948 more than 125 million conversations passed daily through the Bell System’s 138 million miles of cable and 31 million telephone sets. The Bureau of the Census reported these facts under the rubric of “Communications in the United States,” but they were crude measures of communication. The census also counted several thousand broadcasting stations for radio and a few dozen for television, along with newspapers, books, pamphlets, and the mail. The post office counted its letters and parcels, but what, exactly, did the Bell System carry, counted in what units? Not conversations, surely; nor words, nor certainly characters. Perhaps it was just electricity. The company’s engineers were electrical engineers. Everyone understood that electricity served as a surrogate for sound, the sound of the human voice, waves in the air entering the telephone mouthpiece and converted into electrical waveforms. This conversion was the essence of the telephone’s advance over the telegraph—the predecessor technology, already seeming so quaint. Telegraphy relied on a different sort of conversion: a code of dots and dashes, not based on sounds at all but on the written alphabet, which was, after all, a code in its turn. Indeed, considering the matter closely, one could see a chain of abstraction and conversion: the dots and dashes representing letters of the alphabet; the letters representing sounds, and in combination forming words; the words representing some ultimate substrate of meaning, perhaps best left to philosophers.

The Bell System had none of those, but the company had hired its first mathematician in 1897: George Campbell, a Minnesotan who had studied in Göttingen and Vienna. He immediately confronted a crippling problem of early telephone transmission. Signals were distorted as they passed across the circuits; the greater the distance, the worse the distortion. Campbell’s solution was partly mathematics and partly electrical engineering. His employers learned not to worry much about the distinction. Shannon himself, as a student, had never been quite able to decide whether to become an engineer or a mathematician. For Bell Labs he was both, willy-nilly, practical about circuits and relays but happiest in a realm of symbolic abstraction. Most communications engineers focused their expertise on physical problems, amplification and modulation, phase distortion and signal-to-noise degradation. Shannon liked games and puzzles. Secret codes entranced him, beginning when he was a boy reading Edgar Allan Poe. He gathered threads like a magpie. As a first-year research assistant at MIT, he worked on a hundred-ton proto-computer, Vannevar Bush’s Differential Analyzer, which could solve equations with great rotating gears, shafts, and wheels. At twenty-two he wrote a dissertation that applied a nineteenth-century idea, George Boole’s algebra of logic, to the design of electrical circuits. (Logic and electricity—a peculiar combination.) Later he worked with the mathematician and logician Hermann Weyl, who taught him what a theory was: “Theories permit consciousness to ‘jump over its own shadow,’ to leave behind the given, to represent the transcendent, yet, as is self-evident, only in symbols.”

In 1943 the English mathematician and code breaker Alan Turing visited Bell Labs on a cryptographic mission and met Shannon sometimes over lunch, where they traded speculation on the future of artificial thinking machines. (“Shannon wants to feed not just data to a Brain, but cultural things!” Turing exclaimed. “He wants to play music to it!”) Shannon also crossed paths with Norbert Wiener, who had taught him at MIT and by 1948 was proposing a new discipline to be called “cybernetics,” the study of communication and control. Meanwhile Shannon began paying special attention to television signals, from a peculiar point of view: wondering whether their content could be somehow compacted or compressed to allow for faster transmission. Logic and circuits cross-bred to make a new, hybrid thing; so did codes and genes. In his solitary way, seeking a framework to connect his many threads, Shannon began assembling a theory for information.

The raw material lay all around, glistening and buzzing in the landscape of the early twentieth century, letters and messages, sounds and images, news and instructions, figures and facts, signals and signs: a hodgepodge of related species. They were on the move, by post or wire or electromagnetic wave. But no one word denoted all that stuff. “Off and on,” Shannon wrote to Vannevar Bush at MIT in 1939, “I have been working on an analysis of some of the fundamental properties of general systems for the transmission of intelligence.” Intelligence: that was a flexible term, very old. “Nowe used for an elegant worde,” Sir Thomas Elyot wrote in the sixteenth century, “where there is mutuall treaties or appoyntementes, eyther by letters or message.” It had taken on other meanings, though. A few engineers, especially in the telephone labs, began speaking of information. They used the word in a way suggesting something technical: quantity of information, or measure of information. Shannon adopted this usage.

For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague—force, mass, motion, and even time—and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature.

It was the same with information. A rite of purification became necessary.

And then, when it was made simple, distilled, counted in bits, information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age. “Man the food-gatherer reappears incongruously as information-gatherer,” remarked Marshall McLuhan in 1967.


(#ulink_a7c10e1d-1d97-5431-837d-47ab39c92bb3) He wrote this an instant too soon, in the first dawn of computation and cyberspace.

We can see now that information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions. . . . If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

“The information circle becomes the unit of life,” says Werner Loewenstein after thirty years spent studying intercellular communication. He reminds us that information means something deeper now: “It connotes a cosmic principle of organization and order, and it provides an exact measure of that.” The gene has its cultural analog, too: the meme. In cultural evolution, a meme is a replicator and propagator—an idea, a fashion, a chain letter, or a conspiracy theory. On a bad day, a meme is a virus.

Economics is recognizing itself as an information science, now that money itself is completing a developmental arc from matter to bits, stored in computer memory and magnetic strips, world finance coursing through the global nervous system. Even when money seemed to be material treasure, heavy in pockets and ships’ holds and bank vaults, it always was information. Coins and notes, shekels and cowries were all just short-lived technologies for tokenizing information about who owns what.

And atoms? Matter has its own coinage, and the hardest science of all, physics, seemed to have reached maturity. But physics, too, finds itself sideswiped by a new intellectual model. In the years after World War II, the heyday of the physicists, the great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists focused their prestige and resources on the search for fundamental particles and the laws governing their interaction, the construction of giant accelerators and the discovery of quarks and gluons. From this exalted enterprise, the business of communications research could not have appeared further removed. At Bell Labs, Claude Shannon was not thinking about physics. Particle physicists did not need bits.

And then, all at once, they did. Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different sort: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.” This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine.

A key to the enigma is a type of relationship that had no place in classical physics: the phenomenon known as entanglement. When particles or quantum systems are entangled, their properties remain correlated across vast distances and vast times. Light-years apart, they share something that is physical, yet not only physical. Spooky paradoxes arise, unresolvable until one understands how entanglement encodes information, measured in bits or their drolly named quantum counterpart, qubits. When photons and electrons and other particles interact, what are they really doing? Exchanging bits, transmitting quantum states, processing information. The laws of physics are the algorithms. Every burning star, every silent nebula, every particle leaving its ghostly trace in a cloud chamber is an information processor. The universe computes its own destiny.

How much does it compute? How fast? How big is its total information capacity, its memory space? What is the link between energy and information; what is the energy cost of flipping a bit? These are hard questions, but they are not as mystical or metaphorical as they sound. Physicists and quantum information theorists, a new breed, struggle with them together. They do the math and produce tentative answers. (“The bit count of the cosmos, however it is figured, is ten raised to a very large power,” according to Wheeler. According to Seth Lloyd: “No more than 10


ops on 10


bits.”) They look anew at the mysteries of thermodynamic entropy and at those notorious information swallowers, black holes. “Tomorrow,” Wheeler declares, “we will have learned to understand and express all of physics in the language of information.”

As the role of information grows beyond anyone’s reckoning, it grows to be too much. “TMI,” people now say. We have information fatigue, anxiety, and glut. We have met the Devil of Information Overload and his impish underlings, the computer virus, the busy signal, the dead link, and the PowerPoint presentation. All this, too, is due in its roundabout way to Shannon. Everything changed so quickly. John Robinson Pierce (the Bell Labs engineer who had come up with the word transistor) mused afterward: “It is hard to picture the world before Shannon as it seemed to those who lived in it. It is difficult to recover innocence, ignorance, and lack of understanding.”

Yet the past does come back into focus. In the beginning was the word, according to John. We are the species that named itself Homo sapiens, the one who knows—and then, after reflection, amended that to Homosapiens sapiens. The greatest gift of Prometheus to humanity was not fire after all: “Numbers, too, chiefest of sciences, I invented for them, and the combining of letters, creative mother of the Muses’ arts, with which to hold all things in memory.” The alphabet was a founding technology of information. The telephone, the fax machine, the calculator, and, ultimately, the computer are only the latest innovations devised for saving, manipulating, and communicating knowledge. Our culture has absorbed a working vocabulary for these useful inventions. We speak of compressing data, aware that this is quite different from compressing a gas. We know about streaming information, parsing it, sorting it, matching it, and filtering it. Our furniture includes iPods and plasma displays, our skills include texting and Googling, we are endowed, we are expert, so we see information in the foreground. But it has always been there. It pervaded our ancestors’ world, too, taking forms from solid to ethereal, granite gravestones and the whispers of courtiers. The punched card, the cash register, the nineteenth-century Difference Engine, the wires of telegraphy all played their parts in weaving the spiderweb of information to which we cling. Each new information technology, in its own time, set off blooms in storage and transmission. From the printing press came new species of information organizers: dictionaries, cyclopaedias, almanacs—compendiums of words, classifiers of facts, trees of knowledge. Hardly any information technology goes obsolete. Each new one throws its predecessors into relief. Thus Thomas Hobbes, in the seventeenth century, resisted his era’s new-media hype: “The invention of printing, though ingenious, compared with the invention of letters is no great matter.” Up to a point, he was right. Every new medium transforms the nature of human thought. In the long run, history is the story of information becoming aware of itself.

Some information technologies were appreciated in their own time, but others were not. One that was sorely misunderstood was the African talking drum.




(#ulink_8feb27ed-2ab4-5568-ba3f-c917b1e4bff4)And added drily: “In this role, electronic man is no less a nomad than his Paleolithic ancestors.”


Chapter One

Drums That Talk (When a Code Is Not a Code)

Across the Dark Continent sound the never-silent drums:the base of all the music, the focus of every dance;the talking drums, the wireless of the unmapped jungle.

—Irma Wassall (1943)

NO ONE SPOKE SIMPLY ON THE DRUMS. Drummers would not say, “Come back home,” but rather,

Make your feet come back the way they went,

make your legs come back the way they went,

plant your feet and your legs below,

in the village which belongs to us.

They could not just say “corpse” but would elaborate: “which lies on its back on clods of earth.” Instead of “don’t be afraid,” they would say, “Bring your heart back down out of your mouth, your heart out of your mouth, get it back down from there.” The drums generated fountains of oratory. This seemed inefficient. Was it grandiloquence or bombast? Or something else?

For a long time Europeans in sub-Saharan Africa had no idea. In fact they had no idea that the drums conveyed information at all. In their own cultures, in special cases a drum could be an instrument of signaling, along with the bugle and the bell, used to transmit a small set of messages: attack;retreat; come to church. But they could not conceive of talking drums. In 1730 Francis Moore sailed eastward up the Gambia River, finding it navigable for six hundred miles, all the way admiring the beauty of the country and such curious wonders as “oysters that grew upon trees” (mangroves). He was not much of a naturalist. He was reconnoitering as an agent for English slavers in kingdoms inhabited, as he saw it, by different races of people of black or tawny colors, “as Mundingoes, Jolloiffs, Pholeys, Floops, and Portuguese.” When he came upon men and women carrying drums, carved wood as much as a yard long, tapered from top to bottom, he noted that women danced briskly to their music, and sometimes that the drums were “beat on the approach of an enemy,” and finally, “on some very extraordinary occasions,” that the drums summoned help from neighboring towns. But that was all he noticed.

A century later, Captain William Allen, on an expedition to the Niger River,


(#ulink_19985ea2-b383-5d5d-b54e-1eb307bbf884) made a further discovery, by virtue of paying attention to his Cameroon pilot, whom he called Glasgow. They were in the cabin of the iron paddle ship when, as Allen recalled:

Suddenly he became totally abstracted, and remained for a while in the attitude of listening. On being taxed with inattention, he said, “You no hear my son speak?” As we had heard no voice, he was asked how he knew it. He said, “Drum speak me, tell me come up deck.” This seemed to be very singular.

The captain’s skepticism gave way to amazement, as Glasgow convinced him that every village had this “facility of musical correspondence.” Hard though it was to believe, the captain finally accepted that detailed messages of many sentences could be conveyed across miles. “We are often surprised,” he wrote, “to find the sound of the trumpet so well understood in our military evolutions; but how far short that falls of the result arrived at by those untutored savages.” That result was a technology much sought in Europe: long-distance communication faster than any traveler on foot or horseback. Through the still night air over a river, the thump of the drum could carry six or seven miles. Relayed from village to village, messages could rumble a hundred miles or more in a matter of an hour.

A birth announcement in Bolenge, a village of the Belgian Congo, went like this:

Batoko fala fala, tokema bolo bolo, boseka woliana imaki tonkilingonda, ale nda bobila wa fole fole, asokoka l’isika koke koke.

The mats are rolled up, we feel strong, a woman came from the forest, she is in the open village, that is enough for this time.

A missionary, Roger T. Clarke, transcribed this call to a fisherman’s funeral:

La nkesa laa mpombolo, tofolange benteke biesala, tolanga bonteke bolokolo bole nda elinga l’enjale baenga, basaki l’okala bopele pele. Bojende bosalaki lifeta Bolenge wa kala kala, tekendake tonkilingonda, tekendake beningo la nkaka elinga l’enjale. Tolanga bonteke bolokolo bole nda elinga l’enjale, la nkesa la mpombolo.

In the morning at dawn, we do not want gatherings for work, we want a meeting of play on the river. Men who live in Bolenge, do not go to the forest, do not go fishing. We want a meeting of play on the river, in the morning at dawn.

Clarke noted several facts. While only some people learned to communicate by drum, almost anyone could understand the messages in the drumbeats. Some people drummed rapidly and some slowly. Set phrases would recur again and again, virtually unchanged, yet different drummers would send the same message with different wording. Clarke decided that the drum language was at once formulaic and fluid. “The signals represent the tones of the syllables of conventional phrases of a traditional and highly poetic character,” he concluded, and this was correct, but he could not take the last step toward understanding why.

These Europeans spoke of “the native mind” and described Africans as “primitive” and “animistic” and nonetheless came to see that they had achieved an ancient dream of every human culture. Here was a messaging system that outpaced the best couriers, the fastest horses on good roads with way stations and relays. Earth-bound, foot-based messaging systems always disappointed. Their armies outran them. Julius Caesar, for example, was “very often arriving before the messengers sent to announce his coming,” as Suetonius reported in the first century. The ancients were not without resources, however. The Greeks used fire beacons at the time of the Trojan War, in the twelfth century BCE, by all accounts—that is, those of Homer, Virgil, and Aeschylus. A bonfire on a mountaintop could be seen from watchtowers twenty miles distant, or in special cases even farther. In the Aeschylus version, Clytemnestra gets the news of the fall of Troy that very night, four hundred miles away in Mycenae. “Yet who so swift could speed the message here?” the skeptical Chorus asks.

She credits Hephaestus, god of fire: “Sent forth his sign; and on, and ever on, beacon to beacon sped the courier-flame.” This is no small accomplishment, and the listener needs convincing, so Aeschylus has Clytemnestra continue for several minutes with every detail of the route: the blazing signal rose from Mount Ida, carried across the northern Aegean Sea to the island of Lemnos; from there to Mount Athos in Macedonia; then southward across plains and lakes to Macistus; Messapius, where the watcher “saw the far flame gleam on Euripus’ tide, and from the high-piled heap of withered furze lit the new sign and bade the message on”; Cithaeron; Aegiplanetus; and her own town’s mountain watch, Arachne. “So sped from stage to stage, fulfilled in turn, flame after flame,” she boasts, “along the course ordained.” A German historian, Richard Hennig, traced and measured the route in 1908 and confirmed the feasibility of this chain of bonfires. The meaning of the message had, of course, to be pre arranged, effectively condensed into a single bit. A binary choice, something or nothing: the fire signal meant something, which, just this once, meant “Troy has fallen.” To transmit this one bit required immense planning, labor, watchfulness, and firewood. Many years later, lanterns in Old North Church likewise sent Paul Revere a single precious bit, which he carried onward, one binary choice: by land or by sea.

More capacity was required, for less extraordinary occasions. People tried flags, horns, intermitting smoke, and flashing mirrors. They conjured spirits and angels for purposes of communication—angels being divine messengers, by definition. The discovery of magnetism held particular promise. In a world already suffused with magic, magnets embodied occult powers. The lodestone attracts iron. This power of attraction extends invisibly through the air. Nor is it interrupted by water or even solid bodies. A lodestone held on one side of a wall can move a piece of iron on the other side. Most intriguing, the magnetic power appears able to coordinate objects vast distances apart, across the whole earth: namely, compass needles. What if one needle could control another? This idea spread—a “conceit,” Thomas Browne wrote in the 1640s,

whispered thorow the world with some attention, credulous and vulgar auditors readily believing it, and more judicious and distinctive heads, not altogether rejecting it. The conceit is excellent, and if the effect would follow, somewhat divine; whereby we might communicate like spirits, and confer on earth with Menippus in the Moon.

The idea of “sympathetic” needles appeared wherever there were natural philosophers and confidence artists. In Italy a man tried to sell Galileo “a secret method of communicating with a person two or three thousand miles away, by means of a certain sympathy of magnetic needles.”

I told him that I would gladly buy, but wanted to see by experiment and that it would be enough for me if he would stand in one room and I in another. He replied that its operation could not be detected at such a short distance. I sent him on his way, with the remark that I was not in the mood at that time to go to Cairo or Moscow for the experiment, but that if he wanted to go I would stay in Venice and take care of the other end.

The idea was that if a pair of needles were magnetized together—“touched with the same Loadstone,” as Browne put it—they would remain in sympathy from then on, even when separated by distance. One might call this “entanglement.” A sender and a recipient would take the needles and agree on a time to communicate. They would place their needle in disks with the letters of the alphabet spaced around the rim. The sender would spell out a message by turning the needle. “For then, saith tradition,” Browne explained, “at what distance of place soever, when one needle shall be removed unto any letter, the other by a wonderfull sympathy will move unto the same.” Unlike most people who considered the idea of sympathetic needles, however, Browne actually tried the experiment. It did not work. When he turned one needle, the other stood still.

Browne did not go so far as to rule out the possibility that this mysterious force could someday be used for communication, but he added one more caveat. Even if magnetic communication at a distance was possible, he suggested, a problem might arise when sender and receiver tried to synchronize their actions. How would they know the time,

it being no ordinary or Almanack business, but a probleme Mathematical, to finde out the difference of hours in different places; nor do the wisest exactly satisfy themselves in all. For the hours of several places anticipate each other, according to their Longitudes; which are not exactly discovered of every place.

This was a prescient thought, and entirely theoretical, a product of new seventeenth-century knowledge of astronomy and geography. It was the first crack in the hitherto solid assumption of simultaneity. Anyway, as Browne noted, experts differed. Two more centuries would pass before anyone could actually travel fast enough, or communicate fast enough, to experience local time differences. For now, in fact, no one in the world could communicate as much, as fast, as far as unlettered Africans with their drums.

——

By the time Captain Allen discovered the talking drums in 1841, Samuel F. B. Morse was struggling with his own percussive code, the electromagnetic drumbeat designed to pulse along the telegraph wire. Inventing a code was a complex and delicate problem. He did not even think in terms of a code, at first, but “a system of signs for letters, to be indicated and marked by a quick succession of strokes or shocks of the galvanic current.” The annals of invention offered scarcely any precedent. How to convert information from one form, the everyday language, into another form suitable for transmission by wire taxed his ingenuity more than any mechanical problem of the telegraph. It is fitting that history attached Morse’s name to his code, more than to his device.

He had at hand a technology that seemed to allow only crude pulses, bursts of current on and off, an electrical circuit closing and opening. How could he convey language through the clicking of an electromagnet? His first idea was to send numbers, a digit at a time, with dots and pauses. The sequence ••• •• ••••• would mean 325. Every English word would be assigned a number, and the telegraphists at each end of the line would look them up in a special dictionary. Morse set about creating this dictionary himself, wasting many hours inscribing it on large folios.


(#ulink_1ae35e78-3949-526d-bade-e798f41314be) He claimed the idea in his first telegraph patent, in 1840:

The dictionary or vocabulary consists of words alphabetically arranged and regularly numbered, beginning with the letters of the alphabet, so that each word in the language has its telegraphic number, and is designated at pleasure, through the signs of numerals.

Seeking efficiency, he weighed the costs and possibilities across several intersecting planes. There was the cost of transmission itself: the wires would be expensive and would convey only so many pulses per minute. Numbers would be relatively easy to transmit. But then there was the extra cost in time and difficulty for the telegraphists. The idea of code books—lookup tables—still had possibilities, and it echoed into the future, arising again in other technologies. Eventually it worked for Chinese telegraphy. But Morse realized that it would be hopelessly cumbersome for operators to page through a dictionary for every word.

His protégé Alfred Vail, meanwhile, was developing a simple lever key by which an operator could rapidly close and open the electric circuit. Vail and Morse turned to the idea of a coded alphabet, using signs as surrogates for the letters and thus spelling out every word. Somehow the bare signs would have to stand in for all the words of the spoken or written language. They had to map the entire language onto a single dimension of pulses. At first they conceived of a system built on two elements: the clicks (now called dots) and the spaces in between. Then, as they fiddled with the prototype keypad, they came up with a third sign: the line or dash, “when the circuit was closed a longer time than was necessary to make a dot.” (The code became known as the dot-and-dash alphabet, but the unmentioned space remained just as important; Morse code was not a binary language.


(#ulink_b24155f7-3e1e-5edd-963e-fffd001f9d04)) That humans could learn this new language was, at first, wondrous. They would have to master the coding system and then perform a continuous act of double translation: language to signs; mind to fingers. One witness was amazed at how the telegraphists internalized these skills:

The clerks who attend at the recording instrument become so expert in their curious hieroglyphics, that they do not need to look at the printed record to know what the message under reception is; the recording instrument has for them an intelligible articulate language. They understand its speech. They can close their eyes and listen to the strange clicking that is going on close to their ear whilst the printing is in progress, and at once say what it all means.

In the name of speed, Morse and Vail had realized that they could save strokes by reserving the shorter sequences of dots and dashes for the most common letters. But which letters would be used most often? Little was known about the alphabet’s statistics. In search of data on the letters’ relative frequencies, Vail was inspired to visit the local newspaper office in Morristown, New Jersey, and look over the type cases. He found a stock of twelve thousand E’s, nine thousand T’s, and only two hundred Z’s. He and Morse rearranged the alphabet accordingly. They had originally used dash-dash-dot to represent T, the second most common letter; now they promoted T to a single dash, thus saving telegraph operators uncountable billions of key taps in the world to come. Long afterward, information theorists calculated that they had come within 15 percent of an optimal arrangement for telegraphing English text.

No such science, no such pragmatism informed the language of the drums. Yet there had been a problem to solve, just as there was in the design of a code for telegraphers: how to map an entire language onto a one-dimensional stream of the barest sounds. This design problem was solved collectively by generations of drummers in a centuries-long process of social evolution. By the early twentieth century the analogy to the telegraph was apparent to Europeans studying Africa. “Only a few days ago I read in the Times,” Captain Robert Sutherland Rattray reported to the Royal African Society in London, “how a resident in one part of Africa heard of the death—in another and far remote part of the continent—of a European baby, and how this news was carried by means of drums, which were used, it was stated, ‘on the Morse principle’—it is always ‘the Morse principle.’ ”

But the obvious analogy led people astray. They failed to decipher the code of the drums because, in effect, there was no code. Morse had boot-strapped his system from a middle symbolic layer, the written alphabet, intermediate between speech and his final code. His dots and dashes had no direct connection to sound; they represented letters, which formed written words, which represented the spoken words in turn. The drummers could not build on an intermediate code—they could not abstract through a layer of symbols—because the African languages, like all but a few dozen of the six thousand languages spoken in the modern world, lacked an alphabet. The drums metamorphosed speech.

It fell to John F. Carrington to explain. An English missionary, born in 1914 in Northamptonshire, Carrington left for Africa at the age of twenty-four and Africa became his lifetime home. The drums caught his attention early, as he traveled from the Baptist Missionary Society station in Yakusu, on the Upper Congo River, through the villages of the Bambole forest. One day he made an impromptu trip to the small town of Yaongama and was surprised to find a teacher, medical assistant, and church members already assembled for his arrival. They had heard the drums, they explained. Eventually he realized that the drums conveyed not just announcements and warnings but prayers, poetry, and even jokes. The drummers were not signaling but talking: they spoke a special, adapted language.

Eventually Carrington himself learned to drum. He drummed mainly in Kele, a language of the Bantu family in what is now eastern Zaire. “He is not really a European, despite the color of his skin,” a Lokele villager said of Carrington. “He used to be from our village, one of us. After he died, the spirits made a mistake and sent him off far away to a village of whites to enter into the body of a little baby who was born of a white woman instead of one of ours. But because he belongs to us, he could not forget where he came from and so he came back.” The villager added generously, “If he is a bit awkward on the drums, this is because of the poor education that the whites gave him.” Carrington’s life in Africa spanned four decades. He became an accomplished botanist, anthropologist, and above all linguist, authoritative on the structure of African language families: thousands of dialects and several hundred distinct languages. He noticed how loquacious a good drummer had to be. He finally published his discoveries about drums in 1949, in a slim volume titled The Talking Drums of Africa.

In solving the enigma of the drums, Carrington found the key in a central fact about the relevant African languages. They are tonal languages, in which meaning is determined as much by rising or falling pitch contours as by distinctions between consonants or vowels. This feature is missing from most Indo-European languages, including English, which uses tone only in limited, syntactical ways: for example, to distinguish questions (“you are happy


”) from declarations (“you are happy


”). But for other languages, including, most famously, Mandarin and Cantonese, tone has primary significance in distinguishing words. So it does in most African languages. Even when Europeans learned to communicate in these languages, they generally failed to grasp the importance of tonality, because they had no experience with it. When they transliterated the words they heard into the Latin alphabet, they disregarded pitch altogether. In effect, they were color-blind.

Three different Kele words are transliterated by Europeans as lisaka. The words are distinguished only by their speech-tones. Thus lisaka with three low syllables is a puddle; lisaka, the last syllable rising (not necessarily stressed) is a promise; and lisaka is a poison. Lia la means fiancée and liala, rubbish pit. In transliteration they appear to be homonyms, but they are not. Carrington, after the light dawned, recalled, “I must have been guilty many a time of asking a boy to ‘paddle for a book’ or to ‘fish that his friend is coming.’ ” Europeans just lacked the ear for the distinctions. Carrington saw how comical the confusion could become:

alambaka boili [-_--___] = he watched the riverbank

alambaka boili [----_-_] = he boiled his mother-in-law

Since the late nineteenth century, linguists have identified the phoneme as the smallest acoustic unit that makes a difference in meaning. The English word chuck comprises three phonemes: different meanings can be created by changing ch to d, or u to e, or ck to m. It is a useful concept but an imperfect one: linguists have found it surprisingly difficult to agree on an exact inventory of phonemes for English or any other language (most estimates for English are in the vicinity of forty-five). The problem is that a stream of speech is a continuum; a linguist may abstractly, and arbitrarily, break it into discrete units, but the meaningfulness of these units varies from speaker to speaker and depends on the context. Most speakers’ instincts about phonemes are biased, too, by their knowledge of the written alphabet, which codifies language in its own sometimes arbitrary ways. In any case, tonal languages, with their extra variable, contain many more phonemes than were first apparent to inexperienced linguists.

As the spoken languages of Africa elevated tonality to a crucial role, the drum language went a difficult step further. It employed tone and only tone. It was a language of a single pair of phonemes, a language composed entirely of pitch contours. The drums varied in materials and craft. Some were slit gongs, tubes of padauk wood, hollow, cut with a long and narrow mouth to make a high-sounding lip and a low-sounding lip; others had skin tops, and these were used in pairs. All that mattered was for the drums to sound two distinct notes, at an interval of about a major third.

So in mapping the spoken language to the drum language, information was lost. The drum talk was speech with a deficit. For every village and every tribe, the drum language began with the spoken word and shed the consonants and vowels. That was a lot to lose. The remaining information stream would be riddled with ambiguity. A double stroke on the high-tone lip of the drum [


] matched the tonal pattern of the Kele word for father, sango, but naturally it could just as well be songe, the moon; koko, fowl; fele, a species of fish; or any other word of two high tones. Even the limited dictionary of the missionaries at Yakusu contained 130 such words. Having reduced spoken words, in all their sonic richness, to such a minimal code, how could the drums distinguish them? The answer lay partly in stress and timing, but these could not compensate for the lack of consonants and vowels. Thus, Carrington discovered, a drummer would invariably add “a little phrase” to each short word. Songe, the moon, is rendered as songe li tange la manga—“the moon looks down at the earth.” Koko, the fowl, is rendered koko olongo la bokiokio—“the fowl, the little one that says kiokio.” The extra drumbeats, far from being extraneous, provide context. Every ambiguous word begins in a cloud of possible alternative interpretations; then the unwanted possibilities evaporate. This takes place below the level of consciousness. Listeners are hearing only staccato drum tones, low and high, but in effect they “hear” the missing consonants and vowels, too. For that matter, they hear whole phrases, not individual words. “Among peoples who know nothing of writing or grammar, a word per se, cut out of its sound group, seems almost to cease to be an intelligible articulation,” Captain Rattray reported.

The stereotyped long tails flap along, their redundancy overcoming ambiguity. The drum language is creative, freely generating neologisms for innovations from the north: steamboats, cigarettes, and the Christian god being three that Carrington particularly noted. But drummers begin by learning the traditional fixed formulas. Indeed, the formulas of the African drummers sometimes preserve archaic words that have been forgotten in the everyday language. For the Yaunde, the elephant is always “the great awkward one.” The resemblance to Homeric formulas—not merely Zeus, but Zeus the cloud-gatherer; not just the sea, but the wine-dark sea—is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne.

Neither Kele nor English yet had words to say, allocate extra bits for disambiguation and error correction. Yet this is what the drum language did. Redundancy—inefficient by definition—serves as the antidote to confusion. It provides second chances. Every natural language has redundancy built in; this is why people can understand text riddled with errors and why they can understand conversation in a noisy room. The natural redundancy of English motivates the famous New York City subway poster of the 1970s (and the poem by James Merrill),

if u cn rd ths

u cn gt a gd jb w hi pa!

(“This counterspell may save your soul,” Merrill adds.) Most of the time, redundancy in language is just part of the background. For a telegraphist it is an expensive waste. For an African drummer it is essential. Another specialized language provides a perfect analog: the language of aviation radio. Numbers and letters make up much of the information passed between pilots and air traffic controllers: altitudes, vectors, aircraft tail numbers, runway and taxiway identifiers, radio frequencies. This is critical communication over a notoriously noisy channel, so a specialized alphabet is employed to minimize ambiguity. The spoken letters B and V are easy to confuse; bravo and victor are safer. M and N become mike and november. In the case of numbers, five and nine, particularly prone to confusion, are spoken as fife and niner. The extra syllables perform the same function as the extra verbosity of the talking drums.

After publishing his book, John Carrington came across a mathematical way to understand this point. A paper by a Bell Labs telephone engineer, Ralph Hartley, even had a relevant-looking formula: H = n log s, where H is the amount of information, n is the number of symbols in the message, and s is the number of symbols available in the language. Hartley’s younger colleague Claude Shannon later pursued this lead, and one of his touchstone projects became a precise measurement of the redundancy in English. Symbols could be words, phonemes, or dots and dashes. The degree of choice within a symbol set varied—a thousand words or forty-five phonemes or twenty-six letters or three types of interruption in an electrical circuit. The formula quantified a simple enough phenomenon (simple, anyway, once it was noticed): the fewer symbols available, the more of them must be transmitted to get across a given amount of information. For the African drummers, messages need to be about eight times as long as their spoken equivalents.

Hartley took some pains to justify his use of the word information. “As commonly used, information is a very elastic term,” he wrote, “and it will first be necessary to set up for it a more specific meaning.” He proposed to think of information “physically”—his word—rather than psychologically. He found the complications multiplying. Somewhat paradoxically, the complexity arose from the intermediate layers of symbols: letters of the alphabet, or dots and dashes, which were discrete and therefore easily countable in themselves. Harder to measure were the connections between these standins and the bottom layer: the human voice itself. It was this stream of meaningful sound that still seemed, to a telephone engineer as much as an African drummer, the real stuff of communication, even if the sound, in turn, served as a code for the knowledge or meaning below. In any case Hartley thought an engineer should be able to generalize over all cases of communication: writing and telegraph codes as well as the physical transmission of sound by means of electromagnetic waves along telephone wires or through the ether.

He knew nothing of the drums, of course. And no sooner did John Carrington come to understand them than they began to fade from the African scene. He saw Lokele youth practicing the drums less and less, schoolboys who did not even learn their own drum names. He regretted it. He had made the talking drums a part of his own life. In 1954 a visitor from the United States found him running a mission school in the Congolese outpost of Yalemba. Carrington still walked daily in the jungle, and when it was time for lunch his wife would summon him with a fast tattoo. She drummed: “White man spirit in forest come come to house of shingles high up above of white man spirit in forest. Woman with yams awaits. Come come.”

Before long, there were people for whom the path of communications technology had leapt directly from the talking drum to the mobile phone, skipping over the intermediate stages.




(#ulink_00f9c2cb-532f-56ac-bd25-9f63c8b98ebe)The trip was sponsored by the Society for the Extinction of the Slave Trade and the Civilization of Africa for the purpose of interfering with slavers.




(#ulink_e3d36f52-eb14-53e1-a516-7ceec5c4523d)“A very short experience, however, showed the superiority of the alphabetic mode,” he wrote later, “and the big leaves of the numbered dictionary, which cost me a world of labor, . . . were discarded and the alphabetic installed in its stead.”




(#ulink_61904ace-6c28-5bf7-85d4-87825f9a6f62)Operators soon distinguished spaces of different lengths—intercharacter and interword—so Morse code actually employed four signs.


Chapter Two

The Persistence of the Word

(There Is No Dictionary in the Mind)

Odysseus wept when he heard the poet sing of his great deeds abroad because, once sung, they were no longer his alone. They belonged to anyone who heard the song.

—Ward Just (2004)

“TRY TO IMAGINE,” proposed Walter J. Ong, Jesuit priest, philosopher, and cultural historian, “a culture where no one has ever ‘looked up’ anything.” To subtract the technologies of information internalized over two millennia requires a leap of imagination backward into a forgotten past. The hardest technology to erase from our minds is the first of all: writing. This arises at the very dawn of history, as it must, because the history begins with the writing. The pastness of the past depends on it.

It takes a few thousand years for this mapping of language onto a system of signs to become second nature, and then there is no return to naïveté. Forgotten is the time when our very awareness of words came from seeing them. “In a primary oral culture,” as Ong noted,

the expression “to look up something” is an empty phrase: it would have no conceivable meaning. Without writing, words as such have no visual presence, even when the objects they represent are visual. They are sounds. You might “call” them back—“recall” them. But there is nowhere to “look” for them. They have no focus and no trace.

In the 1960s and ’70s, Ong declared the electronic age to be a new age of orality—but of “secondary orality,” the spoken word amplified and extended as never before, but always in the context of literacy: voices heard against a background of ubiquitous print. The first age of orality had lasted quite a bit longer. It covered almost the entire lifetime of the species, writing being a late development, general literacy being almost an afterthought. Like Marshall McLuhan, with whom he was often compared (“the other eminent Catholic-electronic prophet,” said a scornful Frank Kermode), Ong had the misfortune to make his visionary assessments of a new age just before it actually arrived. The new media seemed to be radio, telephone, and television. But these were just the faint glimmerings in the night sky, signaling the light that still lay just beyond the horizon. Whether Ong would have seen cyberspace as fundamentally oral or literary, he would surely have recognized it as transformative: not just a revitalization of older forms, not just an amplification, but something wholly new. He might have sensed a coming discontinuity akin to the emergence of literacy itself. Few understood better than Ong just how profound a discontinuity that had been.

When he began his studies, “oral literature” was a common phrase. It is an oxymoron laced with anachronism; the words imply an all-too-unconscious approach to the past by way of the present. Oral literature was generally treated as a variant of writing; this, Ong said, was “rather like thinking of horses as automobiles without wheels.”

You can, of course, undertake to do this. Imagine writing a treatise on horses (for people who have never seen a horse) which starts with the concept not of “horse” but of “automobile,” built on the readers’ direct experience of automobiles. It proceeds to discourse on horses by always referring to them as “wheelless automobiles,” explaining to highly automobilized readers all the points of difference. . . . Instead of wheels, the wheelless automobiles have enlarged toenails called hooves; instead of headlights, eyes; instead of a coat of lacquer, something called hair; instead of gasoline for fuel, hay, and so on. In the end, horses are only what they are not.

When it comes to understanding the preliterate past, we modern folk are hopelessly automobilized. The written word is the mechanism by which we know what we know. It organizes our thought. We may wish to understand the rise of literacy both historically and logically, but history and logic are themselves the products of literate thought.

Writing, as a technology, requires premeditation and special art. Language is not a technology, no matter how well developed and efficacious. It is not best seen as something separate from the mind; it is what the mind does. “Language in fact bears the same relationship to the concept of mind that legislation bears to the concept of parliament,” says Jonathan Miller: “it is a competence forever bodying itself in a series of concrete performances.” Much the same might be said of writing—it is concrete performance—but when the word is instantiated in paper or stone, it takes on a separate existence as artifice. It is a product of tools, and it is a tool. And like many technologies that followed, it thereby inspired immediate detractors.

One unlikely Luddite was also one of the first long-term beneficiaries. Plato (channeling the nonwriter Socrates) warned that this technology meant impoverishment:

For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom.

External characters which are no part of themselves—this was the trouble. The written word seemed insincere. Ersatz scratchings on papyrus or clay were far abstracted from the real, the free-flowing sound of language, intimately bound up with thought so as to seem coterminous with it. Writing appeared to draw knowledge away from the person, to place their memories in storage. It also separated the speaker from the listener, by so many miles or years. The deepest consequences of writing, for the individual and for the culture, could hardly have been foreseen, but even Plato could see some of the power of this disconnection. The one speaks to the multitude. The dead speak to the living, the living to the unborn. As McLuhan said, “Two thousand years of manuscript culture lay ahead of the Western world when Plato made this observation.” The power of this first artificial memory was incalculable: to restructure thought, to engender history. It is still incalculable, though one statistic gives a hint: whereas the total vocabulary of any oral language measures a few thousand words, the single language that has been written most widely, English, has a documented vocabulary of well over a million words, a corpus that grows by thousands of words a year. These words do not exist only in the present. Each word has a provenance and a history that melts into its present life.

With words we begin to leave traces behind us like breadcrumbs: memories in symbols for others to follow. Ants deploy their pheromones, trails of chemical information; Theseus unwound Ariadne’s thread. Now people leave paper trails. Writing comes into being to retain information across time and across space. Before writing, communication is evanescent and local; sounds carry a few yards and fade to oblivion. The evanescence of the spoken word went without saying. So fleeting was speech that the rare phenomenon of the echo, a sound heard once and then again, seemed a sort of magic. “This miraculous rebounding of the voice, the Greeks have a pretty name for, and call it Echo,” wrote Pliny. “The spoken symbol,” as Samuel Butler observed, “perishes instantly without material trace, and if it lives at all does so only in the minds of those who heard it.” Butler was able to formulate this truth just as it was being falsified for the first time, at the end of the nineteenth century, by the arrival of the electric technologies for capturing speech. It was precisely because it was no longer completely true that it could be clearly seen. Butler completed the distinction: “The written symbol extends infinitely, as regards time and space, the range within which one mind can communicate with another; it gives the writer’s mind a life limited by the duration of ink, paper, and readers, as against that of his flesh and blood body.”

But the new channel does more than extend the previous channel. It enables reuse and “re-collection”—new modes. It permits whole new architectures of information. Among them are history, law, business, mathematics, and logic. Apart from their content, these categories represent new techniques. The power lies not just in the knowledge, preserved and passed forward, valuable as it is, but in the methodology: encoded visual indications, the act of transference, substituting signs for things. And then, later, signs for signs.

Paleolithic people began at least 30,000 years ago to scratch and paint shapes that recalled to the eye images of horses, fishes, and hunters. These signs in clay and on cave walls served purposes of art or magic, and historians are loath to call them writing, but they began the recording of mental states in external media. In another way, knots in cords and notches in sticks served as aids to memory. These could be carried as messages. Marks in pottery and masonry could signify ownership. Marks, images, pictographs, petroglyphs—as these forms grew stylized, conventional, and thus increasingly abstract, they approached what we understand as writing, but one more transition was crucial, from the representation of things to the representation of spoken language: that is, representation twice removed. There is a progression from pictographic, writing the picture; to ideographic, writing the idea; and then logographic, writing the word.

Chinese script began this transition between 4,500 and 8,000 years ago: signs that began as pictures came to represent meaningful units of sound. Because the basic unit was the word, thousands of distinct symbols were required. This is efficient in one way, inefficient in another. Chinese unifies an array of distinct spoken languages: people who cannot speak to one another can write to one another. It employs at least fifty thousand symbols, about six thousand commonly used and known to most literate Chinese. In swift diagrammatic strokes they encode multidimensional semantic relationships. One device is simple repetition: tree + tree + tree = forest; more abstractly, sun + moon = brightness and east + east = everywhere. The process of compounding creates surprises: grain + knife = profit; hand + eye = look. Characters can be transformed in meaning by reorienting their elements: child to childbirth and man to corpse. Some elements are phonetic; some even punning. The entirety is the richest and most complex writing system that humanity has ever evolved. Considering scripts in terms of how many symbols are required and how much meaning each individual symbol conveys, Chinese thus became an extreme case: the largest set of symbols, and the most meaningful individually. Writing systems could take alternative paths: fewer symbols, each carrying less information. An intermediate stage is the syllabary, a phonetic writing system using individual characters to represent syllables, which may or may not be meaningful. A few hundred characters can serve a language.

The writing system at the opposite extreme took the longest to emerge: the alphabet, one symbol for one minimal sound. The alphabet is the most reductive, the most subversive of all scripts.

In all the languages of earth there is only one word for alphabet (alfabet, alfabeto,


,


). The alphabet was invented only once. All known alphabets, used today or found buried on tablets and stone, descend from the same original ancestor, which arose near the eastern littoral of the Mediterranean Sea, sometime not much before 1500 BCE, in a region that became a politically unstable crossroads of culture, covering Palestine, Phoenicia, and Assyria. To the east lay the great civilization of Mesopotamia, with its cuneiform script already a millennium old; down the shoreline to the southwest lay Egypt, where hieroglyphics developed simultaneously and independently. Traders traveled, too, from Cyprus and Crete, bringing their own incompatible systems. With glyphs from Minoan, Hittite, and Anatolian, it made for a symbolic stew. The ruling priestly classes were invested in their writing systems. Whoever owned the scripts owned the laws and the rites. But self-preservation had to compete with the desire for rapid communication. The scripts were conservative; the new technology was pragmatic. A stripped-down symbol system, just twenty-two signs, was the innovation of Semitic peoples in or near Palestine. Scholars naturally look to Kiriath-sepher, translatable as “city of the book,” and Byblos, “city of papyrus,” but no one knows exactly, and no one can know. The paleographer has a unique bootstrap problem. It is only writing that makes its own history possible. The foremost twentieth-century authority on the alphabet, David Diringer, quoted an earlier scholar: “There never was a man who could sit down and say: ‘Now I am going to be the first man to write.’ ”

The alphabet spread by contagion. The new technology was both the virus and the vector of transmission. It could not be monopolized, and it could not be suppressed. Even children could learn these few, lightweight, semantically empty letters. Divergent routes led to alphabets of the Arab world and of northern Africa; to Hebrew and Phoenician; across central Asia, to Brahmi and related Indian script; and to Greece. The new civilization arising there brought the alphabet to a high degree of perfection. Among others, the Latin and Cyrillic alphabets followed along.

Greece had not needed the alphabet to create literature—a fact that scholars realized only grudgingly, beginning in the 1930s. That was when Milman Parry, a structural linguist who studied the living tradition of oral epic poetry in Bosnia and Herzegovina, proposed that the Iliad and the Odyssey not only could have been but must have been composed and sung without benefit of writing. The meter, the formulaic redundancy, in effect the very poetry of the great works served first and foremost to aid memory. Its incantatory power made of the verse a time capsule, able to transmit a virtual encyclopedia of culture across generations. His argument was first controversial and then overwhelmingly persuasive—but only because the poems were written down, sometime in the sixth or seventh century BCE. This act—the transcribing of the Homeric epics—echoes through the ages. “It was something like a thunder-clap in human history, which the bias of familiarity has converted into the rustle of papers on a desk,” said Eric Havelock, a British classical scholar who followed Parry. “It constituted an intrusion into culture, with results that proved irreversible. It laid the basis for the destruction of the oral way of life and the oral modes of thought.”

The transcription of Homer converted this great poetry into a new medium and made of it something unplanned: from a momentary string of words created every time anew by the rhapsode and fading again even as it echoed in the listener’s ear, to a fixed but portable line on a papyrus sheet. Whether this alien, dry mode would suit the creation of poetry and song remained to be seen. In the meantime the written word helped more mundane forms of discourse: petitions to the gods, statements of law, and economic agreements. Writing also gave rise to discourse about discourse. Written texts became objects of a new sort of interest.

But how was one to speak about them? The words to describe the elements of this discourse did not exist in the lexicon of Homer. The language of an oral culture had to be wrenched into new forms; thus a new vocabulary emerged. Poems were seen to have topics—the word previously meaning “place.” They possessed structure, by analogy with buildings. They were made of plot and diction. Aristotle could now see the works of the bards as “representations of life,” born of the natural impulse toward imitation that begins in childhood. But he had also to account for other writing with other purposes—the Socratic dialogues, for example, and medical or scientific treatises—and this general type of work, including, presumably, his own, “happens, up to the present day, to have no name.” Under construction was a whole realm of abstraction, forcibly divorced from the concrete. Havelock described it as cultural warfare, a new consciousness and a new language at war with the old consciousness and the old language: “Their conflict produced essential and permanent contributions to the vocabulary of all abstract thought. Body and space, matter and motion, permanence and change, quality and quantity, combination and separation, are among the counters of common currency now available.”

Aristotle himself, son of the physician to the king of Macedonia and an avid, organized thinker, was attempting to systematize knowledge. The persistence of writing made it possible to impose structure on what was known about the world and, then, on what was known about knowing. As soon as one could set words down, examine them, look at them anew the next day, and consider their meaning, one became a philosopher, and the philosopher began with a clean slate and a vast project of definition to undertake. Knowledge could begin to pull itself up by the bootstraps. For Aristotle the most basic notions were worth recording and were necessary to record:

A beginning is that which itself does not follow necessarily from anything else, but some second thing naturally exists or occurs after it. Conversely, an end is that which does itself naturally follow from something else, either necessarily or in general, but there is nothing else after it. A middle is that which itself comes after something else, and some other thing comes after it.

These are statements not about experience but about the uses of language to structure experience. In the same way, the Greeks created categories (this word originally meaning “accusations” or “predictions”) as a means of classifying animal species, insects, and fishes. In turn, they could then classify ideas. This was a radical, alien mode of thought. Plato had warned that it would repel most people:

The multitude cannot accept the idea of beauty in itself rather than many beautiful things, nor anything conceived in its essence instead of the many specific things. Thus the multitude cannot be philosophic.

For “the multitude” we may understand “the preliterate.” They “lose themselves and wander amid the multiplicities of multifarious things,” declared Plato, looking back on the oral culture that still surrounded him. They “have no vivid pattern in their souls.”

And what vivid pattern was that? Havelock focused on the process of converting, mentally, from a “prose of narrative” to a “prose of ideas”; organizing experience in terms of categories rather than events; embracing the discipline of abstraction. He had a word in mind for this process, and the word was thinking. This was the discovery, not just of the self, but of the thinking self—in effect, the true beginning of consciousness.

In our world of ingrained literacy, thinking and writing seem scarcely related activities. We can imagine the latter depending on the former, but surely not the other way around: everyone thinks, whether or not they write. But Havelock was right. The written word—the persistent word— was a prerequisite for conscious thought as we understand it. It was the trigger for a wholesale, irreversible change in the human psyche—psyche being the word favored by Socrates/Plato as they struggled to understand. Plato, as Havelock puts it,

is trying for the first time in history to identify this group of general mental qualities, and seeking for a term which will label them satisfactorily under a single type. . . . He it was who hailed the portent and correctly identified it. In so doing, he so to speak confirmed and clinched the guesses of a previous generation which had been feeling its way towards the idea that you could “think,” and that thinking was a very special kind of psychic activity, very uncomfortable, but also very exciting, and one which required a very novel use of Greek.

Taking the next step on the road of abstraction, Aristotle deployed categories and relationships in a regimented order to develop a symbolism of reasoning: logic—from λóγoς, logos, the not-quite-translatable word from which so much flows, meaning “speech” or “reason” or “discourse” or, ultimately, just “word.”

Logic might be imagined to exist independent of writing—syllogisms can be spoken as well as written—but it did not. Speech is too fleeting to allow for analysis. Logic descended from the written word, in Greece as well as India and China, where it developed independently. Logic turns the act of abstraction into a tool for determining what is true and what is false: truth can be discovered in words alone, apart from concrete experience. Logic takes its form in chains: sequences whose members connect one to another. Conclusions follow from premises. These require a degree of constancy. They have no power unless people can examine and evaluate them. In contrast, an oral narrative proceeds by accretion, the words passing by in a line of parade past the viewing stand, briefly present and then gone, interacting with one another via memory and association. There are no syllogisms in Homer. Experience is arranged in terms of events, not categories. Only with writing does narrative structure come to embody sustained rational argument. Aristotle crossed another level, by seeing the study of such argument—not just the use of argument, but its study—as a tool. His logic expresses an ongoing self-consciousness about the words in which they are composed. When Aristotle unfurls premises and conclusions—If it is possible for no man to be a horse, it is also admissible for no horse to be a man; and if it is admissible for no garment to be white, it is also admissible for nothing white to be a garment. For if any white thing must be a garment, then some garment will necessarily be white—he neither requires nor implies any personal experience of horses, garments, or colors. He has departed that realm. Yet he claims through the manipulation of words to create knowledge anyway, and a superior brand of knowledge at that.

“We know that formal logic is the invention of Greek culture after it had interiorized the technology of alphabetic writing,” Walter Ong says—it is true of India and China as well—“and so made a permanent part of its noetic resources the kind of thinking that alphabetic writing made possible.” For evidence Ong turns to fieldwork of the Russian psychologist Aleksandr Romanovich Luria among illiterate peoples in remote Uzbekistan and Kyrgyzstan in Central Asia in the 1930s. Luria found striking differences between illiterate and even slightly literate subjects, not in what they knew, but in how they thought. Logic implicates symbolism directly: things are members of classes; they possess qualities, which are abstracted and generalized. Oral people lacked the categories that become second nature even to illiterate individuals in literate cultures: for example, for geometrical shapes. Shown drawings of circles and squares, they named them as “plate, sieve, bucket, watch, or moon” and “mirror, door, house, apricot drying board.” They could not, or would not, accept logical syllogisms. A typical question:

In the Far North, where there is snow, all bears are white.

Novaya Zembla is in the Far North and there is always snow there.

What color are the bears?

Typical response: “I don’t know. I’ve seen a black bear. I’ve never seen any others. . . . Each locality has its own animals.”

By contrast, a man who has just learned to read and write responds, “To go by your words, they should all be white.” To go by your words—in that phrase, a level is crossed. The information has been detached from any person, detached from the speaker’s experience. Now it lives in the words, little life-support modules. Spoken words also transport information, but not with the self-consciousness that writing brings. Literate people take for granted their own awareness of words, along with the array of word-related machinery: classification, reference, definition. Before literacy, there is nothing obvious about such techniques. “Try to explain to me what a tree is,” Luria says, and a peasant replies, “Why should I? Everyone knows what a tree is, they don’t need me telling them.”

“Basically the peasant was right,” Ong comments. “There is no way to refute the world of primary orality. All you can do is walk away from it into literacy.”

It is a twisting journey from things to words, from words to categories, from categories to metaphor and logic. Unnatural as it seemed to define tree, it was even trickier to define word, and helpful ancillary words like define were not at first available, the need never having existed. “In the infancy of logic, a form of thought has to be invented before the content can be filled up,” said Benjamin Jowett, Aristotle’s nineteenth-century translator. Spoken languages needed further evolution.

Language and reasoning fit so well that users could not always see the flaws and gaps. Still, as soon as any culture invented logic, paradoxes appeared. In China, nearly contemporaneously with Aristotle, the philosopher Gongsun Long captured some of these in the form of a dialogue, known as “When a White Horse Is Not a Horse.” It was written on bamboo strips, tied with string, before the invention of paper. It begins:

Can it be that a white horse is not a horse?

It can.

How?

“Horse” is that by means of which one names the shape. “White” is that by means of which one names the color. What names the color is not what names the shape. Hence, I say that a white horse is not a horse.

On its face, this is unfathomable. It begins to come into focus as a statement about language and logic. Gongsun Long was a member of the Mingjia, the School of Names, and his delving into these paradoxes formed part of what Chinese historians call the “language crisis,” a running debate over the nature of language. Names are not the things they name. Classes are not coextensive with subclasses. Thus innocent-seeming inferences get derailed: “a man dislikes white horses” does not imply “a man dislikes horses.”

You think that horses that are colored are not horses. In the world, it is not the case that there are horses with no color. Can it be that there are no horses in the world?

The philosopher shines his light on the process of abstracting into classes based on properties: whiteness; horsiness. Are these classes part of reality, or do they exist only in language?

Horses certainly have color. Hence, there are white horses. If it were the case that horses had no color, there would simply be horses, and then how could one select a white horse? A white horse is a horse and white. A horse and a white horse are different. Hence, I say that a white horse is not a horse.

Two millennia later, philosophers continue to struggle with these texts. The paths of logic into modern thought are roundabout, broken, and complex. Since the paradoxes seem to be in language, or about language, one way to banish them was to purify the medium: eliminate ambiguous words and woolly syntax, employ symbols that were rigorous and pure. To turn, that is, to mathematics. By the beginning of the twentieth century, it seemed that only a system of purpose-built symbols could make logic work properly—free of error and paradoxes. This dream was to prove illusory; the paradoxes would creep back in, but no one could hope to understand until the paths of logic and mathematics converged.

Mathematics, too, followed from the invention of writing. Greece is often thought of as the springhead for the river that becomes modern mathematics, with all its many tributaries down the centuries. But the Greeks themselves alluded to another tradition—to them, ancient— which they called Chaldean, and which we understand to be Babylonian. That tradition vanished into the sands, not to surface until the end of the nineteenth century, when tablets of clay were dug up from the mounds of lost cities.

First there were scores, then thousands of tablets, typically the size of a human hand, etched with a distinctive, edgy, angular writing called cuneiform, “wedge shaped.” Mature cuneiform was neither pictographic (the symbols were spare and abstract) nor alphabetic (they were far too numerous). By 3000 BCE a system with about seven hundred symbols flourished in Uruk, the walled city, probably the largest in the world, home of the hero-king Gilgamesh, in the alluvial marshes near the Euphrates River. German archeologists excavated Uruk in a series of digs all through the twentieth century. The materials for this most ancient of information technologies lay readily at hand. With damp clay held in one hand and a stylus of sharpened reed in the other, a scribe would imprint tiny characters in columns and rows.






A CUNEIFORM TABLET

The result: cryptic messages from an alien culture. They took generations to decipher. “Writing, like a theater curtain going up on these dazzling civilizations, lets us stare directly but imperfectly at them,” writes the psychologist Julian Jaynes. Some Europeans took umbrage at first. “To the Assyrians, the Chaldeans, and Egyptians,” wrote the seventeenth-century divine Thomas Sprat, “we owe the Invention” but also the “Corruption of knowledge,” when they concealed it with their strange scripts. “It was the custom of their Wise men, to wrap up their Observations on Nature, and the Manners of Men, in the dark Shadows of Hieroglyphicks” (as though friendlier ancients would have used an alphabet more familiar to Sprat). The earliest examples of cuneiform baffled archeologists and paleolinguists the longest, because the first language to be written, Sumerian, left no other traces in culture or speech. Sumerian turned out to be a linguistic rarity, an isolate, with no known descendants. When scholars did learn to read the Uruk tablets, they found them to be, in their way, humdrum: civic memoranda, contracts and laws, and receipts and bills for barley, livestock, oil, reed mats, and pottery. Nothing like poetry or literature appeared in cuneiform for hundreds of years to come. The tablets were the quotidiana of nascent commerce and bureaucracy. The tablets not only recorded the commerce and the bureaucracy but, in the first place, made them possible.

Even then, cuneiform incorporated signs for counting and measurement. Different characters, used in different ways, could denote numbers and weights. A more systematic approach to the writing of numbers did not take shape until the time of Hammurabi, 1750 BCE, when Mesopotamia was unified around the great city of Babylon. Hammurabi himself was probably the first literate king, writing his own cuneiform rather than depending on scribes, and his empire building manifested the connection between writing and social control. “This process of conquest and influence is made possible by letters and tablets and stelae in an abundance that had never been known before,” Jaynes declares. “Writing was a new method of civil direction, indeed the model that begins our own memo-communicating government.”

The writing of numbers had evolved into an elaborate system. Numerals were composed of just two basic parts, a vertical wedge for 1 (


) and an angle wedge for 10 (


). These were combined to form the standard characters, so that


represented 3 and


represented 16, and so on. But the Babylonian system was not decimal, base 10; it was sexagesimal, base 60. Each of the numerals from 1 to 60 had its own character. To form large numbers, the Babylonians used numerals in places:





was 70 (one 60 plus ten 1s);





was 616 (ten 60s plus sixteen 1s), and so on. None of this was clear when the tablets first began to surface. A basic theme with variations, encountered many times, proved to be multiplication tables. In a sexagesimal system these had to cover the numbers from 1 to 19 as well as 20, 30, 40, and 50. Even more difficult to unravel were tables of reciprocals, making possible division and fractional numbers: in the 60-based system, reciprocals were 2:30, 3:20, 4:15, 5:12 . . . and then, using extra places, 8:7,30, 9:6,40, and so on.


(#ulink_c4253590-1f48-5e8f-b5e6-3b63cb293016)






A MATHEMATICAL TABLE ON A CUNEIFORM TABLETANALYZED BY ASGER AABOE

These symbols were hardly words—or they were words of a peculiar, slender, rigid sort. They seemed to arrange themselves into visible patterns in the clay, repetitious, almost artistic, not like any prose or poetry archeologists had encountered. They were like maps of a mysterious city. This was the key to deciphering them, finally: the ordered chaos that seems to guarantee the presence of meaning. It seemed like a task for mathematicians, anyway, and finally it was. They recognized geometric progressions, tables of powers, and even instructions for computing square roots and cube roots. Familiar as they were with the rise of mathematics a millennium later in ancient Greece, these scholars were astounded at the breadth and depth of mathematical knowledge that existed before in Mesopotamia. “It was assumed that the Babylonians had had some sort of number mysticism or numerology,” wrote Asger Aaboe in 1963, “but we now know how far short of the truth this assumption was.” The Babylonians computed linear equations, quadratic equations, and Pythagorean numbers long before Pythagoras. In contrast to the Greek mathematics that followed, Babylonian mathematics did not emphasize geometry, except for practical problems; the Babylonians calculated areas and perimeters but did not prove theorems. Yet they could (in effect) reduce elaborate second-degree polynomials. Their mathematics seemed to value computational power above all.

That could not be appreciated until computational power began to mean something. By the time modern mathematicians turned their attention to Babylon, many important tablets had already been destroyed or scattered. Fragments retrieved from Uruk before 1914, for example, were dispersed to Berlin, Paris, and Chicago and only fifty years later were discovered to hold the beginning methods of astronomy. To demonstrate this, Otto Neugebauer, the leading twentieth-century historian of ancient mathematics, had to reassemble tablets whose fragments had made their way to opposite sides of the Atlantic Ocean. In 1949, when the number of cuneiform tablets housed in museums reached (at his rough guess) a half million, Neugebauer lamented, “Our task can therefore properly be compared with restoring the history of mathematics from a few torn pages which have accidentally survived the destruction of a great library.”

In 1972, Donald Knuth, an early computer scientist at Stanford, looked at the remains of an Old Babylonian tablet the size of a paperback book, half lying in the British Museum in London, one-fourth in the Staatliche Museen in Berlin, and the rest missing, and saw what he could only describe, anachronistically, as an algorithm:

A cistern.

The height is 3,20, and a volume of 27,46,40 has been excavated.

The length exceeds the width by 50.

You should take the reciprocal of the height, 3,20, obtaining 18.

Multiply this by the volume, 27,46,40, obtaining 8,20.

Take half of 50 and square it, obtaining 10,25.

Add 8,20, and you get 8,30,25.

The square root is 2,55.

Make two copies of this, adding to the one and subtracting from the other.

You find that 3,20 is the length and 2,30 is the width.

This is the procedure.

“This is the procedure” was a standard closing, like a benediction, and for Knuth redolent with meaning. In the Louvre he found a “procedure” that reminded him of a stack program on a Burroughs B5500. “We can commend the Babylonians for developing a nice way to explain an algorithm by example as the algorithm itself was being defined,” said Knuth. By then he himself was engrossed in the project of defining and explaining the algorithm; he was amazed by what he found on the ancient tablets. The scribes wrote instructions for placing numbers in certain locations—for making “copies” of a number, and for keeping a number “in your head.” This idea, of abstract quantities occupying abstract places, would not come back to life till much later.

Where is a symbol? What is a symbol? Even to ask such questions required a self-consciousness that did not come naturally. Once asked, the questions continued to loom. Look at these signs, philosophers implored. What are they?

“Fundamentally letters are shapes indicating voices,” explained John of Salisbury in medieval England. “Hence they represent things which they bring to mind through the windows of the eyes.” John served as secretary and scribe to the Archbishop of Canterbury in the twelfth century. He served the cause of Aristotle as an advocate and salesman. His Metalogicon not only set forth the principles of Aristotelian logic but urged his contemporaries to convert, as though to a new religion. (He did not mince words: “Let him who is not come to logic be plagued with continuous and everlasting filth.”) Putting pen to parchment in this time of barest literacy, he tried to examine the act of writing and the effect of words: “Frequently they speak voicelessly the utterances of the absent.” The idea of writing was still entangled with the idea of speaking. The mixing of the visual and the auditory continued to create puzzles, and so also did the mixing of past and future: utterances of the absent. Writing leapt across these levels.

Every user of this technology was a novice. Those composing formal legal documents, such as charters and deeds, often felt the need to express their sensation of speaking to an invisible audience: “Oh! all ye who shall have heard this and have seen!” (They found it awkward to keep tenses straight, like voicemail novices leaving their first messages circa 1980.) Many charters ended with the word “Goodbye.” Before writing could feel natural in itself—could become second nature—these echoes of voices had to fade away. Writing in and of itself had to reshape human consciousness.

Among the many abilities gained by the written culture, not the least was the power of looking inward upon itself. Writers loved to discuss writing, far more than bards ever bothered to discuss speech. They could see the medium and its messages, hold them up to the mind’s eye for study and analysis. And they could criticize it—for from the very start, the new abilities were accompanied by a nagging sense of loss. It was a form of nostalgia. Plato felt it:

I cannot help feeling, Phaedrus, [says Socrates] that writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence. . . . You would imagine that they had intelligence, but if you want to know anything and put a question to one of them, the speaker always gives one unvarying answer.

Unfortunately the written word stands still. It is stable and immobile. Plato’s qualms were mostly set aside in the succeeding millennia, as the culture of literacy developed its many gifts: history and the law; the sciences and philosophy; the reflective explication of art and literature itself. None of that could have emerged from pure orality. Great poetry could and did, but it was expensive and rare. To make the epics of Homer, to let them be heard, to sustain them across the years and the miles required a considerable share of the available cultural energy.

Then the vanished world of primary orality was not much missed. Not until the twentieth century, amid a burgeoning of new media for communication, did the qualms and the nostalgia resurface. Marshall McLuhan, who became the most famous spokesman for the bygone oral culture, did so in the service of an argument for modernity. He hailed the new “electric age” not for its newness but for its return to the roots of human creativity. He saw it as a revival of the old orality. “We are in our century ‘winding the tape backward,’ ” he declared, finding his metaphorical tape in one of the newest information technologies. He constructed a series of polemical contrasts: the printed word vs. the spoken word; cold/hot; static/fluid; neutral/magical; impoverished/rich; regimented/creative; mechanical/ organic; separatist/integrative. “The alphabet is a technology of visual fragmentation and specialism,” he wrote. It leads to “a desert of classified data.” One way of framing McLuhan’s critique of print would be to say that print offers only a narrow channel of communication. The channel is linear and even fragmented. By contrast, speech—in the primal case, face-to-face human intercourse, alive with gesture and touch—engages all the senses, not just hearing. If the ideal of communication is a meeting of souls, then writing is a sad shadow of the ideal.

The same criticism was made of other constrained channels, created by later technologies—the telegraph, the telephone, radio, and e-mail. Jonathan Miller rephrases McLuhan’s argument in quasi-technical terms of information: “The larger the number of senses involved, the better the chance of transmitting a reliable copy of the sender’s mental state.”


(#ulink_b78ba33b-88c2-524b-93bf-863599be4307) In the stream of words past the ear or eye, we sense not just the items one by one but their rhythms and tones, which is to say their music. We, the listener or the reader, do not hear, or read, one word at a time; we get messages in groupings small and large. Human memory being what it is, larger patterns can be grasped in writing than in sound. The eye can glance back. McLuhan considered this damaging, or at least diminishing. “Acoustic space is organic and integral,” he said, “perceived through the simultaneous interplay of all the senses; whereas ‘rational’ or pictorial space is uniform, sequential and continuous and creates a closed world with none of the rich resonance of the tribal echoland.” For McLuhan, the tribal echoland is Eden.

By their dependence on the spoken word for information, people were drawn together into a tribal mesh . . . the spoken word is more emotionally laden than the written. . . . Audile-tactile tribal man partook of the collective unconscious, lived in a magical integral world patterned by myth and ritual, its values divine.


(#ulink_d0d83725-264f-51de-bde6-615f4667d8f9)

Up to a point, maybe. Yet three centuries earlier, Thomas Hobbes, looking from a vantage where literacy was new, had taken a less rosy view. He could see the preliterate culture more clearly: “Men lived upon gross experience,” he wrote. “There was no method; that is to say, no sowing nor planting of knowledge by itself, apart from the weeds and common plants of error and conjecture.” A sorry place, neither magical nor divine.

Was McLuhan right, or was Hobbes? If we are ambivalent, the ambivalence began with Plato. He witnessed writing’s rising dominion; he asserted its force and feared its lifelessness. The writer-philosopher embodied a paradox. The same paradox was destined to reappear in different guises, each technology of information bringing its own powers and its own fears. It turns out that the “forgetfulness” Plato feared does not arise. It does not arise because Plato himself, with his mentor Socrates and his disciple Aristotle, designed a vocabulary of ideas, organized them into categories, set down rules of logic, and so fulfilled the promise of the technology of writing. All this made knowledge more durable stuff than before.

And the atom of knowledge was the word. Or was it? For some time to come, the word continued to elude its pursuers, whether it was a fleeting burst of sound or a fixed cluster of marks. “Most literate persons, when you say, ‘Think of a word,’ at least in some vague fashion think of something before their eyes,” Ong says, “where a real word can never be at all.” Where do we look for the words, then? In the dictionary, of course. Ong also said: “It is demoralizing to remind oneself that there is no dictionary in the mind, that lexicographical apparatus is a very late accretion to language.”




(#ulink_6f8c0210-5ed8-5a03-9ad6-dfdc9c5e17d6)It is customary to transcribe a two-place sexagesimal cuneiform number with a comma—such as “7,30.” But the scribes did not use such punctuation, and in fact their notation left the place values undefined; that is, their numbers were what we would call “floating point.” A two-place number like 7,30 could be 450 (seven 60s + thirty 1s) or 7½ (seven 1s + thirty 1/60s).




(#ulink_a51557ce-abbf-546d-8927-029e7590d22c)Not that Miller agrees. On the contrary: “It is hard to overestimate the subtle reflexive effects of literacy upon the creative imagination, providing as it does a cumulative deposit of ideas, images, and idioms upon whose rich and appreciating funds every artist enjoys an unlimited right of withdrawal.”




(#ulink_fe8db67a-d092-53ad-8ee4-e20752348e0b)The interviewer asked plaintively, “But aren’t there corresponding gains in insight, understanding and cultural diversity to compensate detribalized man?” McLuhan responded, “Your question reflects all the institutionalized biases of literate man.”


Chapter Three

Two Wordbooks

(The Uncertainty in Our Writing, the Inconstancy in Our Letters)

In such busie, and active times, there arise more new thoughts of men, which must be signifi’d, and varied by new expressions.

—Thomas Sprat (1667)

A VILLAGE SCHOOLMASTER AND PRIEST made a book in 1604 with a rambling title that began “A Table Alphabeticall, conteyning and teaching the true writing, and understanding of hard usuall English wordes,” and went on with more hints to its purpose, which was unusual and needed explanation:

With the interpretation thereof by plaine English words, gathered for the benefit & helpe of Ladies, Gentlewomen, or any other unskilfull persons.

Whereby they may the more easily and better understand many hard English wordes, which they shall heare or read in Scriptures, Sermons, or elsewhere, and also be made able to use the same aptly themselves.

The title page omitted the name of the author, Robert Cawdrey, but included a motto from Latin—“As good not read, as not to understand”—and situated the publisher with as much formality and exactness as could be expected in a time when the address, as a specification of place, did not yet exist:

At London, Printed by I. R. for Edmund Weaver, & are to be sold at his shop at the great North doore of Paules Church.






CAWDREY’S TITLE PAGE

Even in London’s densely packed streets, shops and homes were seldom to be found by number. The alphabet, however, had a definite order— the first and second letters providing its very name—and that order had been maintained since the early Phoenician times, through all the borrowing and evolution that followed.

Cawdrey lived in a time of information poverty. He would not have thought so, even had he possessed the concept. On the contrary, he would have considered himself to be in the midst of an information explosion, which he himself was trying to abet and organize. But four centuries later, his own life is shrouded in the obscurity of missing knowledge. His Table Alphabeticall appears as a milestone in the history of information, yet of its entire first edition, just one worn copy survived into the future. When and where he was born remain unknown—probably in the late 1530s; probably in the Midlands. Parish registers notwithstanding, people’s lives were almost wholly undocumented. No one has even a definitive spelling for Cawdrey’s name (Cowdrey, Cawdry). But then, no one agreed on the spelling of most names: they were spoken, seldom written.

In fact, few had any concept of “spelling”—the idea that each word, when written, should take a particular predetermined form of letters. The word cony (rabbit) appeared variously as conny, conye, conie, connie, coni, cuny, cunny, and cunnie in a single 1591 pamphlet. Others spelled it differently. And for that matter Cawdrey himself, on the title page of his book for “teaching the true writing,” wrote wordes in one sentence and words in the next. Language did not function as a storehouse of words, from which users could summon the correct items, preformed. On the contrary, words were fugitive, on the fly, expected to vanish again thereafter. When spoken, they were not available to be compared with, or measured against, other instantiations of themselves. Every time people dipped quill in ink to form a word on paper they made a fresh choice of whatever letters seemed to suit the task. But this was changing. The availability—the solidity—of the printed book inspired a sense that the written word should be a certain way, that one form was right and others wrong. First this sense was unconscious; then it began to rise toward general awareness. Printers themselves made it their business.

To spell (from an old Germanic word) first meant to speak or to utter. Then it meant to read, slowly, letter by letter. Then, by extension, just around Cawdrey’s time, it meant to write words letter by letter. The last was a somewhat poetic usage. “Spell Eva back and Ave shall you find,” wrote the Jesuit poet Robert Southwell (shortly before being hanged and quartered in 1595). When certain educators did begin to consider the idea of spelling, they would say “right writing”—or, to borrow from Greek, “orthography.” Few bothered, but one who did was a school headmaster in London, Richard Mulcaster. He assembled a primer, titled “The first part [a second part was not to be] of the Elementarie which entreateth chefelie of the right writing of our English tung.” He published it in 1582 (“at London by Thomas Vautroullier dwelling in the blak-friers by Lud-gate”), including his own list of about eight thousand words and a plea for the idea of a dictionary:

It were a thing verie praiseworthie in my opinion, and no lesse profitable than praise worthie, if some one well learned and as laborious a man, wold gather all the words which we use in our English tung . . . into one dictionarie, and besides the right writing, which is incident to the Alphabete, wold open unto us therein, both their naturall force, and their proper use.

He recognized another motivating factor: the quickening pace of commerce and transportation made other languages a palpable presence, forcing an awareness of the English language as just one among many. “Forenners and strangers do wonder at us,” Mulcaster wrote, “both for the uncertaintie in our writing, and the inconstancie in our letters.” Language was no longer invisible like the air.

Barely 5 million people on earth spoke English (a rough estimate; no one tried to count the population of England, Scotland, or Ireland until 1801). Barely a million of those could write. Of all the world’s languages English was already the most checkered, the most mottled, the most polygenetic. Its history showed continual corruption and enrichment from without. Its oldest core words, the words that felt most basic, came from the language spoken by the Angles, Saxons, and Jutes, Germanic tribes that crossed the North Sea into England in the fifth century, pushing aside the Celtic inhabitants. Not much of Celtic penetrated the Anglo-Saxon speech, but Viking invaders brought more words from Norse and Danish: egg, sky, anger, give, get. Latin came by way of Christian missionaries; they wrote in the alphabet of the Romans, which replaced the runic scripts that spread in central and northern Europe early in the first millennium. Then came the influence of French.

Influence, to Robert Cawdrey, meant “a flowing in.” The Norman Conquest was more like a deluge, linguistically. English peasants of the lower classes continued to breed cows, pigs, and oxen (Germanic words), but in the second millennium the upper classes dined on beef, pork, and mutton (French). By medieval times French and Latin roots accounted for more than half of the common vocabulary. More alien words came when intellectuals began consciously to borrow from Latin and Greek to express concepts the language had not before needed. Cawdrey found this habit irritating. “Some men seek so far for outlandish English, that they forget altogether their mothers language, so that if some of their mothers were alive, they were not able to tell, or understand what they say,” he complained. “One might well charge them, for counterfeyting the Kings English.”

Four hundred years after Cawdrey published his book of words, John Simpson retraced Cawdrey’s path. Simpson was in certain respects his natural heir: the editor of a grander book of words, the Oxford English Dictionary. Simpson, a pale, soft-spoken man, saw Cawdrey as obstinate, uncompromising, and even pugnacious. The schoolteacher was ordained a deacon and then a priest of the Church of England in a restless time, when Puritanism was on the rise. Nonconformity led him into trouble. He seems to have been guilty of “not Conforming himself” to some of the sacraments, such as “the Cross in Baptism, and the Ring in Marriage.” As a village priest he did not care to bow down to bishops and archbishops. He preached a form of equality unwelcome to church authorities. “There was preferred secretly an Information against him for speaking diverse Words in the Pulpit, tending to the depraving of the Book of Common Prayer. . . . And so being judged a dangerous Person, if he should continue preaching, but infecting the People with Principles different from the Religion established.” Cawdrey was degraded from the priesthood and deprived of his benefice. He continued to fight the case for years, to no avail.

All that time, he collected words (“collect, gather”). He published two instructional treatises, one on catechism (“catechiser, that teacheth the principles of Christian religion”) and one on A godlie forme of householde government for the ordering of private families, and in 1604 he produced a different sort of book: nothing more than a list of words, with brief definitions.

Why? Simpson says, “We have already seen that he was committed to simplicity in language, and that he was strong-minded to the point of obstinacy.” He was still preaching—now, to preachers. “Such as by their place and calling (but especially Preachers) as have occasion to speak publiquely before the ignorant people,” Cawdrey declared in his introductory note, “are to bee admonished.” He admonishes them. “Never affect any strange ynckhorne termes.” (An inkhorn was an inkpot; by inkhorn term he meant a bookish word.) “Labour to speake so as is commonly received, and so as the most ignorant may well understand them.” And above all do not affect to speak like a foreigner:

Some far journied gentlemen, at their returne home, like as they love to go in forraine apparrell, so they will pouder their talke with over-sea language. He that commeth lately out of France, will talk French English, and never blush at the matter.

Cawdrey had no idea of listing all the words—whatever that would mean. By 1604 William Shakespeare had written most of his plays, employing a vocabulary of nearly 30,000, but these words were not available to Cawdrey or anyone else. Cawdrey did not bother with the most common words, nor the most inkhorn and Frenchified words; he listed only the “hard usual” words, words difficult enough to need some explanation but still “proper unto the tongue wherein we speake” and “plaine for all men to perceive.” He compiled 2,500. He knew that many were derived from Greek, French, and Latin (“derive, fetch from”), and he marked these accordingly. The book Cawdrey made was the first English dictionary. The word dictionary was not in it.

Although Cawdrey cited no authorities, he had relied on some. He copied the remarks about inkhorn terms and the far-journeyed gentlemen in their foreign apparel from Thomas Wilson’s successful book The Arte of Rhetorique. For the words themselves he found several sources (“source, wave, or issuing foorth of water”). He found about half his words in a primer for teaching reading, called The English Schoolemaister, by Edmund Coote, first published in 1596 and widely reprinted thereafter. Coote claimed that a schoolmaster could teach a hundred students more quickly with his text than forty without it. He found it worthwhile to explain the benefits of teaching people to read: “So more knowledge will be brought into this Land, and moe bookes bought, than otherwise would have been.” Coote included a long glossary, which Cawdrey plundered.

That Cawdrey should arrange his words in alphabetical order, to make his Table Alphabeticall, was not self-evident. He knew he could not count on even his educated readers to be versed in alphabetical order, so he tried to produce a small how-to manual. He struggled with this: whether to describe the ordering in logical, schematic terms or in terms of a step-by-step procedure, an algorithm. “Gentle reader,” he wrote—again adapting freely from Coote—

thou must learne the Alphabet, to wit, the order of the Letters as they stand, perfectly without booke, and where every Letter standeth: as b neere the beginning, n about the middest, and t toward the end. Nowe if the word, which thou art desirous to finde, begin with a then looke in the beginning of this Table, but if with v looke towards the end. Againe, if thy word beginne with ca looke in the beginning of the letter c but if with cu then looke toward the end of that letter. And so of all the rest. &c.

It was not easy to explain. Friar Johannes Balbus of Genoa tried in his 1286 Catholicon. Balbus thought he was inventing alphabetical order for the first time, and his instructions were painstaking: “For example I intend to discuss amo and bibo. I will discuss amo before bibo because a is the first letter of amo and b is the first letter of bibo and a is before b in the alphabet. Similarly . . .” He rehearsed a long list of examples and concluded: “I beg of you, therefore, good reader, do not scorn this great labor of mine and this order as something worthless.”

In the ancient world, alphabetical lists scarcely appeared until around 250 BCE, in papyrus texts from Alexandria. The great library there seems to have used at least some alphabetization in organizing its books. The need for such an artificial ordering scheme arises only with large collections of data, not otherwise ordered. And the possibility of alphabetical order arises only in languages possessing an alphabet: a discrete small symbol set with its own conventional sequence (“abecedarie, the order of the Letters, or hee that useth them”). Even then the system is unnatural. It forces the user to detach information from meaning; to treat words strictly as character strings; to focus abstractly on the configuration of the word. Furthermore, alphabetical ordering comprises a pair of procedures, one the inverse of the other: organizing a list and looking up items; sorting and searching. In either direction the procedure is recursive (“recourse, a running backe againe”). The basic operation is a binary decision: greater than or less than. This operation is performed first on one letter; then, nested as a subroutine, on the next letter; and (as Cawdrey put it, struggling with the awkwardness) “so of all the rest. &c.” This makes for astounding efficiency. The system scales easily to any size, the macrostructure being identical to the microstructure. A person who understands alphabetical order homes in on any one item in a list of a thousand or a million, unerringly, with perfect confidence. And without knowing anything about the meaning.

Not until 1613 was the first alphabetical catalogue made—not printed, but written in two small handbooks—for the Bodleian Library at Oxford. The first catalogue of a university library, made at Leiden, Holland, two decades earlier, was arranged by subject matter, as a shelf list (about 450 books), with no alphabetical index. Of one thing Cawdrey could be sure: his typical reader, a literate, book-buying Englishman at the turn of the seventeenth century, could live a lifetime without ever encountering a set of data ordered alphabetically.

More sensible ways of ordering words came first and lingered for a long time. In China the closest thing to a dictionary for many centuries was the Erya, author unknown, date unknown but probably around the third century BCE. It arranged its two thousand entries by meaning, in topical categories: kinship, building, tools and weapons, the heavens, the earth, plants and animals. Egyptian had word lists organized on philosophical or educational principles; so did Arabic. These lists were arranging not the words themselves, mainly, but rather the world: the things for which the words stood. In Germany, a century after Cawdrey, the philosopher and mathematician Gottfried Wilhelm Leibniz made this distinction explicit:

Let me mention that the words or names of all things and actions can be brought into a list in two different ways, according to the alphabet and according to nature. . . . The former go from the word to the thing, the latter from the thing to the word.

Topical lists were thought provoking, imperfect, and creative. Alphabetical lists were mechanical, effective, and automatic. Considered alphabetically, words are no more than tokens, each placed in a slot. In effect they may as well be numbers.

Meaning comes into the dictionary in its definitions, of course. Cawdrey’s crucial models were dictionaries for translation, especially a 1587 Latin-English Dictionarium by Thomas Thomas. A bilingual dictionary had a clearer purpose than a dictionary of one language alone: mapping Latin onto English made a kind of sense that translating English to English did not. Yet definitions were the point, Cawdrey’s stated purpose being after all to help people understand and use hard words. He approached the task of definition with a trepidation that remains palpable. Even as he defined his words, Cawdrey still did not quite believe in their solidity. Meanings were even more fluid than spellings. Define, to Cawdrey, was for things, not for words: “define, to shew clearely what a thing is.” It was reality, in all its richness, that needed defining. Interpret meant “open, make plaine, to shewe the sence and meaning of a thing.” For him the relationship between the thing and the word was like the relationship between an object and its shadow.

The relevant concepts had not reached maturity:

figurate, to shadowe, or represent, or to counterfaite

type, figure, example, shadowe of any thing

represent, expresse, beare shew of a thing

An earlier contemporary of Cawdrey’s, Ralph Lever, made up his own word: “saywhat, corruptly called a definition: but it is a saying which telleth what a thing is, it may more aptly be called a saywhat.” This did not catch on. It took almost another century—and the examples of Cawdrey and his successors—for the modern sense to come into focus: “Definition,” John Locke finally writes in 1690, “being nothing but making another understand by Words, what Idea the Term defin’d stands for.” And Locke still takes an operational view. Definition is communication: making another understand; sending a message.

Cawdrey borrows definitions from his sources, combines them, and adapts them. In many case he simply maps one word onto another:

orifice, mouth

baud, whore

helmet, head peece

For a small class of words he uses a special designation, the letter k: “standeth for a kind of.” He does not consider it his job to say what kind. Thus:

crocodile, k beast

alablaster, k stone

citron, k fruit

But linking pairs of words, either as synonyms or as members of a class, can carry a lexicographer only so far. The relationships among the words of a language are far too complex for so linear an approach (“chaos, a confused heap of mingle-mangle”). Sometimes Cawdrey tries to cope by adding one or more extra synonyms, definition by triangulation:

specke, spot, or marke

cynicall, doggish, froward

vapor, moisture, ayre, hote breath, or reaking

For other words, representing concepts and abstractions, further removed from the concrete realm of the senses, Cawdrey needs to find another style altogether. He makes it up as he goes along. He must speak to his reader, in prose but not quite in sentences, and we can hear him struggle, both to understand certain words and to express his understanding.

gargarise, to wash the mouth, and throate within, by stirring some liquor up and downe in the mouth

hipocrite, such a one as in his outward apparrell, countenaunce,& behaviour, pretendeth to be another man, then he is indeede, or a deceiver

buggerie, coniunction with one of the same kinde, or of men with beasts theologie, divinitie, the science of living blessedly for ever

Among the most troublesome were technical terms from new sciences:

cypher, a circle in numbering, of no value of it selfe, but serveth to make up the number, and to make other figures of more value

horizon, a circle, deviding the halfe of the firmament, from the other halfe which we see not

zodiack, a circle in the heaven, wherein be placed the 12 signes, and in which the Sunne is mooved

Not just the words but the knowledge was in flux. The language was examining itself. Even when Cawdrey is copying from Coote or Thomas, he is fundamentally alone, with no authority to consult.

One of Cawdrey’s hard usual words was science (“knowledge, or skill”). Science did not yet exist as an institution responsible for learning about the material universe and its laws. Natural philosophers were beginning to have a special interest in the nature of words and their meaning. They needed better than they had. When Galileo pointed his first telescope skyward and discovered sunspots in 1611, he immediately anticipated controversy— traditionally the sun was an epitome of purity—and he sensed that science could not proceed without first solving a problem of language:

So long as men were in fact obliged to call the sun “most pure and most lucid,” no shadows or impurities whatever had been perceived in it; but now that it shows itself to us as partly impure and spotty; why should we not call it “spotted and not pure”? For names and attributes must be accommodated to the essence of things, and not the essence to the names, since things come first and names afterwards.

When Isaac Newton embarked on his great program, he encountered a fundamental lack of definition where it was most needed. He began with a semantic sleight of hand: “I do not define time, space, place, and motion, as being well known to all,” he wrote deceptively. Defining these words was his very purpose. There were no agreed standards for weights and measures. Weight and measure were themselves vague terms. Latin seemed more reliable than English, precisely because it was less worn by everyday use, but the Romans had not possessed the necessary words either. Newton’s raw notes reveal a struggle hidden in the finished product. He tried expressions like quantitas materiae. Too hard for Cawdrey: “materiall, of some matter, or importance.” Newton suggested (to himself) “that which arises from its density and bulk conjointly.” He considered more words: “This quantity I designate under the name of body or mass.” Without the right words he could not proceed. Velocity, force, gravity—none of these were yet suitable. They could not be defined in terms of one another; there was nothing in visible nature at which anyone could point a finger; and there was no book in which to look them up.

As for Robert Cawdrey, his mark on history ends with the publication of his Table Alphabeticall in 1604. No one knows when he died. No one knows how many copies the printer made. There are no records (“records, writings layde up for remembrance”). A single copy made its way to the Bodleian Library in Oxford, which has preserved it. All the others disappeared. A second edition appeared in 1609, slightly expanded (“much inlarged,” the title page claims falsely) by Cawdrey’s son, Thomas, and a third and fourth appeared in 1613 and 1617, and there the life of this book ended.

It was overshadowed by a new dictionary, twice as comprehensive, An English Expositour: Teaching the Interpretation of the hardest Words used in our Language, with sundry Explications, Descriptions, and Discourses. Its compiler, John Bullokar, otherwise left as faint a mark on the historical record as Cawdrey did. He was doctor of physic; he lived for some time in Chichester; his dates of birth and death are uncertain; he is said to have visited London in 1611 and there to have seen a dead crocodile; and little else is known. His Expositour appeared in 1616 and went through several editions in the succeeding decades. Then in 1656 a London barrister, Thomas Blount, published his Glossographia: or a Dictionary, Interpreting all such Hard Words of Whatsoever Language,now used in our refined English Tongue. Blount’s dictionary listed more than eleven thousand words, many of which, he recognized, were new, reaching London in the hurly-burly of trade and commerce—

coffa or cauphe, a kind of drink among the Turks and Persians, (and of late introduced among us) which is black, thick and bitter, destrained from Berries of that nature, and name, thought good and very wholesom: they say it expels melancholy.

—or home-grown, such as “tom-boy, a girle or wench that leaps up and down like a boy.” He seems to have known he was aiming at a moving target. The dictionary maker’s “labor,” he wrote in his preface, “would find no end, since our English tongue daily changes habit.” Blount’s definitions were much more elaborate than Cawdrey’s, and he tried to provide information about the origins of words as well.

Neither Bullokar nor Blount so much as mentioned Cawdrey. He was already forgotten. But in 1933, upon the publication of the greatest word book of all, the first editors of the Oxford English Dictionary did pay their respects to his “slim, small volume.” They called it “the original acorn” from which their oak had grown. (Cawdrey: “akecorne, k fruit.”)

Four hundred and two years after the Table Alphabeticall, the International Astronomical Union voted to declare Pluto a nonplanet, and John Simpson had to make a quick decision. He and his band of lexicographers in Oxford were working on the P’s. Pletzel, plish, pod person, point-and-shoot, and polyamorous were among the new words entering the OED. The entry for Pluto was itself relatively new. The planet had been discovered only in 1930, too late for the OED’s first edition. The name Minerva was first proposed and then rejected because there was already an asteroid Minerva. In terms of names, the heavens were beginning to fill up. Then “Pluto” was suggested by Venetia Burney, an eleven-year-old resident of Oxford. The OED caught up by adding an entry for Pluto in its second edition: “1. A small planet of the solar system lying beyond the orbit of Neptune . . . 2. The name of a cartoon dog that made its first appearance in Walt Disney’s Moose Hunt, released in April 1931.”

“We really don’t like being pushed into megachanges,” Simpson said, but he had little choice. The Disney meaning of Pluto had proved more stable than the astronomical sense, which was downgraded to “small planetary body.” Consequences rippled through the OED. Pluto was removed from the list under planet n. 3a. Plutonian was revised (not to be confused with pluton, plutey, or plutonyl ).

Simpson was the sixth in a distinguished line, the editors of the Oxford English Dictionary, whose names rolled fluently off his tongue— “Murray, Bradley, Craigie, Onions, Burchfield, so however many fingers that is”—and saw himself as a steward of their traditions, as well as traditions of English lexicography extending back to Cawdrey by way of Samuel Johnson. James Murray in the nineteenth century established a working method based on index cards, slips of paper 6 inches by 4 inches. At any given moment a thousand such slips sat on Simpson’s desk, and within a stone’s throw were millions more, filling metal files and wooden boxes with the ink of two centuries. But the word-slips had gone obsolete. They had become treeware. Treeware had just entered the OED as “computing slang, freq. humorous”; blog was recognized in 2003, dot-commer in 2004, cyberpet in 2005, and the verb to Google in 2006. Simpson himself Googled often. Beside the word-slips his desk held conduits into the nervous system of the language: instantaneous connection to a worldwide network of proxy amateur lexicographers and access to a vast, interlocking set of databases growing asymptotically toward the ideal of All Previous Text. The dictionary had met cyberspace, and neither would be the same thereafter. However much Simpson loved the OED’s roots and legacy, he was leading a revolution, willy-nilly—in what it was, what it knew, what it saw. Where Cawdrey had been isolated, Simpson was connected.

The English language, spoken now by more than a billion people globally, has entered a period of ferment, and the perspective available in these venerable Oxford offices is both intimate and sweeping. The language upon which the lexicographers eavesdrop has become wild and amorphous: a great, swirling, expanding cloud of messaging and speech; newspapers, magazines, pamphlets; menus and business memos; Internet news groups and chat-room conversations; television and radio broadcasts and phonograph records. By contrast, the dictionary itself has acquired the status of a monument, definitive and towering. It exerts an influence on the language it tries to observe. It wears its authoritative role reluctantly. The lexicographers may recall Ambrose Bierce’s sardonic century-old definition: “dictionary, a malevolent literary device for cramping the growth of a language and making it hard and inelastic.” Nowadays they stress that they do not presume (or deign) to disapprove any particular usage or spelling. But they cannot disavow a strong ambition: the goal of completeness. They want every word, all the lingo: idioms and euphemisms, sacred or profane, dead or alive, the King’s English or the street’s. It is an ideal only: the constraints of space and time are ever present and, at the margins, the question of what qualifies as a word can become impossible to answer. Still, to the extent possible, the OED is meant to be a perfect record, perfect mirror of the language.

The dictionary ratifies the persistence of the word. It declares that the meanings of words come from other words. It implies that all words, taken together, form an interlocking structure: interlocking, because all words are defined in terms of other words. This could never have been an issue in an oral culture, where language was barely visible. Only when printing—and the dictionary—put the language into separate relief, as an object to be scrutinized, could anyone develop a sense of word meaning as interdependent and even circular. Words had to be considered as words, representing other words, apart from things. In the twentieth century, when the technologies of logic advanced to high levels, the potential for circularity became a problem. “In giving explanations I already have to use language full blown,” complained Ludwig Wittgenstein. He echoed Newton’s frustration three centuries earlier, but with an extra twist, because where Newton wanted words for nature’s laws, Wittgenstein wanted words for words: “When I talk about language (words, sentences, etc.) I must speak the language of every day. Is this language somehow too coarse and material for what we want to say?” Yes. And the language was always in flux.

James Murray was speaking of the language as well as the book when he said, in 1900, “The English Dictionary, like the English Constitution, is the creation of no one man, and of no one age; it is a growth that has slowly developed itself adown the ages.” The first edition of what became the OED was one of the largest books that had ever been made: A New English Dictionary on Historical Principles, 414,825 words in ten weighty volumes, presented to King George V and President Calvin Coolidge in 1928. The work had taken decades; Murray himself was dead; and the dictionary was understood to be out of date even as the volumes were bound and sewn. Several supplements followed, but not till 1989 did the second edition appear: twenty volumes, totaling 22,000 pages. It weighed 138 pounds. The third edition is different. It is weightless, taking its shape in the digital realm. It may never again involve paper and ink. Beginning in the year 2000, a revision of the entire contents began to appear online in quarterly installments, each comprising several thousand revised entries and hundreds of new words.

Cawdrey had begun work naturally enough with the letter A, and so had James Murray in 1879, but Simpson chose to begin with M. He was wary of the A’s. To insiders it had long been clear that the OED as printed was not a seamless masterpiece. The early letters still bore scars of the immaturity of the uncertain work in Murray’s first days. “Basically he got here, sorted his suitcases out and started setting up text,” Simpson said. “It just took them a long time to sort out their policy and things, so if we started at A, then we’d be making our job doubly difficult. I think they’d sorted themselves out by . . . well, I was going to say D, but Murray always said that E was the worst letter, because his assistant, Henry Bradley, started E, and Murray always said that he did that rather badly. So then we thought, maybe it’s safe to start with G, H. But you get to G and H and there’s I, J, K, and you know, you think, well, start after that.”

The first thousand entries from M to mahurat went online in the spring of 2000. A year later, the lexicographers reached words starting with me: me-ism (a creed for modern times), meds (colloq. for drugs), medspeak (doctors’ jargon), meet-and-greet (a N. Amer. type of social occasion), and an assortment of combined forms under media (baron, circus, darling, hype, savvy) and mega- (pixel, bitch, dose, hit, trend). This was no longer a language spoken by 5 million mostly illiterate inhabitants of a small island. As the OED revised the entries letter by letter, it also began adding neologisms wherever they arose; waiting for the alphabetical sequence became impractical. Thus one installment in 2001 saw the arrival of acid jazz, Bollywood, channel surfing, double-click, emoticon, feel-good, gangsta, hyperlink, and many more. Kool-Aid was recognized as a new word, not because the OED feels obliged to list proprietary names (the original Kool-Ade powdered drink had been patented in the United States in 1927) but because a special usage could no longer be ignored: “to drink the Kool-Aid: to demonstrate unquestioning obedience or loyalty.” The growth of this peculiar expression since the use of a powdered beverage in a mass poisoning in Guyana in 1978 bespoke a certain density of global communication.

But they were no slaves to fashion, these Oxford lexicographers. As a rule a neologism needs five years of solid evidence for admission to the canon. Every proposed word undergoes intense scrutiny. The approval of a new word is a solemn matter. It must be in general use, beyond any particular place of origin; the OED is global, recognizing words from everywhere English is spoken, but it does not want to capture local quirks. Once added, a word cannot come out. A word can go obsolete or rare, but the most ancient and forgotten words have a way of reappearing—rediscovered or spontaneously reinvented—and in any case they are part of the language’s history. All 2,500 of Cawdrey’s words are in the OED, perforce. For thirty-one of them Cawdrey’s little book was the first known usage. For a few Cawdrey is all alone. This is troublesome. The OED is irrevocably committed. Cawdrey, for example, has “onust, loaden, overcharged”; so the OED has “loaded, burdened,” but it is an outlier, a one-off. Did Cawdrey make it up? “I’m tending towards the view that he was attempting to reproduce vocabulary he had heard or seen,” Simpson said. “But I can’t be absolutely sure.” Cawdrey has “hallucinate, to deceive, or blind”; the OED duly gave “to deceive” as the first sense of the word, though it never found anyone else who used it that way. In cases like these, the editors can add their double caveat “Obs. rare.” But there it is.

For the twenty-first-century OED a single source is never enough. Strangely, considering the vastness of the enterprise and its constituency, individual men and women strive to have their own nonce-words ratified by the OED. Nonce-word, in fact, was coined by James Murray himself. He got it in. An American psychologist, Sondra Smalley, coined the word codependency in 1979 and began lobbying for it in the eighties; the editors finally drafted an entry in the nineties, when they judged the word to have become established. W. H. Auden declared that he wanted to be recognized as an OED word coiner—and he was, at long last, for motted, metalogue, spitzy, and others. The dictionary had thus become engaged in a feedback loop. It inspired a twisty self-consciousness in the language’s users and creators. Anthony Burgess whinged in print about his inability to break through: “I invented some years ago the word amation, for the art or act of making love, and still think it useful. But I have to persuade others to use it in print before it is eligible for lexicographicizing (if that word exists)”—he knew it did not. “T. S. Eliot’s large authority got the shameful (in my view) juvescence into the previous volume of the Supplement.” Burgess was quite sure that Eliot simply misspelled juvenescence. If so, the misspelling was either copied or reprised twenty-eight years later by Stephen Spender, so juvescence has two citations, not one. The OED admits that it is rare.

As hard as the OED tries to embody the language’s fluidity, it cannot help but serve as an agent of its crystallization. The problem of spelling poses characteristic difficulties. “Every form in which a word has occurred throughout its history” is meant to be included. So for mackerel (“a well-known sea-fish, Scomber scombrus, much used for food”) the second edition in 1989 listed nineteen alternative spellings. The unearthing of sources never ends, though, so the third edition revised entry in 2002 listed no fewer than thirty: maccarel, mackaral, mackarel, mackarell, mackerell, mackeril, mackreel, mackrel, mackrell, mackril, macquerel, macquerell, macrel, macrell, macrelle, macril, macrill, makarell, makcaral, makerel, makerell, makerelle, makral, makrall, makreill, makrel, makrell, makyrelle, maquerel, and maycril. As lexicographers, the editors would never declare these alternatives to be wrong: misspellings. They do not wish to declare their choice of spelling for the headword, mackerel, to be “correct.” They emphasize that they examine the evidence and choose “the most common current spelling.” Even so, arbitrary considerations come into play: “Oxford’s house style occasionally takes precedence, as with verbs which can end -ize or -ise, where the -ize spelling is always used.” They know that no matter how often and how firmly they disclaim a prescriptive authority, a reader will turn to the dictionary to find out how a word should be spelled. They cannot escape inconsistencies. They feel obliged to include words that make purists wince. A new entry as of December 2003 memorialized nucular: “= nuclear a. (in various senses).” Yet they refuse to count evident misprints found by way of Internet searches. They do not recognize straight-laced, even though statistical evidence finds that bastardized form outnumbering strait-laced. For the crystallization of spelling, the OED offers a conventional explanation: “Since the invention of the printing press, spelling has become much less variable, partly because printers wanted uniformity and partly because of a growing interest in language study during the Renaissance.” This is true. But it omits the role of the dictionary itself, arbitrator and exemplar.

For Cawdrey the dictionary was a snapshot; he could not see past his moment in time. Samuel Johnson was more explicitly aware of the dictionary’s historical dimension. He justified his ambitious program in part as a means of bringing a wild thing under control—the wild thing being the language, “which, while it was employed in the cultivation of every species of literature, has itself been hitherto neglected; suffered to spread, under the direction of chance, into wild exuberance; resigned to the tyranny of time and fashion; and exposed to the corruptions of ignorance, and caprices of innovation.” Not until the OED, though, did lexicography attempt to reveal the whole shape of a language across time. The OED becomes a historical panorama. The project gains poignancy if the electronic age is seen as a new age of orality, the word breaking free from the bonds of cold print. No publishing institution better embodies those bonds, but the OED, too, tries to throw them off. The editors feel they can no longer wait for a new word to appear in print, let alone in a respectably bound book, before they must take note. For tighty-whities (men’s underwear), new in 2007, they cite a typescript of North Carolina campus slang. For kitesurfer, they cite a posting to the Usenet newsgroup alt.kite and later a New Zealand newspaper found via an online database. Bits in the ether.

When Murray began work on the new dictionary, the idea was to find the words, and with them the signposts to their history. No one had any idea how many words were there to be found. By then the best and most comprehensive dictionary of English was American: Noah Webster’s, seventy thousand words. That was a baseline. Where were the rest to be discovered? For the first editors of what became the OED, it went almost without saying that the source, the wellspring, should be the literature of the language—particularly the books of distinction and quality. The dictionary’s first readers combed Milton and Shakespeare (still the single most quoted author, with more than thirty thousand references), Fielding and Swift, histories and sermons, philosophers and poets. Murray announced in a famous public appeal in 1879:

A thousand readers are wanted. The later sixteenth-century literature is very fairly done; yet here several books remain to be read. The seventeenth century, with so many more writers, naturally shows still more unexplored territory.

He considered the territory to be large but bounded. The founders of the dictionary explicitly meant to find every word, however many that would ultimately be. They planned a complete inventory. Why should they not? The number of books was unknown but not unlimited, and the number of words in those books was countable. The task seemed formidable but finite.

It no longer seems finite. Lexicographers are accepting the language’s boundlessness. They know by heart Murray’s famous remark: “The circle of the English language has a well-defined centre but no discernable circumference.” In the center are the words everyone knows. At the edges, where Murray placed slang and cant and scientific jargon and foreign border crossers, everyone’s sense of the language differs and no one’s can be called “standard.”

Murray called the center “well defined,” but infinitude and fuzziness can be seen there. The easiest, most common words—the words Cawdrey had no thought of including—require, in the OED, the most extensive entries. The entry for make alone would fill a book: it teases apart ninety-eight distinct senses of the verb, and some of these senses have a dozen or more subsenses. Samuel Johnson saw the problem with these words and settled on a solution: he threw up his hands.

My labor has likewise been much increased by a class of verbs too frequent in the English language, of which the signification is so loose and general, the use so vague and indeterminate, and the senses detorted so widely from the first idea, that it is hard to trace them through the maze of variation, to catch them on the brink of utter inanity, to circumscribe them by any limitations, or interpret them by any words of distinct and settled meaning; such are bear, break, come, cast, full, get, give, do, put, set, go, run, make, take, turn, throw. If of these the whole power is not accurately delivered, it must be remembered, that while our language is yet living, and variable by the caprice of every one that speaks it, these words are hourly shifting their relations, and can no more be ascertained in a dictionary, than a grove, in the agitation of a storm, can be accurately delineated from its picture in the water.

Johnson had a point. These are words that any speaker of English can press into new service at any time, on any occasion, alone or in combination, inventively or not, with hopes of being understood. In every revision, the OED’s entry for a word like make subdivides further and thus grows larger. The task is unbounded in an inward-facing direction.

The more obvious kind of unboundedness appears at the edges. Neologism never ceases. Words are coined by committee: transistor, Bell Laboratories, 1948. Or by wags: booboisie, H. L. Mencken, 1922. Most arise through spontaneous generation, organisms appearing in a petri dish, like blog (c. 1999). One batch of arrivals includes agroterrorism, bada-bing, bahookie (a body part), beer pong (a drinking game), bippy (as in, you bet your ———), chucklesome, cypherpunk, tuneage, and wonky. None are what Cawdrey would have seen as “hard, usual words,” and none are anywhere near Murray’s well-defined center, but they now belong to the common language. Even bada-bing: “Suggesting something happening suddenly, emphatically, or easily and predictably; ‘Just like that!’, ‘Presto!’ ” The historical citations begin with a 1965 audio recording of a comedy routine by Pat Cooper and continue with newspaper clippings, a television news transcript, and a line of dialogue from the first Godfather movie: “You’ve gotta get up close like this and bada-bing! you blow their brains all over your nice Ivy League suit.” The lexicographers also provide an etymology, an exquisite piece of guesswork: “Origin uncertain. Perh. imitative of the sound of a drum roll and cymbal clash. Perh. cf. Italian bada bene mark well.”

The English language no longer has such a thing as a geographic center, if it ever did. The universe of human discourse always has backwaters. The language spoken in one valley diverges from the language of the next valley, and so on. There are more valleys now than ever, even if the valleys are not so isolated. “We are listening to the language,” said Peter Gilliver, an OED lexicographer and resident historian. “When you are listening to the language by collecting pieces of paper, that’s fine, but now it’s as if we can hear everything said anywhere. Take an expatriate community living in a non-English-speaking part of the world, expatriates who live at Buenos Aires or something. Their English, the English that they speak to one another every day, is full of borrowings from local Spanish. And so they would regard those words as part of their idiolect, their personal vocabulary.” Only now they may also speak in chat rooms and on blogs. When they coin a word, anyone may hear. Then it may or may not become part of the language.

If there is an ultimate limit to the sensitivity of lexicographers’ ears, no one has yet found it. Spontaneous coinages can have an audience of one. They can be as ephemeral as atomic particles in a bubble chamber. But many neologisms require a level of shared cultural knowledge. Perhaps bada-bing would not truly have become part of twenty-first-century English had it not been for the common experience of viewers of a particular American television program (though it is not cited by the OED).

The whole word hoard—the lexis—constitutes a symbol set of the language. It is the fundamental symbol set, in one way: words are the first units of meaning any language recognizes. They are recognized universally. But in another way it is far from fundamental: as communication evolves, messages in a language can be broken down and composed and transmitted in much smaller sets of symbols: the alphabet; dots and dashes; drumbeats high and low. These symbol sets are discrete. The lexis is not. It is messier. It keeps on growing. Lexicography turns out to be a science poorly suited to exact measurement. English, the largest and most widely shared language, can be said very roughly to possess a number of units of meaning that approaches a million. Linguists have no special yardsticks of their own; when they try to quantify the pace of neologism, they tend to look to the dictionary for guidance, and even the best dictionary runs from that responsibility. The edges always blur. A clear line cannot be drawn between word and unword.

So we count as we can. Robert Cawdrey’s little book, making no pretense to completeness, contained a vocabulary of only 2,500. We possess now a more complete dictionary of English as it was circa 1600: the subset of the OED comprising words then current. That vocabulary numbers 60,000 and keeps growing, because the discovery of sixteenth-century sources never ends. Even so, it is a tiny fraction of the words used four centuries later. The explanation for this explosive growth, from 60,000 to a million, is not simple. Much of what now needs naming did not yet exist, of course. And much of what existed was not recognized. There was no call for transistor in 1600, nor nanobacterium, nor webcam, nor fen-phen. Some of the growth comes from mitosis. The guitar divides into the electric and the acoustic; other words divide in reflection of delicate nuances (as of March 2007 the OED assigned a new entry to prevert as a form of pervert, taking the view that prevert was not just an error but a deliberately humorous effect). Other new words appear without any corresponding innovation in the world of real things. They crystallize in the solvent of universal information.

What, in the world, is a mondegreen? It is a misheard lyric, as when, for example, the Christian hymn is heard as “Lead on, O kinky turtle . . .”). In sifting the evidence, the OED first cites a 1954 essay in Harper’s Magazine by Sylvia Wright: “What I shall hereafter call mondegreens, since no one else has thought up a word for them.” She explained the idea and the word this way:

When I was a child, my mother used to read aloud to me from Percy’s Reliques, and one of my favorite poems began, as I remember:

Ye Highlands and ye Lowlands,

Oh, where hae ye been?

They hae slain the Earl Amurray,

And Lady Mondegreen.

There the word lay, for some time. A quarter-century later, William Safire discussed the word in a column about language in The New York Times Magazine. Fifteen years after that, Steven Pinker, in his book The Language Instinct, offered a brace of examples, from “A girl with colitis goes by” to “Gladly the cross-eyed bear,” and observed, “The interesting thing about mondegreens is that the mishearings are generally less plausible than the intended lyrics.” But it was not books or magazines that gave the word its life; it was Internet sites, compiling mondegreens by the thousands. The OED recognized the word in June 2004.

A mondegreen is not a transistor, inherently modern. Its modernity is harder to explain. The ingredients—songs, words, and imperfect understanding—are all as old as civilization. Yet for mondegreens to arise in the culture, and for mondegreen to exist in the lexis, required something new: a modern level of linguistic self-consciousness and interconnectedness. People needed to mishear lyrics not just once, not just several times, but often enough to become aware of the mishearing as a thing worth discussing. They needed to have other such people with whom to share the recognition. Until the most modern times, mondegreens, like countless other cultural or psychological phenomena, simply did not need to be named. Songs themselves were not so common; not heard, anyway, on elevators and mobile phones. The word lyrics, meaning the words of a song, did not exist until the nineteenth century. The conditions for mondegreens took a long time to ripen. Similarly, the verb to gaslight now means “to manipulate a person by psychological means into questioning his or her own sanity”; it exists only because enough people saw the 1944 film of that title and could assume that their listeners had seen it, too. Might not the language Cawdrey spoke—which was, after all, the abounding and fertile language of Shakespeare—have found use for such a word? No matter: the technology for gaslight had not been invented. Nor had the technology for motion pictures.

The lexis is a measure of shared experience, which comes from inter-connectedness. The number of users of the language forms only the first part of the equation: jumping in four centuries from 5 million English speakers to a billion. The driving factor is the number of connections between and among those speakers. A mathematician might say that messaging grows not geometrically, but combinatorially, which is much, much faster. “I think of it as a saucepan under which the temperature has been turned up,” Gilliver said. “Any word, because of the interconnectedness of the English-speaking world, can spring from the backwater. And they are still backwaters, but they have this instant connection to ordinary, everyday discourse.” Like the printing press, the telegraph, and the telephone before it, the Internet is transforming the language simply by transmitting information differently. What makes cyberspace different from all previous information technologies is its intermixing of scales from the largest to the smallest without prejudice, broadcasting to the millions, narrowcasting to groups, instant messaging one to one.

This comes as quite an unexpected consequence of the invention of computing machinery. At first, that had seemed to be about numbers.


Chapter Four

To Throw the Powers of Thought into Wheel-Work

(Lo, the Raptured Arithmetician)

Light almost solar has been extracted from the refuse of fish; fire has been sifted by the lamp of Davy; and machinery has been taught arithmetic instead of poetry.

—Charles Babbage (1832)

NO ONE DOUBTED THAT Charles Babbage was brilliant. Nor did anyone quite understand the nature of his genius, which remained out of focus for a long time. What did he hope to achieve? For that matter, what, exactly, was his vocation? On his death in London in 1871 the Times obituarist declared him “one of the most active and original of original thinkers” but seemed to feel he was best known for his long, cranky crusade against street musicians and organ-grinders. He might not have minded. He was multifarious and took pride in it. “He showed great desire to inquire into the causes of things that astonish childish minds,” said an American eulogist. “He eviscerated toys to ascertain their manner of working.” Babbage did not quite belong in his time, which called itself the Steam Age or the Machine Age. He did revel in the uses of steam and machinery and considered himself a thoroughly modern man, but he also pursued an assortment of hobbies and obsessions—cipher cracking, lock picking, lighthouses, tree rings, the post—whose logic became clearer a century later. Examining the economics of the mail, he pursued a counterintuitive insight, that the significant cost comes not from the physical transport of paper packets but from their “verification”—the calculation of distances and the collection of correct fees—and thus he invented the modern idea of standardized postal rates. He loved boating, by which he meant not “the manual labor of rowing but the more intellectual art of sailing.” He was a train buff. He devised a railroad recording device that used inking pens to trace curves on sheets of paper a thousand feet long: a combination seismograph and speedometer, inscribing the history of a train’s velocity and all the bumps and shakes along the way.

As a young man, stopping at an inn in the north of England, he was amused to hear that his fellow travelers had been debating his trade:

“The tall gentleman in the corner,” said my informant, “maintained you were in the hardware line; whilst the fat gentleman who sat next to you at supper was quite sure that you were in the spirit trade. Another of the party declared that they were both mistaken: he said you were travelling for a great iron-master.”

“Well,” said I, “you, I presume, knew my vocation better than our friends.”

“Yes,” said my informant, “I knew perfectly well that you were in the Nottingham lace trade.”

He might have been described as a professional mathematician, yet here he was touring the country’s workshops and manufactories, trying to discover the state of the art in machine tools. He noted, “Those who enjoy leisure can scarcely find a more interesting and instructive pursuit than the examination of the workshops of their own country, which contain within them a rich mine of knowledge, too generally neglected by the wealthier classes.” He himself neglected no vein of knowledge. He did become expert on the manufacture of Nottingham lace; also the use of gunpowder in quarrying limestone; precision glass cutting with diamonds; and all known uses of machinery to produce power, save time, and communicate signals. He analyzed hydraulic presses, air pumps, gas meters, and screw cutters. By the end of his tour he knew as much as anyone in England about the making of pins. His knowledge was practical and methodical. He estimated that a pound of pins required the work of ten men and women for at least seven and a half hours, drawing wire, straightening wire, pointing the wire, twisting and cutting heads from the spiral coils, tinning or whitening, and finally papering. He computed the cost of each phase in millionths of a penny. And he noted that this process, when finally perfected, had reached its last days: an American had invented an automatic machine to accomplish the same task, faster.

Babbage invented his own machine, a great, gleaming engine of brass and pewter, comprising thousands of cranks and rotors, cogs and gearwheels, all tooled with the utmost precision. He spent his long life improving it, first in one and then in another incarnation, but all, mainly, in his mind. It never came to fruition anywhere else. It thus occupies an extreme and peculiar place in the annals of invention: a failure, and also one of humanity’s grandest intellectual achievements. It failed on a colossal scale, as a scientific-industrial project “at the expense of the nation, to be held as national property,” financed by the Treasury for almost twenty years, beginning in 1823 with a Parliamentary appropriation of £1,500 and ending in 1842, when the prime minister shut it down. Later, Babbage’s engine was forgotten. It vanished from the lineage of invention. Later still, however, it was rediscovered, and it became influential in retrospect, to shine as a beacon from the past.

Like the looms, forges, naileries, and glassworks he studied in his travels across northern England, Babbage’s machine was designed to manufacture vast quantities of a certain commodity. The commodity was numbers. The engine opened a channel from the corporeal world of matter to a world of pure abstraction. The engine consumed no raw materials—input and output being weightless—but needed a considerable force to turn the gears. All that wheel-work would fill a room and weigh several tons. Producing numbers, as Babbage conceived it, required a degree of mechanical complexity at the very limit of available technology. Pins were easy, compared with numbers.

It was not natural to think of numbers as a manufactured commodity. They existed in the mind, or in ideal abstraction, in their perfect infinitude. No machine could add to the world’s supply. The numbers produced by Babbage’s engine were meant to be those with significance: numbers with a meaning. For example, 2.096910013 has a meaning, as the logarithm of 125. (Whether every number has a meaning would be a conundrum for the next century.) The meaning of a number could be expressed as a relationship to other numbers, or as the answer to a certain question of arithmetic. Babbage himself did not speak in terms of meaning; he tried to explain his engine pragmatically, in terms of putting numbers into the machine and seeing other numbers come out, or, a bit more fancifully, in terms of posing questions to the machine and expecting an answer. Either way, he had trouble getting the point across. He grumbled:

On two occasions I have been asked,—“Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?” In one case a member of the Upper, and in the other a member of the Lower, House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

Anyway, the machine was not meant to be a sort of oracle, to be consulted by individuals who would travel from far and wide for mathematical answers. The engine’s chief mission was to print out numbers en masse. For portability, the facts of arithmetic could be expressed in tables and bound in books.

To Babbage the world seemed made of such facts. They were the “constants of Nature and Art.” He collected them everywhere. He compiled a Table of Constants of the Class Mammalia: wherever he went he timed the breaths and heartbeats of pigs and cows. He invented a statistical methodology with tables of life expectancy for the somewhat shady business of life insurance. He drew up a table of the weight in Troy grains per square yard of various fabrics: cambric, calico, nankeen, muslins, silk gauze, and “caterpillar veils.” Another table revealed the relative frequencies of all the double-letter combinations in English, French, Italian, German, and Latin. He researched, computed, and published a Table of the Relative Frequency of the Causes of Breaking of Plate Glass Windows, distinguishing 464 different causes, no less than fourteen of which involved “drunken men, women, or boys.” But the tables closest to his heart were the purest: tables of numbers and only numbers, marching neatly across and down the pages in stately rows and columns, patterns for abstract appreciation.

A book of numbers: amid all the species of information technology, how peculiar and powerful an object this is. “Lo! the raptured arithmetician!” wrote Élie de Joncourt in 1762. “Easily satisfied, he asks no Brussels lace, nor a coach and six.” Joncourt’s own contribution was a small quarto volume registering the first 19,999 triangular numbers. It was a treasure box of exactitude, perfection, and close reckoning. These numbers were so simple, just the sums of the first n whole numbers: 1, 3 (1+2), 6 (1+2+3), 10 (1+2+3+4), 15, 21, 28, and so on. They had interested number theorists since Pythagoras. They offered little in the way of utility, but Joncourt rhapsodized about his pleasure in compiling them and Babbage quoted him with heartfelt sympathy: “Numbers have many charms, unseen by vulgar eyes, and only discovered to the unwearied and respectful sons of Art. Sweet joy may arise from such contemplations.”

Tables of numbers had been part of the book business even before the beginning of the print era. Working in Baghdad in the ninth century, Abu Abdullah Mohammad Ibn Musa al-Khwarizmi, whose name survives in the word algorithm, devised tables of trigonometric functions that spread west across Europe and east to China, made by hand and copied by hand, for hundreds of years. Printing brought number tables into their own: they were a natural first application for the mass production of data in the raw. For people in need of arithmetic, multiplication tables covered more and more territory: 10 × 1,000, then 10 × 10,000, and later as far as 1,000 × 1,000. There were tables of squares and cubes, roots and reciprocals. An early form of table was the ephemeris or almanac, listing positions of the sun, moon, and planets for sky-gazers. Tradespeople found uses for number books. In 1582 Simon Stevin produced Tafelen van Interest, a compendium of interest tables for bankers and moneylenders. He promoted the new decimal arithmetic “to astrologers, land-measurers, measurers of tapestry and wine casks and stereometricians, in general, mint masters and merchants all.” He might have added sailors. When Christopher Columbus set off for the Indies, he carried as an aid to navigation a book of tables by Regiomontanus printed in Nuremberg two decades after the invention of moveable type in Europe.

Joncourt’s book of triangular numbers was purer than any of these— which is also to say useless. Any arbitrary triangular number can be found (or made) by an algorithm: multiply n by n + 1 and divide by 2. So Joncourt’s whole compendium, as a bundle of information to be stored and transmitted, collapses in a puff to a one-line formula. The formula contains all the information. With it, anyone capable of simple multiplication (not many were) could generate any triangular number on demand. Joncourt knew this. Still he and his publisher, M. Husson, at the Hague, found it worthwhile to set the tables in metal type, three pairs of columns to a page, each pair listing thirty natural numbers alongside their corresponding triangular numbers, from 1(1) to 19,999(199,990,000), every numeral chosen individually by the compositor from his cases of metal type and lined up in a galley frame and wedged into an iron chase to be placed upon the press.

Why? Besides the obsession and the ebullience, the creators of number tables had a sense of their economic worth. Consciously or not, they reckoned the price of these special data by weighing the difficulty of computing them versus looking them up in a book. Precomputation plus data storage plus data transmission usually came out cheaper than ad hoc computation. “Computers” and “calculators” existed: they were people with special skills, and all in all, computing was costly.

Beginning in 1767, England’s Board of Longitude ordered published a yearly Nautical Almanac, with position tables for the sun, moon, stars, planets, and moons of Jupiter. Over the next half century a network of computers did the work—thirty-four men and one woman, Mary Edwards of Ludlow, Shropshire, all working from their homes. Their painstaking labor paid £70 a year. Computing was a cottage industry. Some mathematical sense was required but no particular genius; rules were laid out in steps for each type of calculation. In any case the computers, being human, made errors, so the same work was often farmed out twice for the sake of redundancy. (Unfortunately, being human, computers were sometimes caught saving themselves labor by copying from one other.) To manage the information flow the project employed a Comparer of the Ephemeris and Corrector of the Proofs. Communication between the computers and comparer went by post, men on foot or on horseback, a few days per message.

A seventeenth-century invention had catalyzed the whole enterprise. This invention was itself a species of number, given the name logarithm. It was number as tool. Henry Briggs explained:

Logarithmes are Numbers invented for the more easie working of questions in Arithmetike and Geometrie. The name is derived of Logos, which signifies Reason, and Arithmos, signifying Numbers. By them all troublesome Multiplications and Divisions in Arithmetike are avoided, and performed onely by Addition in stead of Multiplication, and by Subtraction in stead of Division.

In 1614 Briggs was a professor of geometry—the first professor of geometry—at Gresham College, London, later to be the birthplace of the Royal Society. Without logarithms he had already created two books of tables, A Table to find the Height of the Pole, the Magnetic Declination being given and Tables for the Improvement of Navigation, when a book came from Edinburgh promising to “take away all the difficultie that heretofore hath beene in mathematical calculations.”

There is nothing (right well beloved Students in the Mathematickes) that is so troublesome to Mathematicall practice, not that doth more molest and hinder Calculators, then the Multiplications, Divisions, square and cubical Extractions of great numbers, which besides the tedious expence of time, are for the most part subject to many slippery errors.

This new book proposed a method that would do away with most of the expense and the errors. It was like an electric flashlight sent to a lightless world. The author was a wealthy Scotsman, John Napier (or Napper, Nepair, Naper, or Neper), the eighth laird of Merchiston Castle, a theologian and well-known astrologer who also made a hobby of mathematics. Briggs was agog. “Naper, lord of Markinston, hath set my head and hands a work,” he wrote. “I hope to see him this summer, if it please God, for I never saw book, which pleased me better, and made me more wonder.” He made his pilgrimage to Scotland and their first meeting, as he reported later, began with a quarter hour of silence: “spent, each beholding other almost with admiration before one word was spoke.”

Briggs broke the trance: “My Lord, I have undertaken this long journey purposely to see your person, and to know by what engine of wit or ingenuity you came first to think of this most excellent help unto astronomy, viz. the Logarithms; but, my Lord, being by you found out, I wonder nobody else found it out before, when now known it is so easy.” He stayed with the laird for several weeks, studying.

In modern terms a logarithm is an exponent. A student learns that the logarithm of 100, using 10 as the base, is 2, because 100 = 10


. The logarithm of 1,000,000 is 6, because 6 is the exponent in the expression 1,000,000 = 10


. To multiply two numbers, a calculator could just look up their logarithms and add those. For example:

100 × 1,000,000 = 10


× 10


= 10




Looking up and adding are easier than multiplying.

But Napier did not express his idea this way, in terms of exponents. He grasped the thing viscerally: he was thinking in terms of a relationship between differences and ratios. A series of numbers with a fixed difference is an arithmetic progression: 0, 1, 2, 3, 4, 5 . . . When the numbers are separated by a fixed ratio, the progression is geometric: 1, 2, 4, 8, 16, 32 . . . Set these progressions side by side,

0 1 2 3 4 5 . . . (base 2 logarithms)

1 2 4 8 16 32 . . . (natural numbers)

and the result is a crude table of logarithms—crude, because the whole-number exponents are the easy ones. A useful table of logarithms had to fill in the gaps, with many decimal places of accuracy.

In Napier’s mind was an analogy: differences are to ratios as addition is to multiplication. His thinking crossed over from one plane to another, from spatial relationships to pure numbers. Aligning these scales side by side, he gave a calculator a practical means of converting multiplication into addition—downshifting, in effect, from the difficult task to the easier one. In a way, the method is a kind of translation, or encoding. The natural numbers are encoded as logarithms. The calculator looks them up in a table, the code book. In this new language, calculation is easy: addition instead of multiplication, or multiplication instead of exponentiation. When the work is done, the result is translated back into the language of natural numbers. Napier, of course, could not think in terms of encoding.

Briggs revised and extended the necessary number sequences and published a book of his own, Logarithmicall Arithmetike, full of pragmatic applications. Besides the logarithms he presented tables of latitude of the sun’s declination year by year; showed how to find the distance between any two places, given their latitudes and longitudes; and laid out a star guide with declinations, distance to the pole, and right ascension. Some of this represented knowledge never compiled and some was oral knowledge making the transition to print, as could be seen in the not-quite-formal names of the stars: the Pole Starre, girdle of Andromeda, Whales Bellie, the brightest in the harpe, and the first in the great Beares taile next her rump. Briggs also considered matters of finance, offering rules for computing with interest, backward and forward in time. The new technology was a watershed: “It may be here also noted that the use of a 100 pound for a day at the rate of 8, 9, 10, or the like for a yeare hath beene scarcely known, till by Logarithms it was found out: for otherwise it requires so many laborious extractions of roots, as will cost more paines than the knowledge of the thing is accompted to be worth.” Knowledge has a value and a discovery cost, each to be counted and weighed.






Even this exciting discovery took several years to travel as far as Johannes Kepler, who employed it in perfecting his celestial tables in 1627, based on the laboriously acquired data of Tycho Brahe. “A Scottish baron has appeared on the scene (his name I have forgotten) who has done an excellent thing,” Kepler wrote a friend, “transforming all multiplication and division into addition and subtraction.” Kepler’s tables were far more accurate—perhaps thirty times more—than any of his medieval predecessors, and the accuracy made possible an entirely new thing, his harmonious heliocentric system, with planets orbiting the sun in ellipses. From that time until the arrival of electronic machines, the majority of human computation was performed by means of logarithms. A teacher of Kepler’s sniffed, “It is not fitting for a professor of mathematics to manifest childish joy just because reckoning is made easier.” But why not? Across the centuries they all felt that joy in reckoning: Napier and Briggs, Kepler and Babbage, making their lists, building their towers of ratio and proportion, perfecting their mechanisms for transforming numbers into numbers. And then the world’s commerce validated their pleasure.

Charles Babbage was born on Boxing Day 1791, near the end of the century that began with Newton. His home was on the south side of the River Thames in Walworth, Surrey, still a rural hamlet, though the London Bridge was scarcely a half hour’s walk even for a small boy. He was the son of a banker, who was himself the son and grandson of goldsmiths. In the London of Babbage’s childhood, the Machine Age made itself felt everywhere. A new breed of impresario was showing off machinery in exhibitions. The shows that drew the biggest crowds featured automata—mechanical dolls, ingenious and delicate, with wheels and pinions mimicking life itself. Charles Babbage went with his mother to John Merlin’s Mechanical Museum in Hanover Square, full of clockwork and music boxes and, most interesting, simulacra of living things. A metal swan bent its neck to catch a metal fish, moved by hidden motors and cams. In the artist’s attic workshop Charles saw a pair of naked dancing women, gliding and bowing, crafted in silver at one-fifth life size. Merlin himself, their elderly creator, said he had devoted years to these machines, his favorites, still unfinished. One of the figurines especially impressed Charles with its (or her) grace and seeming liveliness. “This lady attitudinized in a most fascinating manner,” he recalled. “Her eyes were full of imagination, and irresistible.” Indeed, when he was a man in his forties he found Merlin’s silver dancer at an auction, bought it for £35, installed it on a pedestal in his home, and dressed its nude form in custom finery.

The boy also loved mathematics—an interest far removed from the mechanical arts, as it seemed. He taught himself in bits and pieces from such books as he could find. In 1810 he entered Trinity College, Cambridge—Isaac Newton’s domain and still the moral center of mathematics in England. Babbage was immediately disappointed: he discovered that he already knew more of the modern subject than his tutors, and the further knowledge he sought was not to be found there, maybe not anywhere in England. He began to acquire foreign books— especially books from Napoleon’s France, with which England was at war. From a specialty bookseller in London he got Lagrange’s Théorie des fonctions analytiques and “the great work of Lacroix, on the Differential and Integral Calculus.”

He was right: at Cambridge mathematics was stagnating. A century earlier Newton had been only the second professor of mathematics the university ever had; all the subject’s power and prestige came from his legacy. Now his great shadow lay across English mathematics as a curse. The most advanced students learned his brilliant and esoteric “fluxions” and the geometrical proofs of his Principia. In the hands of anyone but Newton, the old methods of geometry brought little but frustration. His peculiar formulations of the calculus did his heirs little good. They were increasingly isolated. The English professoriate “regarded any attempt at innovation as a sin against the memory of Newton,” one nineteenth-century mathematician said. For the running river of modern mathematics a student had to look elsewhere, to the Continent, to “analysis” and the language of differentiation as invented by Newton’s rival and nemesis, Gottfried Wilhelm Leibniz. Fundamentally, there was only one calculus. Newton and Leibniz knew how similar their work was— enough that each accused the other of plagiarism. But they had devised incompatible systems of notation—different languages—and in practice these surface differences mattered more than the underlying sameness. Symbols and operators were what a mathematician had to work with, after all. Babbage, unlike most students, made himself fluent in both— “the dots of Newton, the d’s of Leibnitz”—and felt he had seen the light. “It is always difficult to think and reason in a new language.”

Indeed, language itself struck him as a fit subject for philosophical study—a subject into which he found himself sidetracked from time to time. Thinking about language, while thinking in language, leads to puzzles and paradoxes. Babbage tried for a while to invent, or construct, a universal language, a symbol system that would be free of local idiosyncrasies and imperfections. He was not the first to try. Leibniz himself had claimed to be on the verge of a characteristica universalis that would give humanity “a new kind of an instrument increasing the powers of reason far more than any optical instrument has ever aided the power of vision.” As philosophers came face to face with the multiplicity of the world’s dialects, they so often saw language not as a perfect vessel for truth but as a leaky sieve. Confusion about the meanings of words led to contradictions. Ambiguities and false metaphors were surely not inherent in the nature of things, but arose from a poor choice of signs. If only one could find a proper mental technology, a true philosophical language! Its symbols, properly chosen, must be universal, transparent, and immutable, Babbage argued. Working systematically, he managed to create a grammar and began to write down a lexicon but ran aground on a problem of storage and retrieval—stopped “by the apparent impossibility of arranging signs in any consecutive order, so as to find, as in a dictionary, the meaning of each when wanted.” Nevertheless he felt that language was a thing a person could invent. Ideally, language should be rationalized, made predictable and mechanical. The gears should mesh.

Still an undergraduate, he aimed at a new revival of English mathematics—a suitable cause for founding an advocacy group and launching a crusade. He joined with two other promising students, John Herschel and George Peacock, to form what they named the Analytical Society, “for the propagation of d ’s” and against “the heresy of dots,” or as Babbage said, “the Dot-age of the University.” (He was pleased with his own “wicked pun.”) In their campaign to free the calculus from English dotage, Babbage lamented “the cloud of dispute and national acrimony, which has been thrown over its origin.” Never mind if it seemed French. He declared, “We have now to re-import the exotic, with nearly a century of foreign improvement, and to render it once more indigenous among us.” They were rebels against Newton in the heart of Newton-land. They met over breakfast every Sunday after chapel.

“Of course we were much ridiculed by the Dons,” Babbage recalled. “It was darkly hinted that we were young infidels, and that no good would come of us.” Yet their evangelism worked: the new methods spread from the bottom up, students learning faster than their teachers. “The brows of many a Cambridge moderator were elevated, half in ire, half in admiration, at the unusual answers which began to appear in examination papers,” wrote Herschel. The dots of Newton faded from the scene, his fluxions replaced by the notation and language of Leibniz.

Meanwhile Babbage never lacked companions with whom he could quaff wine or play whist for six-penny points. With one set of friends he formed a Ghost Club, dedicated to collecting evidence for and against occult spirits. With another set he founded a club called the Extractors, meant to sort out issues of sanity and insanity according to a set of procedures:



1 Every member shall communicate his address to the Secretary once in six months.

2 If this communication is delayed beyond twelve months, it shall be taken for granted that his relatives had shut him up as insane.

3 Every effort legal and illegal shall be made to get him out of the madhouse [hence the name “Extractors”].

4 Every candidate for admission as a member shall produce six certificates. Three that he is sane and three others that he is insane.


But the Analytical Society was serious. It was with no irony, all earnestness, that these mathematical friends, Babbage and Herschel and Peacock, resolved to “do their best to leave the world a wiser place than they found it.” They rented rooms and read papers to one another and published their “Transactions.” And in those rooms, as Babbage nodded over a book of logarithms, one of them interrupted: “Well, Babbage, what are you dreaming about?”

“I am thinking that all these Tables might be calculated by machinery,” he replied.

Anyway that was how Babbage reported the conversation fifty years later. Every good invention needs a eureka story, and he had another in reserve. He and Herschel were laboring together to produce a manuscript of logarithm tables for the Cambridge Astronomical Society. These very logarithms had been computed before; logarithms must always be computed and recomputed and compared and mistrusted. No wonder Babbage and Herschel, laboring over their own manuscript at Cambridge, found the work tedious. “I wish to God these calculations had been executed by steam,” cried Babbage, and Herschel replied simply, “It is quite possible.”

Steam was the driver of all engines, the enabler of industry. If only for these few decades, the word stood for power and force and all that was vigorous and modern. Formerly, water or wind drove the mills, and most of the world’s work still depended on the brawn of people and horses and livestock. But hot steam, generated by burning coal and brought under control by ingenious inventors, had portability and versatility. It replaced muscles everywhere. It became a watchword: people on the go would now “steam up” or “get more steam on” or “blow off steam.” Benjamin Disraeli hailed “your moral steam which can work the world.” Steam became the most powerful transmitter of energy known to humanity.

It was odd even so that Babbage thought to exert this potent force in a weightless realm—applying steam to thought and arithmetic. Numbers were the grist for his mill. Racks would slide, pinions would turn, and the mind’s work would be done.

It should be done automatically, Babbage declared. What did it mean to call a machine “automatic”? For him it was not just a matter of semantics but a principle for judging a machine’s usefulness. Calculating devices, such as they were, could be divided into two classes: the first requiring human intervention, the second truly self-acting. To decide whether a machine qualified as automatic, he needed to ask a question that would have been simpler if the words input and output had been invented: “Whether, when the numbers on which it is to operate are placed in the instrument, it is capable of arriving at its result by the mere motion of a spring, a descending weight, or any other constant force.” This was a farsighted standard. It eliminated virtually all the devices ever used or conceived as tools for arithmetic—and there had been many, from the beginning of recorded history. Pebbles in bags, knotted strings, and tally sticks of wood or bone served as short-term memory aids. Abacuses and slide rules applied more complex hardware to abstract reckoning. Then, in the seventeenth century, a few mathematicians conceived the first calculating devices worthy of the name machine, for adding and—through repetition of the adding—multiplying. Blaise Pascal made an adding machine in 1642 with a row of revolving disks, one for each decimal digit. Three decades later Leibniz improved on Pascal by using a cylindrical drum with protruding teeth to manage “carrying” from one digit to the next.


(#ulink_2357c1ac-7791-5efb-a1a4-35e4efd18b95) Fundamentally, however, the prototypes of Pascal and Leibniz remained closer to the abacus—a passive register of memory states—than to a kinetic machine. As Babbage saw, they were not automatic.

It would not occur to him to use a device for a one-time calculation, no matter how difficult. Machinery excelled at repetition—“intolerable labour and fatiguing monotony.” The demand for computation, he foresaw, would grow as the uses of commerce, industry, and science came together. “I will yet venture to predict, that a time will arrive, when the accumulating labour which arises from the arithmetical application of mathematical formulae, acting as a constantly retarding force, shall ultimately impede the useful progress of the science, unless this or some equivalent method is devised for relieving it from the overwhelming incumbrance of numerical detail.”

In the information-poor world, where any table of numbers was a rarity, centuries went by before people began systematically to gather different printed tables in order to check one against another. When they did, they found unexpected flaws. For example, Taylor’s Logarithms, the standard quarto printed in London in 1792, contained (it eventually transpired) nineteen errors of either one or two digits. These were itemized in the Nautical Almanac, for, as the Admiralty knew well, every error was a potential shipwreck.

Unfortunately, one of the nineteen corrections proved erroneous, so the next year’s Nautical Almanac printed an “erratum of the errata.” This in turn introduced yet another error. “Confusion is worse confounded,” declared The Edinburgh Review. The next almanac would have to put forth an “Erratum of the Erratum of the Errata in Taylor’s Logarithms.”

Particular mistakes had their own private histories. When Ireland established its Ordnance Survey, to map the entire country on a finer scale than any nation had ever accomplished, the first order of business was to ensure that the surveyors—teams of sappers and miners—had 250 sets of logarithmic tables, relatively portable and accurate to seven places. The survey office compared thirteen tables published in London over the preceding two hundred years, as well as tables from Paris, Avignon, Berlin, Leipzig, Gouda, Florence, and China. Six errors were discovered in almost every volume—and they were the same six errors. The conclusion was inescapable: these tables had been copied, one from another, at least in part.

Errors arose from mistakes in carrying. Errors arose from the inversion of digits, sometimes by the computers themselves and sometimes by the printer. Printers were liable to transpose digits in successive lines of type. What a mysterious, fallible thing the human mind seemed to be! All these errors, one commentator mused, “would afford a curious subject of metaphysical speculation respecting the operation of the faculty of memory.” Human computers had no future, he saw: “It is only by the mechanical fabrication of tables that such errors can be rendered impossible.”

Babbage proceeded by exposing mechanical principles within the numbers. He saw that some of the structure could be revealed by computing differences between one sequence and another. The “calculus of finite differences” had been explored by mathematicians (especially the French) for a hundred years. Its power was to reduce high-level calculations to simple addition, ready to be routinized. For Babbage the method was so crucial that he named his machine from its first conception the Difference Engine.

By way of example (for he felt the need to publicize and explain his conception many times as the years passed) Babbage offered the Table of Triangular Numbers. Like many of the sequences of concern, this was a ladder, starting on the ground and rising ever higher:

1, 3, 6, 10, 15, 21 . . .

He illustrated the idea by imagining a child placing groups of marbles on the sand:






Suppose the child wants to know “how many marbles the thirtieth or any other distant group might contain.” (It is a child after Babbage’s own heart.) “Perhaps he might go to papa to obtain this information; but I much fear papa would snub him, and would tell him that it was nonsense—that it was useless—that nobody knew the number, and so forth.” Understandably papa knows nothing of the Table of Triangular Numbers published at the Hague by É. de Joncourt, professor of philosophy. “If papa fail to inform him, let him go to mamma, who will not fail to find means to satisfy her darling’s curiosity.” Meanwhile, Babbage answers the question by means of a table of differences. The first column contains the number sequence in question. The next columns are derived by repeated subtractions, until a constant is reached—a column made up entirely of a single number.






Any polynomial function can be reduced by the method of differences, and all well-behaved functions, including logarithms, can be effectively approximated. Equations of higher degree require higher-order differences. Babbage offered another concrete geometrical example that requires a table of third differences: piles of cannonballs in the form of a triangular pyramid—the triangular numbers translated to three dimensions.






The Difference Engine would run this process in reverse: instead of repeated subtraction to find the differences, it would generate sequences of numbers by a cascade of additions. To accomplish this, Babbage conceived a system of figure wheels, marked with the numerals 0 to 9, placed along an axis to represent the decimal digits of a number: the units, the tens, the hundreds, and so on. The wheels would have gears. The gears along each axis would mesh with the gears of the next, to add the successive digits. As the machinery transmitted motion, wheel to wheel, it would be transmitting information, in tiny increments, the numbers summing across the axes. A mechanical complication arose, of course, when any sum passed 9. Then a unit had to be carried to the next decimal place. To manage this, Babbage placed a projecting tooth on each wheel, between the 9 and 0. The tooth would push a lever, which would in turn transmit its motion to the next wheel above.

At this point in the history of computing machinery, a new theme appears: the obsession with time. It occurred to Babbage that his machine had to compute faster than the human mind and as fast as possible. He had an idea for parallel processing: number wheels arrayed along an axis could add a row of the digits all at once. “If this could be accomplished,” he noted, “it would render additions and subtractions with numbers having ten, twenty, fifty, or any number of figures, as rapid as those operations are with single figures.” He could see a problem, however. The digits of a single addition could not be managed with complete independence because of the carrying. The carries could overflow and cascade through a whole set of wheels. If the carries were known in advance, then the additions could proceed in parallel. But that knowledge did not become available in timely fashion. “Unfortunately,” he wrote, “there are multitudes of cases in which the carriages that become due are only known in successive periods of time.” He counted up the time, assuming one second per operation: to add two fifty-digit numbers might take only nine seconds in itself, but the carrying, in the worst case, could require fifty seconds more. Bad news indeed. “Multitudes of contrivances were designed, and almost endless drawings made, for the purpose of economizing the time,” Babbage wrote ruefully. By 1820 he had settled on a design. He acquired his own lathe, used it himself and hired metalworkers, and in 1822 managed to present the Royal Society with a small working model, gleaming and futuristic.






BABBAGE’S WHEEL-WORK

He was living in London near the Regent’s Park as a sort of gentleman philosopher, publishing mathematical papers and occasionally lecturing to the public on astronomy. He married a wealthy young woman from Shropshire, Georgiana Whitmore, the youngest of eight sisters. Beyond what money she had, he was supported mainly by a £300 allowance from his father—whom he resented as a tyrannical, ungenerous, and above all close-minded old man. “It is scarcely too much to assert that he believes nothing he hears, and only half of what he sees,” Babbage wrote his friend Herschel. When his father died, in 1827, Babbage inherited a fortune of £100,000. He briefly became an actuary for a new Protector Life Assurance Company and computed statistical tables rationalizing life expectancies. He tried to get a university professorship, so far unsuccessfully, but he had an increasingly lively social life, and in scholarly circles people were beginning to know his name. With Herschel’s help he was elected a fellow of the Royal Society.

Even his misfires kindled his reputation. On behalf of The Edinburgh Journal of Science Sir David Brewster sent him a classic in the annals of rejection letters: “It is with no inconsiderable degree of reluctance that I decline the offer of any Paper from you. I think, however, you will upon reconsideration of the subject be of opinion that I have no other alternative. The subjects you propose for a series of Mathematical and Metaphysical Essays are so very profound, that there is perhaps not a single subscriber to our Journal who could follow them.” On behalf of his nascent invention, Babbage began a campaign of demonstrations and letters. By 1823 the Treasury and the Exchequer had grown interested. He promised them “logarithmic tables as cheap as potatoes”—how could they resist? Logarithms saved ships. The Lords of the Treasury authorized a first appropriation of £1,500.

As an abstract conception the Difference Engine generated excitement that did not need to wait for anything so mundane as the machine’s actual construction. The idea was landing in fertile soil. Dionysius Lardner, a popular lecturer on technical subjects, devoted a series of public talks to Babbage, hailing his “proposition to reduce arithmetic to the dominion of mechanism,—to substitute an automaton for a compositor,—to throw the powers of thought into wheel-work.” The engine “must, when completed,” he said, “produce important effects, not only on the progress of science, but on that of civilization.” It would be the rational machine. It would be a junction point for two roads—mechanism and thought. Its admirers sometimes struggled with their explanations of this intersection: “The question is set to the instrument,” Henry Colebrooke told the Astronomical Society, “or the instrument is set to the question.” Either way, he said, “by simply giving motion the solution is wrought.”

But the engine made slower progress in the realm of brass and wrought iron. Babbage tore out the stables in back of his London house and replaced them with a forge, foundry, and fireproofed workshop. He engaged Joseph Clement, a draftsman and inventor, self-educated, the son of a village weaver who had made himself into England’s preeminent mechanical engineer. Babbage and Clement realized that they would have to make new tools. Inside a colossal iron frame the design called for the most intricate and precise parts—axles, gears, springs, and pins, and above all figure wheels by the hundreds and then thousands. Hand tools could never produce the components with the needed precision. Before Babbage could have a manufactory of number tables, he would have to build new manufactories of parts. The rest of the Industrial Revolution, too, needed standardization in its parts: interchangeable screws of uniform thread count and pitch; screws as fundamental units. The lathes of Clement and his journeymen began to produce them.






A WOODCUT IMPRESSION (1853) OF ASMALL PORTION OF THE DIFFERENCE ENGINE

As the difficulties grew, so did Babbage’s ambitions. After ten years, the engine stood twenty-four inches high, with six vertical axles and dozens of wheels, capable of computing six-figure results. Ten years after that, the scale—on paper—had reached 160 cubic feet, 15 tons, and 25,000 parts, and the paper had spread, too, the drawings covering more than 400 square feet. The level of complexity was confounding. Babbage solved the problem of adding many digits at once by separating the “adding motions” from the “carrying motions” and then staggering the timing of the carries. The addition would begin with a rush of grinding gears, first the odd-numbered columns of dials, then the even columns. Then the carries would recoil across the rows. To keep the motion synchronized, parts of the machine would need to “know” at critical times that a carry was pending. The information was conveyed by the state of a latch. For the first time, but not the last, a device was invested with memory. “It is in effect a memorandum taken by the machine,” wrote his publicizer, Dionysius Lardner. Babbage himself was self-conscious about anthropomorphizing but could not resist. “The mechanical means I employed to make these carriages,” he suggested, “bears some slight analogy to the operation of the faculty of memory.”

In ordinary language, to describe even this basic process of addition required a great effulgence of words, naming the metal parts, accounting for their interactions, and sorting out interdependencies that multiplied to form a long chain of causality. Lardner’s own explanation of “carrying,” for example, was epic. A single isolated instant of the action involved a dial, an index, a thumb, an axis, a trigger, a notch, a hook, a claw, a spring, a tooth, and a ratchet wheel:

Now, at the moment that the division between 9 and 0 on the dial B


passes under the index, a thumb placed on the axis of this dial touches a trigger which raises out of the notch of the hook which sustains the claw just mentioned, and allows it to fall back by the recoil of the spring, and drop into the next tooth of the ratchet wheel.

Hundreds of words later, summing up, Lardner resorted to a metaphor suggesting fluid dynamics:

There are two systems of waves of mechanical action continually flowing from the bottom to the top; and two streams of similar action constantly passing from the right to the left. The crests of the first system of adding waves fall upon the last difference, and upon every alternate one proceeding upwards. . . . The first stream of carrying action passes from right to left along the highest row and every alternate row.

This was one way of abstracting from the particular—the particulars being so intricate. And then he surrendered. “Its wonders, however, are still greater in its details,” he wrote. “We despair of doing it justice.”

Nor were ordinary draftsman’s plans sufficient for describing this machine that was more than a machine. It was a dynamical system, its many parts each capable of several modes or states, sometimes at rest and sometimes in motion, propagating their influence along convoluted channels. Could it ever be specified completely, on paper? Babbage, for his own purposes, devised a new formal tool, a system of “mechanical notation” (his term). This was a language of signs meant to represent not just the physical form of a machine but its more elusive properties: its timing and its logic. It was an extraordinary ambition, as Babbage himself appreciated. In 1826 he proudly reported to the Royal Society “On a Method of Expressing by Signs the Action of Machinery.” In part it was an exercise in classification. He analyzed the different ways in which something—motion, or power—could be “communicated” through a system. There were many ways. A part could receive its influence simply by being attached to another part, “as a pin on a wheel, or a wheel and pinion on the same axis.” Or transmission could occur “by stiff friction.” A part might be driven constantly by another part “as happens when a wheel is driven by a pinion”—or not constantly, “as is the case when a stud lifts a bolt once in the course of a revolution.” Here a vision of logical branching entered the scheme: the path of communication would vary depending on the alternative states of some part of the machine. Babbage’s mechanical notation followed naturally from his work on symbolic notation in mathematical analysis. Machinery, like mathematics, needed rigor and definition for progress. “The forms of ordinary language were far too diffuse,” he wrote. “The signs, if they have been properly chosen, and if they should be generally adopted, will form as it were an universal language.” Language was never a side issue for Babbage.

He finally won a university post, at Cambridge: the prestigious Lucasian Professorship of Mathematics, formerly occupied by Newton. As in Newton’s time, the work was not onerous. Babbage did not have to teach students, deliver lectures, or even live in Cambridge, and this was just as well, because he was also becoming a popular fixture of London social life. At home at One Dorset Street he hosted a regular Saturday soirée that drew a glittering crowd—politicians, artists, dukes and duchesses, and the greatest English scientists of the age: Charles Darwin, Michael Faraday, and Charles Lyell, among others.


(#ulink_2319d875-0a8b-551e-aebe-28ee0a9e1039) They marveled at his calculating machine and, on display nearby, the dancing automaton of his youth. (In invitations he would write, “I hope you intend to patronise the ‘Silver Lady.’ She is to appear in new dresses and decorations.”) He was a mathematical raconteur—that was no contradiction, in this time and place. Lyell reported approvingly that he “jokes and reasons in high mathematics.” He published a much-quoted treatise applying probability theory to the theological question of miracles. With tongue in cheek he wrote Alfred, Lord Tennyson, to suggest a correction for the poet’s couplet: “Every minute dies a man, / Every minute one is born.”

I need hardly point out to you that this calculation would tend to keep the sum total of the world’s population in a state of perpetual equipoise, whereas it is a well-known fact that the said sum total is constantly on the increase. I would therefore take the liberty of suggesting that in the next edition of your excellent poem the erroneous calculation to which I refer should be corrected as follows: “Every moment dies a man / And one and a sixteenth is born.” I may add that the exact figures are 1.167, but something must, of course, be conceded to the laws of metre.

Fascinated with his own celebrity, he kept a scrapbook—“the pros and cons in parallel columns, from which he obtained a sort of balance,” as one visitor described it. “I was told repeatedly that he spent all his days in gloating and grumbling over what people said of him.”

But progress on the engine, the main source of his fame, was faltering. In 1832 he and his engineer Clement produced a working demonstration piece. Babbage displayed it at his parties to guests who found it miraculous or merely puzzling. The Difference Engine stands—for a replica works today, in the Science Museum in London—as a milestone of what could be achieved in precision engineering. In the composition of its alloys, the exactness of its dimensions, the interchangeability of its parts, nothing surpassed this segment of an unfinished machine. Still, it was a curio. And it was as far as Babbage could go.

He and his engineer fell into disputes. Clement demanded more and more money from Babbage and from the Treasury, which began to suspect profiteering. He withheld parts and drawings and fought over control of the specialized machine tools in their workshops. The government, after more than a decade and £17,000, was losing faith in Babbage, and he in the government. In his dealing with lords and ministers Babbage could be imperious. He was developing a sour view of the Englishman’s attitude toward technological innovation: “If you speak to him of a machine for peeling a potato, he will pronounce it impossible: if you peel a potato with it before his eyes, he will declare it useless, because it will not slice a pineapple.” They no longer saw the point.

“What shall we do to get rid of Mr. Babbage and his calculating machine?” Prime Minister Robert Peel wrote one of his advisers in August 1842. “Surely if completed it would be worthless as far as science is concerned. . . . It will be in my opinion a very costly toy.” He had no trouble finding voices inimical to Babbage in the civil service. Perhaps the most damning was George Biddell Airy, the Astronomer Royal, a starched and methodical figure, who with no equivocation told Peel precisely what he wanted to hear: that the engine was useless. He added this personal note: “I think it likely he lives in a sort of dream as to its utility.” Peel’s government terminated the project. As for Babbage’s dream, it continued. It had already taken another turn. The engine in his mind had advanced into a new dimension. And he had met Ada Byron.






CHARLES BABBAGE (1860)

In the Strand, at the north end of the Lowther shopping arcade, visitors thronged to the National Gallery of Practical Science, “Blending Instruction with Amusement,” a combination toy store and technology show set up by an American entrepreneur. For the admission price of a shilling, a visitor could touch the “electrical eel,” listen to lectures on the newest science, and watch a model steamboat cruising a seventy-foot trough and the Perkins steam gun emitting a spray of bullets. For a guinea, she could sit for a “daguerreotype” or “photographic” portrait, by which a faithful and pleasing likeness could be obtained in “less than One Second.” Or she could watch, as young Augusta Ada Byron did, a weaver demonstrating the automated Jacquard loom, in which the patterns to be woven in cloth were encoded as holes punched into pasteboard cards.

Ada was “the child of love,” her father had written, “—though born in bitterness, and nurtured in convulsion.” Her father was a poet. When she was barely a month old, in 1816, the already notorious Lord Byron, twenty-seven, and the bright, wealthy, and mathematically knowledgeable Anne Isabella Milbanke (Annabella), twenty-three, separated after a year of marriage. Byron left England and never saw his daughter again. Her mother refused to tell her who her father was until she was eight and he died in Greece, an international celebrity. The poet had begged for any news of his daughter: “Is the Girl imaginative?—at her present age I have an idea that I had many feelings & notions which people would not believe if I stated them now.” Yes, she was imaginative.

She was a prodigy, clever at mathematics, encouraged by tutors, talented in drawing and music, fantastically inventive and profoundly lonely. When she was twelve, she set about inventing a means of flying. “I am going to begin my paper wings tomorrow,” she wrote to her mother. She hoped “to bring the art of flying to very great perfection. I think of writing a book of Flyology illustrated with plates.” For a while she signed her letters “your very affectionate Carrier Pigeon.” She asked her mother to find a book illustrating bird anatomy, because she was reluctant “to dissect even a bird.” She analyzed her daily situation with a care for logic.

Miss Stamp desires me to say that at present she is not particularly pleased with me on account of some very foolish conduct yesterday about a simple thing, and which she said was not only foolish but showed a spirit of inattention, and though today she has not had reason to be dissatisfied with me on the whole yet she says that she can not directly efface the recollection of the past.

She was growing up in a well-kept cloister of her mother’s arranging. She had years of sickliness, a severe bout of measles, and episodes of what was called neurasthenia or hysteria. (“When I am weak,” she wrote, “I am always so exceedingly terrified, at nobody knows what, that I can hardly help having an agitated look & manner.”) Green drapery enclosed the portrait of her father that hung in one room. In her teens she developed a romantic interest in her tutor, which led to a certain amount of sneaking about the house and gardens and to lovemaking as intimate as possible without, she said, actual “connection.” The tutor was dismissed. Then, in the spring, wearing white satin and tulle, the seventeen-year-old made her ritual debut at court, where she met the king and queen, the most important dukes, and the French diplomat Talleyrand, whom she described as an “old monkey.”

A month later she met Charles Babbage. With her mother, she went to see what Lady Byron called his “thinking machine,” the portion of the Difference Engine in his salon. Babbage saw a sparkling, self-possessed young woman with porcelain features and a notorious name, who managed to reveal that she knew more mathematics than most men graduating from university. She saw an imposing forty-one-year-old, authoritative eyebrows anchoring his strong-boned face, who possessed wit and charm and did not wear these qualities lightly. He seemed a kind of visionary— just what she was seeking. She admired the machine, too. An onlooker reported: “While other visitors gazed at the working of this beautiful instrument with the sort of expression, and I dare say the sort of feeling, that some savages are said to have shown on first seeing a looking-glass or hearing a gun, Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.” Her feeling for the beauty and abstractions of mathematics, fed only in morsels from her succession of tutors, was overflowing. It had no outlet. A woman could not attend university in England, nor join a scientific society (with two exceptions: the botanical and horticultural).

Ada became a tutor for the young daughters of one of her mother’s friends. When writing to them, she signed herself, “your affectionate & untenable Instructress.” On her own she studied Euclid. Forms burgeoned in her mind. “I do not consider that I know a proposition,” she wrote another tutor, “until I can imagine to myself a figure in the air, and go through the construction & demonstration without any book or assistance whatever.” She could not forget Babbage, either, or his “gem of all mechanism.” To another friend she reported her “great anxiety about the machine.” Her gaze turned inward, often. She liked to think about herself thinking.






AUGUSTA ADA BYRON KING, COUNTESS OF LOVELACE, AS PAINTED IN 1836 BY MARGARET CARPENTER.“I CONCLUDE SHE IS BENT ON DISPLAYING THEWHOLE EXPANSE OF MY CAPACIOUS JAW BONE, UPON WHICH I THINK THE WORD MATHEMATICSSHOULD BE WRITTEN.”

Babbage himself had moved far beyond the machine on display in his drawing room; he was planning a new machine, still an engine of computation but transmuted into another species. He called this the Analytical Engine. Motivating him was a quiet awareness of the Difference Engine’s limitations: it could not, merely by adding differences, compute every sort of number or solve any mathematical problem. Inspiring him, as well, was the loom on display in the Strand, invented by Joseph-Marie Jacquard, controlled by instructions encoded and stored as holes punched in cards.

What caught Babbage’s fancy was not the weaving, but rather the encoding, from one medium to another, of patterns. The patterns would appear in damask, eventually, but first were “sent to a peculiar artist.” This specialist, as he said,

punches holes in a set of pasteboard cards in such a manner that when those cards are placed in a Jacquard loom, it will then weave upon its produce the exact pattern designed by the artist.

The notion of abstracting information away from its physical substrate required careful emphasis. Babbage explained, for example, that the weaver might choose different threads and different colors—“but in all these cases the form of the pattern will be precisely the same.” As Babbage conceived his machine now, it raised this very process of abstraction to higher and higher degrees. He meant the cogs and wheels to handle not just numbers but variables standing in for numbers. Variables were to be filled or determined by the outcomes of prior calculations, and, further, the very operations—such as addition or multiplication—were to be changeable, depending on prior outcomes. He imagined these abstract information quantities being stored in cards: variable cards and operation cards. He thought of the machine as embodying laws and of the cards as communicating these laws. Lacking a ready-made vocabulary, he found it awkward to express his fundamental working concepts; for example,

how the machine could perform the act of judgment sometimes required during an analytical inquiry, when two or more different courses presented themselves, especially as the proper course to be adopted could not be known in many cases until all the previous portion had been gone through.

He made clear, though, that information—representations of number and process—would course through the machinery. It would pass to and from certain special physical locations, which Babbage named a store, for storage, and a mill, for action.

In all this he had an intellectual companion now in Ada, first his acolyte and then his muse. She married a sensible and promising aristocrat, William King, her senior by a decade and a favorite of her mother. In the space of a few years he was elevated to the peerage as earl of Lovelace— making Ada, therefore, a countess—and, still in her early twenties, she bore three children. She managed their homes, in Surrey and London, practiced the harp for hours daily (“I am at present a condemned slave to my harp, no easy Task master”), danced at balls, met the new queen, Victoria, and sat for her portrait, self-consciously (“I conclude [the artist] is bent on displaying the whole expanse of my capacious jaw bone, upon which I think the word Mathematics should be written”). She suffered terrible dark moods and bouts of illness, including cholera. Her interests and behavior still set her apart. One morning she went alone in her carriage, dressed plainly, to see a model of Edward Davy’s “electrical telegraph” at Exeter Hall

& the only other person was a middle-aged gentleman who chose to behave as if I were the show [she wrote to her mother] which of course I thought was the most impudent and unpardonable.—I am sure he took me for a very young (& I suppose he thought rather handsome) governess. . . . He stopped as long as I did, & then followed me out.— I took care to look as aristocratic & as like a Countess as possible. . . . I must try & add a little age to my appearance. . . . I would go & see something everyday & I am sure London would never be exhausted.

Lady Lovelace adored her husband but reserved much of her mental life for Babbage. She had dreams, waking dreams, of something she could not be and something she could not achieve, except by proxy, through his genius. “I have a peculiar way of learning,” she wrote to him, “& I think it must be a peculiar man to teach me successfully.” Her growing desperation went side by side with a powerful confidence in her untried abilities. “I hope you are bearing me in mind,” she wrote some months later, “I mean my mathematical interests. You know this is the greatest favour any one can do me.—Perhaps, none of us can estimate how great. . . .”

You know I am by nature a bit of a philosopher, & a very great speculator, —so that I look on through a very immeasurable vista, and though I see nothing but vague & cloudy uncertainty in the foreground of our being, yet I fancy I discern a very bright light a good way further on, and this makes me care much less about the cloudiness & indistinctness which is near.—Am I too imaginative for you? I think not.

The mathematician and logician Augustus De Morgan, a friend of Babbage and of Lady Byron, became Ada’s teacher by post. He sent her exercises. She sent him questions and musings and doubts (“I could wish I went on quicker”; “I am sorry to say I am sadly obstinate about the Term at which Convergence begins”; “I have enclosed my Demonstration of my view of the case”; “functional Equations are complete Will-o-the-wisps to me”; “However I try to keep my metaphysical head in order”). Despite her naïveté, or because of it, he recognized a “power of thinking . . . so utterly out of the common way for any beginner, man or woman.” She had rapidly mastered trigonometry and integral and differential calculus, and he told her mother privately that if he had encountered “such power” in a Cambridge student he would have anticipated “an original mathematical investigator, perhaps of first rate eminence.” She was fearless about drilling down to first principles. Where she felt difficulties, real difficulties lay.

One winter she grew obsessed with a fashionable puzzle known as Solitaire, the Rubik’s Cube of its day. Thirty-two pegs were arranged on a board with thirty-three holes, and the rules were simple: Any peg may jump over another immediately adjacent, and the peg jumped over is removed, until no more jumps are possible. The object is to finish with only one peg remaining. “People may try thousands of times, and not succeed in this,” she wrote Babbage excitedly.






I have done it by trying & observation & can now do it at any time, but I want to know if the problem admits of being put into a mathematical Formula, & solved in this manner. . . . There must be a definite principle, a compound I imagine of numerical & geometrical properties, on which the solution depends, & which can be put into symbolic language.

A formal solution to a game—the very idea of such a thing was original. The desire to create a language of symbols, in which the solution could be encoded—this way of thinking was Babbage’s, as she well knew.

She pondered her growing powers of mind. They were not strictly mathematical, as she saw it. She saw mathematics as merely a part of a greater imaginative world. Mathematical transformations reminded her “of certain sprites & fairies one reads of, who are at one’s elbows in one shape now, & the next minute in a form most dissimilar; and uncommonly deceptive, troublesome & tantalizing are the mathematical sprites & fairies sometimes; like the types I have found for them in the world of Fiction.” Imagination—the cherished quality. She mused on it; it was her heritage from her never-present father.

We talk much of Imagination. We talk of the Imagination of Poets, the Imagination of Artists &c; I am inclined to think that in general we don’t know very exactly what we are talking about. . . .

It is that which penetrates into the unseen worlds around us, the worlds of Science. It is that which feels & discovers what is, the real which we see not, which exists not for our senses. Those who have learned to walk on the threshold of the unknown worlds . . . may then with the fair white wings of Imagination hope to soar further into the unexplored amidst which we live.

She began to believe she had a divine mission to fulfill. She used that word, mission. “I have on my mind most strongly the impression that Heaven has allotted me some peculiar intellectual-moral mission to perform.” She had powers. She confided in her mother:

I believe myself to possess a most singular combination of qualities exactly fitted to make me pre-eminently a discoverer of the hidden realities of nature. . . . The belief has been forced upon me, & most slow have I been to admit it even.

She listed her qualities:

Firstly: Owing to some peculiarity in my nervous system, I have perceptions of some things, which no one else has; or at least very few, if any. . . . Some might say an intuitive perception of hidden things;—that is of things hidden from eyes, ears & the ordinary senses. . . .

Secondly;—my immense reasoning faculties;

Thirdly; . . . the power not only of throwing my whole energy & existence into whatever I choose, but also bring to bear on any one subject or idea, a vast apparatus from all sorts of apparently irrelevant & extraneous sources. I can throw rays from every quarter of the universe into one vast focus.

She admitted that this sounded mad but insisted she was being logical and cool. She knew her life’s course now, she told her mother. “What a mountain I have to climb! It is enough to frighten anyone who had not all that most insatiable & restless energy, which from my babyhood has been the plague of your life & my own. However it has found food I believe at last.” She had found it in the Analytical Engine.

——

Babbage meanwhile, restless and omnivorous, was diverting his energies to another burgeoning technology, steam’s most powerful expression, the railroad. The newly formed Great Western Railway was laying down track and preparing trial runs of locomotive engines from Bristol to London under the supervision of Isambard Kingdom Brunel, the brilliant engineer, then just twenty-seven years old. Brunel asked Babbage for help, and Babbage decided to begin with an information-gathering program— characteristically ingenious and grandiose. He outfitted an entire railway carriage. On a specially built, independently suspended table, rollers unwound sheets of paper a thousand feet long, while pens drew lines to “express” (as Babbage put it) measurements of the vibrations and forces felt by the carriage in every direction. A chronometer marked the passage of time in half seconds. He covered two miles of paper this way.

As he traversed the rails, he realized that a peculiar danger of steam locomotion lay in its outracing every previous means of communication. Trains lost track of one another. Until the most regular and disciplined scheduling was imposed, hazard ran with every movement. One Sunday Babbage and Brunel, operating in different engines, barely avoided smashing into each other. Other people, too, worried about this new gap between the speeds of travel and messaging. An important London banker told Babbage he disapproved: “It will enable our clerks to plunder us, and then be off to Liverpool on their way to America at the rate of twenty miles an hour.” Babbage could only express the hope that science might yet find a remedy for the problem it had created. (“Possibly we might send lightning to outstrip the culprit.”)

As for his own engine—the one that would travel nowhere—he had found a fine new metaphor. It would be, he said, “a locomotive that lays down its own railway.”

Bitter as he was about England’s waning interest in his visionary plans, Babbage found admirers on the continent, particular in Italy—“the country of Archimedes and Galileo,” as he put it to his new friends. In the summer of 1840 he gathered up his sheaves of drawings and journeyed by way of Paris and Lyon, where he watched the great Jacquard loom at Manufacture d’Étoffes pour Ameublements et Ornements d’Église, to Turin, the capital of Sardinia, for an assembly of mathematicians and engineers. There he made his first (and last) public presentation of the Analytical Engine. “The discovery of the Analytical Engine is so much in advance of my own country, and I fear even of the age,” he said. He met the Sardinian king, Charles Albert, and, more significantly, an ambitious young mathematician named Luigi Menabrea. Later Menabrea was to become a general, a diplomat, and the prime minister of Italy; now he prepared a scientific report, “Notions sur la machine analytique,” to introduce Babbage’s plan to a broader community of European philosophers.

As soon as this reached Ada Lovelace, she began translating it into English, correcting errors on the basis of her own knowledge. She did that on her own, without telling either Menabrea or Babbage.

When she finally did show Babbage her draft, in 1843, he responded enthusiastically, urging her to write on her own behalf, and their extraordinary collaboration began in earnest. They sent letters by messenger back and forth across London at a ferocious pace—“My Dear Babbage” and “My Dear Lady Lovelace”—and met whenever they could at her home in St. James’s Square. The pace was almost frantic. Though he was the eminence, fifty-one years old to her twenty-seven, she took charge, mixing stern command with banter. “I want you to answer me the following question by return of post”; “Be kind enough to write this out properly for me”; “You were a little harum-scarum and inaccurate”; “I wish you were as accurate and as much to be relied on as myself.” She proposed to sign her work with her initials—nothing so forward as her name—not to “proclaim who has written it,” merely to “individualize and identify it with other productions of the said A.A.L.”

Her exposition took the form of notes lettered A through G, extending to nearly three times the length of Menabrea’s essay. They offered a vision of the future more general and more prescient than any expressed by Babbage himself. How general? The engine did not just calculate; it performed operations, she said, defining an operation as “any process which alters the mutual relation of two or more things,” and declaring: “This is the most general definition, and would include all subjects in the universe.” The science of operations, as she conceived it,

is a science of itself, and has its own abstract truth and value; just as logic has its own peculiar truth and value, independently of the subjects to which we may apply its reasonings and processes. . . . One main reason why the separate nature of the science of operations has been little felt, and in general little dwelt on, is the shifting meaning of many of the symbols used.

Symbols and meaning: she was emphatically not speaking of mathematics alone. The engine “might act upon other things besides number.” Babbage had inscribed numerals on those thousands of dials, but their working could represent symbols more abstractly. The engine might process any meaningful relationships. It might manipulate language. It might create music. “Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

It had been an engine of numbers; now it became an engine of information. A.A.L. perceived that more distinctly and more imaginatively than Babbage himself. She explained his prospective, notional, virtual creation as though it already existed:

The Analytical Engine does not occupy common ground with mere “calculating machines.” It holds a position wholly its own. . . . A new, a vast, and a powerful language is developed . . . in which to wield its truths so that these may become of more speedy and accurate practical application for the purposes of mankind than the means hitherto in our possession have rendered possible. Thus not only the mental and the material, but the theoretical and the practical in the mathematical world, are brought into more intimate and effective connexion with each other.

. . . We may say most aptly, that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves.

For this flight of fancy she took full responsibility. “Whether the inventor of this engine had any such views in his mind while working out the invention, or whether he may subsequently ever have regarded it under this phase, we do not know; but it is one that forcibly occurred to ourselves.”

She proceeded from the poetic to the practical. She set forth on a virtuoso excursion through a hypothetical program by which this hypothetical machine might compute a famously deep-seated infinite series, the Bernoulli numbers. These numbers arise in the summing of numbers from 1 to n raised to integral powers, and they occur in various guises all through number theory. No direct formula generates them, but they can be worked out methodically, by expanding certain formulas further and further and looking at the coefficients each time. She began with examples; the simplest, she wrote, would be the expansion of






and another approach would be via






but she would take a more challenging path, because “our object is not simplicity . . . but the illustration of the powers of the engine.”

She devised a process, a set of rules, a sequence of operations. In another century this would be called an algorithm, later a computer program, but for now the concept demanded painstaking explanation. The trickiest point was that her algorithm was recursive. It ran in a loop. The result of one iteration became food for the next. Babbage had alluded to this approach as “the Engine eating its own tail.” A.A.L. explained: “We easily perceive that since every successive function is arranged in a series following the same law, there would be a cycle of a cycle of a cycle, &c. . . . The question is so exceedingly complicated, that perhaps few persons can be expected to follow. . . . Still it is a very important case as regards the engine, and suggests ideas peculiar to itself, which we should regret to pass wholly without allusion.”

A core idea was the entity she and Babbage called the variable. Variables were, in hardware terms, the machine’s columns of number dials. But there were “Variable cards,” too. In software terms they were a sort of receptacle or envelope, capable of representing, or storing, a number of many decimal digits. (“What is there in a name?” Babbage wrote. “It is merely an empty basket until you put something in it.”) Variables were the machine’s units of information. This was quite distinct from the algebraic variable. As A.A.L. explained, “The origin of this appellation is, that the values on the columns are destined to change, that is to vary, in every conceivable manner.” Numbers traveled, in effect, from variable cards to variables, from variables to the mill (for operations), from the mill to the store. To solve the problem of generating Bernoulli numbers, she choreographed an intricate dance. She worked days and sometimes through the night, messaging Babbage across London, struggling with sickness and ominous pains, her mind soaring:

That brain of mine is something more than merely mortal; as time will show; (if only my breathing & some other et-ceteras do not make too rapid a progress towards instead of from mortality).

Before ten years are over, the Devil’s in it if I have not sucked out some of the life-blood from the mysteries of this universe, in a way that no purely mortal lips or brains could do.

No one knows what almost awful energy & power lie yet undevelopped in that wiry little system of mine. I say awful, because you may imagine what it might be under certain circumstances. . . .

I am doggedly attacking & sifting to the very bottom, all the ways of deducing the Bernoulli Numbers. . . . I am grappling with this subject, & connecting it with others.

She was programming the machine. She programmed it in her mind, because the machine did not exist. The complexities she encountered for the first time became familiar to programmers of the next century:

How multifarious and how mutually complicated are the considerations which the working of such an engine involve. There are frequently several distinct sets of effects going on simultaneously; all in a manner independent of each other, and yet to a greater or less degree exercising a mutual influence. To adjust each to every other, and indeed even to perceive and trace them out with perfect correctness and success, entails difficulties whose nature partakes to a certain extent of those involved in every question where conditions are very numerous and inter-complicated.

She reported her feelings to Babbage: “I am in much dismay at having got into so amazing a quagmire & botheration.” And nine days later: “I find that my plans & ideas keep gaining in clearness, & assuming more of the crystalline & less & less of the nebulous form.” She knew she had achieved something utterly new. Ten days later still, struggling over the final proofs with “Mr Taylors Printing Office” in Fleet Street, she declared: “I do not think you possess half my forethought, & power of foreseeing all possible contingencies (probable & improbable, just alike).— . . . I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; (& Metaphysician); for with me the two go together indissolubly.”

Who would have used this machine? Not clerks or shopkeepers, said Babbage’s son, many years later. Common arithmetic was never the purpose—“It would be like using the steam hammer to crush the nut.” He paraphrased Leibniz: “It is not made for those who sell vegetables or little fishes, but for observatories, or the private rooms of calculators, or for others who can easily bear the expense, and need a good deal of calculation.” Babbage’s engine had not been well understood, not by his government and not by the many friends who passed through his salon, but in its time its influence traveled far.

In America, a country bursting with invention and scientific optimism, Edgar Allan Poe wrote, “What shall we think of the calculating machine of Mr. Babbage? What shall we think of an engine of wood and metal which can . . . render the exactitude of its operations mathematically certain through its power of correcting its possible errors?” Ralph Waldo Emerson had met Babbage in London and declared in 1870, “Steam is an apt scholar and a strong-shouldered fellow, but it has not yet done all its work.”

It already walks about the field like a man, and will do anything required of it. It irrigates crops, and drags away a mountain. It must sew our shirts, it must drive our gigs; taught by Mr. Babbage, it must calculate interest and logarithms. . . . It is yet coming to render many higher services of a mechanico-intellectual kind.

Its wonders met disapproval, too. Some critics feared a rivalry between mechanism and mind. “What a satire is that machine on the mere mathematician!” said Oliver Wendell Holmes Sr. “A Frankenstein-monster, a thing without brains and without heart, too stupid to make a blunder; which turns out results like a corn-sheller, and never grows any wiser or better, though it grind a thousand bushels of them!” They all spoke as though the engine were real, but it never was. It remained poised before its own future.

Midway between his time and ours, the Dictionary of National Biography granted Charles Babbage a brief entry—almost entirely devoid of relevance or consequence:

mathematician and scientific mechanician; . . . obtained government grant for making a calculating machine . . . but the work of construction ceased, owning to disagreements with the engineer; offered the government an improved design, which was refused on grounds of expense; . . . Lucasian professor of mathematics, Cambridge, but delivered no lectures.

Babbage’s interests, straying so far from mathematics, seeming so miscellaneous, did possess a common thread that neither he nor his contemporaries could perceive. His obsessions belonged to no category—that is, no category yet existing. His true subject was information: messaging, encoding, processing.

He took up two quirky and apparently unphilosophical challenges, which he himself noted had a deep connection one to the other: picking locks and deciphering codes. Deciphering, he said, was “one of the most fascinating of arts, and I fear I have wasted upon it more time than it deserves.” To rationalize the process, he set out to perform a “complete analysis” of the English language. He created sets of special dictionaries: lists of the words of one letter, two letters, three letters, and so on; and lists of words alphabetized by their initial letter, second letter, third letter, and so on. With these at hand he designed methodologies for solving anagram puzzles and word squares.

In tree rings he saw nature encoding messages about the past. A profound lesson: that a tree records a whole complex of information in its solid substance. “Every shower that falls, every change of temperature that occurs, and every wind that blows, leaves on the vegetable world the traces of its passage; slight, indeed, and imperceptible, perhaps, to us, but not the less permanently recorded in the depths of those woody fabrics.”

In London workshops he had observed speaking tubes, made of tin, “by which the directions of the superintendent are instantly conveyed to the remotest parts.” He classified this technology as a contribution to the “economy of time” and suggested that no one had yet discovered a limit on the distance over which spoken messages might travel. He made a quick calculation: “Admitting it to be possible between London and Liverpool, about seventeen minutes would elapse before the words spoken at one end would reach the other extremity of the pipe.” In the 1820s he had an idea for transmitting written messages, “enclosed in small cylinders along wires suspended from posts, and from towers, or from church steeples,” and he built a working model in his London house. He grew obsessed with other variations on the theme of sending messages over the greatest possible distances. The post bag dispatched nightly from Bristol, he noted, weighed less than one hundred pounds. To send these messages 120 miles, “a coach and apparatus, weighing above thirty hundred weight, are put in motion, and also conveyed over the same space.” What a waste! Suppose, instead, he suggested, post towns were linked by a series of high pillars erected every hundred feet or so. Steel wires would stretch from pillar to pillar. Within cities, church steeples might serve as the pillars. Tin cases with wheels would roll along the wires and carry batches of letters. The expense would be “comparatively trifling,” he said, “nor is it impossible that the stretched wire might itself be available for a species of telegraphic communication yet more rapid.”

During the Great Exhibition of 1851, when England showcased its industrial achievement in a Crystal Palace, Babbage placed an oil lamp with a moveable shutter in an upstairs window at Dorset Street to create an “occulting light” apparatus that blinked coded signals to passersby. He drew up a standardized system for lighthouses to use in sending numerical signals and posted twelve copies to, as he said, “the proper authorities of the great maritime countries.” In the United States, the Congress approved $5,000 for a trial program of Babbage’s system. He studied sun signals and “zenith-light signals” flashed by mirrors, and Greenwich time signals for transmission to mariners. For communicating between stranded ships and rescuers on shore, he proposed that all nations adopt a standard list of a hundred questions and answers, assigned numbers, “to be printed on cards, and nailed up on several parts of every vessel.” Similar signals, he suggested, could help the military, the police, the railways, or even, “for various social purposes,” neighbors in the country.

These purposes were far from obvious. “For what purposes will the electric telegraph become useful?” the king of Sardinia, Charles Albert, asked Babbage in 1840. Babbage searched his mind for an illustration, “and at last I pointed out the probability that, by means of the electric telegraphs, his Majesty’s fleet might receive warning of coming storms. . . .”

This led to a new theory of storms, about which the king was very curious. By degrees I endeavoured to make it clear. I cited, as an illustration, a storm which had occurred but a short time before I left England. The damage done by it at Liverpool was very great, and at Glasgow immense. . . . I added that if there had been electric communication between Genoa and a few other places the people of Glasgow might have had information of one of those storms twenty-four hours previously to its arrival.

As for the engine, it had to be forgotten before it was remembered. It had no obvious progeny. It rematerialized like buried treasure and inspired a sense of puzzled wonder. With the computer era in full swing, the historian Jenny Uglow felt in Babbage’s engines “a different sense of anachronism.” Such failed inventions, she wrote, contain “ideas that lie like yellowing blueprints in dark cupboards, to be stumbled on afresh by later generations.”

Meant first to generate number tables, the engine in its modern form instead rendered number tables obsolete. Did Babbage anticipate that? He did wonder how the future would make use of his vision. He guessed that a half century would pass before anyone would try again to create a general-purpose computing machine. In fact, it took most of a century for the necessary substrate of technology to be laid down. “If, unwarned by my example,” he wrote in 1864, “any man shall undertake and shall succeed in really constructing an engine embodying in itself the whole of the executive department of mathematical analysis upon different principles or by simpler mechanical means, I have no fear of leaving my reputation in his charge, for he alone will be fully able to appreciate the nature of my efforts and the value of their results.”

As he looked to the future, he saw a special role for one truth above all: “the maxim, that knowledge is power.” He understood that literally. Knowledge “is itself the generator of physical force,” he declared. Science gave the world steam, and soon, he suspected, would turn to the less tangible power of electricity: “Already it has nearly chained the ethereal fluid.” And he looked further:

It is the science of calculation—which becomes continually more necessary at each step of our progress, and which must ultimately govern the whole of the applications of science to the arts of life.

Some years before his death, he told a friend that he would gladly give up whatever time he had left, if only he could be allowed to live for three days, five centuries in the future.

As for his young friend Ada, countess of Lovelace, she died many years before him—a protracted, torturous death from cancer of the womb, her agony barely lessened by laudanum and cannabis. For a long time her family kept from her the truth of her illness. In the end she knew she was dying. “They say that ‘coming events cast their shadows before,’ ” she wrote to her mother. “May they not sometimes cast their lights before?” They buried her next to her father.

She, too, had a last dream of the future: “my being in time an Autocrat, in my own way.” She would have regiments, marshaled before her. The iron rulers of the earth would have to give way. And of what would her regiments consist? “I do not at present divulge. I have however the hope that they will be most harmoniously disciplined troops;—consisting of vast numbers, & marching in irresistible power to the sound of Music. Is not this very mysterious? Certainly my troops must consist of numbers, or they can have no existence at all. . . . But then, what are these Numbers? There is a riddle—”




(#ulink_9bbbadae-64dd-5271-8159-7031ea422f8c)Leibniz dreamed grandly of mechanizing algebra and even reason itself. “We may give final praise to the machine,” he wrote. “It will be desirable to all who are engaged in computations . . . the managers of financial affairs, the administrators of others’ estates, merchants, surveyors, geographers, navigators, astronomers. . . . For it is unworthy of excellent men to lose hours like slaves in the labor of calculation.”




(#ulink_adb75386-611e-5c8f-872e-5e19e5cc9def)Another guest, Charles Dickens, put something of Babbage into the character of Daniel Doyce in Little Dorrit. Doyce is an inventor mistreated by the government he tries to serve: “He is well known as a very ingenious man. . . . He perfects an invention (involving a very curious secret process) of great importance to his country and his fellow-creatures. I won’t say how much money it cost him, or how many years of his life he had been about it, but he brought it to perfection.” Dickens added: “A composed and unobtrusive self-sustainment was noticeable in Daniel Doyce—a calm knowledge that what was true must remain true.”


Chapter Five

A Nervous System for the Earth

(What Can One Expect of a Few Wretched Wires?)

Is it a fact—or have I dreamt it—that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? Rather, the round globe is a vast head, a brain, instinct with intelligence! Or, shall we say, it is itself a thought, nothing but thought, and no longer the substance which we deemed it!

—Nathaniel Hawthorne (1851)

THREE CLERKS IN A SMALL ROOM UPSTAIRS in the Ferry House of Jersey City handled the entire telegraph traffic of the city of New York in 1846 and did not have to work very hard. They administered one end of a single pair of wires leading to Baltimore and Washington. Incoming messages were written down by hand, relayed by ferry across the Hudson River to the Liberty Street pier, and delivered to the first office of the Magnetic Telegraph Company at 16 Wall Street.

In London, where the river caused less difficulty, capitalists formed the Electric Telegraph Company and began to lay their first copper wires, twisted into cables, covered with gutta-percha, and drawn through iron pipes, mainly alongside new railroad tracks. To house the central office the company rented Founders’ Hall, Lothbury, opposite the Bank of England, and advertised its presence by installing an electric clock— modern and apt, for already railroad time was telegraphic time. By 1849 the telegraph office boasted eight instruments, operated day and night. Four hundred battery cells provided the power. “We see before us a stuccoed wall, ornamented with an electric illuminated clock,” reported Andrew Wynter, a journalist, in 1854. “Who would think that behind this narrow forehead lay the great brain—if we may so term it—of the nervous system of Britain?” He was neither the first nor the last to liken the electric telegraph to biological wiring: comparing cables to nerves; the nation, or the whole earth, to the human body.

The analogy linked one perplexing phenomenon with another. Electricity was an enigma wrapped in mystery verging on magic, and no one understood nerves, either. Nerves were at least known to conduct a form of electricity and thus, perhaps, to serve as conduits for the brain’s control of the body. Anatomists examining nerve fibers wondered whether they might be insulated with the body’s own version of gutta-percha. Maybe nerves were not just like wires; maybe they were wires, carrying messages from the nether regions to the sensorium. Alfred Smee, in his 1849 Elements of Electro-Biology, likened the brain to a battery and the nerves to “bio-telegraphs.” Like any overused metaphor, this one soon grew ripe for satire. A newspaper reporter in Menlo Park, discovering Thomas A. Edison in the grip of a head cold, wrote: “The doctor came and looked at him, explained the relations of the trigeminal nerves and their analogy to an electric telegraph with three wires, and observed incidentally that in facial neuralgia each tooth might be regarded as a telegraph station with an operator.” When the telephone arrived, it reinforced the analogy. “The time is close at hand,” declared Scientific American in 1880, “when the scattered members of civilized communities will be as closely united, so far as instant telephonic communication is concerned, as the various members of the body now are by the nervous system.” Considering how speculative the analogy was, it turned out well. Nerves really do transmit messages, and the telegraph and telephone did begin to turn human society, for the first time, into something like a coherent organism.

In their earliest days these inventions inspired exhilaration without precedent in the annals of technology. The excitement passed from place to place in daily newspapers and monthly magazines and, more to the point, along the wires themselves. A new sense of futurity arose: a sense that the world was in a state of change, that life for one’s children and grandchildren would be very different, all because of this force and its uses. “Electricity is the poetry of science,” an American historian declared in 1852.

Not that anyone knew what electricity was. “An invisible, intangible, imponderable agent,” said one authority. Everyone agreed that it involved a “peculiar condition” either of molecules or of the ether (itself a nebulous, and ultimately doomed, conception). Thomas Browne, in the seventeenth century, described electrical effluvia as “threads of syrup, which elongate and contract.” In the eighteenth, the kite-flying Benjamin Franklin proved “the sameness of lightning with electricity”— identifying those fearsome bolts from the sky with the odd terrestrial sparks and currents. Franklin followed the Abbé Jean-Antoine Nollet, a natural philosopher and a bit of a showman, who said in 1748, “Electricity in our hands is the same as thunder in the hands of nature” and to prove it organized an experiment employing a Leyden jar and iron wire to send a shock through two hundred Carthusian monks arranged in a circle one mile around. From the monks’ almost simultaneous hops, starts, jerks, and cries, onlookers judged that the message—its information content small but not zero—sped round the circle at fantastic speed.

Later, it was Michael Faraday in England who did more than anyone to turn electricity from magic to science, but even so, in 1854, when Faraday was at the height of his investigations, Dionysius Lardner, the scientific writer who so admired Babbage, could quite accurately declare, “The World of Science is not agreed as to the physical character of Electricity.” Some believed it to be a fluid “lighter and more subtle” than any gas; others suspected a compound of two fluids “having antagonistic properties”; and still others thought electricity was not a fluid at all, but something analogous to sound: “a series of undulations or vibrations.” Harper’s Magazine warned that “current” was just a metaphor and added mysteriously, “We are not to conceive of the electricity as carrying the message that we write, but rather as enabling the operator at the other end of the line to write a similar one.”

Whatever its nature, electricity was appreciated as a natural force placed under human control. A young New York newspaper, The Times, explained it by way of contrast with steam:

Both of them are powerful and even formidable agents wrested from nature, by the skill and power of man. But electricity is by far the subtlest energy of the two. It is an original natural element, while steam is an artificial production. . . . Electricity combined with magnetism, is a more subjective agent, and when evolved for transmission is ready to go forth, a safe and expeditious messenger to the ends of the habitable globe.

Looking back, rhapsodists found the modern age foretold in a verse from the book of Job: “Canst thou send lightnings, that they may go and say unto thee, Here we are?”

But lightning did not say anything—it dazzled, cracked, and burned, but to convey a message would require some ingenuity. In human hands, electricity could hardly accomplish anything, at first. It could not make a light brighter than a spark. It was silent. But it could be sent along wires to great distances—this was discovered early—and it seemed to turn wires into faint magnets. Those wires could be long: no one had found any limit to the range of the electric current. It took no time at all to see what this meant for the ancient dream of long-distance communication. It meant sympathetic needles.

Practical problems had to be solved: making wires, insulating them, storing currents, measuring them. A whole realm of engineering had to be invented. Apart from the engineering was a separate problem: the problem of the message itself. This was more a logic puzzle than a technical one. It was a problem of crossing levels, from kinetics to meaning. What form would the message take? How would the telegraph convert this fluid into words? By virtue of magnetism, the influence propagated across a distance could perform work upon physical objects, such as needles, or iron filings, or even small levers. People had different ideas: the electromagnet might sound an alarum-bell; might govern the motion of wheel-work; might turn a handle, which might carry a pencil (but nineteenth-century engineering was not up to robotic handwriting). Or the current might discharge a cannon. Imagine discharging a cannon by sending a signal from miles away! Would-be inventors naturally looked to previous communications technologies, but the precedents were mostly the wrong sort.

Before there were electric telegraphs, there were just telegraphs: les télégraphes, invented and named by Claude Chappe in France during the Revolution.


(#litres_trial_promo) They were optical; a “telegraph” was a tower for sending signals to other towers in line of sight. The task was to devise a signaling system more efficient and flexible than, say, bonfires. Working with his messaging partner, his brother Ignace, Claude tried out a series of different schemes, evolving over a period of years.

The first was peculiar and ingenious. The Chappe brothers set a pair of pendulum clocks to beat in synchrony, each with its pointer turning around a dial at relatively high speed. They experimented with this in their hometown, Brûlon, about one hundred miles west of Paris. Ignace, the sender, would wait till the pointer reached an agreed number and at that instant signal by ringing a bell or firing a gun or, more often, banging upon a casserole. Upon hearing the sound, Claude, stationed a quarter mile away, would read the appropriate number off his own clock. He could convert number to words by looking them up in a prearranged list. This notion of communication via synchronized clocks reappeared in the twentieth century, in physicists’ thought experiments and in electronic devices, but in 1791 it led nowhere. One drawback was that the two stations had to be linked both by sight and by sound—and if they were, the clocks had little to add. Another was the problem of getting the clocks synchronized in the first place and keeping them synchronized. Ultimately, fast long-distance messaging was what made synchronization possible—not the reverse. The scheme collapsed under the weight of its own cleverness.

Meanwhile the Chappes managed to draw more of their brothers, Pierre and René, into the project, with a corps of municipal officers and royal notaries to bear witness. The next attempt dispensed with clockwork and sound. The Chappes constructed a large wooden frame with five sliding shutters, to be raised and lowered with pulleys. By using each possible combination, this “telegraph” could transmit an alphabet of thirty-two symbols—2


, another binary code, though the details do not survive. Claude was pleading for money from the newly formed Legislative Assembly, so he tried this hopeful message from Brûlon: “L’Assembleé nationale récompensera les experiences utiles au public” (“The National Assembly will reward experiments useful to the public”). The eight words took 6 minutes, 20 seconds to transmit, and they failed to come true.

Revolutionary France was both a good and a bad place for modernistic experimentation. When Claude erected a prototype telegraph in the parc Saint-Fargeau, in the northeast of Paris, a suspicious mob burned it to the ground, fearful of secret messaging. Citizen Chappe continued looking for a technology as swift and reliable as that other new device, the guillotine. He designed an apparatus with a great crossbeam supporting two giant arms manipulated by ropes. Like so many early machines, this was somewhat anthropomorphic in form. The arms could take any of seven angles, at 45-degree increments (not eight, because one would leave the arm hidden behind the beam), and the beam, too, could rotate, all under the control of an operator down below, manipulating a system of cranks and pulleys. To perfect this complex mechanism Chappe enlisted Abraham-Louis Breguet, the well-known watchmaker.

As intricate as the control problem was, the question of devising a suitable code proved even more difficult. From a strictly mechanical point of view, the arms and the beam could take any angle at all—the possibilities were infinite—but for efficient signaling Chappe had to limit the possibilities. The fewer meaningful positions, the less likelihood of confusion. He chose only two for the crossbeam, on top of the seven for each arm, giving a symbol space of 98 possible arrangements (7 × 7 × 2). Rather than just use these for letters and numerals, Chappe set out to devise an elaborate code. Certain signals were reserved for error correction and control: start and stop, acknowledgment, delay, conflict (a tower could not send messages in both directions at once), and failure. Others were used in pairs, pointing the operator to pages and line numbers in special code books with more than eight thousand potential entries: words and syllables as well as proper names of people and places. All this remained a carefully guarded secret. After all, the messages were to be broadcast in the sky, for anyone to see. Chappe took it for granted that the telegraph network of which he dreamed would be a department of the state, government owned and operated. He saw it not as an instrument of knowledge or of riches, but as an instrument of power. “The day will come,” he wrote, “when the Government will be able to achieve the grandest idea we can possibly have of power, by using the telegraph system in order to spread directly, every day, every hour, and simultaneously, its influence over the whole republic.”





Конец ознакомительного фрагмента. Получить полную версию книги.


Текст предоставлен ООО «ЛитРес».

Прочитайте эту книгу целиком, купив полную легальную версию (https://www.litres.ru/james-gleick/the-information-a-history-a-theory-a-flood/) на ЛитРес.

Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.



Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing.We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code.In ‘The Information’ James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the ‘bit’, it is a fascinating account of the modern age’s defining idea and a brilliant exploration of how information has revolutionised our lives.

Как скачать книгу - "The Information: A History, a Theory, a Flood" в fb2, ePub, txt и других форматах?

  1. Нажмите на кнопку "полная версия" справа от обложки книги на версии сайта для ПК или под обложкой на мобюильной версии сайта
    Полная версия книги
  2. Купите книгу на литресе по кнопке со скриншота
    Пример кнопки для покупки книги
    Если книга "The Information: A History, a Theory, a Flood" доступна в бесплатно то будет вот такая кнопка
    Пример кнопки, если книга бесплатная
  3. Выполните вход в личный кабинет на сайте ЛитРес с вашим логином и паролем.
  4. В правом верхнем углу сайта нажмите «Мои книги» и перейдите в подраздел «Мои».
  5. Нажмите на обложку книги -"The Information: A History, a Theory, a Flood", чтобы скачать книгу для телефона или на ПК.
    Аудиокнига - «The Information: A History, a Theory, a Flood»
  6. В разделе «Скачать в виде файла» нажмите на нужный вам формат файла:

    Для чтения на телефоне подойдут следующие форматы (при клике на формат вы можете сразу скачать бесплатно фрагмент книги "The Information: A History, a Theory, a Flood" для ознакомления):

    • FB2 - Для телефонов, планшетов на Android, электронных книг (кроме Kindle) и других программ
    • EPUB - подходит для устройств на ios (iPhone, iPad, Mac) и большинства приложений для чтения

    Для чтения на компьютере подходят форматы:

    • TXT - можно открыть на любом компьютере в текстовом редакторе
    • RTF - также можно открыть на любом ПК
    • A4 PDF - открывается в программе Adobe Reader

    Другие форматы:

    • MOBI - подходит для электронных книг Kindle и Android-приложений
    • IOS.EPUB - идеально подойдет для iPhone и iPad
    • A6 PDF - оптимизирован и подойдет для смартфонов
    • FB3 - более развитый формат FB2

  7. Сохраните файл на свой компьютер или телефоне.

Книги автора

Рекомендуем

Последние отзывы
Оставьте отзыв к любой книге и его увидят десятки тысяч людей!
  • константин александрович обрезанов:
    3★
    21.08.2023
  • константин александрович обрезанов:
    3.1★
    11.08.2023
  • Добавить комментарий

    Ваш e-mail не будет опубликован. Обязательные поля помечены *