A Shortcut Through Time: The Path to the Quantum Computer - Hardcover

Johnson, George

  • 3.71 out of 5 stars
    238 ratings by Goodreads
 
9780375411939: A Shortcut Through Time: The Path to the Quantum Computer

Synopsis

The first book to prepare us for the next big—perhaps the biggest—breakthrough in the short history of the cyberworld: the development of the quantum computer.

The newest Pentium chip driving personal computers packs 40 million electronic switches onto a piece of silicon the size of a thumbnail. It is dramatically smaller and more powerful than anything that has come before it. If this incredible shrinking act continues, the logical culmination is a computer in which each switch is composed of a single atom. And at that point the miraculous—the actualization of quantum mechanics—becomes real. If atoms can be harnessed, society will be transformed: problems that could take forever to be solved on the supercomputers available today would be dispatched with ease. Quantum computing promises nothing less astonishing than a shortcut through time.

In this book, the award-winning New York Times science writer George Johnson first takes us back to the original idea of a computer—almost simple enough to be made of Tinkertoys—and then leads us through increasing levels of complexity to the soul of this remarkable new machine. He shows us how, in laboratories around the world, the revolution has already begun.

Writing with a brilliant clarity, Johnson makes sophisticated material on (and even beyond) the frontiers of science both graspable and utterly fascinating, affording us a front-row seat at one of the most galvanizing scientific dramas of the new century.

"synopsis" may belong to another edition of this title.

About the Author

George Johnson is a science writer for the New York Times. He is a recipient of the Science Journalism Award from the American Association for the Advancement of Science and was a finalist for the distinguished Rhone-Poulenc Prize. This is his fifth book. He lives with his wife in Santa Fe, New Mexico.

From the Back Cover

"George Johnson…sets a new standard in science writing." --The New York Times

"A tantalizing glimpse of how the uncertainties of quantum theory may yet be tamed for work of the highest precision."
--Kirkus

"George Johnson, who writes about science for the New York Times, has set himself the task of deconstructing quantum computing at a level that readers of that newspaper -- and this magazine -- can understand. He has succeeded admirably…One of our most gifted science writers, Johnson is a master at bringing the reader along, giving increasingly better approximations to the truth. The book is lucid, elegant, brief -- and imbued with the excitement of this rapidly evolving field."
--Scientific American



“There’s nothing like quantum weirdness to remind us that this world was not made with us in mind. We are lucky to have a writer like George Johnson to walk us through it. In A Shortcut Through Time, he gives us a clear, funny and very human tour of this impossible science and where it may be taking us next. I read it and I thought, At last, this computes. Terrific book.”
–Jonathan Weiner, Pulitzer Prize-winning author of The Beak of the Finch

From the Inside Flap

ok to prepare us for the next big perhaps the biggest breakthrough in the short history of the cyberworld: the development of the quantum computer.

The newest Pentium chip driving personal computers packs 40 million electronic switches onto a piece of silicon the size of a thumbnail. It is dramatically smaller and more powerful than anything that has come before it. If this incredible shrinking act continues, the logical culmination is a computer in which each switch is composed of a single atom. And at that point the miraculous the actualization of quantum mechanics becomes real. If atoms can be harnessed, society will be transformed: problems that could take forever to be solved on the supercomputers available today would be dispatched with ease. Quantum computing promises nothing less astonishing than a shortcut through time.

In this book, the award-winning New York Times science writer George Johnson first takes us back to the original

Reviews

Johnson has been nominated for several awards for earlier books on physics and physicists (Strange Beauty; Fire in the Mind). Here he sticks mainly to science, providing a quick overview of a cutting-edge union between quantum theory and computing. The book begins by describing a computer as "just a box with a bunch of switches." Although today's computer switches are imbedded in circuitry, they can in principle be made of any material, like the early banks of vacuum tubes; Johnson also recalls a tic-tac-toe-playing machine created from Tinkertoys in the 1970s. An ordinary computer switch, binary in nature, registers as either a zero or a one, but if a single atom were harnessed as a switch, its dual nature as both particle and wave means it could be "superpositioned," simultaneously zero and one. A series of such switches could handle complex calculations much more swiftly than conventional computers: an entertaining theory, but impractical. Except that a quantum computer's ability to factor large numbers-determining the smaller numbers by which they are divisible-would have a critical application in cryptography, with a string of atoms used to create (or break) complex codes. After discussing competing projects that aim to make the theory of quantum computing a reality, the book concludes with ruminations on the implications of the projects' possible success. Using "a series of increasingly better cartoons" and plain language, Johnson's slim volume is so straightforward that readers without a technical background will have no problem following his chain of thought. Illus.
Copyright 2003 Reed Business Information, Inc.

In the 1960s Gordon Moore made the empirical observation that the density of components on a chip was doubling roughly every 18 months. Over the past 40 years, Moore's law has continued to hold. These doublings in chip density explain why today's personal computers are as powerful as those that only governments and large corporations possessed just a couple decades ago. But in 10 to 20 years each transistor will have shrunk to atomic size, and Moore's law, which is based on current silicon technology, is expected to end. This prospect drives the search for entirely new technologies, and one major candidate is a quantum computer--that is, a computer based on the principles of quantum mechanics. There is another motive for studying quantum computers. The functioning of such a device, which lies at the intersection of quantum mechanics, computer science and mathematics, has aroused great intellectual curiosity. George Johnson, who writes about science for the New York Times, has set himself the task of deconstructing quantum computing at a level that readers of that newspaper--and this magazine--can understand. He has succeeded admirably. He explains the principles of quantum mechanics essential to quantum computing but tells no more than necessary. "We are operating here," he promises, "on a need-to-know basis." One of the things readers really need to know about is superposition, a key principle of quantum mechanics, and Johnson proceeds to enlighten: "In the tiny spaces inside atoms, the ordinary rules of reality ... no longer hold. Defying all common sense, a single particle can be in two places at the same time. And so, while a switch in a conventional computer can be either on or off, representing 1 or 0, a quantum switch can paradoxically be in both states at the same time, saying 1 and 0.... Therein lies the source of the power." Whereas three ordinary switches could store any one of eight patterns, three quantum switches can hold all eight at once. Thus, a quantum computer could process extraordinary amounts of information, and it could do so with such speed that it essentially takes "a shortcut through time." In 1982 Richard Feynman conjectured that although simulations of the quantum world (needed for understanding the subatomic aspects of physics and chemistry) could never be done on a classical computer, they might be possible on a computer that worked quantum-mechanically. But interest in quantum computing didn't really take off until 1994, when Peter Shor, a mathematician at Bell Labs, showed that a quantum computer could be programmed to factor huge numbers--fast. There is a reason for the fascination with factoring large integers (breaking the large number into the smaller numbers that can be multiplied together to produce it). "Many of society's secrets, from classified military documents to the credit card numbers sent over the Internet, are protected using codes based on the near-impossibility of factoring large numbers.... Vulnerable codes are as disturbing to nations as vulnerable borders." Despite such advances as Shor's algorithm and despite the importance for national security, serious impediments stand in the way of building a quantum computer. The superpositions from which quantum computing gets its power are lost when a measurement of the quantum state is made. And because the environment interacting with a quantum computer is akin to taking measurements, this presents a fundamental difficulty. Another barrier is that although quantum computers with seven quantum bits (qubits) have been built, it is not clear whether the technology used--or any other technology--will scale up to handle enough qubits. Seth Lloyd of the Massachusetts Institute of Technology has estimated, for example, that interesting classically intractable problems from atomic physics can be solved by a quantum computer that has some 100 qubits (although error correction will require many additional qubits). Researchers are exploring a variety of technologies for building a quantum computer, including ion traps, nuclear magnetic resonance (NMR), quantum dots, and cavity quantum electrodynamics (QED). Which of these technologies, or technologies not yet conceived, will win out is not yet known. Of course, the unknown is part of the fun of science. One of our most gifted science writers, Johnson is a master at bringing the reader along, giving increasingly better approximations to the truth. The book is lucid, elegant, brief--and imbued with the excitement of this rapidly evolving field.

Joseph F. Traub is Edwin Howard Armstrong Professor of Computer Science at Columbia University (homepage: www.cs.columbia. edu/~traub). His most recent book is Complexity and Information (Cambridge University Press, 1998).



It's hard to imagine how the newest Pentium chip could pack 40 million electronic switches into a nickel-sized bit of silicon and even harder to imagine what that means for computing. A recipient of the Science Journalism Award, Johnson should make it all clear.
Copyright 2002 Reed Business Information, Inc.

Take computer theory, mix it with quantum physics, and what do you end up with? One of the most confusing fields of study imaginable. For those who already feel confused or overwhelmed by complicated technology and its proliferation, the increasing momentum of this area of research is not good news. For those awaiting the next technological revolution, Johnson's book on quantum computing may be as friendly an introduction as one could find. His stated purpose is to present a basic overview to a general audience, and he does a surprisingly good job, considering the difficulty of the subject matter. He begins by explaining some of the basic concepts of quantum mechanics using simple examples and analogies and then goes on to explain how information theory can be applied to physics at the atomic and molecular levels. He walks the reader through a time line of scientific progress, including practical obstacles and theoretical problems. Although Johnson barely scratches the surface of the subject, his book is a respectable and accessible "crash course" in the emerging field of quantum computing. Gavin Quinn
Copyright © American Library Association. All rights reserved

Excerpt. © Reprinted by permission. All rights reserved.

Chapter 1

"Simple Electric Brain Machines and How to Make Them"

I don't know where I first saw the advertisement for the Geniac Electric Brain construction kit, but I knew I had to have one for Christmas. It was the early 1960s, and like a lot of science-crazed kids I was obsessed with the wonderfully outrageous idea of "thinking machines." I devoured the picture stories in Life magazine and the Saturday Evening Post about the electronic behemoths manufactured by companies like International Business Machines, Univac, and Remington Rand. The spinning tape drives and banks of blinking lights were as exciting to me as the idea of space travel. Two of my favorite books were Tom Swift and His Giant Robot and Danny Dunn and the Homework Machine-testaments to the eerie fantasy of automating human thought. One day, flipping through one of my favorite magazines-probably Boys' Life or Popular Science-I stumbled upon an unbelievably tantalizing ad.

"Can you think faster than this Machine?"

Below the provocative headline was a picture of the Geniac with its sloping panel bedecked with six large dials and a row of ten bulbs. Who knew what kind of mysterious circuitry was hidden inside?

"GENIAC, the first electrical brain construction kit, is equipped to play tic-tac-toe, cipher and decipher codes, convert from binary to decimal, reason in syllogisms, as well as add, subtract, multiply and divide. . . . You create from over 400 specially designed and manufactured components a machine that solves problems faster than you can express them." Such was the promise of the Oliver Garfield Co., 126 Lexington Avenue, New York 16, N.Y. (This was before zip codes replaced the old numbered postal zones.) To a boy growing up in Albuquerque, the location of this modern Frankenstein laboratory seemed promisingly exotic and far away.

"Send for your GENIAC kit now. Only $19.95. . . . We guarantee that if you do not want to keep GENIAC after two weeks you can return it for full refund plus shipping costs."

There was nothing to lose. I began my lobbying effort, making it clear to my parents that receiving a Geniac was all that mattered to me. Then I waited, my brain charged with the kind of high-voltage anticipation that can only accumulate in someone still in the first decade of life.

Christmas morning I sat on the floor anxiously opening presents, keeping my eye out for one large enough to hold the pieces of an electronic computer. Finally a likely box emerged from behind the tree. I tore off the wrapping.

Decades later I still remember the disappointment I felt as I explored the contents of the cardboard package. The title of the instruction manual was intriguing enough: "Simple Electric Brain Machines and How to Make Them." But how was anyone to carry out such an ambitious project with the meager, humdrum parts that had been supplied?

Digging through the pile, I was crestfallen to find that the bulk of the kit consisted of some decidedly low-tech pieces of particle board called Masonite: a big square one and six smaller round disks, each drilled with concentric patterns of little holes. This was complemented by an assortment of hardware you might find in a kitchen junk drawer or a toolbox in the garage: ten flashlight bulbs and sockets, a battery and battery clamp, a spool of insulated wire, several dozen nuts, bolts, and washers, a bunch of small brass-plated staples (referred to in the typewritten instructions as "jumpers"), and the tools for assembling this detritus into what would supposedly function as a digital computer-a hexagonal wrench for gripping bolts (a "spintite") and a screwdriver.

Finally there was a simple on-off switch, described rather melodramatically in the manual: "This is the switch that enables you to put suspense and drama into your machine; for you set everything the way it should be, then talk about it and explain it, and finally when you have your listener all keyed up and ready, you (or he) throw the switch . . ."

I'd been had. There were no vacuum tubes, no transistors, or capacitors, or resistors-the colorful components I'd found from eviscerating dead radios and TV sets. All I'd gotten for Christmas was a handy-dandy kit for stringing together mindlessly simple circuits of switches and bulbs. The nuts and bolts were to be placed in various holes on the square wooden panel and connected one to the other by wires running underneath. The little metal jumpers were to be inserted into holes in the Masonite disks, the ends bent over to keep them in place. When the disks were attached to the panel, with more bolts and washers, they could be turned to and fro so that the jumpers touched the heads of the bolts, forming connections that caused the lightbulbs to flash on and off. It was all just switches-simple enough for a child to understand.

Reluctantly I opened the manual, published, disconcertingly, in 1955, and saw that it contained the familiar explanations of the wonders of electricity. ("You can think of a battery as a pump, which is able to push electrons, or little marbles of electricity, away from the plus end of the battery and towards the minus end of the battery . . . A flow of electrons is an electric current.")

The instructions went on to show how to assemble circuits and switches into various question-answering machines.

1.Whom do you prefer: (a) Marilyn Monroe? or

(b) Liberace?

2.How would you put a thread in a small hole: (a) wet it? or (b) tap it?

3.Would you rather spend a day: (a) shopping on Fifth Avenue? or (b) hunting in the woods?

Depending on how you answered these and three other questions (rotating the six circular switches so that they pointed to A or B), the current in the wiring would flow to one of two bulbs, M or F. The result was called a "Masculine-Feminine Testing Machine."

Never mind the musty Eisenhower-era philosophy. Scientifically, the whole thing seemed obvious and dumb. Just changing the paper labels would turn the machine into a tester for, say, whether you were a Jock or a Brain (the 1960s version of "nerd"): "Would you rather spend a rainy afternoon: (a) building a crystal radio? or (b) working out in the gym?" The meaning was all in the eye of the beholder.

As I paged through the manual, other projects appeared slightly more interesting. The switches could be wired to make machines that added numbers. Turn dial A to indicate the first number, turn dial B to indicate the second, and if the copper paths behind the panel had been correctly platted, the lightbulb that came on would be the very one labeled with the proper answer. (And if you made a mistake and didn't feel like redoing the wires, you could just move around the tags.)

Rig the machine another way and you could subtract or multiply. Wire up the "Reasoning Machine" described on page 25 of the guide and turning switch A to indicate "All fighter pilots are bomber pilots" and switch B to "No bomber pilots are jet pilots" would light the bulb for "No fighter pilots are jet pilots." QED. A syllogism (and a reminder that something can be logical but not true).

It was all terribly anticlimactic. Only years later would I realize that an utterly profound idea was slowly insinuating itself into my head: a computer is indeed just a box with a bunch of switches. The Geniac was "semiautomatic," as the manual put it: You had to turn the dials by hand to light the lights. And to "reprogram" the machine, you had to unscrew nuts with the "spintite" and shift the wires around. But suppose that some of the bulbs were replaced with little motors. When activated by the proper combination of settings, the motors could turn switches on other Geniacs, which would cause the dials on still other Geniacs to spin.

All kinds of elaborations were possible. Data could be fed into a glorified Geniac not with clunky wooden rotary dials but with cards or paper tape punched with holes arranged in the proper patterns. Read like Braille by metallic fingers, the holes would cause electrical connections to be made. Instead of flashlight bulbs, the circuits could ignite phosphorescent dots on a video screen-which is really just an array of thousands of tiny lights.

In the computer's modern incarnation, mechanical parts have been replaced by millions of microscopic transistors, minuscule switches etched onto the surfaces of silicon chips. Data are stored as invisible magnetic spots on spinning disks. But the basic idea is the same.

Only remnants of my Geniac survive. Going through a closet recently, I found a box of old electronic junk. There, sitting atop a partially dissected radio chassis and other relics of the vacuum-tube age, were three of the old dials. Two still had labels-"Candidate A: Popularity," "Candidate A: Campaign Effort"-handwritten in black ink on strips of surgical adhesive tape. I had apparently tried to make a machine to predict election outcomes.

I couldn't find the old manual anywhere. The reason I can quote from it (and the advertisement), reigniting old memories, is because I found the information in minutes on the World Wide Web. (I also found a Geniac kit on eBay, which was auctioned off for more than $400, twenty times its original price.) I'm still astonished by the speed with which data course through the great skein of computers called the Internet. But it is comforting to know that for all its complexity, the Net is just a bunch of Geniacs chattering at each other, twisting each other's dials.

On page 37 of "Simple Electric Brain Machines and How to Make Them" came what was intended to be the climax: instructions for how to make a Geniac that played tic-tac-toe. But looking back years later, it's clear to me that the real meat of the book was an innocent-sounding statement way back on page 3: "The kit, though inexpensive and ...

"About this title" may belong to another edition of this title.

Other Popular Editions of the Same Title