From Publishers Weekly
Johnson has been nominated for several awards for earlier books on physics and physicists (Strange Beauty; Fire in the Mind). Here he sticks mainly to science, providing a quick overview of a cutting-edge union between quantum theory and computing. The book begins by describing a computer as "just a box with a bunch of switches." Although today's computer switches are imbedded in circuitry, they can in principle be made of any material, like the early banks of vacuum tubes; Johnson also recalls a tic-tac-toe-playing machine created from Tinkertoys in the 1970s. An ordinary computer switch, binary in nature, registers as either a zero or a one, but if a single atom were harnessed as a switch, its dual nature as both particle and wave means it could be "superpositioned," simultaneously zero and one. A series of such switches could handle complex calculations much more swiftly than conventional computers: an entertaining theory, but impractical. Except that a quantum computer's ability to factor large numbers-determining the smaller numbers by which they are divisible-would have a critical application in cryptography, with a string of atoms used to create (or break) complex codes. After discussing competing projects that aim to make the theory of quantum computing a reality, the book concludes with ruminations on the implications of the projects' possible success. Using "a series of increasingly better cartoons" and plain language, Johnson's slim volume is so straightforward that readers without a technical background will have no problem following his chain of thought. Illus.Copyright 2003 Reed Business Information, Inc.
From Library Journal
It's hard to imagine how the newest Pentium chip could pack 40 million electronic switches into a nickel-sized bit of silicon and even harder to imagine what that means for computing. A recipient of the Science Journalism Award, Johnson should make it all clear. Copyright 2002 Reed Business Information, Inc.
From Scientific American
In the 1960s Gordon Moore made the empirical observation that the density of components on a chip was doubling roughly every 18 months. Over the past 40 years, Moore's law has continued to hold. These doublings in chip density explain why today's personal computers are as powerful as those that only governments and large corporations possessed just a couple decades ago. But in 10 to 20 years each transistor will have shrunk to atomic size, and Moore's law, which is based on current silicon technology, is expected to end. This prospect drives the search for entirely new technologies, and one major candidate is a quantum computer--that is, a computer based on the principles of quantum mechanics. There is another motive for studying quantum computers. The functioning of such a device, which lies at the intersection of quantum mechanics, computer science and mathematics, has aroused great intellectual curiosity. George Johnson, who writes about science for the New York Times, has set himself the task of deconstructing quantum computing at a level that readers of that newspaper--and this magazine--can understand. He has succeeded admirably. He explains the principles of quantum mechanics essential to quantum computing but tells no more than necessary. "We are operating here," he promises, "on a need-to-know basis." One of the things readers really need to know about is superposition, a key principle of quantum mechanics, and Johnson proceeds to enlighten: "In the tiny spaces inside atoms, the ordinary rules of reality ... no longer hold. Defying all common sense, a single particle can be in two places at the same time. And so, while a switch in a conventional computer can be either on or off, representing 1 or 0, a quantum switch can paradoxically be in both states at the same time, saying 1 and 0.... Therein lies the source of the power." Whereas three ordinary switches could store any one of eight patterns, three quantum switches can hold all eight at once. Thus, a quantum computer could process extraordinary amounts of information, and it could do so with such speed that it essentially takes "a shortcut through time." In 1982 Richard Feynman conjectured that although simulations of the quantum world (needed for understanding the subatomic aspects of physics and chemistry) could never be done on a classical computer, they might be possible on a computer that worked quantum-mechanically. But interest in quantum computing didn't really take off until 1994, when Peter Shor, a mathematician at Bell Labs, showed that a quantum computer could be programmed to factor huge numbers--fast. There is a reason for the fascination with factoring large integers (breaking the large number into the smaller numbers that can be multiplied together to produce it). "Many of society's secrets, from classified military documents to the credit card numbers sent over the Internet, are protected using codes based on the near-impossibility of factoring large numbers.... Vulnerable codes are as disturbing to nations as vulnerable borders." Despite such advances as Shor's algorithm and despite the importance for national security, serious impediments stand in the way of building a quantum computer. The superpositions from which quantum computing gets its power are lost when a measurement of the quantum state is made. And because the environment interacting with a quantum computer is akin to taking measurements, this presents a fundamental difficulty. Another barrier is that although quantum computers with seven quantum bits (qubits) have been built, it is not clear whether the technology used--or any other technology--will scale up to handle enough qubits. Seth Lloyd of the Massachusetts Institute of Technology has estimated, for example, that interesting classically intractable problems from atomic physics can be solved by a quantum computer that has some 100 qubits (although error correction will require many additional qubits). Researchers are exploring a variety of technologies for building a quantum computer, including ion traps, nuclear magnetic resonance (NMR), quantum dots, and cavity quantum electrodynamics (QED). Which of these technologies, or technologies not yet conceived, will win out is not yet known. Of course, the unknown is part of the fun of science. One of our most gifted science writers, Johnson is a master at bringing the reader along, giving increasingly better approximations to the truth. The book is lucid, elegant, brief--and imbued with the excitement of this rapidly evolving field.
Joseph F. Traub is Edwin Howard Armstrong Professor of Computer Science at Columbia University (homepage: www.cs.columbia. edu/~traub). His most recent book is Complexity and Information (Cambridge University Press, 1998).
From Booklist
Take computer theory, mix it with quantum physics, and what do you end up with? One of the most confusing fields of study imaginable. For those who already feel confused or overwhelmed by complicated technology and its proliferation, the increasing momentum of this area of research is not good news. For those awaiting the next technological revolution, Johnson's book on quantum computing may be as friendly an introduction as one could find. His stated purpose is to present a basic overview to a general audience, and he does a surprisingly good job, considering the difficulty of the subject matter. He begins by explaining some of the basic concepts of quantum mechanics using simple examples and analogies and then goes on to explain how information theory can be applied to physics at the atomic and molecular levels. He walks the reader through a time line of scientific progress, including practical obstacles and theoretical problems. Although Johnson barely scratches the surface of the subject, his book is a respectable and accessible "crash course" in the emerging field of quantum computing. Gavin Quinn
Copyright © American Library Association. All rights reserved
From Book News, Inc.
Answering a colleague's challenge to write a short book explaining the complex science of quantum computing, Johnson offers a clear analysis of the emerging technology for a general audience interested in science. Johnson, a science writer for the New York Times, explains the quantum theory that makes the computer a reality and projects future applications in code breaking, solving previously unsolvable math equations, and understanding the mysterious behavior of certain protein molecules.Copyright © 2004 Book News, Inc., Portland, OR
A Shortcut Through Time: The Path to the Quantum Computer FROM OUR EDITORS
Computers are constantly becoming smaller, faster, and more potent. But the power of even tomorrow's supercomputers could be dramatically outstripped by the development of quantum computers now on the drawing boards. In A Shortcut Through Time, award-winning New York Times science writer George Johnson guides us on a mind-boggling tour of the miraculous world of nanosecond computation.
FROM THE PUBLISHER
The first book to prepare us for the next big—perhaps the biggest—breakthrough in the short history of the cyberworld: the development of the quantum computer.
The newest Pentium chip driving personal computers packs 40 million electronic switches onto a piece of silicon the size of a thumbnail. It is dramatically smaller and more powerful than anything that has come before it. If this incredible shrinking act continues, the logical culmination is a computer in which each switch is composed of a single atom. And at that point the miraculous—the actualization of quantum mechanics—becomes real. If atoms can be harnessed, society will be transformed: problems that could take forever to be solved on the supercomputers available today would be dispatched with ease. Quantum computing promises nothing less astonishing than a shortcut through time.
In this book, the award-winning New York Times science writer George Johnson first takes us back to the original idea of a computer—almost simple enough to be made of Tinkertoys—and then leads us through increasing levels of complexity to the soul of this remarkable new machine. He shows us how, in laboratories around the world, the revolution has already begun.
Writing with a brilliant clarity, Johnson makes sophisticated material on (and even beyond) the frontiers of science both graspable and utterly fascinating, affording us a front-row seat at one of the most galvanizing scientific dramas of the new century.
SYNOPSIS
Answering a colleague's challenge to write a short book explaining the complex science of quantum computing, Johnson offers a clear analysis of the emerging technology for a general audience interested in science. Johnson, a science writer for the New York Times, explains the quantum theory that makes the computer a reality and projects future applications in code breaking, solving previously unsolvable math equations, and understanding the mysterious behavior of certain protein molecules. Annotation (c)2003 Book News, Inc., Portland, OR
FROM THE CRITICS
The New York Times
George Johnson has written a blessedly slim book, A Shortcut Through
Time, that gets across the gist of quantum computing with plenty of
charm and no tears. Computer science is hard; quantum mechanics is
weird. But Johnson, who contributes science articles to The New York
Times and is the author of four previous popularizations, explains it
all with Tinkertoys and clocks and spinning tops and just a little
arithmetic. It's a briskly told story, driven entirely by ideas. —
Jim Holt
Publishers Weekly
Johnson has been nominated for several awards for earlier books on physics and physicists (Strange Beauty; Fire in the Mind). Here he sticks mainly to science, providing a quick overview of a cutting-edge union between quantum theory and computing. The book begins by describing a computer as "just a box with a bunch of switches." Although today's computer switches are imbedded in circuitry, they can in principle be made of any material, like the early banks of vacuum tubes; Johnson also recalls a tic-tac-toe-playing machine created from Tinkertoys in the 1970s. An ordinary computer switch, binary in nature, registers as either a zero or a one, but if a single atom were harnessed as a switch, its dual nature as both particle and wave means it could be "superpositioned," simultaneously zero and one. A series of such switches could handle complex calculations much more swiftly than conventional computers: an entertaining theory, but impractical. Except that a quantum computer's ability to factor large numbers-determining the smaller numbers by which they are divisible-would have a critical application in cryptography, with a string of atoms used to create (or break) complex codes. After discussing competing projects that aim to make the theory of quantum computing a reality, the book concludes with ruminations on the implications of the projects' possible success. Using "a series of increasingly better cartoons" and plain language, Johnson's slim volume is so straightforward that readers without a technical background will have no problem following his chain of thought. Illus. Agent, Esther Newberg. (Mar. 2) Copyright 2003 Cahners Business Information.
Library Journal
The simplicity of binary logic, on or off, 1 or 0, is what enables today's desktop and supercomputers to process data. Quantum computing, on the other hand, operates under a different set of rules in which everything is 1, 0, or 1 and 0 at the same time. Here, Johnson, an award-winning science writer for the New York Times and author of Strange Beauty: Murray Gell-Mann and the Revolution in Twentieth-Century Physics, chronicles a technology that on its deepest level he even finds "hard to swallow." Advances in quantum theory are as inevitable as the universe's expansion, he posits, and they will result in exponentially faster computers that will be able to solve complex problems now considered impossible. In everyday terms, quantum computers would be able to search infinitely vast databases in mere seconds or solve mathematical problems that have puzzled scientists for centuries. On the dark side, Johnson warns that a quantum computer in the hands of a digital thief could be used to crack complex encryption codes that protect credit card transactions and other sensitive financial information or compromise the security of classified military information. Johnson has presented the fascinating science of quantum computing and its future development in a down-to-earth style. Recommended for most libraries. [Previewed in Prepub Alert, LJ 10/1/02.]-Joe J. Accardi, Harper Coll. Lib., Palatine, IL Copyright 2003 Reed Business Information.
Kirkus Reviews
New York Times science writer Johnson (Strange Beauty, 1999, etc.) explains why quantum computers are expected to be the next major breakthrough. The author begins by recalling his youthful disappointment when he received a build-it-yourself computer and discovered how simple it was. But that anticlimax revealed a central truth: all digital computers are in essence bundles of on-off switches. The logical destination of the trend toward miniaturization is a computer in which each switch is a single atom. There is more to the quantum computer, however, than mere compactness, as Johnson makes clear in a quick summary of quantum theory. The beauty of the "qubit" (as scientists have dubbed the quantum bit) is that it can be in several superimposed states: not just "on" or "off," but both at once. Thus, the numbers 1 through 1024 can all be represented at once by ten quantum switches. Put into practice, this capability enables a stunning increase in speed, essential for tackling such problems as the factoring of very large numbers, which is a key to modern cryptography. Johnson spends some time examining ways in which the simple switches that are the basis for computers could be built from quantum parts. He doesn't minimize the difficulties of the task. To give just one example, capturing atoms (or molecules, or subatomic particles) and training them to act as switches requires cooling them close to absolute zero, impractical for desktop applications. Nor are the qubits anywhere near as stable as one would like, with a few seconds the best working lifetime so far achieved. Still, the potential of the nascent technology is fascinating, and if successful, its development is likely to be one ofthe most closely watched scientific stories of the new century. A tantalizing glimpse of how the uncertainties of quantum theory may yet be tamed for work of the highest precision.