Book details of 'Programming the Universe : A Quantum Computer Scientist Takes On the Cosmos'
|Title||Programming the Universe : A Quantum Computer Scientist Takes On the Cosmos|
Back to shelf Science
Amazon.com info for Programming the Universe : A Quantum Computer Scientist Takes On the Cosmos
The Virtual Bookcase Reviews of 'Programming the Universe : A Quantum Computer Scientist Takes On the Cosmos':
Add my review for Programming the Universe : A Quantum Computer Scientist Takes On the Cosmos
Is the universe actually a giant quantum computer? According to Seth Lloyd—Professor of Quantum-Mechanical Engineering at MIT and originator of the first technologically feasible design for a working quantum computer—the answer is yes. This wonderfully accessible book illuminates the professional and personal paths that led him to this remarkable conclusion.All interactions between particles in the universe, Lloyd explains, convey not only energy but also information—in other words, particles not only collide, they compute. And what is the entire universe computing, ultimately? “Its own dynamical evolution,” he says. “As the computation proceeds, reality unfolds.”To elucidate his theory, Lloyd examines the history of the cosmos, posing questions that in other hands might seem unfathomably complex: How much information is there in the universe? What information existed at the moment of the Big Bang and what happened to it? How do quantum mechanics and chaos theory interact to create our world? Could we attempt to re-create it on a giant quantum computer? Programming the Universe presents an original and compelling vision of reality, revealing our world in an entirely new light.
About the Author
Seth Lloyd is Professor of Mechanical Engineering at MIT, principal investigator at the Research Laboratory of Electronics, and the designer of the first feasible quantum computer. He has been featured in The New York Times, the Los Angeles Times, The Washington Post, The Economist, and Wired, among other publications. His name frequently appears (as both writer and subject) in the pages of Nature, New Scientist, Science, and Scientific American. He lives in Cambridge, MA.
Excerpt. © Reprinted by permission. All rights reserved.
This book is the story of the universe and the bit. The universe is the biggest thing there is and the bit is the smallest possible chunk of information. The universe is made of bits. Every molecule, atom, and elementary particle registers bits of information. Every interaction between those pieces of the universe processes that information by altering those bits. That is, the universe computes, and because the universe is governed by the laws of quantum mechanics, it computes in an intrinsically quantum-mechanical fashion; its bits are quantum bits. The history of the universe is, in effect, a huge and ongoing quantum computation. The universe is a quantum computer. This begs the question: What does the universe compute? It computes itself. The universe computes its own behavior. As soon as the universe began, it began computing. At first, the patterns it produced were simple, comprising elementary particles and establishing the fundamental laws of physics. In time, as it processed more and more information, the universe spun out ever more intricate and complex patterns, including galaxies, stars, and planets. Life, language, human beings, society, culture-all owe their existence to the intrinsic ability of matter and energy to process information. The computational capability of the universe explains one of the great mysteries of nature: how complex systems such as living creatures can arise from fundamentally simple physical laws. These laws allow us to predict the future, but only as a matter of probability, and only on a large scale. The quantum-computational nature of the universe dictates that the details of the future are intrinsically unpredictable. They can be computed only by a computer the size of the universe itself. Otherwise, the only way to discover the future is to wait and see what happens. Allow me to introduce myself. The first thing I remember is living in a chicken house. My father was apprenticed to a furniture maker in Lincoln, Massachusetts, and the chicken house was in back of her barn. My father turned the place into a two-room apartment; the space where the chickens had roosted became bunks for my older brother and me. (My younger brother was allowed a cradle.) At night, my mother would sing to us, tuck us in, and close the wooden doors to the roosts, leaving us to lie snug and stare out the windows at the world outside. My first memory is of seeing a fire leap up in a wire trash basket with an overlapping diamond pattern. Then I remember holding tight to my mother's blue-jeaned leg just above the knee and my father flying a Japanese fighter kite. After that, memories crowd on thick and fast. Each living being's perception of the world is unique and crowded with detail and structure. Yet we all inhabit the same space and are governed by the same physical laws. In school, I learned that the physical laws governing the universe are surprisingly simple. How could it be, I wondered, that the intricacy and complexity I saw outside my bedroom window was the result of these simple physical laws? I decided to study this question and spent years learning about the laws of nature. Heinz Pagels, who died tragically in a mountaineering accident in Colorado in the summer of 1988, was a brilliant and unconventional thinker who believed in transgressing the conventional boundaries of science. He encouraged me to develop physically precise techniques for characterizing and measuring complexity. Later, under the guidance of Murray Gell-Mann at Caltech, I learned how the laws of quantum mechanics and elementary-particle physics effectively "program" the universe, planting the seeds of complexity. These days, I am a professor of mechanical engineering at the Massachusetts Institute of Technology. Or, because I have no formal training in mechanical engineering, it might be more accurate to call me a professor of quantum-mechanical engineering. Quantum mechanics is the branch of physics that deals with matter and energy at its smallest scales. Quantum mechanics is to atoms what classical mechanics is to engines. In essence: I engineer atoms. In 1993, I discovered a way to build a quantum computer. Quantum computers are devices that harness the information-processing ability of individual atoms, photons, and other elementary particles. They compute in ways that classical computers, such as a Macintosh or a PC, cannot. In the process of learning how to make atoms and molecules-the smallest pieces of the universe-compute, I grew to appreciate the intrinsic information-processing ability of the universe as a whole. The complex world we see around us is the manifestation of the universe's underlying quantum computation. The digital revolution under way today is merely the latest in a long line of information-processing revolutions stretching back through the development of language, the evolution of sex, and the creation of life, to the beginning of the universe itself. Each revolution has laid the groundwork for the next, and all information-processing revolutions since the Big Bang stem from the intrinsic information-processing ability of the universe. The computational universe necessarily generates complexity. Life, sex, the brain, and human civilization did not come about by mere accident. The Quantum Computer Quantum mechanics is famously weird. Waves act like particles, and particles act like waves. Things can be in two places at once. It is perhaps not surprising that, at small scales, things behave in strange and counterintuitive ways; after all, our intuitions have developed for dealing with objects much larger than individual atoms. Quantum weirdness is still disconcerting, though. Niels Bohr, the father of quantum mechanics, once said that anyone who thinks he can contemplate quantum mechanics without getting dizzy hasn't properly understood it. Quantum computers exploit "quantum weirdness" to perform tasks too complex for classical computers. Because a quantum bit, or "qubit," can register both 0 and 1 at the same time (a classical bit can register only one or the other), a quantum computer can perform millions of computations simultaneously. Quantum computers process the information stored on individual atoms, electrons, and photons. A quantum computer is a democracy of information: every atom, electron, and photon participates equally in registering and processing information. And this fundamental democracy of information is not confined to quantum computers. All physical systems are at bottom quantum-mechanical, and all physical systems register and process information. The world is composed of elementary particles-electrons, photons, quarks-and each elementary piece of a physical system registers a chunk of information: one particle, one bit. When these pieces interact, they transform and process that information, bit by bit. Each collision between elementary particles acts as a simple logical operation, or "op." To understand any physical system in terms of its bits, we need to understand in detail the mechanism by which each and every piece of that system registers and processes information. If we can understand how a quantum computer does this, then we can understand how a physical system does. The idea of such a computer was proposed in the early 1980s by Paul Benioff, Richard Feynman, David Deutsch, and others. When they were first discussed, quantum computers were a wholly abstract concept: Nobody had a clue how to build them. In the early 1990s, I showed how they could be built using existing experimental techniques. Over the past ten years, I have worked with some of the world's greatest scientists and engineers to design, build, and operate quantum computers. There are a number of good reasons to build quantum computers. The first is that we can. Quantum technologies-technologies for manipulating matter at the atomic scale-have undergone remarkable advances in recent years. We now possess lasers stable enough, fabrication techniques accurate enough, and electronics fast enough to perform computation at the atomic scale. The second reason is that we have to-at least if we want to keep building ever faster and more powerful computers. Over the past half century, the power of computers has doubled every year and a half. This explosion of computer power is known as "Moore's law," after Gordon Moore, subsequently the chief executive of Intel, who noted its exponential advance in the 1960s. Moore's law is a law not of nature, but of human ingenuity. Computers have gotten two times faster every eighteen months because every eighteen months engineers have figured out how to halve the size of the wires and logic gates from which they are constructed. Every time the size of the basic components of a computer goes down by a factor of two, twice as many of them will fit on the same size chip. The resulting computer is twice as powerful as its predecessor of a year and half earlier. If you project Moore's law into the future, you find that the size of the wires and logic gates from which computers are constructed should reach the atomic scale in about forty years; thus, if Moore's law is to be sustained, we must learn to build computers that operate at the quantum scale. Quantum computers represent the ultimate level of miniaturization. The quantum computers my colleagues and I have constructed already attain this goal: each atom registers a bit. But the quantum computers we can build today are small, not only in size but also in power. The largest general-purpose quantum computers available at the time of this writing have seven to ten quantum bits and can perform thousands of quantum logic operations per second. (By contrast, a c...