“Computers, as long as they have existed, have all really done the same thing,” electrical engineering professor Stephen Lyon said. “The quantum computer looks at the problem in a rather different way.”
In 1982, Nobel Prize-winning physicist Richard Feynman GS ’42 developed the theoretical concept behind this new breed of computer, which uses some of the complicated effects of quantum mechanics to do computations that would be practically impossible using other methods.
A key application in which quantum computers might excel over their traditional counterparts is the factoring of very large integers, which is the basis of the widely used public-key cryptography framework known as RSA.
“All of the codes online for doing transactions are based on RSA encryption, and that security is dependent on it being extremely difficult to factor these large numbers,” Lyon said. “If you had a quantum computer, you could break all of these codes.”
Because of the potential security risk, the NSA is very interested in quantum computing technology.
“A lot of those people would like to think that building a quantum computer is impossible, but that doesn’t seem to be the answer,” Lyon said, explaining that there is no physical barrier to the eventual development of a quantum computer.
Lyon is part of a team of scientists and engineers from Princeton, Oxford and Lawrence Berkeley National Laboratory that have been studying memory in quantum computers. To conduct their experiments, they have been working with individual spins of electrons of phosphorus-31 atoms contained in a very expensive, exceptionally pure and isotopically controlled silicon-28 crystal, Lyon said.
The electrons can be used for data processing because they can be interpreted as small processing units, he explained. This processing is accomplished by manipulating the spins of the electrons.
One key problem is that the electrons moving around the atom’s center can only sustain their spin structures for about 60 milliseconds, so they cannot preserve computational data for very long, electrical engineering department research scientist Alexei Tyryshkin explained.
The team, therefore, focused on using the nucleus of the phosphorus atom as a type of storage device in a quantum computer, he said.
The quantum information that is stored in the outer layer of electrons can be transferred to the nucleus, which can preserve the information for a significantly longer timeframe: in the realm of seconds rather than milliseconds, Lyon explained.
The team was able to demonstrate this experimentally, Lyon added, noting that this feat was one of the team’s major successes.

“The electron acts as a middle-man between the nucleus and the outside world,” the project’s lead researcher, Oxford’s John Morton, said in a statement. “It gives us a way to have our cake and eat it — fast processing speeds from the electron and long memory times from the nucleus.”
The nucleus ends up being a suitable location for isolating a quantum “bit” from a noisy environment while also allowing for that bit to be accessed, and, in this way, the nucleus can almost be thought of as a “hard disk,” he explained in the statement.
With quantum computers, the idea of bits — the small switches that hold the binary values of 0 and 1 — goes out the window.
“The counterintuitive, fun thing about these quantum bits is that it can be 1 or it can be 0 or it can be a superposition where it is both 1 and 0 at the same time,” Lyon said. “That’s one of the ways in which quantum computers can compute faster than classical computers.”
Because of the complex mathematics involved and the uncertainty of the state of the quantum bits, it is relatively unknown which types of problems a quantum computer could solve, and quantum machines might not be as adaptable as modern computers for solving many types of problems, Lyon noted.
In addition, creating a large quantum computer with many electrons and nuclei working together would be a very demanding task, as even manipulating the spin of a single electron is a significant challenge, Tyryshkin said.
“Everything is possible when you have an ideal system in theory that behaves the way you want it to behave,” Tyryshkin said. “Everything is fine until you realize that the system has to exist in real life, where it is exposed to the harsh environment.”