Beyond the exascale | symmetry magazine

After years of speculation, quantum computing is here, sort of.

Physicists are beginning to wonder how quantum computing might provide answers to the field’s deepest questions. But most aren’t caught up in the hype. Instead, they’re taking what’s a familiar approach to them: planning for a future decades away, while making room for potential pivots, turns and breakthroughs along the way.

“When we’re working on building a new particle collider, this type of project can take 40 years,” says Hank Lamm, a research associate at the US Department of Energy’s Fermi National Accelerator Laboratory. “It’s on the same timeline. I hope to start seeing quantum computing provide big answers to particle physics before it dies. But that doesn’t mean there isn’t some interesting physics to do along the way.

Equations that dominate even supercomputers

Classical computers have been at the heart of physics research for decades, and simulations run on classical computers have guided many breakthroughs. Fermilab, for example, used classical computing to simulate lattice quantum chromodynamics. Lattice QCD is a set of equations that describe the interactions of quarks and gluons via the strong force.

Theorists developed lattice QCD in the 1970s. But applying its equations has proven extremely difficult. “Even in the 1980s, many people said that even if they had an exascale computer [a computer that can perform a billion billion calculations per second]they still couldn’t calculate the QCD of the network,” says Lamm.

But that turned out to be wrong.

Over the past 10 to 15 years, researchers have discovered the algorithms needed to make their calculations more manageable, while learning to understand theoretical errors and correct them. These advances allowed them to use lattice simulation, a simulation that uses a volume of a specified grid of points in space and time as a substitute for the continuous vastness of reality.

Lattice simulations allowed physicists to calculate the mass of the proton – a particle made up of quarks and gluons all interacting via the strong force – and to see that the theoretical prediction matches the experimental result well. The simulations also allowed them to accurately predict the temperature at which quarks should detach from each other in a quark-gluon plasma.

The limit of these calculations? Besides being approximate or based on a hypothetical confined area of ​​space, only certain properties can be calculated efficiently. Try looking at more than that, and even the biggest high-performance computer can’t handle all the possibilities.

Enter quantum computers.

Quantum computers mean possibilities. Classical computers lack the memory to compute the many possible outcomes of lattice QCD problems, but quantum computers take advantage of quantum mechanics to compute differently.

Quantum computing is not an easy answer, however. Solving equations on a quantum computer requires completely new ways of thinking about programming and algorithms.

With a typical computer, when you program code, you can see its state at any time. You can check the work of a classic computer before it’s finished and troubleshoot if something goes wrong. But according to the laws of quantum mechanics, you cannot observe any intermediate step of a quantum calculation without corrupting the calculation; you can only observe the final state.

This means you can’t store information in an intermediate state and retrieve it later, and you can’t clone information from one set of qubits to another, which makes error correction difficult.

“It can be a nightmare to design an algorithm for quantum computing,” says Lamm, who spends his days trying to figure out how to do quantum simulations for high-energy physics. “Everything needs to be redesigned from top to bottom. We are just at the beginning of figuring out how to do this.

just started

Quantum computers have already proven useful in basic research. Condensed matter physicists – whose research focuses on the phases of matter – have spent much more time than particle physicists thinking about how computers and quantum simulators can help them. They used quantum simulators to explore quantum spin liquid states and observe a previously unobserved phase of matter called prethermal time crystal.

“The biggest place where quantum simulators will have an impact is in discovery science, in discovering new phenomena like this that exist in nature,” says Norman Yao, assistant professor at the University of California, Berkeley. and co-author of the paper on time crystals.

Quantum computers show promise in particle physics and astrophysics. Many physics and astrophysical researchers use quantum computers to simulate “toy problems” – small, simple versions of much more complicated problems. They have, for example, used quantum computing to test parts of quantum gravity theories or create proof-of-principle templates, such as templates for parton showers emitted by particle colliders like the Large Hadron Collider.

“Physicists tackle small problems, those they can solve by other means, to try to understand how quantum computing can have an advantage,” says Fermilab scientist Roni Harnik. “By learning from this, they can build a scale from simulations, through trial and error, to more difficult problems.

But which approaches will succeed and which will lead to dead ends, remains to be seen. Estimates of the number of qubits needed to simulate physics problems large enough to achieve breakthroughs range from thousands to (more likely) millions. Many in the field expect this to be possible in the 2030s or 2040s.

“In high-energy physics, problems like these are clearly a regime in which quantum computers will have an advantage,” says Ning Bao, associate research scientist in computer science at Brookhaven National Laboratory. “The problem is that quantum computers are still too limited in what they can do.”

Starting with physics

Some physicists approach things from a different angle: they turn to physics to better understand quantum computing.

John Preskill is a professor of physics at Caltech and an early leader in the field of quantum computing. A few years ago, he and Patrick Hayden, a physics professor at Stanford University, showed that if you entangle two photons and throw one into a black hole, you’ll decode the information that eventually comes out via Hawking radiation would be much easier than if you had used non-entangled particles. The physicists Beni Yoshida and Alexei Kitaev then proposed a explicit protocol for such decodingand Yao went further by showing that the protocol could also be a powerful tool for characterizing quantum computers.

“We took something that was thought of in terms of high-energy physics and quantum information science, and then we thought of it as a tool that could be used in quantum computing,” Yao says.

This kind of cross-disciplinary thinking will be key to moving the field forward, physicists say.

“Everyone comes into this field with different expertise,” Bao says. “Whether it’s computer science, physics, or quantum information theory, everyone comes together to bring different perspectives and solve problems. There are probably many ways to use quantum computing to study physics that we can’t predict right now, and getting the two right people together in one room will be enough.

Comments are closed.