IBM and CERN use quantum computing to hunt down the elusive Higgs boson

0

It is likely that future quantum computers will greatly improve the understanding of CERN’s gigantic particle collider.

CERN / Maximilien Brice

The potential of quantum computers is currently being discussed in environments ranging from banks to merchant ships, and now the technology has been taken even further – or rather, lower.

About 100 meters below the Franco-Swiss border is the largest machine in the world, the Large Hadron Collider (LHC) operated by the European particle physics laboratory, CERN. And to better understand the mountains of data produced by such a colossal system, scientists at CERN turned to IBM’s quantum team for help.

The partnership was a success: In a new paper, which has yet to be peer-reviewed, IBM researchers have established that quantum algorithms can help make sense of LHC data, which means that ‘It is likely that future quantum computers will greatly stimulate scientific discovery. at CERN.

With CERN’s mission to understand why everything is happening in the universe, this could have big implications for anyone interested in anything to do with matter, antimatter, dark matter, etc.

The LHC is one of CERN’s most important tools for understanding the fundamental laws that govern the particles and forces that make up the universe. Ring-shaped 27 kilometers long, the system accelerates beams of particles like protons and electrons to just below the speed of light, before shattering those beams together in collisions that scientists observe thanks to eight high precision detectors located inside the accelerator.

Every second, particles collide about a billion times inside the LHC, producing a petabyte of data which is currently processed by a million processors in 170 locations around the world – a geographic spread which is due to the fact that such huge amounts of information cannot be stored in one place.

It’s not just about storing data, of course. All the information generated by the LHC is then available to be processed and analyzed, for scientists to hypothesize, prove and discover.

Thus, by observing particles shattering together, researchers at CERN discovered in 2012 the existence of an elementary particle called the Higgs boson, which gives mass to all other fundamental particles and has been hailed as a major achievement in the field of physics.

Scientists, so far, have used the best conventional computer tools available to help them in their work. In practice, this means using sophisticated machine learning algorithms capable of diffusing the data produced by the LHC to distinguish between useful collisions, such as those that produce Higgs bosons, and unwanted waste.

“Until now, scientists have used classic machine learning techniques to analyze raw data captured by particle detectors, automatically selecting the best candidate events,” IBM researchers Ivano Tavernelli and Panagiotis Barkoutsos wrote in a blog post. “But we believe we can dramatically improve this selection process – by strengthening machine learning with quantum.”

As the volume of data increases, classic machine learning models are rapidly approaching the limits of their capabilities, and this is where quantum computers are likely to play a useful role. The versatile qubits that make up quantum computers can hold much more information than conventional bits, which means they can view and handle many more dimensions than conventional devices.

A quantum computer equipped with enough qubits could therefore in principle perform extremely complex calculations that would take centuries for classical computers to solve.

With that in mind, CERN joined forces with IBM’s quantum team in 2018, with the aim of discovering how exactly quantum technologies could be applied to advance scientific discoveries.

Quantum machine learning quickly emerged as a potential application. The approach is to harness the capabilities of qubits to expand what is known as the feature space – the collection of features on which the algorithm bases its classification decision. By using a larger feature space, a quantum computer will be able to see patterns and perform classification tasks even in a huge data set, where a typical computer might only see random noise.

Applied to CERN research, a quantum machine learning algorithm could sift through raw data from the LHC and recognize occurrences of Higgs boson behavior, for example, where classical computers might struggle to see anything. is.

The IBM team then created a quantum algorithm called the Quantum Support Vector Machine (QSVM), designed to identify the collisions that produce the Higgs bosons. The algorithm was trained with a test data set based on information generated by one of the LHC detectors, and was performed on both quantum simulators and physical quantum hardware.

In both cases, the results were promising. The simulation study, which ran on Google Tensorflow Quantum, IBM Quantum, and Amazon Braket, used up to 20 qubits and a 50,000 event dataset, and performed as well, if not better, than its classic counterparts performing the same problem.

The hardware experiment was performed on IBM’s own quantum devices using 15 qubits and a 100-event data set, and the results showed that, despite the noise affecting quantum computations, the quality of the classification remained comparable. to the best classic simulation results.

“This once again confirms the potential of the quantum algorithm for this class of problems,” Tavernelli and Barkoutsos wrote. “The quality of our results indicates a possible demonstration of a quantum advantage for data classification with quantum carrier vector machines in the near future.”

This does not mean that the benefit has been proven yet. The quantum algorithm developed by IBM worked in a manner comparable to classical methods on the limited quantum processors that exist today – but these systems are still in their infancy.

And with only a small number of qubits, today’s quantum computers are unable to perform useful calculations. They also remain paralyzed by the fragility of the qubits, very sensitive to environmental changes and still prone to errors.

On the contrary, IBM and CERN are banking on future improvements in quantum hardware to demonstrate concretely, and not just theoretically, that quantum algorithms have an advantage.

“Our results show that quantum machine learning algorithms for classifying data can be as accurate as classical algorithms on noisy quantum computers, paving the way for the demonstration of quantum advantage in the near future,” concluded Tavernelli and Barkoutsos.

CERN scientists certainly have high hopes that this will be the case. The LHC is currently undergoing modernization and the next iteration of the system, which is due to go into service in 2027, is expected to produce ten times more collisions than the current machine. The volume of data generated only goes one way – and it won’t be long before traditional processors are unable to handle it all.


Source link

Leave A Reply

Your email address will not be published.