This website uses cookies primarily for visitor analytics. Certain pages will ask you to fill in contact details to receive additional information. On these pages you have the option of having the site log your details for future visits. Indicating you want the site to remember your details will place a cookie on your device. To view our full cookie policy, please click here. You can also view it at any time by going to our Contact Us page.

Machine learning helps quantum computing mind the reality gap

11 January 2024

For the first time, the power of machine learning has been harnessed to close the distance between the predicted and observed behaviour of quantum devices.

Image: David Craig/PRX
Image: David Craig/PRX

A study has used the power of machine learning to overcome a key challenge affecting quantum devices. For the first time, the findings reveal a way to close the 'reality gap'.

Quantum computing could supercharge a wealth of applications, from climate modelling and financial forecasting, to drug discovery and artificial intelligence. But this will require effective ways to scale and combine individual quantum devices (also called qubits). A major barrier against this is inherent variability: where even apparently identical units exhibit different behaviours.

Functional variability is presumed to be caused by nanoscale imperfections in the materials that quantum devices are made from. Since there is no way to measure these directly, this internal disorder cannot be captured in simulations, leading to the gap in predicted and observed outcomes.

To address this, the research group used a "physics-informed" machine learning approach to infer these disorder characteristics indirectly. This was based on how the internal disorder affected the flow of electrons through the device.

Lead researcher Associate Professor Natalia Ares, of the University of Oxford, said: “As an analogy, when we play ‘crazy golf’, the ball may enter a tunnel and exit with a speed or direction that doesn't match our predictions. 

“But with a few more shots, a crazy golf simulator, and some machine learning, we might get better at predicting the ball's movements and narrow the reality gap."

The researchers measured the output current for different voltage settings across an individual quantum dot device. The data was input into a simulation which calculated the difference between the measured current with the theoretical current if no internal disorder was present. 

By measuring the current at many different voltage settings, the simulation was constrained to find an arrangement of internal disorder that could explain the measurements at all voltage settings. This approach used a combination of mathematical and statistical approaches coupled with deep learning.

Associate Professor Ares added: "In the crazy golf analogy, it would be equivalent to placing a series of sensors along the tunnel, so that we could take measurements of the ball's speed at different points. 

“Although we still can't see inside the tunnel, we can use the data to inform better predictions of how the ball will behave when we take the shot."

Not only did the new model find suitable internal disorder profiles to describe the measured current values, but it was also able accurately to predict voltage settings required for specific device operating regimes.

Crucially, the model provides a new method to quantify the variability between quantum devices. This could enable more accurate predictions of how devices will perform, and also help to engineer optimum materials for quantum devices. 

It could inform compensation approaches to mitigate the unwanted effects of material imperfections in quantum devices.

Co-author David Craig, a PhD student at the Department of Materials, University of Oxford, added, “Similar to how we cannot observe black holes directly, but we infer their presence from their effect on surrounding matter, we have used simple measurements as a proxy for the internal variability of nanoscale quantum devices. 

“Although the real device still has greater complexity than the model can capture, our study has demonstrated the utility of using physics-aware machine learning to narrow the reality gap.”

Print this page | E-mail this page

MinitecRegarl Rexnord