A new paper from the theory team: Quantum Computing in High Energy Physics

Today the paper Determining the proton content with a quantum computer by Quantic member Adrián Pérez-Salinas, in collaboration with Juan Cruz-Martinez, Abdulla A. Alhajri and Stefano Carrazza came out (arXiv: 2011.13934)

In this paper they present a first attempt to design a quantum circuit for the determination of the parton content of the proton through the estimation of parton distribution functions (PDFs), in the context of high energy physics (HEP). In this work they identify architectures of variational quantum circuits suitable for PDFs representation (qPDFs) and show experiments about the deployment of qPDFs on real quantum devices, taking into consideration current experimental limitations. Finally, they perform a global qPDF determination from LHC data using quantum computer simulation on classical hardware and they compare the obtained partons and related phenomenological predictions involving hadronic processes to modern PDFs.

The quantum algorithm herein proposed belongs to the class of Variational Quantum Circuits that rely on both quantum and classical resources.

The final Ansatz chosen to create the quantum circuits encoding the qPDFs within can be easily described through a layered structure

There are two different kinds of single-qubit gates serving as building blocks of such circuits, giving raise to the Weighted and Fourier Ansätze

These Ansätze can be used with and without entangling gates to create single- and multi-flavour fits. We consider the flavours

Both Ansätze provide accurate results when we fit some known PDF with them, ensuring a great flexibility for the model with few layers and parameters (the table with lower numbers corresponds to the Weighted ansatz)

 

Knowing that the model works, they implement it now in actual quantum computers, using the single- and multi-flavour models. For the single-flavour fit they used the IBMQ Athens processor

For the multi-flavour fit, they used simulated versions of IBMQ Melbourne whose errors were under control. The results deteriorates rapidly in this case.

 

The third step of the work consists in using the quantum model to fit actual PDFs in order to fulfill the requirements provided by LHC data. To do so, they made use of the NNPDF methodology and substituted the Neural Networks (NN) by the quantum model (as a simulator). This procedure provide also satisfactory results for PDF fitting

And returns phenomenology predictions in agreement with state-of-the art PDF fitting.

In conclusion, the qPDF model provides great flexibility and is capable to fit PDFs according to LHC data. This may open the possibility to use quantum computing for this kind of operations. However, current qualities of the quantum computers prevents the immediate implementation, and computational capabilities of this model are not enough yet to compete against modern high-performance NNPDF computations.