What is it about?
Quantum computers can solve problems that are too complex for classical computers. Classical computers are based on binary digits that represent an “on” or an “off” state. But, quantum computers use quantum bits (qubits) that use both these values. Current quantum computers are called noisy intermediate-scale quantum (NISQ) devices. They use 50–100 qubits and are very sensitive to the environment. While they are a milestone for quantum computing, they don't replace classical computers. The two computer types work together using hybrid quantum-classical algorithms. In these, a classical computer is used to train or optimize parameterized quantum circuits (PQCs). However, training the PQCs is difficult due to the vanishing gradient problem. This paper presents an initialization scheme based on a tensor network algorithm to train the PQCs. The PQCs are fed with optimal starting parameters so that training takes fewer steps to reach the target state.
Featured Image
Photo by Taylor Vick on Unsplash
Why is it important?
Quantum computers solve optimization problems like finding new materials or recognizing images. These computers find values that minimize a function by referring to a gradient. The gradient or the slope is the difference between the actual and predicted value of a function. But if random parameters are used initially, the gradient might become too small to continue training. In this method, matrix product states (MPSs) are trained to optimize the function. The PQC is then initialized with the MPSs. This way, the PQC reaches a better minimum with fewer gradient updates. KEY TAKEAWAY: The use of a tensor network improves how quantum circuits are trained. With this method, the problem of vanishing gradients can be avoided. This allows for better values when using NISQ devices.
Read the Original
This page is a summary of: Matrix product state pre-training for quantum machine learning, Quantum Science and Technology, May 2022, Institute of Physics Publishing,
DOI: 10.1088/2058-9565/ac7073.
You can read the full text:
Contributors
Be the first to contribute to this page