Quantum algorithms for group convolution, cross-correlation, and equivariant transformations
Grecia Castelazo,
Quynh T. Nguyen,
Giacomo De Palma
et al.
Abstract:Group convolutions and cross-correlations, which are equivariant to the actions of group elements, are commonly used in mathematics to analyze or take advantage of symmetries inherent in a given problem setting. Here, we provide efficient quantum algorithms for performing linear group convolutions and cross-correlations on data stored as quantum states. Runtimes for our algorithms are logarithmic in the dimension of the group thus offering an exponential speedup compared to classical algorithms when input data… Show more
“…Although imposing invariance at the map level effectively results in global invariance of the model, this is quite restrictive. A more relaxed approach towards the construction of group invariant models involves the concept of equivariance [7][8][9]88] which is now defined.…”
Quantum Machine Learning (QML) models are aimed at learning from data encoded in quantum states. Recently, it has been shown that models with little to no inductive biases (i.e., with no assumptions about the problem embedded in the model) are likely to have trainability and generalization issues, especially for large problem sizes. As such, it is fundamental to develop schemes that encode as much information as available about the problem at hand. In this work we present a simple, yet powerful, framework where the underlying invariances in the data are used to build QML models that, by construction, respect those symmetries. These so-called group-invariant models produce outputs that remain invariant under the action of any element of the symmetry group G associated to the dataset. We present theoretical results underpinning the design of G-invariant models, and exemplify their application through several paradigmatic QML classification tasks including cases when G is a continuous Lie group and also when it is a discrete symmetry group. Notably, our framework allows us to recover, in an elegant way, several well known algorithms for the literature, as well as to discover new ones. Taken together, we expect that our results will help pave the way towards a more geometric and group-theoretic approach to QML model design.
“…Although imposing invariance at the map level effectively results in global invariance of the model, this is quite restrictive. A more relaxed approach towards the construction of group invariant models involves the concept of equivariance [7][8][9]88] which is now defined.…”
Quantum Machine Learning (QML) models are aimed at learning from data encoded in quantum states. Recently, it has been shown that models with little to no inductive biases (i.e., with no assumptions about the problem embedded in the model) are likely to have trainability and generalization issues, especially for large problem sizes. As such, it is fundamental to develop schemes that encode as much information as available about the problem at hand. In this work we present a simple, yet powerful, framework where the underlying invariances in the data are used to build QML models that, by construction, respect those symmetries. These so-called group-invariant models produce outputs that remain invariant under the action of any element of the symmetry group G associated to the dataset. We present theoretical results underpinning the design of G-invariant models, and exemplify their application through several paradigmatic QML classification tasks including cases when G is a continuous Lie group and also when it is a discrete symmetry group. Notably, our framework allows us to recover, in an elegant way, several well known algorithms for the literature, as well as to discover new ones. Taken together, we expect that our results will help pave the way towards a more geometric and group-theoretic approach to QML model design.
“…Finally, we note that the case of finite groups and regular representations (i.e., when the intermediate representations are chosen to be R reg : G → C[G] corresponding to the group action on its own group algebra) has been studied in the classical literature under the name of homogeneous ENNs [14]. In this case, any equivariant map is a group convolution [11], which can be realized as a unitary operator embedding the classical convolution kernel by the quantum algorithms in [116]. Combining this with quantum algorithms for polynomial transformations of quantum states [117,118] allows one to quantize classical homogeneous ENNs.…”
Section: Intermediate Representations As Hyperparametersmentioning
Most currently used quantum neural network architectures have little-to-no inductive biases, leading to trainability and generalization issues. Inspired by a similar problem, recent breakthroughs in classical machine learning address this crux by creating models encoding the symmetries of the learning task. This is materialized through the usage of equivariant neural networks whose action commutes with that of the symmetry. In this work, we import these ideas to the quantum realm by presenting a general theoretical framework to understand, classify, design and implement equivariant quantum neural networks. As a special implementation, we show how standard quantum convolutional neural networks (QCNN) can be generalized to group-equivariant QCNNs where both the convolutional and pooling layers are equivariant under the relevant symmetry group. Our framework can be readily applied to virtually all areas of quantum machine learning, and provides hope to alleviate central challenges such as barren plateaus, poor local minima, and sample complexity.
“…In fact, achieving a runtime logarithmic in N for dense and full-rank matrices is generally challenging unless there are specific symmetries or structures inherent in the matrix. For example, quantum algorithms have been developed to achieve polylogarithmic complexity in N for Toepliz systems [6], Hankel matrices [7], and linear group convolutions [8] which all feature some inherent structure.…”
Kernel matrices, which arise from discretizing a kernel function k(x, x ), have a variety of applications in mathematics and engineering. Classically, the celebrated fast multipole method was designed to perform matrix multiplication on kernel matrices of dimension N in time almost linear in N by using techniques later generalized into the linear algebraic framework of hierarchical matrices. In light of this success, we propose a quantum algorithm for efficiently performing matrix operations on hierarchical matrices by implementing a quantum block-encoding of the hierarchical matrix structure. When applied to many kernel matrices, our quantum algorithm can solve quantum linear systems of dimension N in time O(κ polylog( N ε )), where κ and ε are the condition number and error bound of the matrix operation. This runtime is exponentially faster than any existing quantum algorithms for implementing dense kernel matrices. Finally, we discuss possible applications of our methodology in solving integral equations or accelerating computations in N-body problems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.