PennyLane is a Python 3 software framework for optimization and machine learning of quantum and hybrid quantumclassical computations. The library provides a unified architecture for near-term quantum computing devices, supporting both qubit and continuous-variable paradigms. PennyLane's core feature is the ability to compute gradients of variational quantum circuits in a way that is compatible with classical techniques such as backpropagation. PennyLane thus extends the automatic differentiation algorithms common in optimization and machine learning to include quantum and hybrid computations. A plugin system makes the framework compatible with any gate-based quantum simulator or hardware.We provide plugins for Strawberry Fields, Rigetti Forest, Qiskit, and ProjectQ, allowing PennyLane optimizations to be run on publicly accessible quantum devices provided by Rigetti and IBM Q. On the classical front, PennyLane interfaces with accelerated machine learning libraries such as TensorFlow, PyTorch, and autograd. PennyLane can be used for the optimization of variational quantum eigensolvers, quantum approximate optimization, quantum machine learning models, and many other applications.
For molecules with more than three atoms, it is difficult to fit or interpolate a potential energy surface (PES) from a small number of (usually ab initio) energies at points. Many methods have been proposed in recent decades, each claiming a set of advantages. Unfortunately, there are few comparative studies. In this paper, we compare neural networks (NNs) with Gaussian process (GP) regression. We re-fit an accurate PES of formaldehyde and compare PES errors on the entire point set used to solve the vibrational Schrödinger equation, i.e., the only error that matters in quantum dynamics calculations. We also compare the vibrational spectra computed on the underlying reference PES and the NN and GP potential surfaces. The NN and GP surfaces are constructed with exactly the same points, and the corresponding spectra are computed with the same points and the same basis. The GP fitting error is lower, and the GP spectrum is more accurate. The best NN fits to 625/1250/2500 symmetry unique potential energy points have global PES root mean square errors (RMSEs) of 6.53/2.54/0.86 cm, whereas the best GP surfaces have RMSE values of 3.87/1.13/0.62 cm, respectively. When fitting 625 symmetry unique points, the error in the first 100 vibrational levels is only 0.06 cm with the best GP fit, whereas the spectrum on the best NN PES has an error of 0.22 cm, with respect to the spectrum computed on the reference PES. This error is reduced to about 0.01 cm when fitting 2500 points with either the NN or GP. We also find that the GP surface produces a relatively accurate spectrum when obtained based on as few as 313 points.
The mathematical representation of large data sets of electronic energies has seen substantial progress in the past 10 years. The so-called Permutationally Invariant Polynomial (PIP) representation is one established approach. This approach dates from 2003, when a global potential energy surface (PES) for CH was reported using a basis of polynomials that are invariant with respect to the 120 permutations of the five equivalent H atoms. More recently, several approaches from "machine learning" have been applied to fit these large data sets. Gaussian Process (GP) regression is such an approach. Here, we consider the implementation of the (full) GP due to Krems and co-workers, with a modification that renders it permutationally invariant, which we denote by PIP-GP. This modification uses the approach of Guo and co-workers and later extended by Zhang and co-workers, to achieve permutational invariance for neural-network fits. The PIP, GP, and PIP-GP approaches are applied to four case studies for fitting data sets of electronic energies: HO, OCHCO, and HCO/ cis-HCOH/ trans-HCOH with the goal of assessing precision, accuracy in normal-mode analysis and barrier heights, and timings. We also report an application to (HCOOH), where the full PIP approach is possible but where the PIP-GP one is not feasible. However, by replicating data, which is feasible in this case, the GP approach is able to represent the data with precision comparable to that of the PIP approach. We examine these assessments for varying sizes of data sets in each case to determine the dependence of properties of the fits on the training data size. We conclude with some comments on the different aspects of computational effort of the PIP, GP, and PIP-GP approaches and also challenges these methods face for more "rugged" PESs, exemplified here by HCO/ cis-HCOH/ trans-HCOH.
We propose a machine-learning approach based on Bayesian optimization to build global potential energy surfaces (PES) for reactive molecular systems using feedback from quantum scattering calculations. The method is designed to correct for the uncertainties of quantum chemistry calculations and yield potentials that reproduce accurately the reaction probabilities in a wide range of energies. These surfaces are obtained automatically and do not require manual fitting of the ab initio energies with analytical functions. The PES are built from a small number of ab initio points by an iterative process that incrementally samples the most relevant parts of the configuration space. Using the dynamical results of previous authors as targets, we show that such feedback loops produce accurate global PES with 30 ab initio energies for the three-dimensional H+H 2 H 2 + H reaction and 290 ab inito energies for the six-dimensional OH + H 2 H 2 O+H reaction. These surfaces are obtained from 360 scattering calculations for H 3 and 600 scattering calculations for OH 3 . We also introduce a method that quickly converges to an accurate PES without the a priori knowledge of the dynamical results. By construction, our method illustrates the lowest number of potential energy points (i.e. the minimum information) required for the non-parametric construction of global PES for quantum reactive scattering calculations.
The accuracy of some density functional (DF) models, widely used in material science, depends on empirical or free parameters which are commonly tuned using reference physical properties. The optimal value of the free parameters is regularly found using grid search algorithms, which computational complexity scales with the number of points in the grid. In this report, we illustrate that Bayesian optimization (BO), a sample-efficient machine learning algorithm, can efficiently calibrate different density functional models, e.g., hybrid-exchange-correlation and range-separated density functionals. We present that, BO can optimize the free parameters of hybrid-exchange-correlation functionals, with approximately 55 evaluations of the root-mean-square or mean-absolute error functions of the atomization energies and the bond length of the Gaussian-1 (G1) database. We also illustrate that BO can identify, without any prior information, the most appropriate exchangecorrelation functional by navigating through the space of density functional models. We optimize and select the free parameters and the exchange-correlation functional form jointly by also minimizing the root-mean-square error function with respect to the atomization energies of the G1 database using BO.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.