We develop a neuroevolution-potential (NEP) framework for generating neural network-based machinelearning potentials. They are trained using an evolutionary strategy for performing large-scale molecular dynamics (MD) simulations. A descriptor of the atomic environment is constructed based on Chebyshev and Legendre polynomials. The method is implemented in graphic processing units within the open-source GPUMD package, which can attain a computational speed over 10 7 atom-step per second using one Nvidia Tesla V100. Furthermore, per-atom heat current is available in NEP, which paves the way for efficient and accurate MD simulations of heat transport in materials with strong phonon anharmonicity or spatial disorder, which usually cannot be accurately treated either with traditional empirical potentials or with perturbative methods.
We present our latest advancements of machine-learned potentials (MLPs) based on the neuroevolution potential (NEP) framework introduced in [Fan et al., Phys. Rev. B 104, 104309 (2021)] and their implementation in the open-source package GPUMD.We increase the accuracy of NEP models both by improving the radial functions in the atomic-environment descriptor using a linear combination of Chebyshev basis functions and by extending the angular descriptor with some four-body and five-body contributions as in the atomic cluster expansion approach.We also detail our efficient implementation of the NEP approach in graphics processing units as well as our workflow for the construction of NEP models, and we demonstrate their application in large-scale atomistic simulations.By comparing to state-of-the-art MLPs, we show that the NEP approach not only achieves above-average accuracy but also is far more computationally efficient.These results demonstrate that the GPUMD package is a promising tool for solving challenging problems requiring highly accurate, large-scale atomistic simulations.To enable the construction of MLPs using a minimal training set, we propose an active-learning scheme based on the latent space of a pre-trained NEP model.Finally, we introduce three separate Python packages, GPYUMD, CALORINE, and PYNEP, which enable the integration of GPUMD into Python workflows.
The oxidative stability of diacylglycerol (DAG)-enriched soybean oil and palm olein produced by partial hydrolysis using phospholipase A1 (Lecitase Ultra) and molecular distillation was investigated at 110°C by the Rancimat method with and without addition of synthetic antioxidants. Compared with triacylglycerol oils, the DAG-enriched oils displayed lower oxidative stability due to a higher content of unsaturated fatty acids and a lower level of tocopherols. With the addition (50-200 mg/kg) of tert-butylhydroquinone (TBHQ) or ascorbyl palmitate (AP), the oxidative stability indicated by induction period (IP) of these DAG-enriched oils under the Rancimat conditions was improved. The IP of the diacylglycerol-enriched soybean oil increased from 4.21 ± 0.09 to 12.64 ± 0.42 h when 200 mg/kg of TBHQ was added, whereas the IP of the diacylglycerol-enriched palm olein increased from 5.35 ± 0.21 to 16.24 ± 0.55 h when the same level of AP was added. Addition of TBHQ, alone and in combination with AP resulted in a significant (p B 0.05) increase in oxidative stability of diacylglycerol-enriched soybean oil. AP had a positive synergistic effect when used with TBHQ.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.