Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
In pursuit of this goal, atomic-scale computer simulations have long been a central approach, and two major families of methods are routinely used today. On the one hand, there are quantum-mechanical simulations, in which we solve Schrödinger's equation for the electronic structure of molecular and periodic systems, most widely based on density-functional theory (DFT). [6][7][8] These methods provide (largely) reliable results for structural models of materials that normally contain a few tens or hundreds of atoms. State-of-the-art DFT methods can be applied to many material classes, and they are increasingly used for high-throughput screening and "in silico" (computer-based) design of materials: new compositions and previously unknown structures have been identified in DFT searches and subsequently experimentally realized. [5,[9][10][11] On the other hand, interatomic potential models ("force fields"), parameterizing interactions between atoms with (relatively) simple functional forms, are widely used in materials science to describe matter in molecular dynamics (MD) simulations. These simulations grant access to larger time and length scales, reaching system sizes of up to hundreds of thousands of atoms. [12] In parameterizing these potentials, a certain physical form of the atomic interactions is assumed, often in terms of bond distances, angles, and so on, and physical properties such as equilibrium lattice parameters or elastic constants enter the fitting of the potential. For this reason, such potentials are often called "empirical." They are several orders of magnitude faster than DFT, but necessarily less accurate and less easily transferable.In this Progress Report, we highlight recent developments in "machine-learned" interatomic potentials, which represent a rapidly growing field that promises to do away with the aforementioned trade-offs. Over the last year, there has been a surge of interest in machine learning (ML) methodology: part of it is due to the dramatic growth of ML throughout the scientific disciplines, and part of it is due to tangible success stories of ML-based interatomic potentials that are now beginning to emerge. We will argue that this is an exciting development with very practical implications, currently on the verge of moving from a somewhat specialized new technology to everyday applicability, poised to enhance and complement the communities' existing strengths in computational materials modeling. We will show selected applications of ML potentials to problems in materials science, discuss the current limitations (and possible pitfalls), and outline what we expect to be interesting directions for the development of the field in the coming years.Atomic-scale modeling and understanding of materials have made remarkable progress, but they are still fundamentally limited by the large computational cost of explicit electronic-structure methods such as density-functional theory. This Progress Report shows how machine learning (ML) is currently enabling a new degree of realism in materi...
In pursuit of this goal, atomic-scale computer simulations have long been a central approach, and two major families of methods are routinely used today. On the one hand, there are quantum-mechanical simulations, in which we solve Schrödinger's equation for the electronic structure of molecular and periodic systems, most widely based on density-functional theory (DFT). [6][7][8] These methods provide (largely) reliable results for structural models of materials that normally contain a few tens or hundreds of atoms. State-of-the-art DFT methods can be applied to many material classes, and they are increasingly used for high-throughput screening and "in silico" (computer-based) design of materials: new compositions and previously unknown structures have been identified in DFT searches and subsequently experimentally realized. [5,[9][10][11] On the other hand, interatomic potential models ("force fields"), parameterizing interactions between atoms with (relatively) simple functional forms, are widely used in materials science to describe matter in molecular dynamics (MD) simulations. These simulations grant access to larger time and length scales, reaching system sizes of up to hundreds of thousands of atoms. [12] In parameterizing these potentials, a certain physical form of the atomic interactions is assumed, often in terms of bond distances, angles, and so on, and physical properties such as equilibrium lattice parameters or elastic constants enter the fitting of the potential. For this reason, such potentials are often called "empirical." They are several orders of magnitude faster than DFT, but necessarily less accurate and less easily transferable.In this Progress Report, we highlight recent developments in "machine-learned" interatomic potentials, which represent a rapidly growing field that promises to do away with the aforementioned trade-offs. Over the last year, there has been a surge of interest in machine learning (ML) methodology: part of it is due to the dramatic growth of ML throughout the scientific disciplines, and part of it is due to tangible success stories of ML-based interatomic potentials that are now beginning to emerge. We will argue that this is an exciting development with very practical implications, currently on the verge of moving from a somewhat specialized new technology to everyday applicability, poised to enhance and complement the communities' existing strengths in computational materials modeling. We will show selected applications of ML potentials to problems in materials science, discuss the current limitations (and possible pitfalls), and outline what we expect to be interesting directions for the development of the field in the coming years.Atomic-scale modeling and understanding of materials have made remarkable progress, but they are still fundamentally limited by the large computational cost of explicit electronic-structure methods such as density-functional theory. This Progress Report shows how machine learning (ML) is currently enabling a new degree of realism in materi...
Phosphorus, first found in the seventeenth century, played an important role in the definition of the element term by Lavoisier and thus shaped the beginning of the era of modern chemistry. It was discovered for the first time in the most unstable crystalline modification—the white phosphorus. Today, a variety of experimentally proven allotropes are known. The most common allotropes, such as black, violet, and fibrous phosphorus, are described here with respect to their synthesis, crystal structures, thermal, and thermodynamic properties. Besides, more than 50 crystalline allotropes have been predicted, and their stabilities have been estimated using quantum‐chemical methods. This way, phosphorus becomes one of the most structurally variable elements of the periodic table. In this article, some of the most reasonable and sophisticated calculations are presented. The applications of elemental phosphorus are mainly connected with its semiconducting properties. Thus, the development of current applications is strongly related to new synthesis methods for direct preparation of individual, phase pure allotropic forms of phosphorus. The past decade supplied basic results on the formation of black phosphorus and other modifications, primarily using the mineralizer concept. Related to graphene and other two‐dimensional, layered structures, phosphorene is of drastically rising interest. The pertinent modifications are characterized by corrugated arrangement of six‐membered P‐rings, where both the boat conformation and the chair conformation are known. The application of phosphorene is in a jumping evolution. Currently, phosphorene is already in use in manifold ways, including as a sensor, optical device, transistor, energy‐conversion material, and supercapacitor material.
such as "ab initio" molecular dynamics (AIMD). ML-based interatomic potentials, therefore, are beginning to be applied to a range of challenging materials-science research questions, such as the modeling of phase-change memory materials, [12][13][14] catalysts, [15] or battery materials. [16] Recently, a number of "general-purpose" ML potentials have been reported, which can accurately describe a broad range of atomic configurations and materials properties-including silicon, [17] carbon, [18] aluminum, [19,20] and the binary Ga-As system. [21] The hope for such potentials is to enable "off-the-shelf" use without further modification: for example, the aforementioned silicon ML potential has been used to study complex structural transitions under pressure [22] or unusual mechanical properties of amorphous silicon (a-Si). [23] The starting point for the present study is a general-purpose Gaussian approximation potential (GAP) ML model for bulk and nanostructured phosphorus-which was shown to be flexible enough to be applicable to the pressure-induced liquidliquid phase transition from the molecular fluid to the network liquid, whilst also accurately describing the crystalline allotropes and the layered structure of phosphorene. [24] This GAP is now set to facilitate even more challenging studies on more extended length or time scales, and the exploration of other structurally complex phases for which it has not been explicitly "trained", such as amorphous phosphorus (a-P).Research interest in a-P has grown because of emerging applications in batteries. [25][26][27][28] As a commercially available anode material, red phosphorus provides a large cation-storage ability with high theoretical capacities by forming binary X-P compounds (X = Li, Na, K), but it suffers from low conductivity and a large volumetric change during cycling. [29] As discussed in ref. [30], these issues can be ameliorated by creating composites of a-P and carbonaceous materials: on the one hand, increasing electronic conductivity; [31,32] on the other hand, minimizing the mechanical stress induced by volume changes. [30,33] The structure and properties of phosphorus itself are clearly important for battery applications: for example, experimental work showed that the size of a-P particles in phosphoruscarbon composite anodes has an effect on the electrochemical performance, [34] and in situ transmission electron microscopy (TEM) revealed images of red phosphorus segments within a carbon nanofiber. [35] Computationally, structurally ordered phases have been studied with density-functional theory (DFT; Amorphous phosphorus (a-P) has long attracted interest because of its complex atomic structure, and more recently as an anode material for batteries. However, accurately describing and understanding a-P at the atomistic level remains a challenge. Here, it is shown that large-scale molecular-dynamics simulations, enabled by a machine-learning (ML)-based interatomic potential for phosphorus, can give new insights into the atomic structure of a-P and how...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.