2019
DOI: 10.1103/physrevlett.122.028001
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Activation, Local Structure, and Softness in Supercooled Colloidal Liquids

Abstract: We experimentally characterize heterogeneous nonexponential relaxation in bidisperse supercooled colloidal liquids utilizing a recent concept called "softness" [Phys. Rev. Lett. 114, 108001(2015)]. Particle trajectory and structure data enable classification of particles into subgroups with different local environments and propensities to hop. We determine residence times, tR, between particle hops and show that tR derived from particles in the same softness subgroup are exponentially distributed. Using the me… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

4
30
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(34 citation statements)
references
References 47 publications
4
30
0
Order By: Relevance
“…The aim of our method is to identify such communities using statistical inference, with minimal prior assumptions on the nature of the local order. Our method is unsupervised and uses only structural informationthis is distinct from other inference or machine learning approaches that learn about dynamically-active regions or soft spots using training data sets [24][25][26][27][28].…”
Section: A Overview and Motivationmentioning
confidence: 99%
See 1 more Smart Citation
“…The aim of our method is to identify such communities using statistical inference, with minimal prior assumptions on the nature of the local order. Our method is unsupervised and uses only structural informationthis is distinct from other inference or machine learning approaches that learn about dynamically-active regions or soft spots using training data sets [24][25][26][27][28].…”
Section: A Overview and Motivationmentioning
confidence: 99%
“…The spatial distribution of soft modes [22,23] correlates well with the local dynamics, at least on time scales shorter than the structural relaxation time, but normal modes obviously contain richer information than the bare structure, since they account for local variations of the energy function. Machine learning techniques have also been used to identify structural defects related to localized excitations in supercooled liquids, as well as plastic events in glasses [24][25][26][27][28]. While promising, supervised approaches still need input dynamic data to identify relevant structural features.…”
Section: Introductionmentioning
confidence: 99%
“…Attempts from a purely structural perspective (i.e., with knowledge of only the atomic positions) have long been frustrated due to the lack of representations to sufficiently encode the structural heterogeneity. Recently, researchers have made notable progress by combining symmetry functions as structural representations with machine learning (ML) to establish predictive models for the plasticity and dynamics of various disordered solids and liquids [24][25][26][27] .…”
Section: Introductionmentioning
confidence: 99%
“…A major advantage is that they can be considered as quite complete, and can successfully distinguish many different types of environments [24][25][26][27] . However, the complex and nonintuitive transformations, especially for the angular functions, makes it more challenging to extract scientific insights from ML models employing symmetry functions.…”
Section: Introductionmentioning
confidence: 99%
“…The remaining free parameter τ 0 (time between random walk attempts) can be interpreted as the time scale for structure decorrelation in the molecule/cage system. The connection between structural relaxation, diffusion and viscosity is subject of ongoing research [33], and fully unravelling the underlying mechanisms goes far beyond the scope of this work. Nevertheless, as a starting point we consider the time autocorrelation function e(0)·e(t) of the molecules' end-to-end vector orientation e(t) (see Fig.…”
mentioning
confidence: 99%