Newly synthesized perovskite NaOsO 3 shows Curie-Weiss metallic nature at high temperature and suddenly goes into an antiferromagnetically insulating state at 410 K on cooling. Electronic specific heat at the low temperature limit is absent, indicating that the band gap fully opens. In situ observation in electron microscopy undetected any lattice anomalies in the vicinity of the transition temperature. It is most likely that the antiferromagnetic correlation plays an essential role of the gap opening.
We report powder and single crystal neutron diffraction measurements of the magnetic order in AMnBi2 (A = Sr and Ca), two layered manganese pnictides with anisotropic Dirac fermions on a Bi square net. Both materials are found to order at TN ≈ 300 K in k = 0 antiferromagnetic structures, with ordered Mn moments at T = 10 K of approximately 3.8 µB aligned along the c axis. The magnetic structures are Néel-type within the Mn-Bi layers but the inter-layer ordering is different, being antiferromagnetic in SrMnBi2 and ferromagnetic in CaMnBi2. This allows a meanfield coupling of the magnetic order to Bi electrons in CaMnBi2 but not in SrMnBi2. We find clear evidence that magnetic order influences electrical transport. First principles calculations explain the experimental observations and suggest that the mechanism for different inter-layer ordering in the two compounds is the competition between the anteiferromagnetic superexchange and ferromagnetic double exchange carried by itinerant Bi electrons.
Deep learning techniques have boosted the performance of hyperspectral image (HSI) classification. In particular, convolutional neural networks (CNNs) have shown superior performance to that of the conventional machine learning algorithms. Recently, a novel type of neural networks called capsule networks (CapsNets) was presented to improve the most advanced CNNs. In this paper, we present a modified two-layer CapsNet with limited training samples for HSI classification, which is inspired by the comparability and simplicity of the shallower deep learning models. The presented CapsNet is trained using two real HSI datasets, i.e., the PaviaU (PU) and SalinasA datasets, representing complex and simple datasets, respectively, and which are used to investigate the robustness or representation of every model or classifier. In addition, a comparable paradigm of network architecture design has been proposed for the comparison of CNN and CapsNet. Experiments demonstrate that CapsNet shows better accuracy and convergence behavior for the complex data than the state-of-the-art CNN. For CapsNet using the PU dataset, the Kappa coefficient, overall accuracy, and average accuracy are 0.9456, 95.90%, and 96.27%, respectively, compared to the corresponding values yielded by CNN of 0.9345, 95.11%, and 95.63%. Moreover, we observed that CapsNet has much higher confidence for the predicted probabilities. Subsequently, this finding was analyzed and discussed with probability maps and uncertainty analysis. In terms of the existing literature, CapsNet provides promising results and explicit merits in comparison with CNN and two baseline classifiers, i.e., random forests (RFs) and support vector machines (SVMs).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.