Prediction of Plant Uptake and Translocation of Engineered Metallic Nanoparticles by Machine Learning

Author(s):  
Xiaoxuan Wang ◽  
Liwei Liu ◽  
Weilan Zhang ◽  
Xingmao Ma
Nano Futures ◽  
2020 ◽  
Vol 4 (3) ◽  
pp. 035003
Author(s):  
Amanda S Barnard ◽  
George Opletal

2019 ◽  
Vol 4 (1) ◽  
pp. 1654919 ◽  
Author(s):  
Claudio Zeni ◽  
Kevin Rossi ◽  
Aldo Glielmo ◽  
Francesca Baletto

2020 ◽  
Vol 698 ◽  
pp. 133999 ◽  
Author(s):  
Majid Bagheri ◽  
Khalid Al-jabery ◽  
Donald Wunsch ◽  
Joel G. Burken

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Kihoon Bang ◽  
Byung Chul Yeo ◽  
Donghun Kim ◽  
Sang Soo Han ◽  
Hyuck Mo Lee

AbstractWithin first-principles density functional theory (DFT) frameworks, it is challenging to predict the electronic structures of nanoparticles (NPs) accurately but fast. Herein, a machine-learning architecture is proposed to rapidly but reasonably predict electronic density of states (DOS) patterns of metallic NPs via a combination of principal component analysis (PCA) and the crystal graph convolutional neural network (CGCNN). With the PCA, a mathematically high-dimensional DOS image can be converted to a low-dimensional vector. The CGCNN plays a key role in reflecting the effects of local atomic structures on the DOS patterns of NPs with only a few of material features that are easily extracted from a periodic table. The PCA-CGCNN model is applicable for all pure and bimetallic NPs, in which a handful DOS training sets that are easily obtained with the typical DFT method are considered. The PCA-CGCNN model predicts the R2 value to be 0.85 or higher for Au pure NPs and 0.77 or higher for Au@Pt core@shell bimetallic NPs, respectively, in which the values are for the test sets. Although the PCA-CGCNN method showed a small loss of accuracy when compared with DFT calculations, the prediction time takes just ~ 160 s irrespective of the NP size in contrast to DFT method, for example, 13,000 times faster than the DFT method for Pt147. Our approach not only can be immediately applied to predict electronic structures of actual nanometer scaled NPs to be experimentally synthesized, but also be used to explore correlations between atomic structures and other spectrum image data of the materials (e.g., X-ray diffraction, X-ray photoelectron spectroscopy, and Raman spectroscopy).


2020 ◽  
Author(s):  
Amanda Barnard ◽  
George Opletal

The outcome of machine learning is influenced by the features used to describe the data, and various metrics are used to measure model performance. In this study we use five different feature sets to describe the same 4000 gold nanoparticles, and 14 different machine learning methods to compare a total of 70 high scoring models. We then use classification and regression to show which meta-features of data sets or machine learning algorithms are important when making a selection. We find that number of features, and those that are strongly correlated, determine the class of model that should be used, but overall quality is almost entirely determined by the cross-validation score, regardless of the sophistication of the algorithm.


Author(s):  
Yuewei Lin ◽  
Mehmet Topsakal ◽  
Janis Timoshenko ◽  
Deyu Lu ◽  
Shinjae Yoo ◽  
...  

2020 ◽  
Author(s):  
Amanda Barnard ◽  
George Opletal

The outcome of machine learning is influenced by the features used to describe the data, and various metrics are used to measure model performance. In this study we use five different feature sets to describe the same 4000 gold nanoparticles, and 14 different machine learning methods to compare a total of 70 high scoring models. We then use classification and regression to show which meta-features of data sets or machine learning algorithms are important when making a selection. We find that number of features, and those that are strongly correlated, determine the class of model that should be used, but overall quality is almost entirely determined by the cross-validation score, regardless of the sophistication of the algorithm.


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


Sign in / Sign up

Export Citation Format

Share Document