scholarly journals Materials Informatics for 2D Materials Combined with Sparse Modeling and Chemical Perspective: Toward Small-Data-Driven Chemistry and Materials Science

Author(s):  
Yuya Oaki ◽  
Yasuhiko Igarashi
2022 ◽  
Author(s):  
Yuri Haraguchi ◽  
Yasuhiko Igarashi ◽  
Hiroaki Imai ◽  
Yuya Oaki

Data-scientific approaches have permeated in chemistry and materials science. In general, these approaches are not easily applied to small data, such as experimental data in laboratories. Our group has focused...


Nanoscale ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 3853-3859
Author(s):  
Ryosuke Mizuguchi ◽  
Yasuhiko Igarashi ◽  
Hiroaki Imai ◽  
Yuya Oaki

Lateral sizes of the exfoliated transition-metal–oxide nanosheets were predicted and controlled by the assistance of machine learning. 


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Dipendra Jha ◽  
Vishu Gupta ◽  
Logan Ward ◽  
Zijiang Yang ◽  
Christopher Wolverton ◽  
...  

AbstractThe application of machine learning (ML) techniques in materials science has attracted significant attention in recent years, due to their impressive ability to efficiently extract data-driven linkages from various input materials representations to their output properties. While the application of traditional ML techniques has become quite ubiquitous, there have been limited applications of more advanced deep learning (DL) techniques, primarily because big materials datasets are relatively rare. Given the demonstrated potential and advantages of DL and the increasing availability of big materials datasets, it is attractive to go for deeper neural networks in a bid to boost model performance, but in reality, it leads to performance degradation due to the vanishing gradient problem. In this paper, we address the question of how to enable deeper learning for cases where big materials data is available. Here, we present a general deep learning framework based on Individual Residual learning (IRNet) composed of very deep neural networks that can work with any vector-based materials representation as input to build accurate property prediction models. We find that the proposed IRNet models can not only successfully alleviate the vanishing gradient problem and enable deeper learning, but also lead to significantly (up to 47%) better model accuracy as compared to plain deep neural networks and traditional ML techniques for a given input materials representation in the presence of big data.


Nanoscale ◽  
2021 ◽  
Author(s):  
Hongping Zhang ◽  
Run Zhang ◽  
Chenghua Sun ◽  
Yan Jiao ◽  
Yaping Zhang

Electrochemical carbon dioxide reduction (CRR) to fuels is one of the significant challenges in materials science and chemistry. Recently, single metal atom catalysts based on 2D materials provide a promising...


Author(s):  
He Tan ◽  
Vladimir Tarasov ◽  
Vasileios Fourlakidis ◽  
Attila Dioszegi

For many industries, an understanding of the fatigue behavior of cast iron is important but this topic is still under extensive research in materials science. This paper offers fuzzy logic as a data-driven approach to address the challenge of predicting casting performance. However, data scarcity is an issue when applying a data-driven approach in this field; the presented study tackled this problem. Four fuzzy logic systems were constructed and compared in the study, two based solely upon experimental data and the others combining the same experimental data with data drawn from relevant literature. The study showed that the latter demonstrated a higher accuracy for the prediction of the ultimate tensile strength for cast iron.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Vishu Gupta ◽  
Kamal Choudhary ◽  
Francesca Tavazza ◽  
Carelyn Campbell ◽  
Wei-keng Liao ◽  
...  

AbstractArtificial intelligence (AI) and machine learning (ML) have been increasingly used in materials science to build predictive models and accelerate discovery. For selected properties, availability of large databases has also facilitated application of deep learning (DL) and transfer learning (TL). However, unavailability of large datasets for a majority of properties prohibits widespread application of DL/TL. We present a cross-property deep-transfer-learning framework that leverages models trained on large datasets to build models on small datasets of different properties. We test the proposed framework on 39 computational and two experimental datasets and find that the TL models with only elemental fractions as input outperform ML/DL models trained from scratch even when they are allowed to use physical attributes as input, for 27/39 (≈ 69%) computational and both the experimental datasets. We believe that the proposed framework can be widely useful to tackle the small data challenge in applying AI/ML in materials science.


MRS Bulletin ◽  
2018 ◽  
Vol 43 (9) ◽  
pp. 676-682 ◽  
Author(s):  
Claudia Draxl ◽  
Matthias Scheffler

Abstract


Sign in / Sign up

Export Citation Format

Share Document