Flexible and interpretable generalization of self-evolving computational materials framework

2022 ◽  
Vol 260 ◽  
pp. 106706
Author(s):  
Mohammed Bazroun ◽  
Yicheng Yang ◽  
In Ho Cho
2018 ◽  
Author(s):  
Steen Lysgaard ◽  
Paul C. Jennings ◽  
Jens Strabo Hummelshøj ◽  
Thomas Bligaard ◽  
Tejs Vegge

A machine learning model is used as a surrogate fitness evaluator in a genetic algorithm (GA) optimization of the atomic distribution of Pt-Au nanoparticles. The machine learning accelerated genetic algorithm (MLaGA) yields a 50-fold reduction of required energy calculations compared to a traditional GA.


Author(s):  
Vasily Bulatov ◽  
Wei Cai

This book presents a broad collection of models and computational methods - from atomistic to continuum - applied to crystal dislocations. Its purpose is to help students and researchers in computational materials sciences to acquire practical knowledge of relevant simulation methods. Because their behavior spans multiple length and time scales, crystal dislocations present a common ground for an in-depth discussion of a variety of computational approaches, including their relative strengths, weaknesses and inter-connections. The details of the covered methods are presented in the form of "numerical recipes" and illustrated by case studies. A suite of simulation codes and data files is made available on the book's website to help the reader "to learn-by-doing" through solving the exercise problems offered in the book.


2021 ◽  
Author(s):  
Victor Fung ◽  
Jiaxin Zhang ◽  
Eric Juarez ◽  
Bobby Sumpter

Graph neural networks (GNNs) have received intense interest as a rapidly expanding class of machine learning models remarkably well-suited for materials applications. To date, a number of successful GNNs have been proposed and demonstrated for systems ranging from crystal stability to electronic property prediction and to surface chemistry and heterogeneous catalysis. However, a consistent benchmark of these models remains lacking, hindering the development and consistent evaluation of new models in the materials field. Here, we present a workflow and testing platform, MatDeepLearn, for quickly and reproducibly assessing and comparing GNNs and other machine learning models. We use this platform to optimize and evaluate a selection of top performing GNNs on several representative datasets in computational materials chemistry. From our investigations we note the importance of hyperparameter selection and find roughly similar performances for the top models once optimized. We identify several strengths in GNNs over conventional models in cases with compositionally diverse datasets and in its overall flexibility with respect to inputs, due to learned rather than defined representations. Meanwhile several weaknesses of GNNs are also observed including high data requirements, and suggestions for further improvement for applications in materials chemistry are proposed.


2015 ◽  
Vol 1762 ◽  
Author(s):  
Jie Zou

ABSTRACTComputation has become an increasingly important tool in materials science. Compared to experimental research, which requires facilities that are often beyond the financial capability of primarily-undergraduate institutions, computation provides a more affordable approach. In the Physics Department at Eastern Illinois University (EIU), students have opportunities to participate in computational materials research. In this paper, I will discuss our approach to involving undergraduate students in this area. Specifically, I will discuss (i) how to prepare undergraduate students for computational research, (ii) how to motivate and recruit students to participate in computational research, and (iii) how to select and design undergraduate projects in computational materials science. Suggestions on how similar approaches can be implemented at other institutions are also given.


Sign in / Sign up

Export Citation Format

Share Document