Abstract
In this study, we show the transferability of graph convolutional neural network (GCNN) predictions of the formation energy of solid solution alloys across atomic structures of increasing sizes, which was utilized in the cost-efficient sampling strategy. The GCNN was trained on a nickel-platinum (NiPt) dataset generated with the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) using the second nearest-neighbor modified embedded-atom method (2NN MEAM) empirical interatomic potential. The dataset has been obtained by optimizing the geometries of initially randomly generated FCC crystal structures and calculating the formation energy, with configurations spanning the whole compositional range. The GCNN was first trained on a lattice of 256 atoms, which accounts well for the short-range interactions. Using this data, we predicted the formation energy for lattices of 864 atoms and 2,048 atoms, which resulted in lower-than-expected accuracy due to the long-range interactions present in these larger lattices. We accounted for the long-range interactions by including a small amount of training data representative for those two larger sizes. Using this additional data, the predictions of the GCNN scaled linearly with the size of the lattice. Therefore, our strategy ensured scalability while reducing significantly the computational cost of training on larger lattice sizes.
Supplementary materials
Title
Supplementary Material: Transferable prediction of formation energy across lattices of increasing size
Description
This Supplementary Material document contains quantitative comparisons between different graph neural network (GNN) models trained on different datasets, as well as additional scatterplots that have not been included in the manuscript.
Actions