Abstract
Efficient prediction of sampling-intensive thermodynamic properties in both closed and open systems is needed to evaluate material performance and permit high-throughput materials discovery for a diverse array of technology applications, from rational design of low-density high entropy alloys, to optimizing battery cathodes, to modulating hydrogen absorption equilibria in complex metals. To alleviate the prohibitive computational expense of high-throughput configurational sampling with density functional theory (DFT), surrogate modeling strategies like cluster expansion are many orders of magnitude more efficient, but can be difficult to construct in systems with high compositional complexity. We therefore employ minimal-complexity graph neural network models that accurately predict, and can even extrapolate to out-of-train-distribution, formation energies of DFT-relaxed structures from an ideal (unrelaxed) crystallographic representation. This enables the large-scale sampling necessary for various thermodynamic property predictions that may otherwise be intractable and can be achieved with small training datasets. Two exemplars, optimizing thermodynamic stability of low-density high entropy alloys and the modulating the plateau pressure of hydrogen in metal alloys, demonstrate the power of this approach, which will be extendable to a variety of materials discovery and modeling problems.
Supplementary materials
Title
Supplementary Information: Phase diagrams of alloys and their hydrides via on-lattice graph neural networks and limited training data
Description
Supporting information for manuscript
Actions
Title
Supporting Data: AlLiMgSnZn
Description
Configurations and energies for FCC AlLiMgSnZn solid solutions
Actions