Abstract
Chemical language model (CLM), a molecular generation model, leverages large language models by utilizing SMILES, a string representation of compounds. Chemical variational auto-encoder (VAE), which explicitly constructs a latent space, demonstrates their strength in molecular optimization and generation on a continuous space. We propose the Fragment Tree-Transformer based VAE (FRATTVAE) for the task of molecular optimization and generation, which treats molecules as a tree structure with fragments as nodes. Representing compounds as fragment trees allows for the use of fragments as tokens, which are not manageable with SMILES representations, and facilitates the handling of molecules of large size that include salts and solvents. By utilizing a tree positional encoding method for tree structures and applying the Transformer to these encoded structures, we effectively extract the dependencies among fragments through self-attention mechanism, achieving the accuracy of molecule generation and computational speed that surpasses existing methods. To demonstrate the performance of FRATTVAE, we conducted distribution learning and molecular optimization, fundamental tasks of molecular generation. Distribution learning showed that, across a wide range of benchmark datasets from small molecules to natural compounds with differing required properties, the reconstruction accuracy and generation evaluation metrics of FRATTVAE were superior to those of the state-of-the-arts methods. In molecular optimization tasks, FRATTVAE generated high-quality, stable molecules with desired properties that do not trigger structural alerts.