Multitask deep learning with dynamic task balancing for quantum mechanical properties prediction†
Predicting quantum mechanical properties (QMPs) is very important for the innovation of material and chemistry science. Multitask deep learning models have been widely used in QMPs prediction. However, existing multitask learning models often train multiple QMPs prediction tasks simultaneously without considering the internal relationships and differences between tasks, which may cause the model to overfit easy tasks. In this study, we first proposed a multiscale dynamic attention graph neural network (MDGNN) for molecular representation learning. The MDGNN was designed in a multitask learning fashion that can solve multiple learning tasks at the same time. We then introduced a dynamic task balancing (DTB) strategy combining task differences and difficulties to reduce overfitting across multiple tasks. Finally, we adopted gradient-weighted class activation mapping (Grad-CAM) to analyze a deep learning model for frontier molecular orbital, highest occupied molecular orbital (HOMO) and lowest unoccupied molecular orbital (LUMO) energy level predictions. We evaluated our approach using two large QMPs datasets and compared the proposed method to the state-of-the-art multitask learning models. The MDGNN outperforms other multitask learning approaches on two datasets. The DTB strategy can further improve the performance of MDGNN significantly. Moreover, we show that Grad-CAM creates explanations that are consistent with the molecular orbitals theory. These advantages demonstrate that the proposed method improves the generalization and interpretation capability of QMPs prediction modeling.