Large language model-enabled machine learning for highperformance Nd-Fe-B permanent magnet design
Abstract
High-performance Nd-Fe-B permanent magnets are indispensable to modern energy-efficient technologies. Traditional materials discovery approaches are timeconsuming and resource-intensive, while existing machine learning models often fail to capture the rich contextual information embedded in scientific literature. Here, we present a multimodal deep learning framework that integrates structured compositional data with textual embeddings extracted from scientific publications using large language models. A dual-tower neural network architecture was developed to independently encode elemental compositions and experimental descriptions, followed by a systematic evaluation of various fusion strategies, including concatenation, attention-based mechanisms, Hadamard product, and gated fusion. The gated-fusion model achieved exceptional prediction accuracy that surpasses conventional methods, including XGBoost and Random Forest. The model demonstrates remarkable experimental validation with prediction accuracies exceeding 98% for remanence.Leveraging this high-accuracy predictive model, we systematically designed and experimentally validated high-performance Nd-Fe-B magnets with outstanding magnetic properties. Through Pareto frontier analysis of virtual compositions, we identified Nb as a critical performance-enhancing element. Guided by model predictions, we successfully fabricated magnets with optimized Nb content that achieved exceptional magnetic performance, surpassing the predicted Pareto frontier for heavy rare-earth-free magnets. This work establishes a multimodal learning paradigm to efficiently leverage scientific knowledge for accelerated materials optimization.
Please wait while we load your content...