Abstract
The interdisciplinary nature of redox flow batteries (RFBs), spanning chemistry, materials, and engineering, has led to a vast and fragmented body of research, hindering the efficient synthesis of knowledge. An intelligent question-answering system is there-fore essential to organize this dispersed knowledge, enhance information retrieval, and lower the barrier to comprehensive understanding. In this study, we leveraged the natu-ral language processing capabilities of large language models (LLMs) and the structured nature of knowledge graphs (KGs) to establish a chat model in the field of RFB, named Chat-RFB. By analyzing 5,353 articles related to flow batteries and deconstructing the text content, we learned contextual relationships and generated nearly 164,232 nodes, constructing 853,939 relationships among nodes. This process enhances the profession-al domain knowledge question-answering ability of LLMs. Given the limited research on the responsiveness of evaluation models in the flow battery field, we conducted model performance evaluations using both choice and non-choice questions. The results indicate that by incorporating a professional knowledge base, Chat-RFB enhanced the level of professional domain knowledge. Choice question accuracy was: Chat-RFB 94.9%, DeepSeek-v3 90.9%, GPT-4o 90.7%, Qwen-Max 90.4%, Gemini-2.5-Flash 91.1%. Non-choice question accuracy was: Chat-RFB 93.3%, DeepSeek-v3 75.6%, GPT-4o 68.9%, Qwen-Max 73.3%, Gemini-2.5-Flash 86.7%.
Supplementary materials
Title
Supplementary Materials
Description
Supplementary Materials
Actions