吴慕遥
![]()
开通时间:..
最后更新时间:..
点击次数:
影响因子:7.9
DOI码:10.1016/j.jpowsour.2026.240209
发表刊物:Journal of Power Sources
关键字:Lithium-ion battery; State of Health; Mixture of Experts; Transformer; Incremental learning
摘要:Accurate State of Health (SOH) estimation is vital for lithium-ion battery safety and optimization. Traditional data-driven methods often lack generalization across diverse operational scenarios like slow-charging and fastcharging. This study proposes an incremental learning framework using a Transformer-integrated Mixture of Experts (MoE) architecture. The model integrates multi-source data (expansion and voltage) with lightweight expert networks, enhancing accuracy while reducing memory consumption. For new scenarios, it activates only a sparse subset of experts, enabling efficient knowledge updates without full retraining. Experimental results show the Transformer-MoE model achieves a Mean Absolute Percentage Error (MAPE) of 1.52%, a Root Mean Square Percentage Error (RMSPE) of 2.24%, and a Maximum Absolute Error (MAX-AE) of 6.74%, with only 0.4578 MB memory usage—a 72.31% reduction versus baseline. The proposed incremental learning framework maintains excellent performance on both fast and slow-charging test sets, achieving an overall MAPE of 2.53% and an RMSPE of 3.25%, outperforming traditional data-driven models in most scenarios.
备注:中科院2区
合写作者:Changpeng Tan,Ji Wu
第一作者:Li Wang
论文类型:期刊论文
通讯作者:Muyao Wu
论文编号:240209
学科门类:工学
文献类型:J
卷号:679
ISSN号:0378-7753
是否译文:否
发表时间:2026-04-26
收录刊物:SCI、EI