<div class="csl-bib-body">
<div class="csl-entry">Wu, X., Li, P., Zhang, Y., Zhou, J., Xiang, S., Hou, J., Wang, G., & Dustdar, S. (2025). Knowledge distillation-based lightweight deformable network for remaining useful life prognostics of vehicle power battery. <i>Energy</i>, <i>337</i>, Article 138522. https://doi.org/10.1016/j.energy.2025.138522</div>
</div>
-
dc.identifier.issn
0360-5442
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/220401
-
dc.description.abstract
For prognostics and health management of vehicle power batteries, predicting the remaining useful life (RUL) is a core task. Despite their superior performance in RUL prediction, deep neural networks (DNNs) are computationally intensive, which poses significant challenges for deployment on edge devices. To address this, we propose a knowledge distillation-based lightweight deformable network (KD-LDNet) model for vehicle power battery life prediction. First, we introduce a deformable student model that efficiently adapts to variations in teacher features with minimal computational overhead. This is achieved through lightweight parameterization and an adaptive receptive field, which enables the model to capture the dynamics of battery degradation. Second, the distillation method introduces a novel KD technique, Diff-KD, which transfers knowledge between teacher and student networks using a diffusion model. Additionally, a maximum mean discrepancy (MMD) method is employed to reduce the prediction distribution gap between the networks. Furthermore, we propose a game-based mutual distillation (GMD) technique to boost the student network's generalization performance. Combining these techniques, the KD-LDNet model uses fewer than 4 k parameters, while still achieving superior performance compared to baseline models. Experimental results on vehicle power battery datasets demonstrate that the approach achieves state-of-the-art (SOTA) performance, with an average RMSE of 0.24, an average score of 0.54, 3.38 k parameters, GPU inference times of 2.93 ms, CPU inference time of 2.93 ms, and 5.15 ms on the NVIDIA Jetson TX2.
en
dc.language.iso
en
-
dc.publisher
PERGAMON-ELSEVIER SCIENCE LTD
-
dc.relation.ispartof
Energy
-
dc.subject
Knowledge distillation (KD)
en
dc.subject
Light-weight neural network
en
dc.subject
Remaining-useful-life (RUL)
en
dc.subject
Vehicle power battery
en
dc.title
Knowledge distillation-based lightweight deformable network for remaining useful life prognostics of vehicle power battery