contributor author | Flessner, David | |
contributor author | Chen, Jun | |
date accessioned | 2025-04-21T09:54:51Z | |
date available | 2025-04-21T09:54:51Z | |
date copyright | 2/5/2025 12:00:00 AM | |
date issued | 2025 | |
identifier issn | 2689-6117 | |
identifier other | aldsc_5_2_024502.pdf | |
identifier uri | http://yetl.yabesh.ir/yetl1/handle/yetl/4305098 | |
description abstract | To extend the operation window of batteries, active cell balancing has been studied in the literature. However, such an advancement presents significant computational challenges on real-time optimal control, especially when the number of cells in a battery increases. This article investigates the use of reinforcement learning (RL) and model predictive control (MPC) to effectively balance battery cells while at the same time keeping the computational load at a minimum. Specifically, event-triggered MPC is introduced as a way to reduce real-time computation. Different from the existing literature where rule-based or threshold-based event-trigger policies are used to determine the event instances, deep RL is explored to learn and optimize the event-trigger policy. Simulation results demonstrate that the proposed framework can keep the cell state-of-charge variation under 1% while using less than 1% computational resources compared to conventional MPC. | |
publisher | The American Society of Mechanical Engineers (ASME) | |
title | Reinforcement Learning-Based Event-Triggered Model Predictive Control for Electric Vehicle Active Battery Cell Balancing | |
type | Journal Paper | |
journal volume | 5 | |
journal issue | 2 | |
journal title | ASME Letters in Dynamic Systems and Control | |
identifier doi | 10.1115/1.4067656 | |
journal fristpage | 24502-1 | |
journal lastpage | 24502-5 | |
page | 5 | |
tree | ASME Letters in Dynamic Systems and Control:;2025:;volume( 005 ):;issue: 002 | |
contenttype | Fulltext | |