Show simple item record

contributor authorFlessner, David
contributor authorChen, Jun
date accessioned2025-04-21T09:54:51Z
date available2025-04-21T09:54:51Z
date copyright2/5/2025 12:00:00 AM
date issued2025
identifier issn2689-6117
identifier otheraldsc_5_2_024502.pdf
identifier urihttp://yetl.yabesh.ir/yetl1/handle/yetl/4305098
description abstractTo extend the operation window of batteries, active cell balancing has been studied in the literature. However, such an advancement presents significant computational challenges on real-time optimal control, especially when the number of cells in a battery increases. This article investigates the use of reinforcement learning (RL) and model predictive control (MPC) to effectively balance battery cells while at the same time keeping the computational load at a minimum. Specifically, event-triggered MPC is introduced as a way to reduce real-time computation. Different from the existing literature where rule-based or threshold-based event-trigger policies are used to determine the event instances, deep RL is explored to learn and optimize the event-trigger policy. Simulation results demonstrate that the proposed framework can keep the cell state-of-charge variation under 1% while using less than 1% computational resources compared to conventional MPC.
publisherThe American Society of Mechanical Engineers (ASME)
titleReinforcement Learning-Based Event-Triggered Model Predictive Control for Electric Vehicle Active Battery Cell Balancing
typeJournal Paper
journal volume5
journal issue2
journal titleASME Letters in Dynamic Systems and Control
identifier doi10.1115/1.4067656
journal fristpage24502-1
journal lastpage24502-5
page5
treeASME Letters in Dynamic Systems and Control:;2025:;volume( 005 ):;issue: 002
contenttypeFulltext


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record