1. Introduction
A number of energy storage systems have been the subject of investigation, including hydroelectric power, capacitors, compressed air energy storage, flywheels, and electric batteries [
1]. Energy systems are increasingly being designed to be “electrically controllable” or to integrate with the “electric vector”, as this is the most convenient and efficient way to transport, convert, or control power [
2]. Also, as part of the global carbon neutrality initiative, the utilization of renewable energy has expanded, resulting in a heightened interest in energy storage systems [
3].
According to [
4], 65% of global greenhouse emissions are due to carbon dioxide attributable to fossil fuel and industrial processes. The energy transition is an urgent need faced by various industries, primarily the automotive industry. We, as a society, have the obligation to adopt economic and sustainable solutions for the environment and the fight against climate change, which is increasingly close to becoming irreversible [
5]. In this scenario, batteries play an important role in electrical systems by making them smarter, more flexible, and resilient through their controllability, providing auxiliary services to maintain system stability while increasing the penetration of renewable energy [
6].
The rising costs and concerns regarding environmental pollution have prompted the development of electric and hybrid vehicle technologies [
7]. The attention being paid to electric vehicles is a reflection of their potential to offer an efficient and sustainable alternative to conventional fossil-fuel powered vehicles [
8,
9]. However, their adoption and widespread use present significant technical and scientific challenges, given that the storage of electrical energy can be achieved through electrochemical means in batteries [
10], and the battery pack represents the primary energy storage component in an electric vehicle powertrain.
Lithium-ion batteries are currently the most suitable energy storage device for powering electric vehicles due to their attractive properties [
11]. These include high energy efficiency, lack of memory effect, long cycle life, high energy density and high power density [
12,
13]. The challenges for batteries include driving range (currently from 200 to 350 km with a full charge), charging time (full charging takes 4 to 8 h), battery cost, and bulk and weight (battery packs are heavy and take up considerable vehicle space) [
14]. If the widespread adoption of this mode of transport is sought, these challenges must be addressed through efficient and environmentally sustainable solutions.
The literature review on lithium-ion battery thermal management highlights several approaches, each with its own limitations. Air cooling, for instance, has limited cooling capacity and is less effective for high-power applications, while liquid cooling, though more effective, is complex and expensive to implement. Phase-change materials (PCMs) add weight and diminish in effectiveness over time, and heat pipes can be bulky and degrade with wear and tear. Thermoelectric cooling is low in efficiency and high in cost, refrigeration systems consume a lot of energy and are complex, and nanomaterial-assisted cooling, though promising, faces challenges with integration and cost. Machine learning and data-driven methods require large datasets and significant computational resources, with the prediction accuracy being heavily dependent on data quality [
15,
16].
This study focuses on optimizing the charging time of batteries in electric racing cars through telemetry-integrated battery charging and a response surface analysis (RSM) approach. The innovative application of RSM helps identify optimal charging conditions, minimizing charging time and maximizing vehicle efficiency. Integrating real-time telemetry data provides detailed insights into thermal and electrical behavior during the charging process, addressing the limitations of previous methods and offering a more efficient and safe solution. This research not only enhances thermal management and battery safety but also establishes an adaptable methodology for future energy storage applications.
The behavior of a battery can be understood through a combination of physical and electrochemical model-based explanations, as well as data. The collection of data plays an important role in understanding the different models required for accurate fuel gauging. It is important to note that the collection of data and the development of battery models are inextricably linked until adequate models have been defined and the parameters estimated [
17].
To address these challenges and develop solutions, real-time monitoring tools are required for battery performance, enabling actions to improve battery efficiency and vehicle performance. Various Industry 4.0 tools, such as sensors, web applications, mobile applications, and communication protocols, among others, along with new technologies and telemetry, collect data on battery performance. These data can be analyzed to determine performance factors and generate solutions that optimize batteries and address the challenges posed.
One of the prominent optimization techniques is response surface methodology (RSM), a statistical and mathematical approach used to model and analyze complex phenomena. The primary objective of RSM is to optimize a response variable influenced by multiple factors and their interactions [
18]. This methodology supports decision-making by identifying which adjustments to the causal variables yield the optimal response. RSM is widely utilized across various fields, including the characterization of materials [
19], engine performance and fuel emissions [
20], catalytic oligomerization in chemistry [
21], pollutant adsorption [
22], and environmental applications [
23], among others.
In the present study, the main objective is to integrate a telemetry system that measures the charging time of a battery bank (lithium-ion power batteries), which are the foundation of electric vehicles [
24]. Using RSM, we aim to determine the optimal values of current, voltage, and temperature to improve the efficiency of the vehicle. The vehicular telemetry system integrates current, voltage, and temperature sensors into a prototype with a microcontroller. Data are acquired during electric battery charge cycles and a
response surface design is employed to minimize the battery charging time using the steepest descent optimization algorithm.
2. Materials and Methods
The objective of this study is to optimize the charging process of a battery bank in an electric vehicle. Telemetry is used to integrate current, voltage, and temperature sensors through a prototype that is seamlessly integrated into the battery bank. The RSM methodology is applied to the data generated by the sensors to optimize and minimize the charging time. The input variables for the optimization are current, voltage, and temperature difference. This section is divided into two subsections: the first section describes the study methods, and the second section outlines the six steps of the experimental analysis.
2.1. Study Methods
2.1.1. Battery Bank of Electric Vehicle
Batteries can come in an infinite number of shapes, sizes, and storage capacities. The main differences lie in the electrochemical characteristics needed to meet specific requirements. As a result, there is a specific battery for every need and application. For this reason, solutions are often based on the use of basic storage units that are commercially available and standardized. One of these units is the 18,650 battery, so named because it is standardized in shape and size: 18 mm in diameter and 65 mm in length, with the “0” indicating its cylindrical shape. The only aspects that have changed as electrolyte technology has improved are the storage capacity and the charging technology. Other battery types have been developed, but the 18,650 remains one of the most widely used. In fact, the energy storage in the first versions of the famous Tesla Roadster electric vehicle was obtained from a special configuration of 18,650 battery packs.
The energy storage system studied can be seen in
Figure 1. It consists of basic 18,650 battery packs with a nominal capacity of 2500 mAh at 3.6 V. The configuration includes a small pack of four batteries connected in parallel, equivalent to a 3.6 V, 10 Ah battery. Thirteen of these packs are connected in series to make a total pack or equivalent battery of 10 Ah at 46.8 V.
The actual battery configuration is shown in
Figure 2. This battery arrangement also requires an external physical protection system. The pack is encased in a 1.1 mm thick aluminum case to increase the isolation between the case and the batteries. This aluminum cover not only encloses and shields the batteries from shock, but also provides a surface to dissipate the heat generated inside the battery pack. The configuration includes a lithium battery protection system (BMS 48 V 13S) and a PCB 13S protection plate rated up to 30 A.
The internal structure of a lithium battery is shown in
Figure 3. The electrode stack of commercial Li-ion batteries on the market today is a multi-layer structure. A single repeatable unit consists of a cathode, an anode, and two separator layers. The cathode consists of an aluminum foil coated on both sides with an active material and a binder. Similarly, the anode consists of a copper foil coated with graphite (or silicon) particles [
1,
25].
One of the most critical factors affecting battery performance is the operating temperature of the batteries. Each individual cell or unit behaves as a heat source due to the heat released from the electrochemical process and the inherent internal resistance of the battery itself. The thermal and impedance characteristics of a Li-ion battery have a significant impact on the optimal charging process. Temperature plays a critical role in the electrochemical reactions within the battery. At low temperatures, internal resistance increases and chemical activity decreases, resulting in a loss of battery capacity and reduced chargeability.
A battery thermal management system (BTMS) is required to quickly preheat the battery to an optimal temperature for efficient charging in low temperature environments. In addition, high internal resistance can generate a lot of heat during charging, which must be controlled to prevent thermal runaway and ensure safety. One of the most critical factors affecting battery performance is the operating temperature of the batteries. Each individual cell or unit behaves as a heat source due to the heat released from the electrochemical process and the inherent internal resistance of the battery itself. The thermal and impedance characteristics of a Li-ion battery have a significant impact on the optimal charging process. Temperature plays a critical role in the electrochemical reactions within the battery. At low temperatures, internal resistance increases and chemical activity decreases, resulting in a loss of battery capacity and reduced chargeability. A battery thermal management system (BTMS) is required to quickly preheat the battery to an optimal temperature for efficient charging in low temperature environments. In addition, high internal resistance can generate a lot of heat during charging, which must be controlled to prevent thermal runaway and ensure safety [
26].
Impedance, which refers to the internal resistance of the battery, varies with temperature and state of charge (SOC). Elevated impedance can lead to increased heat generation and energy loss during charging. Techniques such as electrochemical impedance spectroscopy (EIS) can be used to monitor and manage these impedance changes to optimize the charging process. By integrating thermal management with impedance monitoring, it is possible to develop charging strategies that minimize energy loss and thermal stress, thereby improving overall battery efficiency and lifetime [
27,
28]. This integration will be considered a potential improvement in future research. The energy balance of the cell is expressed by the Equation (
1) [
29]
The cell energy balance equation, expressed in Equation (
1), is used to quantify the heat generation and thermal characteristics of the battery during operation. This equation is essential for understanding and predicting the thermal behavior of the battery, which is critical for optimizing its performance and ensuring its safety. To achieve this, the equation must include the major sources of heat generation within the battery: reaction heat, ohmic (resistive) heat, polarization heat, and secondary heat. This comprehensive approach allows an accurate assessment of the total heat generation within the cell, which is critical for managing thermal performance and preventing overheating. By incorporating the convective heat transfer coefficient (h), and the difference between the surface temperature (
) and the ambient temperature (
), the equation facilitates the evaluation of the effectiveness of thermal management systems.
Proper thermal management is critical to maintaining battery efficiency and longevity. It is important to note that these equations are dynamic and therefore include the rate of change of temperature (), allowing the thermal response of the battery to be studied under varying operating conditions. This is particularly relevant for applications such as electric vehicles, where batteries undergo rapid charge and discharge cycles. From an optimization and safety perspective, understanding the thermal behavior using this equation helps to optimize the charging and discharging processes to minimize heat generation. This is essential to improve battery life and ensure battery safety, especially in high-power applications.
h is the convective heat transfer coefficient, and
and
are the surface and ambient temperatures, respectively,
is the average rate or total heat generation, which includes the sum of all the heats generated in the battery: the reaction heat, the ohmic heat, the active polarization heat, and the secondary or lateral heat, respectively, according to Equation (
2) [
30].
However, the latter heat,
in Equation (
4), is usually not considered:
The equivalent circuit model used to characterize batteries is the simplified Thévenin model, which provides an effective representation of battery behavior. This model has an ideal voltage source representing the open-circuit voltage of the battery, in series with an internal resistor [
2] as shown in
Figure 4.
In this representation, the battery’s open-circuit voltage and internal resistance are functions of its state of charge. The open-circuit voltage is mathematically defined by [
31]
where
represents the nominal voltage,
is the open-circuit voltage, I denotes the discharge current, and R int is the internal resistance (58.5
) [
32].
The theory can be extended to more complex models involving parallel RC elements, although analytical closed-form solutions are not possible in this latter case [
33]. In addition, we use data for the optimization process.
2.1.2. Telemetry
The implementation of an efficient and accurate telemetry system is critical to ensure proper monitoring of the battery bank in electric vehicle applications. A variety of sensors and devices are used to collect critical data, including temperature, current, and voltage measurements. Each of these components plays a critical role in tracking the operational status of the battery bank in real time, providing vital information for optimizing the charging and discharging process, preventing potential failures, and maximizing energy efficiency. The communication protocol between the temperature, voltage, and current sensors and the ESP32 controller (Espressif Systems, Shanghai, China) is achieved through the ADC (Analog-to-Digital Converter) inputs of the ESP32. Calibrations are performed according to each sensor’s data sheet to ensure accurate data acquisition. This section describes the technical aspects and key features of the MCP9701 temperature sensors (Microchip Technology Inc., Chandler, AZ, USA), the ACS709 current sensors, and the Analog Voltage Divider module for voltage measurement.
The choice of the ESP32 controller with LoRa module for data acquisition and transmission was made considering its advantages in terms of cost, ease of use, flexibility, and community support. Additionally, the Raspberry Pi 4 (Raspberry Pi Trading, UK) with LoRa module acts as a Node-RED server for storing and visualizing the collected information. These characteristics are particularly valuable in the initial phase of development and prototyping of the telemetry system.
Temperature:
The telemetry system implemented for the battery bank uses temperature probes based on the MCP9701 sensor. This sensor can accurately measure temperature over a wide range from −10 °C to +125 °C. The output of the MCP9701 is calibrated with a slope of 19.53 mV/°C and has a DC offset of 400 mV. These features provide reliable and accurate temperature measurement of both the battery bank and the surrounding environment. The temperature data captured by the MCP9701 are integrated with the rest of the telemetry data and transferred to the central database at a sampling rate of 1 s, allowing real-time monitoring and the creation of detailed logs for further analysis.
Current:
The battery bank telemetry system uses the Allegro ACS709 sensor (Allegro MicroSystems, Shanghai, China) for accurate current measurement. This sensor is specifically designed for real-time current monitoring applications. The board used is a simple carrier for the ACS709 sensor, which uses the Hall effect to measure current over a range up to ±75 A. The sensor provides a low-resistance current path of approximately 1.1 m and electrical isolation of up to 2.1 kV RMS. The ACS709 has been optimized for optimum accuracy in the current range of −37.5 A to 37.5 A. Its analog voltage output is linear for currents up to 75 A, with the output voltage centered at and a typical error of .
In addition, it can operate over a voltage range of 3 V to 5.5 V, allowing direct connection to 3.3 V and 5 V systems. This precise measurement capability and wide operating range make the ACS709 sensor an ideal choice for monitoring battery bank charge current with high reliability and accuracy. The current data collected by the ACS709 are integrated with the rest of the telemetry data and logged to the central database, allowing continuous monitoring and optimization of the battery bank’s charging process.
Voltage:
The Analog Voltage Divider module was utilized for the purpose of voltage measurement in the battery bank telemetry system. The module is capable of detecting the supply voltage over a wide range, from 8 V to 100 V. The module operates on the principle of the voltage divider, which allows the input voltage to be reduced by a factor of 20. Given that the ESP32’s analog input is typically limited to 5 V, the maximum input voltage of the analog voltage detection module is 5 V multiplied by 20, or 100 V. This feature guarantees that the input voltage is within the safe range and compatible with the ESP32. The ability to detect voltages over a wide range, in conjunction with the module’s compatibility with the ESP32’s analog input, renders the Analog Voltage Divider module an appropriate choice for the accurate measurement of the battery bank voltage. The voltage data captured by this module are integrated with other telemetry data and logged into the central database, thereby facilitating the continuous monitoring and efficient management of the battery charging process.
Controller:
The ESP32 controller, equipped with a LoRa module enabling its connection to a Raspberry Pi 4, is used for data acquisition. The Raspberry Pi 4, in turn, is equipped with a LoRa module serving as a Node-RED server, which is responsible for storing and visualizing the collected information. The ESP32 functions as a remote node, collecting data from the battery bank’s telemetry system. These data are transmitted to the Raspberry Pi 4 wirelessly via the ESP32’s LoRa module. The Raspberry Pi 4, acting as a Node-RED server, receives these data and processes them for storage in a database and subsequent visualization in the form of graphs or tables. This configuration allows for efficient and wireless communication between the ESP32 and the Raspberry Pi.
2.1.3. Response Surface Methods
The response surface methodology (RSM) is a set of mathematical and statistical tools whose primary objective is to optimize a response of interest influenced by a variety of variables and determine the optimal operating conditions of a system. The advantages of using RSM over other optimization techniques primarily stem from its capability to efficiently explore and model complex relationships between variables. RSM enables the construction of polynomial models to describe the behavior of complex systems. By focusing on local regions of interest near optimal solutions, RSM effectively refines and enhances designs or processes without the need for exhaustive exploration of the entire solution space. This targeted approach conserves computational resources and time. Moreover, the resulting models are often interpretable due to their mathematical simplicity, such as polynomial equations. This interpretability enhances understanding of how variable changes influence the response, thereby supporting informed decision-making and facilitating process improvement efforts.
The modeling is carried out through the estimation of a polynomial of the first order or higher, generally applying the ordinary least squares method, to develop a second-order model on a response surface as illustrated in Equation (
5):
where
y is the response variable,
: are the entrance variables or factors,
is the intercept,
is the effect of the
over
y,
is the effect of the interaction between
and
, and
is the error term.
RSM is a sequential procedure where current operating conditions are considered non-optimal if the appropriate model is not first order. The objective is, starting from the current operating conditions, to find the trajectory towards the optimum, applying an algorithm such as the one with the highest rise to maximize the response (or descent to minimize), through a second-order model, where the top represents the point of maximum response [
18]. In order to characterize the shape of the surface and locate the optimum, response surfaces are plotted in two (contour) or three dimensions (response surface). The coefficients of the second-order model can be estimated through regression analysis and analysis of variance (ANOVA), which is applied in order to interpret the effect that the input variables have on the response.
The initial voltage, final voltage, and current were selected as input variables, with the charge time serving as the response variable. The levels and variables are outlined in
Table 1. The selection of current, voltage, and temperature ranges was based on several factors critical to the study’s goals. These factors included adhering to the operational limits of the equipment under investigation, ensuring safe and reliable operation within manufacturer-recommended specifications, and covering a sufficiently wide range to capture significant variations in charge time.
2.2. Experimental Analysis
The process was conducted in six phases, which are delineated in the following sections.
Step 1. Experimental Design
The critical variables selected for investigation were the charging current, battery voltage, and the temperatures of the battery and its surrounding environment. These variables were selected on the basis of their direct impact on system efficiency and their pivotal role in the design of a highly effective telemetry system for electric vehicles. The experimental design is presented in
Table 1. The current levels ranged between 1.0 and 3.0, the voltage varied between 51.0 and 51.4, and an RSM design of
with three central points was selected.
Step 2. Design and Development of the Prototype
The developed prototype is distinguished by the integration of current, voltage, and temperature sensors, in accordance with the previously conceived design. To guarantee consistent operation and precise communication, the sensor configuration was optimized, and an appropriate microcontroller was configured.
Step 3. Data Acquisition and Registration
Once the prototype was installed in the electric vehicle, it was configured to capture telemetry data on battery charge and discharge cycles. This enabled the accurate collection of relevant information during crucial operational situations.
Step 4. Data Analysis
The data obtained underwent a rigorous pre-processing process to eliminate outliers and rectify possible errors in the acquisition system. To guarantee the integrity and reliability of the data, advanced statistical techniques were applied. These techniques allowed for the discovery of subtle patterns and interdependent relationships between key variables.
Step 5. Determination of the Optimal Charge Current
A statistical analysis based on a response surface design, which explored different combinations of current and voltage levels, enabled us to establish the optimal charging time. This resulting value, which aimed to maximize electric vehicle efficiency by minimizing battery charging time, was derived directly from the results obtained from this detailed analysis. A confidence level of 95% () was established.
Step 6. Validation of the Prototype and Results
To ensure the reliability and effectiveness of the prototype, it underwent tests under various operating conditions. The results obtained were evaluated in accordance with the objectives previously outlined, providing an accurate assessment of the feasibility and coherence of the approach adopted.
All mathematical and statistical calculations were performed using RStudio Desktop 4.3.2.
3. Results and Discussion
This article presents a telemetry system designed to measure the factors that affect the charging time of a lithium-ion battery bank. The system integrates current, voltage, and temperature sensors into a microcontroller-based prototype. The primary objective was to minimize charging time as a response variable by improving the efficiency of the vehicle. To achieve this goal, data were collected throughout the electric battery charging cycles. Once the telemetry data were collected, a response surface design was applied using current and voltage as factors that varied from low to high values with the goal of minimizing the battery charging time.
Telemetry System for Competitive Electric Vehicles
The telemetry system designed for competitive electric vehicles is shown in
Figure 5. This system includes key sensors, including those for current, voltage, battery temperature, ambient temperature, distance, and GPS.
Data transmission is facilitated by LoRa technology, which ensures reliable communication over long distances. Data are encapsulated in JSON format. The Node-RED platform serves as the interface for data visualization and management, allowing competition teams to monitor vehicle performance in real time. The customization of the PCB and rendering is shown in
Figure 6, ensuring the resilience of the system under extreme conditions. This is supported by a Raspberry Pi hosting a Node.js server for real-time data processing and storage.
The prototype, which has been assigned a Technology Readiness Level (TRL) of 6, has been completed, soldered, and tested.
Figure 7 depicts the fully assembled prototype. The robust construction of the prototype ensures its resistance to vibrations, impacts, and adverse conditions. Additionally, progress has been made in filling out the registration forms for both the prototype and the software used in the telemetry system. At the sixth Technology Readiness Level (TRL), the prototype has been subjected to testing in a relevant environment, thereby ensuring that the conditions under which it has been evaluated are very close to those expected in actual operation.
Moreover, a Human–Machine Interface (HMI) has been developed to graphically represent the telemetry data obtained from the vehicle. The HMI is implemented using the Node-RED dashboard, which allows for real-time visualization of key parameters such as temperature, current, and voltage. Additionally, the HMI provides detailed insights into the battery voltage, current consumed by the electric vehicle, power output, distance traveled during the test, ambient temperature, and battery temperature. This comprehensive visualization enables operators to monitor the vehicle’s performance and battery status effectively, ensuring optimal operation and timely detection of any anomalies.
Figure 8 illustrates the implemented HMI.
Screening experiments
The previously described experimental design methodology was employed. The response surface methodology is a sequential process. The initial step was to identify the factors that influence the response variable, based on the selected factors. In the second stage, the optimal value of the response variable was identified. A response surface design of
was adjusted, with the objective of minimizing the battery charging time using the steepest descent algorithm. The coefficients are presented in
Table 2, and the results of the analysis of variance (ANOVA) are shown in
Table 3.
From
Table 2 and
Table 3, it can be concluded that all the coefficients are statistically significant due to the
p-values being less than 0.05. The current is also statistically significant due to its interaction with the initial voltage. The second-order model is therefore adequate for finding the optimal operating conditions, with an R-squared value of 99%. The optimal conditions as determined by the steepest descent algorithm are presented in
Table 4.
As indicated by the estimated coefficients in
Table 2, the initial voltage was identified as the primary variable influencing the minimization of the charge time. With regard to the interactions, it can be observed that there is a positive relationship between the initial and final voltage. This is due to the battery’s capacity to store more energy during charging, thereby enhancing the efficiency of the process. In contrast, the inverse relationship between the current and final voltage can be attributed to the internal resistance inherent to the battery, which causes a voltage drop as current flows through. An increase in charge current intensity intensifies the internal resistance within the battery, resulting in a more pronounced voltage decline and a subsequent reduction in the available voltage at the battery terminals.
It is crucial to acknowledge that while integrated temperature sensors were incorporated into the system for monitoring purposes, temperature was not utilized as a variable in the optimization process. This is due to the inherent difficulties associated with controlling temperature in a real-world setting.
Figure 9,
Figure 10 and
Figure 11 illustrate the response surface plots, where the z-axis indicates the response variable “loading time”, and the x- and y-axes represent the corresponding input factors. The objective is to reduce the time required for battery charging, with the optimal conditions indicated by red hues as shown by the color bar on the right side of each figure as explained below.
Figure 9 depicts the response surface, which illustrates the relationship between the charging current and the battery bank charging time. It can be observed that the optimal charging current is approximately 2.38 A, with an estimated charging time of approximately 6961.08 s. Given that the objective is to minimize the charging time, this configuration is considered optimal.
The figure also demonstrates that as the charging current increases, the charging time decreases significantly as evidenced by the red areas on the surface. It is, however, important to note that an increase in the charging current also leads to an increase in the battery temperature. It is important to note that an increase in battery temperature can lead to battery failure and a reduction in battery lifespan, and pose risks to the electric vehicle.
Consequently, although augmenting the charging current may appear to be an efficacious strategy for reducing charging time, it is of the utmost importance to achieve a balance that considers both the efficiency of the charging time and the safety and durability of the battery. It is therefore essential that optimization strategies encompass not only the reduction in charging times but also the implementation of effective thermal management techniques to prevent overheating and ensure the safe and prolonged operation of the battery system.
Some authors have investigated various aspects of charging optimization, reliability analysis, and project-based learning related to lithium-ion batteries and electric vehicles. The aforementioned topics encompass a range of subjects, including charging strategies based on temperature rise and charge time, multi-objective optimization methods, reliability optimization for battery packs, and project-based learning for electric vehicle innovation. A number of charging optimization strategies have been put forth for lithium-ion batteries, with a particular emphasis on achieving a balance between charging time and battery longevity.
Figure 10 illustrates the response surface and contour plot for varying charging currents and initial voltages. The optimal region is observed to be around 6000 s. As illustrated by the response surface, an increase in the initial charging voltage results in a reduction in the charging time. Furthermore, it is evident that a higher charging current results in a shorter charging time, with the initial voltage having a lesser influence on this relationship.
It is crucial to acknowledge that lithium batteries necessitate meticulous regulation of the charging current to avert an escalation in temperature, which could result in battery deterioration, diminished lifespan, and potential safety concerns. While an increase in the charging current can significantly reduce the charging time, it is essential to balance this with thermal management strategies to maintain battery integrity and ensure safe operation.
Figure 11 illustrates the response surface and contour plot for the final voltage of the battery bank in relation to the charging current. It can be observed that as the final charging voltage increases, the charging time will also be longer. The time required for the battery to reach a full charge can be reduced by increasing the charging current, as illustrated in the contour plot, where the optimal region is approximately 8000 s. Nevertheless, it is of paramount importance to regulate this phenomenon, as an increase in the charging current also results in an elevated battery temperature. Such temperature increases have the potential to impact performance, reduce the battery’s lifespan, and pose safety risks.
The optimization of the charging process must strike a balance between reducing the time required for charging and ensuring thermal safety. Although augmenting the charging current may appear to be an efficacious method for reducing charging time, it is imperative to implement temperature control measures to prevent damage and mitigate risks. It is of paramount importance to implement effective thermal management strategies in order to maintain the integrity of the battery throughout the charging process. In conclusion,
Figure 11 illustrates the intricate relationship between the final voltage and charging current in the battery charging process, underscoring the significance of identifying optimal conditions and implementing thermal management strategies to achieve a balance between charging efficiency and operational safety.
In order to enhance the precision of the solution at elevated charging currents, the authors of this study [
34] devised a strategy that integrated an Enhanced Thermal Behavior Model and a genetic algorithm (GA). This method employs a polarization-based optimization strategy, which aims to achieve a balance between charging speed and battery lifetime. The GA is employed to identify the optimal charging current trajectories, taking into account temperature rise constraints and charging time.
In their study, ref. [
35] employed a multi-objective charging optimization method that divides the charging process into multiple stages. This method optimizes the charging currents for each stage. The objective of this method is to reduce both the charging time and temperature rise, taking into account the limitations of charging current in different states of charge (SOCs) stages. An offline first-order equivalent circuit model and a battery thermal model are employed to estimate temperature rise and to balance the charging time and temperature increase. The optimal charging currents for the eight stages were determined to be 39.4, 36.4, 30.1, 14.0, 13.1, 12.4, 8.2, and 10.0 A, resulting in a charging process that took 1620 s with a temperature rise of 3.37 °C. Nevertheless, attaining this optimal time necessitates the use of a high-capacity battery, which differs considerably from the one employed in the present study.
Ref. [
36] employed a Variable Frequency Pulse Charge System (VFPCS) to identify the optimal pulse charge frequency for providing optimal pulse current (PC) charging to the battery, thereby reducing charging time. In comparison to the conventional constant current–constant voltage (CC-CV) charge system, the proposed method demonstrated a 21% enhancement in charging speed. A novel optimization approach was developed, utilizing Particle Swarm Optimization (PSO) and a fuzzy-deduced fitness evaluator, with the objective of determining an optimal charging pattern. This approach was able to charge batteries to above 80% capacity in 3060 s. This approach resulted in a 56.8% reduction in charging times in comparison to the conventional CC-CV method.
Other studies concentrate on the optimization of the charging process, with the objective of enhancing battery performance and longevity. This is achieved by the management of current and voltage profiles. In their study, ref. [
33] introduced a method for selecting optimal charging parameters that maximize battery cycle life while minimizing charging time. This approach considers the effects of voltage and current on capacity curves. The battery is charged at a constant current until a specified voltage is reached, after which the voltage is maintained at that level while the current is gradually decreased. The optimal current in the constant current (CC) stage is determined by the ratio of the weighting on total charging time (TTC) and energy loss (EL), as well as the battery’s resistance. The paper presents models that accurately predict the capacity of the battery over multiple cycles. The models and methods are validated with experimental data, thereby demonstrating their effectiveness in real-world scenarios.
As stated in reference [
37], the maximum acceptable terminal voltage and the maximum current during charging represent pivotal parameters that exert a considerable influence on the charging process. These parameters not only affect the lifespan of the battery but also the efficiency and speed of charging. For example, elevated current levels can reduce the time required for charging (TTC) but also increase energy losses (EL) due to the proportional relationship between EL and the square of the current.
In regard to the identification of battery state of charge (SOC) from experimental data, the article [
38] examines algorithms for estimating SOC in lithium-ion batteries using model-adaptive Kalman filters. The study examines a range of approaches for parameterizing battery models and assesses the performance of algorithms across diverse scenarios, with the objective of enhancing the accuracy and reliability of SOC estimation for electric vehicles. Moreover, the research highlights the pivotal importance of precise measurements and comprehensive battery models in advancing the capabilities of SOC estimation for lithium-ion batteries.
4. Conclusions
The principal objective of this study was to reduce the time required for vehicle charging, which was treated as the response variable, with the aim of enhancing vehicle efficiency. The study was informed by telemetry data collected during electric battery charging cycles. Subsequently, a response surface design was employed, whereby the current and voltage were varied across a range of low to high values, with the objective of minimizing the charging time of the battery. The response surface methodology and the steepest descent algorithm demonstrated that both current and voltage have a significant impact on charging time. The minimum charging time of 6961 s was achieved with a current of 2.4 amps and voltages of 50.5 volts and 43.6 volts. As previously stated, temperature was monitored but not utilized as a variable in the optimization process.
The detailed analysis of response surfaces reveals the importance of striking a balance between charging efficiency and thermal management to maximize the performance and durability of batteries in electric vehicles. It has been demonstrated that reducing the charging time necessitates maintaining a low final charging voltage and increasing the charging current. For instance, it was observed that a charging current of approximately 2.38 A resulted in an optimal charging time of approximately 6961.08 s. However, it should be noted that this time may vary depending on specific charging conditions and battery capacity. The study’s results highlight the critical role of the charging current in battery charging time. The results demonstrated that an increase in the charging current resulted in a significant reduction in charging time. However, this advantage must be weighed against the potential detrimental impact on battery temperature. An increase in the charging current was observed to result in elevated temperatures, which could potentially compromise the long-term safety and durability of the battery.
While fast charging offers benefits in terms of convenience and practicality for electric vehicle users, it also presents significant challenges in terms of thermal management and battery safety. The findings of this study underscore the necessity for further research and the development of fast charging strategies that minimize the negative impacts on battery temperature while maximizing charging efficiency. The advent of novel thermal management techniques and advanced battery materials presents a promising avenue for addressing these challenges and advancing towards more efficient and secure energy storage systems.
RSM is a robust methodology for optimizing processes and elucidating intricate variable relationships. However, it is not without limitations. For example, RSM requires the assumption of polynomial relationships. However, if the true relationship between variables such as charge time, voltage, and current is highly nonlinear or exhibits complex interactions beyond the experimental range, RSM may not accurately model system behavior. It is therefore recommended that future research should explore alternative optimization approaches, such as genetic algorithms, surrogate models, or simulated annealing, which can accommodate more complex and non-linear relationships effectively.