scholarly journals Battery State of Health Estimation with Improved Generalization Using Parallel Layer Extreme Learning Machine

Energies ◽  
2021 ◽  
Vol 14 (8) ◽  
pp. 2243
Author(s):  
Ethelbert Ezemobi ◽  
Andrea Tonoli ◽  
Mario Silvagni

The online estimation of battery state of health (SOH) is crucial to ensure the reliability of the energy supply in electric and hybrid vehicles. An approach for enhancing the generalization of SOH estimation using a parallel layer extreme learning machine (PL-ELM) algorithm is analyzed in this paper. The deterministic and stable PL-ELM model is designed to overcome the drift problem that is associated with some conventional machine learning algorithms; hence, extending the application of a single SOH estimation model over a large set of batteries of the same type. The PL-ELM model was trained with selected features that characterize the SOH. These features are acquired as the discrete variation of indicator variables including voltage, state of charge (SOC), and energy releasable by the battery. The model training was performed with an experimental battery dataset collected at room temperature under a constant current load condition at discharge phases. Model validation was performed with a dataset of other batteries of the same type that were aged under a constant load condition. An optimum performance with low error variance was obtained from the model result. The root mean square error (RMSE) of the validated model varies from 0.064% to 0.473%, and the mean absolute error (MAE) error from 0.034% to 0.355% for the battery sets tested. On the basis of performance, the model was compared with a deterministic extreme learning machine (ELM) and an incremental capacity analysis (ICA)-based scheme from the literature. The algorithm was tested on a Texas F28379D microcontroller unit (MCU) board with an average execution speed of 93 μs in real time, and 0.9305% CPU occupation. These results suggest that the model is suitable for online applications.

2021 ◽  
Vol 12 (4) ◽  
pp. 228
Author(s):  
Jianfeng Jiang ◽  
Shaishai Zhao ◽  
Chaolong Zhang

The state-of-health (SOH) estimation is of extreme importance for the performance maximization and upgrading of lithium-ion battery. This paper is concerned with neural-network-enabled battery SOH indication and estimation. The insight that motivates this work is that the chi-square of battery voltages of each constant current-constant voltage phrase and mean temperature could reflect the battery capacity loss effectively. An ensemble algorithm composed of extreme learning machine (ELM) and long short-term memory (LSTM) neural network is utilized to capture the underlying correspondence between the SOH, mean temperature and chi-square of battery voltages. NASA battery data and battery pack data are used to demonstrate the estimation procedures and performance of the proposed approach. The results show that the proposed approach can estimate the battery SOH accurately. Meanwhile, comparative experiments are designed to compare the proposed approach with the separate used method, and the proposed approach shows better estimation performance in the comparisons.


Author(s):  
Yu Zhang ◽  
Wanwan Zeng ◽  
Chun Chang ◽  
Qiyue Wang ◽  
Si Xu

Abstract Accurate estimation of the state of health (SOH) is an important guarantee for safe and reliable battery operation. In this paper, an online method based on indirect health features (IHF) and sparrow search algorithm fused with deep extreme learning machine (SSA-DELM) of lithium-ion batteries is proposed to estimate SOH. Firstly, the temperature and voltage curves in the battery discharge data are acquired, and the optimal intervals are obtained by ergodic method. Discharge temperature difference at equal time intervals (DTD-ETI) and discharge time interval with equal voltage difference (DTI-EVD) are extracted as IHF. Then, the input weights and hidden layer thresholds of the DELM algorithm are optimized using SSA, and the SSA-DELM model is applied to the estimation of battery's SOH. Finally, the established model is experimentally validated using the battery data, and the results show that the method has high prediction accuracy, strong algorithmic stability and good adaptability.


2019 ◽  
Vol 2019 ◽  
pp. 1-17
Author(s):  
Ju-Young Shin ◽  
Yonghun Ro ◽  
Joo-Wan Cha ◽  
Kyu-Rang Kim ◽  
Jong-Chul Ha

Machine learning algorithms should be tested for use in quantitative precipitation estimation models of rain radar data in South Korea because such an application can provide a more accurate estimate of rainfall than the conventional ZR relationship-based model. The applicability of random forest, stochastic gradient boosted model, and extreme learning machine methods to quantitative precipitation estimation models was investigated using case studies with polarization radar data from Gwangdeoksan radar station. Various combinations of input variable sets were tested, and results showed that machine learning algorithms can be applied to build the quantitative precipitation estimation model of the polarization radar data in South Korea. The machine learning-based quantitative precipitation estimation models led to better performances than ZR relationship-based models, particularly for heavy rainfall events. The extreme learning machine is considered the best of the algorithms used based on evaluation criteria.


2015 ◽  
Vol 166 ◽  
pp. 164-171 ◽  
Author(s):  
L.D. Tavares ◽  
R.R. Saldanha ◽  
D.A.G. Vieira

Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 440
Author(s):  
Dingming Wu ◽  
Xiaolong Wang ◽  
Shaocong Wu

The trend prediction of the stock is a main challenge. Accidental factors often lead to short-term sharp fluctuations in stock markets, deviating from the original normal trend. The short-term fluctuation of stock price has high noise, which is not conducive to the prediction of stock trends. Therefore, we used discrete wavelet transform (DWT)-based denoising to denoise stock data. Denoising the stock data assisted us to eliminate the influences of short-term random events on the continuous trend of the stock. The denoised data showed more stable trend characteristics and smoothness. Extreme learning machine (ELM) is one of the effective training algorithms for fully connected single-hidden-layer feedforward neural networks (SLFNs), which possesses the advantages of fast convergence, unique results, and it does not converge to a local minimum. Therefore, this paper proposed a combination of ELM- and DWT-based denoising to predict the trend of stocks. The proposed method was used to predict the trend of 400 stocks in China. The prediction results of the proposed method are a good proof of the efficacy of DWT-based denoising for stock trends, and showed an excellent performance compared to 12 machine learning algorithms (e.g., recurrent neural network (RNN) and long short-term memory (LSTM)).


2019 ◽  
Vol 11 (10) ◽  
pp. 1148 ◽  
Author(s):  
Rei Sonobe

Cropland maps are useful for the management of agricultural fields and the estimation of harvest yield. Some local governments have documented field properties, including crop type and location, based on site investigations. This process, which is generally done manually, is labor-intensive, and remote-sensing techniques can be used as alternatives. In this study, eight crop types (beans, beetroot, grass, maize, potatoes, squash, winter wheat, and yams) were identified using gamma naught values and polarimetric parameters calculated from TerraSAR-X (or TanDEM-X) dual-polarimetric (HH/VV) data. Three indices (difference (D-type), simple ratio (SR), and normalized difference (ND)) were calculated using gamma naught values and m-chi decomposition parameters and were evaluated in terms of crop classification. We also evaluated the classification accuracy of four widely used machine-learning algorithms (kernel-based extreme learning machine, support vector machine, multilayer feedforward neural network (FNN), and random forest) and two multiple-kernel methods (multiple kernel extreme learning machine (MKELM) and multiple kernel learning (MKL)). MKL performed best, achieving an overall accuracy of 92.1%, and proved useful for the identification of crops with small sample sizes. The difference (raw or normalized) between double-bounce scattering and odd-bounce scattering helped to improve the identification of squash and yams fields.


Agronomy ◽  
2021 ◽  
Vol 11 (11) ◽  
pp. 2344
Author(s):  
Mansoor Maitah ◽  
Karel Malec ◽  
Ying Ge ◽  
Zdeňka Gebeltová ◽  
Luboš Smutka ◽  
...  

Machine learning algorithms have been applied in the agriculture field to forecast crop productivity. Previous studies mainly focused on the whole crop growth period while different time windows on yield prediction were still unknown. The entire growth period was separated into each month to assess their corresponding predictive ability by taking maize production (silage and grain) in Czechia. We present a thorough assessment of county-level maize yield prediction in Czechia using a machine learning algorithm (extreme learning machine (ELM)) and an extensive set of weather data and maize yields from 2002 to 2018. Results show that sunshine in June and water deficit in July were vastly influential factors for silage maize yield. The two primary climate parameters for grain maize yield are minimum temperature in September and water deficit in May. The average absolute relative deviation (AARD), root mean square error (RMSE), and coefficient (R2) of the proposed models are 6.565–32.148%, 1.006–1.071%, 0.641–0.716, respectively. Based on the results, silage yield will decrease by 1.367 t/ha (3.826% loss), and grain yield will increase by 0.337 t/ha (5.394% increase) when the max temperature in May increases by 2 °C. In conclusion, ELM models show a great potential application for predicting maize yield.


2020 ◽  
Vol 10 (22) ◽  
pp. 8179
Author(s):  
Young Hwan Choi ◽  
Ali Sadollah ◽  
Joong Hoon Kim

This study proposes a novel detection model for the detection of cyber-attacks using remote sensing data on water distribution systems (i.e., pipe flow sensor, nodal pressure sensor, tank water level sensor, and programmable logic controllers) by machine learning approaches. The most commonly used and well-known machine learning algorithms (i.e., k-nearest neighbor, support vector machine, artificial neural network, and extreme learning machine) were compared to determine the one with the best detection performance. After identifying the best algorithm, several improved versions of the algorithm are compared and analyzed according to their characteristics. Their quantitative performances and abilities to correctly classify the state of the urban water system under cyber-attack were measured using various performance indices. Among the algorithms tested, the extreme learning machine (ELM) was found to exhibit the best performance. Moreover, this study not only has identified excellent algorithm among the compared algorithms but also has considered an improved version of the outstanding algorithm. Furthermore, the comparison was performed using various representative performance indices to quantitatively measure the prediction accuracy and select the most appropriate model. Therefore, this study provides a new perspective on the characteristics of various versions of machine learning algorithms and their application to different problems, and this study may be referenced as a case study for future cyber-attack detection fields.


Sign in / Sign up

Export Citation Format

Share Document