scholarly journals Comparison of ECG Baseline Wander Removal Techniques and Improvement Based on Moving Average of Wavelet Approximation Coefficients

2021 ◽  
Vol 25 (2) ◽  
pp. 183-204
Author(s):  
Mounaim Aqil ◽  
◽  
Atman Jbari ◽  
Abdennasser Bourouhou ◽  
◽  
...  

The baseline wander is among the artifacts that corrupt the ECG signal. This noise can affect some signal features, in particular the ST segment, which is an important marker for the diagnosis of ischemia. This paper presents a study on the effectiveness of several methods and techniques for suppressing the baseline wonder (BW) from the ECG signals. As a result, a new technique called moving average of wavelet approximation coefficients (DWT-MAV) is proposed. The techniques concerned are the moving average, the approximation of the baseline by polynomial fitting, the Savitzky-Golay filtering, and the discrete wavelet transform (DWT). The comparison of this techniques is performed using the main criteria for assessing the BW denoising quality criteria such mean square error (MSE), percent root mean square difference (PRD) and correlation coefficient (COR). In this paper, three other criteria of comparison are proposed namely the number of samples of the ECG signal, the baseline frequency variation and the time processing. Two of these new indices are related to possible real time ECG denoising. To improve the quality of BW suppression including the new indices, a new method is proposed. This technique is a combination of the DWT and the moving average methods. This new technique performs the best compromise in terms of MSE, PRD, coefficient correlation and the time processing. The simulations were performed on ECG recording from MIT-BIH database with synthetic and real baselines.

2017 ◽  
Vol 8 (1) ◽  
pp. 32-45 ◽  
Author(s):  
KM Talha Nahiyan ◽  
Abdullah Al Amin

ECG (Electrocardiogram) is a measure of heart’s electrical activity. As the body is a volume conductor, ECG signal can be recorded from the body surface. The signal while being recorded from the body surface gets corrupted by various types of noise or artifact. Among these, baseline wander is a type of noise that severely hampers the ECG signal. Baseline wander is particularly of very low frequency; it causes the ECG signal to deviate from its isoelectric line and causes the signal to ride on the lower frequency artifact. The proposed method is based on Savitzky-Golay filter, which is a moving average filter that takes into consideration the polynomial order along with moving averaging when approximating the signal. It enables to approximate the baseline wander quite efficiently. Though in some cases it distorts the ECG signal to some extent, when compared with usual polynomial fitting method, it demonstrates superiority in terms of accuracy, simplicity and generalization.Bangladesh Journal of Medical Physics Vol.8 No.1 2015 32-45


Author(s):  
R. SHANTHA SELVA KUMARI ◽  
V. SADASIVAM

In this paper, an off-line double density discrete wavelet transform based de-noising and baseline wandering removal methods are proposed. Different levels decomposition is used depending upon the noise level, so as to give a better result. When the noise level is low, three levels decomposition is used. When the noise level is medium, four levels decomposition is used. When the noise level is high, five levels decomposition is used. Soft threshold technique is applied to each set of wavelet detail coefficients with different noise level. Donoho's estimator is used as a threshold for each set of wavelet detail coefficients. The results are compared with other classical filters and improvement of signal to noise ratio is discussed. Using the proposed method the output signal to noise ratio is 19.7628 dB for an input signal to noise ratio of -7.11 dB. This is much higher than other methods available in the literature. Baseline wandering removal is done by using double density discrete wavelet approximation coefficients of the whole signal. This is an unsupervised method allowing the process to be used in off-line automatic analysis of electrocardiogram. The results are more accurate than other methods with less effort.


Author(s):  
Amean Al-Safi

Electrocardiogram (ECG) is considered as the main signal that can be used to diagnose different kinds of diseases related to human heart. During the recording process, it is usually contaminated with different kinds of noise which includes power-line interference, baseline wandering and muscle contraction. In order to clean the ECG signal, several noise removal techniques have been used such as adaptive filters, empirical mode decomposition, Hilbert-Huang transform, wavelet-based algorithm, discrete wavelet transforms, modulus maxima of wavelet transform, patch based method, and many more. Unfortunately, all the presented methods cannot be used for online processing since it takes long time to clean the ECG signal. The current research presents a unique method for ECG denoising using a novel approach of adaptive filters. The suggested method was tested by using a simulated signal using MATLAB software under different scenarios. Instead of using a reference signal for ECG signal denoising, the presented model uses a unite delay and the primary ECG signal itself. Least mean square (LMS), normalized least mean square (NLMS), and Leaky LMS were used as adaptation algorithms in this paper.


2018 ◽  
Vol 5 (1) ◽  
pp. 41-46
Author(s):  
Rosalina Rosalina ◽  
Hendra Jayanto

The aim of this paper is to get high accuracy of stock market forecasting in order to produce signals that will affect the decision making in the trading itself. Several experiments by using different methodologies have been performed to answer the stock market forecasting issues. A traditional linear model, like autoregressive integrated moving average (ARIMA) has been used, but the result is not satisfactory because it is not suitable for model financial series. Yet experts are likely observed another approach by using artificial neural networks. Artificial neural network (ANN) are found to be more effective in realizing the input-output mapping and could estimate any continuous function which given an arbitrarily desired accuracy. In details, in this paper will use maximal overlap discrete wavelet transform (MODWT) and graph theory to distinguish and determine between low and high frequencies, which in this case acted as fundamental and technical prediction of stock market trading. After processed dataset is formed, then we will advance to the next level of the training process to generate the final result that is the buy or sell signals given from information whether the stock price will go up or down.


2011 ◽  
Vol 2011 ◽  
pp. 1-11 ◽  
Author(s):  
Erol Egrioglu ◽  
Cagdas Hakan Aladag ◽  
Cem Kadilar

Seasonal Autoregressive Fractionally Integrated Moving Average (SARFIMA) models are used in the analysis of seasonal long memory-dependent time series. Two methods, which are conditional sum of squares (CSS) and two-staged methods introduced by Hosking (1984), are proposed to estimate the parameters of SARFIMA models. However, no simulation study has been conducted in the literature. Therefore, it is not known how these methods behave under different parameter settings and sample sizes in SARFIMA models. The aim of this study is to show the behavior of these methods by a simulation study. According to results of the simulation, advantages and disadvantages of both methods under different parameter settings and sample sizes are discussed by comparing the root mean square error (RMSE) obtained by the CSS and two-staged methods. As a result of the comparison, it is seen that CSS method produces better results than those obtained from the two-staged method.


2021 ◽  
Vol 26 (1) ◽  
pp. 13-28
Author(s):  
Agus Sulaiman ◽  
Asep Juarna

Beberapa penyebab terjadinya pengangguran di Indonesia ialah, tingkat urbanisasi, tingkat industrialisasi, proporsi angkatan kerja SLTA dan upah minimum provinsi. Faktor-faktor tersebut turut serta mempengaruhi persentase data terkait tingkat pengangguran menjadi sedikit fluktuatif. Berdasarkan pergerakan persentase data tersebut, diperlukan sebuah prediksi untuk mengetahui persentase tingkat pengangguran di masa depan dengan menggunakan konsep peramalan. Pada penelitian ini, peneliti melakukan analisis peramalan time series menggunakan metode Box-Jenkins dengan model Autoregressive Integrated Moving Average (ARIMA) dan metode Exponential Smoothing dengan model Holt-Winters. Pada penelitian ini, peramalan dilakukan dengan menggunakan dataset tingkat pengangguran dari tahun 2005 hingga 2019 per 6 bulan antara Februari hingga Agustus. Peneliti akan melihat evaluasi Range Mean Square Error (RMSE) dan Mean Square Error (MSE) terkecil dari setiap model time series. Berdasarkan hasil penelitian, ARIMA(0,1,12) menjadi model yang terbaik untuk metode Box-Jenkins sedangkan Holt-Winters dengan alpha(mean) = 0.3 dan beta(trend) = 0.4 menjadi yang terbaik pada metode Exponential Smoothing. Pemilihan model terbaik dilanjutkan dengan perbandingan nilai akurasi RMSE dan MSE. Pada model ARIMA(0,1,12) nilai RMSE = 1.01 dan MSE = 1.0201, sedangkan model Holt-Winters menghasilkan nilai RMSE = 0.45 dan MSE = 0.2025. Berdasarkan data tersebut terpilih model Holt-Winters sebagai model terbaik untuk peramalan data tingkat pengangguran di Indonesia.


2021 ◽  
Vol 52 (1) ◽  
pp. 6-14
Author(s):  
Amit Tak ◽  
Sunita Dia ◽  
Mahendra Dia ◽  
Todd Wehner

Background: The forecasting of Coronavirus Disease-19 (COVID-19) dynamics is a centrepiece in evidence-based disease management. Numerous approaches that use mathematical modelling have been used to predict the outcome of the pandemic, including data-driven models, empirical and hybrid models. This study was aimed at prediction of COVID-19 evolution in India using a model based on autoregressive integrated moving average (ARIMA). Material and Methods: Real-time Indian data of cumulative cases and deaths of COVID-19 was retrieved from the Johns Hopkins dashboard. The dataset from 11 March 2020 to 25 June 2020 (n = 107 time points) was used to fit the autoregressive integrated moving average model. The model with minimum Akaike Information Criteria was used for forecasting. The predicted root mean square error (PredRMSE) and base root mean square error (BaseRMSE) were used to validate the model. Results: The ARIMA (1,3,2) and ARIMA (3,3,1) model fit best for cumulative cases and deaths, respectively, with minimum Akaike Information Criteria. The prediction of cumulative cases and deaths for next 10 days from 26 June 2020 to 5 July 2020 showed a trend toward continuous increment. The PredRMSE and BaseRMSE of ARIMA (1,3,2) model were 21,137 and 166,330, respectively. Similarly, PredRMSE and BaseRMSE of ARIMA (3,3,1) model were 668.7 and 5,431, respectively. Conclusion: It is proposed that data on COVID-19 be collected continuously, and that forecasting continue in real time. The COVID-19 forecast assist government in resource optimisation and evidence-based decision making for a subsequent state of affairs.


Sign in / Sign up

Export Citation Format

Share Document