PERFORMANCE MEASURES OF THE VARIABLE STRUCTURE FILTER

2005 ◽  
Vol 29 (2) ◽  
pp. 267-295 ◽  
Author(s):  
Saeid Habibi

A new method for state estimation, referred to as the Variable Structure Filter (VSF), has recently been proposed. The VSF is a model based predictor-corrector method. It uses an internal model to provide an initial estimate of the states and subsequently refines this initial estimate by a corrective term that is a function of the system output and the upper bound of uncertainties. As such, the VSF can explicitly cater for uncertainties in its internal model. In this paper, a conceptual discussion of the VSF strategy and its performance in terms of stability, accuracy, and convergence is provided. The impact of modeling uncertainties on the performance of the VSF is discussed and quantified. The analysis is augmented by comparative simulation studies to further illustrate the concept.

2021 ◽  
Vol 13 (22) ◽  
pp. 4612
Author(s):  
Yu Chen ◽  
Luping Xu ◽  
Guangmin Wang ◽  
Bo Yan ◽  
Jingrong Sun

As a new-style filter, the smooth variable structure filter (SVSF) has attracted significant interest. Based on the predictor-corrector method and sliding mode concept, the SVSF is more robust in the face of modeling errors and uncertainties compared to the Kalman filter. Since the estimation performance is usually insufficient in real cases where the measurement vector is of fewer dimensions than the state vector, an improved SVSF (ISVSF) is proposed by combining the existing SVSF with Bayesian theory. The ISVSF contains two steps: firstly, a preliminary estimation is performed by SVSF. Secondly, Bayesian formulas are adopted to improve the estimation for higher accuracy. The ISVSF shows high robustness in dealing with modeling uncertainties and noise. It is noticeable that ISVSF could deliver satisfying performance even if the state of the system is undergoing a sudden change. According to the simulation results of target tracking, the proposed ISVSF performance can be better than that obtained with existing filters.


2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
M Poldrugovac ◽  
J E Amuah ◽  
H Wei-Randall ◽  
P Sidhom ◽  
K Morris ◽  
...  

Abstract Background Evidence of the impact of public reporting of healthcare performance on quality improvement is not yet sufficient to draw conclusions with certainty, despite the important policy implications. This study explored the impact of implementing public reporting of performance indicators of long-term care facilities in Canada. The objective was to analyse whether improvements can be observed in performance measures after publication. Methods We considered 16 performance indicators in long-term care in Canada, 8 of which are publicly reported at a facility level, while the other 8 are privately reported. We analysed data from the Continuing Care Reporting System managed by the Canadian Institute for Health Information and based on information collection with RAI-MDS 2.0 © between the fiscal years 2011 and 2018. A multilevel model was developed to analyse time trends, before and after publication, which started in 2015. The analysis was also stratified by key sample characteristics, such as the facilities' jurisdiction, size, urban or rural location and performance prior to publication. Results Data from 1087 long-term care facilities were included. Among the 8 publicly reported indicators, the trend in the period after publication did not change significantly in 5 cases, improved in 2 cases and worsened in 1 case. Among the 8 privately reported indicators, no change was observed in 7, and worsening in 1 indicator. The stratification of the data suggests that for those indicators that were already improving prior to public reporting, there was either no change in trend or there was a decrease in the rate of improvement after publication. For those indicators that showed a worsening trend prior to public reporting, the contrary was observed. Conclusions Our findings suggest public reporting of performance data can support change. The trends of performance indicators prior to publication appear to have an impact on whether further change will occur after publication. Key messages Public reporting is likely one of the factors affecting change in performance in long-term care facilities. Public reporting of performance measures in long-term care facilities may support improvements in particular in cases where improvement was not observed before publication.


2021 ◽  
Vol 13 (11) ◽  
pp. 5795
Author(s):  
Sławomir Biruk ◽  
Łukasz Rzepecki

Reducing the duration of construction works requires additional organizational measures, such as selecting construction methods that assure a shorter realization time, engaging additional resources, working overtime, or allowing construction works to be performed simultaneously in the same working units. The simultaneous work of crews may affect the quality of works and the efficiency of construction processes. This article presents a simulation model aimed at assessing the impact of the overlap period on the extension of the working time of the crews and the reduction of a repetitive project’s duration in random conditions. The purpose of simulation studies is to provide construction managers with guidelines when deciding on the dates of starting the sequential technological process lines realized by specialized working crews, for sustainable scheduling and organization of construction projects.


2021 ◽  
Vol 11 (10) ◽  
pp. 4602
Author(s):  
Farzin Piltan ◽  
Jong-Myon Kim

In this study, the application of an intelligent digital twin integrated with machine learning for bearing anomaly detection and crack size identification will be observed. The intelligent digital twin has two main sections: signal approximation and intelligent signal estimation. The mathematical vibration bearing signal approximation is integrated with machine learning-based signal approximation to approximate the bearing vibration signal in normal conditions. After that, the combination of the Kalman filter, high-order variable structure technique, and adaptive neural-fuzzy technique is integrated with the proposed signal approximation technique to design an intelligent digital twin. Next, the residual signals will be generated using the proposed intelligent digital twin and the original RAW signals. The machine learning approach will be integrated with the proposed intelligent digital twin for the classification of the bearing anomaly and crack sizes. The Case Western Reserve University bearing dataset is used to test the impact of the proposed scheme. Regarding the experimental results, the average accuracy for the bearing fault pattern recognition and crack size identification will be, respectively, 99.5% and 99.6%.


2011 ◽  
Vol 467-469 ◽  
pp. 766-769
Author(s):  
Gui You Pu ◽  
Ge Wen Kang

Systems with large variable delay, traditional control methods can’t performance well. In this paper, a controller combined with the human-simulated intelligent controller (HSIC) and newly dynamic anti-saturation integral controller, is used in the time-varying delay motor speed control. Simulation studies show, there is no chatter in this controller which is always in norm variable structure controller and this method reaches good performance in the time-varying delay system.


2012 ◽  
Vol 562-564 ◽  
pp. 1012-1015
Author(s):  
S.X. Wang ◽  
Z.X. Li ◽  
D.X. Sun ◽  
X.X. Xie

In order to avoid the limitations of traditional mechanism modeling method, a neural network (NN) model of variable - pitch wind turbine is built by the NN modeling method based on field data. Then considering that from wind turbine’s startup to grid integration, the generator speed must be controlled to rise to the synchronous speed smoothly and precisely, a neural network model predictive control (NNMPC) strategy based on the small-world optimization algorithm (SWOA) is proposed. Simulation results show that the strategy can forecast the change of generator rotational speed based on the wind speed disturbance, making the controller act ahead to eliminate the impact of system delay. Furthermore, the system output can track the reference trajectory well, making sure that the system can connect the electricity grid steadily.


2021 ◽  
Author(s):  
Olga Bountali ◽  
Sila Çetinkaya ◽  
Vishal Ahuja

We analyze a congested healthcare delivery setting resulting from emergency treatment of a chronic disease on a regular basis. A prominent example of the problem of interest is congestion in the emergency room (ER) at a publicly funded safety net hospital resulting from recurrent arrivals of uninsured end-stage renal disease patients needing dialysis (a.k.a. compassionate dialysis). Unfortunately, this is the only treatment option for un/under-funded patients (e.g., undocumented immigrants) with ESRD, and it is available only when the patient’s clinical condition is deemed as life-threatening after a mandatory protocol, including an initial screening assessment in the ER as dictated and communicated by hospital administration and county policy. After the screening assessment, the so-called treatment restrictions are in place, and a certain percentage of patients are sent back home; the ER, thus, serves as a screening stage. The intention here is to control system load and, hence, overcrowding via restricting service (i.e., dialysis) for recurrent arrivals as a result of the chronic nature of the underlying disease. In order to develop a deeper understanding of potential unintended consequences, we model the problem setting as a stylized queueing network with recurrent arrivals and restricted service subject to the mandatory screening assessment in the ER. We obtain analytical expressions of fundamental quantitative metrics related to network characteristics along with more sophisticated performance measures. The performance measures of interest include both traditional and new problem-specific metrics, such as those that are indicative of deterioration in patient welfare because of rejections and treatment delays. We identify cases for which treatment restrictions alone may alleviate or lead to severe congestion and treatment delays, thereby impacting both the system operation and patient welfare. The fundamental insight we offer is centered around the finding that the impact of mandatory protocol on network characteristics as well as traditional and problem-specific performance measures is nontrivial and counterintuitive. However, impact is analytically and/or numerically quantifiable via our approach. Overall, our quantitative results demonstrate that the thinking behind the mandatory protocol is potentially naive. This is because the approach does not necessarily serve its intended purpose of controlling system-load and overcrowding.


2012 ◽  
Vol 9 (6) ◽  
pp. 7591-7611 ◽  
Author(s):  
A. C. V. Getirana ◽  
C. Peters-Lidard

Abstract. In this study, we evaluate the use of a large radar altimetry dataset as a complementary gauging network capable of providing water discharge in ungauged regions within the Amazon basin. A rating-curve-based methodology is adopted to derive water discharge from altimetric data provided by Envisat at 444 virtual stations (VS). The stage-discharge relations at VS are built based on radar altimetry and outputs from a global flow routing scheme. In order to quantify the impact of modeling uncertainties on rating-curve based discharges, another experiment is performed using simulated discharges derived from a simplified data assimilation procedure. Discharge estimates at 90 VS are evaluated against observations during the curve fitting calibration (2002–2005) and evaluation (2006–2008) periods, resulting in mean relative RMS errors as high as 52% and 12% for experiments without and with assimilation, respectively. Without data assimilation, uncertainty of discharge estimates can be mostly attributed to forcing errors at smaller scales, generating a positive correlation between performance and drainage area. Mean relative errors (RE) of altimetry-based discharges varied from 15% to 92% for large and small drainage areas, respectively. Rating curves produced a mean RE of 54% versus 68% from model outputs. Assimilating discharge data decreases the mean RE from 68% to 12%. These results demonstrate the feasibility of applying the proposed methodology to the regional or global scales. Also, it is shown the potential of satellite altimetry for predicting water discharge in poorly-gauged and ungauged river basins.


The purpose of this thesis is to examine the impact of digital bank deposit, asset and loan growth on selected traditional bank performance measures. In order to estimate whether a causal relationship between digital bank measures and traditional bank performance exists, Granger causality method is selected as the main empirical model. In addition, to determine the direction and strength of said relationship, OLS regressions are performed. Research results lead to the conclusion that digital bank deposit and loan growth have a causal relationship to traditional bank performance ratios. Deposit growth has a negative impact on traditional bank performance ratios and loan growth shows both positive and negative impact on different ratios. This research demonstrates some of the challenges that traditional banks are facing in the age of innovation.


Sign in / Sign up

Export Citation Format

Share Document