scholarly journals An Ensemble Prognostic Method of Francis Turbine Units Using Low-Quality Data under Variable Operating Conditions

Sensors ◽  
2022 ◽  
Vol 22 (2) ◽  
pp. 525
Author(s):  
Ran Duan ◽  
Jie Liu ◽  
Jianzhong Zhou ◽  
Pei Wang ◽  
Wei Liu

The prognostic is the key to the state-based maintenance of Francis turbine units (FTUs), which consists of performance state evaluation and degradation trend prediction. In practical engineering environments, there are three significant difficulties: low data quality, complex variable operation conditions, and prediction model parameter optimization. In order to effectively solve the above three problems, an ensemble prognostic method of FTUs using low-quality data under variable operation conditions is proposed in this study. Firstly, to consider the operation condition parameters, the running data set of the FTU is constructed by the water head, active power, and vibration amplitude of the top cover. Then, to improve the robustness of the proposed model against anomaly data, the density-based spatial clustering of applications with noise (DBSCAN) is introduced to clean outliers and singularities in the raw running data set. Next, considering the randomness of the monitoring data, the healthy state model based on the Gaussian mixture model is constructed, and the negative log-likelihood probability is calculated as the performance degradation indicator (PDI). Furthermore, to predict the trend of PDIs with confidence interval and automatically optimize the prediction model on both accuracy and certainty, the multiobjective prediction model is proposed based on the non-dominated sorting genetic algorithm and Gaussian process regression. Finally, monitoring data from an actual large FTU was used for effectiveness verification. The stability and smoothness of the PDI curve are improved by 3.2 times and 1.9 times, respectively, by DBSCAN compared with 3-sigma. The root-mean-squared error, the prediction interval normalized average, the prediction interval coverage probability, the mean absolute percentage error, and the R2 score of the proposed method achieved 0.223, 0.289, 1.000, 0.641%, and 0.974, respectively. The comparison experiments demonstrate that the proposed method is more robust to low-quality data and has better accuracy, certainty, and reliability for the prognostic of the FTU under complex operating conditions.

Author(s):  
Frank Dauby ◽  
Stefan Vages

Pacific Gas and Electric Company owns and operates an extensive network of over 10,700 km (6,700 miles) of gas transmission pipelines, much of which is under 16″ diameter and operates at less than 27.5 bar (400 psig), making them difficult to inspect with free swimming in-line inspection (ILI) tools. Additionally, many piggable pipeline sections are multi-diameter and have numerous 1.5D fittings, some of these in back to back configuration, requiring tools that are not currently available. Following several failed attempts to inspect PG&E’s 12″ × 16″ pipelines in 2015 using existing ILI tools, and after working to modify a 12″ × 18″ tool for lower pressure service in 2016, PG&E and ROSEN decided to collaboratively develop new, specially designed, 12″ × 16″ geometry and axial MFL tools. The goal of this project was to develop tools that could meet both the PG&E pipeline passage requirements and allow for an acceptable speed profile. The need to inspect a total of 16 pipeline sections in the long-term ILI Upgrade Plan, in this size range, justified the investment in these new tools. The service provider embarked on a new ILI tool design process including design, manufacturing, fabrication and testing at their facilities in Germany. Through this process, a number of unique ILI tool design features to lower tool drag and improve ease of collapsibility were implemented, resulting in a tool that far exceeds existing industry capabilities. To confirm the tools’ capabilities before their first use in a live gas transmission pipeline, pump testing in water, as well as in compressed air, was performed. In late 2017, using these tools, PG&E inspected two previously unpiggable 12″ × 16″ low-pressure pipelines successfully. In this paper, the process of developing these tools will be discussed. The test program will be reviewed comparing findings under controlled conditions in water and compressed air with pig run behavior in the live pipelines. The analysis also provides an assessment of the operating conditions that are deemed necessary for the inspection tool to gather a good quality data set.


2015 ◽  
Vol 15 (7) ◽  
pp. 3755-3771 ◽  
Author(s):  
F. Tan ◽  
H. S. Lim ◽  
K. Abdullah ◽  
T. L. Yoon ◽  
B. Holben

Abstract. Obtaining continuous aerosol-optical-depth (AOD) measurements is a difficult task due to the cloud-cover problem. With the main motivation of overcoming this problem, an AOD-predicting model is proposed. In this study, the optical properties of aerosols in Penang, Malaysia were analyzed for four monsoonal seasons (northeast monsoon, pre-monsoon, southwest monsoon, and post-monsoon) based on data from the AErosol RObotic NETwork (AERONET) from February 2012 to November 2013. The aerosol distribution patterns in Penang for each monsoonal period were quantitatively identified according to the scattering plots of the Ångström exponent against the AOD. A new empirical algorithm was proposed to predict the AOD data. Ground-based measurements (i.e., visibility and air pollutant index) were used in the model as predictor data to retrieve the missing AOD data from AERONET due to frequent cloud formation in the equatorial region. The model coefficients were determined through multiple regression analysis using selected data set from in situ data. The calibrated model coefficients have a coefficient of determination, R2, of 0.72. The predicted AOD of the model was generated based on these calibrated coefficients and compared against the measured data through standard statistical tests, yielding a R2 of 0.68 as validation accuracy. The error in weighted mean absolute percentage error (wMAPE) was less than 0.40% compared with the real data. The results revealed that the proposed model efficiently predicted the AOD data. Performance of our model was compared against selected LIDAR data to yield good correspondence. The predicted AOD can enhance measured short- and long-term AOD and provide supplementary information for climatological studies and monitoring aerosol variation.


Author(s):  
Gerard van der Weijde ◽  
Niels Mallon

Reliable transfer systems are a key element in developing floating LNG technology. Multi-composite hoses may prove to be a reliable and cost effective solution. TNO, the Dutch contract research organisation, has executed an extensive test program on the multi-composite hose of Gutteling B.V. for Ship-to-Ship (STS) LNG transfers. It has resulted in qualification of the hose in accordance to EN1474-part 2 [2]. This hose is the first product that has been qualified in accordance with this new internationally accepted standard. The test program was performed in close cooperation with Gutteling B.V., EXMAR and DNV. It focused on the mechanical and flow behaviour at ambient and cryogenic operating conditions. In excess of the EN1474-requirements, a multi-level test programme is performed, more samples are tested and more extreme load combinations are applied. The purpose was to provide a data set that enables transfer system qualification in accordance with EN1474-part 3. The hose appears to have a complex mechanical behaviour: elastic non-linear, coupling of deformation modes, hysteric behaviour and large damping. Damage tolerance tests, such as impact and crushing, show that the composite hose performs exceptionally. This paper summarises the test program and describes some of the tests performed in excess of the requirements. In particular, testing for off-spec operation conditions, fatigue and creep is addressed. Absences of engineering models to predict fatigue and creep performance are future obstacles for realising the interesting proposition of multi-composite hoses.


2019 ◽  
Vol 23 (6) ◽  
pp. 670-679
Author(s):  
Krista Greenan ◽  
Sandra L. Taylor ◽  
Daniel Fulkerson ◽  
Kiarash Shahlaie ◽  
Clayton Gerndt ◽  
...  

OBJECTIVEA recent retrospective study of severe traumatic brain injury (TBI) in pediatric patients showed similar outcomes in those with a Glasgow Coma Scale (GCS) score of 3 and those with a score of 4 and reported a favorable long-term outcome in 11.9% of patients. Using decision tree analysis, authors of that study provided criteria to identify patients with a potentially favorable outcome. The authors of the present study sought to validate the previously described decision tree and further inform understanding of the outcomes of children with a GCS score 3 or 4 by using data from multiple institutions and machine learning methods to identify important predictors of outcome.METHODSClinical, radiographic, and outcome data on pediatric TBI patients (age < 18 years) were prospectively collected as part of an institutional TBI registry. Patients with a GCS score of 3 or 4 were selected, and the previously published prediction model was evaluated using this data set. Next, a combined data set that included data from two institutions was used to create a new, more statistically robust model using binomial recursive partitioning to create a decision tree.RESULTSForty-five patients from the institutional TBI registry were included in the present study, as were 67 patients from the previously published data set, for a total of 112 patients in the combined analysis. The previously published prediction model for survival was externally validated and performed only modestly (AUC 0.68, 95% CI 0.47, 0.89). In the combined data set, pupillary response and age were the only predictors retained in the decision tree. Ninety-six percent of patients with bilaterally nonreactive pupils had a poor outcome. If the pupillary response was normal in at least one eye, the outcome subsequently depended on age: 72% of children between 5 months and 6 years old had a favorable outcome, whereas 100% of children younger than 5 months old and 77% of those older than 6 years had poor outcomes. The overall accuracy of the combined prediction model was 90.2% with a sensitivity of 68.4% and specificity of 93.6%.CONCLUSIONSA previously published survival model for severe TBI in children with a low GCS score was externally validated. With a larger data set, however, a simplified and more robust model was developed, and the variables most predictive of outcome were age and pupillary response.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Shengpu Li ◽  
Yize Sun

Ink transfer rate (ITR) is a reference index to measure the quality of 3D additive printing. In this study, an ink transfer rate prediction model is proposed by applying the least squares support vector machine (LSSVM). In addition, enhanced garden balsam optimization (EGBO) is used for selection and optimization of hyperparameters that are embedded in the LSSVM model. 102 sets of experimental sample data have been collected from the production line to train and test the hybrid prediction model. Experimental results show that the coefficient of determination (R2) for the introduced model is equal to 0.8476, the root-mean-square error (RMSE) is 6.6 × 10 (−3), and the mean absolute percentage error (MAPE) is 1.6502 × 10 (−3) for the ink transfer rate of 3D additive printing.


Author(s):  
Jing Qi ◽  
Kun Xu ◽  
Xilun Ding

AbstractHand segmentation is the initial step for hand posture recognition. To reduce the effect of variable illumination in hand segmentation step, a new CbCr-I component Gaussian mixture model (GMM) is proposed to detect the skin region. The hand region is selected as a region of interest from the image using the skin detection technique based on the presented CbCr-I component GMM and a new adaptive threshold. A new hand shape distribution feature described in polar coordinates is proposed to extract hand contour features to solve the false recognition problem in some shape-based methods and effectively recognize the hand posture in cases when different hand postures have the same number of outstretched fingers. A multiclass support vector machine classifier is utilized to recognize the hand posture. Experiments were carried out on our data set to verify the feasibility of the proposed method. The results showed the effectiveness of the proposed approach compared with other methods.


Author(s):  
Simona Babiceanu ◽  
Sanhita Lahiri ◽  
Mena Lockwood

This study uses a suite of performance measures that was developed by taking into consideration various aspects of congestion and reliability, to assess impacts of safety projects on congestion. Safety projects are necessary to help move Virginia’s roadways toward safer operation, but can contribute to congestion and unreliability during execution, and can affect operations after execution. However, safety projects are assessed primarily for safety improvements, not for congestion. This study identifies an appropriate suite of measures, and quantifies and compares the congestion and reliability impacts of safety projects on roadways for the periods before, during, and after project execution. The paper presents the performance measures, examines their sensitivity based on operating conditions, defines thresholds for congestion and reliability, and demonstrates the measures using a set of Virginia safety projects. The data set consists of 10 projects totalling 92 mi and more than 1M data points. The study found that, overall, safety projects tended to have a positive impact on congestion and reliability after completion, and the congestion variability measures were sensitive to the threshold of reliability. The study concludes with practical recommendations for primary measures that may be used to measure overall impacts of safety projects: percent vehicle miles traveled (VMT) reliable with a customized threshold for Virginia; percent VMT delayed; and time to travel 10 mi. However, caution should be used when applying the results directly to other situations, because of the limited number of projects used in the study.


Author(s):  
Ahmad R. Alsaber ◽  
Jiazhu Pan ◽  
Adeeba Al-Hurban 

In environmental research, missing data are often a challenge for statistical modeling. This paper addressed some advanced techniques to deal with missing values in a data set measuring air quality using a multiple imputation (MI) approach. MCAR, MAR, and NMAR missing data techniques are applied to the data set. Five missing data levels are considered: 5%, 10%, 20%, 30%, and 40%. The imputation method used in this paper is an iterative imputation method, missForest, which is related to the random forest approach. Air quality data sets were gathered from five monitoring stations in Kuwait, aggregated to a daily basis. Logarithm transformation was carried out for all pollutant data, in order to normalize their distributions and to minimize skewness. We found high levels of missing values for NO2 (18.4%), CO (18.5%), PM10 (57.4%), SO2 (19.0%), and O3 (18.2%) data. Climatological data (i.e., air temperature, relative humidity, wind direction, and wind speed) were used as control variables for better estimation. The results show that the MAR technique had the lowest RMSE and MAE. We conclude that MI using the missForest approach has a high level of accuracy in estimating missing values. MissForest had the lowest imputation error (RMSE and MAE) among the other imputation methods and, thus, can be considered to be appropriate for analyzing air quality data.


2021 ◽  
Vol 111 ◽  
pp. 106576
Author(s):  
Chen Kong ◽  
Juntao Chang ◽  
Ziao Wang ◽  
Yunfei Li

Electronics ◽  
2021 ◽  
Vol 10 (5) ◽  
pp. 581
Author(s):  
Yongbae Kim ◽  
Juyong Back ◽  
Jongweon Kim

A tachograph in a vehicle records the vehicle operating conditions, such as speed, distance, brake operation conditions, acceleration, GPS information, etc., in intervals of one second. For accidents, the tachograph records information, such as the acceleration and direction of a vehicle traveling in intervals of 1/100 s for 10 s before and after the accident occurs as collision data. A vehicle equipped with a tachograph is obliged to upload operation data to administrative organizations periodically via other auxiliary storage devices like a USB attached external memory or online wireless communication. If there is a problem with the recorded contents, data may be at risk of being tampered with during the uploading process. This research proposed tamper-resistant technology based on blockchain for data in online and offline environments. The suggested algorithm proposed a new data recording mechanism that operates in low-level hardware of digital tachographs for tamper-resistance in light blockchains and on/offline situations. The average encoding time of the proposed light blockchain was 1.85 ms/Mb, while the average decoding time was 1.65 ms/Mb. With the outliers in statistical tests removed, the estimated average encoding and decoding time was 1.32 ms/Mb and 1.29 ms/Mb, respectively, and the tamper verification test detected all the tampered data.


Sign in / Sign up

Export Citation Format

Share Document