gross error
Recently Published Documents


TOTAL DOCUMENTS

276
(FIVE YEARS 48)

H-INDEX

24
(FIVE YEARS 3)

2021 ◽  
Author(s):  
Dimitris Akritidis ◽  
Andrea Pozzer ◽  
Johannes Flemming ◽  
Antje Inness ◽  
Philippe Nédélec ◽  
...  

Abstract. Tropopause folds are the key process underlying stratosphere-to-troposphere transport (STT) of ozone, thus, affecting tropospheric ozone levels and variability. In the present study we perform a process-oriented evaluation of Copernicus Atmosphere Monitoring Service (CAMS) reanalysis (CAMSRA) O3 during folding events, over Europe and for the time period from 2003 to 2018. A 3-D labeling algorithm is applied to detect tropopause folds in CAMSRA, while ozonesonde data from WOUDC (World Ozone and Ultraviolet Radiation Data Centre) and aircraft measurements from IAGOS (In-service Aircraft for a Global Observing System) are used for CAMSRA O3 evaluation. The profiles of observed and CAMSRA O3 concentrations indicate that CAMSRA reproduces the observed O3 increases in the troposphere during the examined folding events. Nevertheless, at some of the examined sites, CAMSRA overestimates the observed O3 concentrations, mostly at the upper portion of the observed increases, with a median fractional gross error (FGE) among the examined sites > 0.2 above 400 hPa. The use of a control run without data assimilation, reveals that the aforementioned overestimation of CAMSRA O3 arises from the data assimilation implementation. Overall, although data assimilation assists CAMSRA O3 to follow the observed O3 enhancements in the troposphere during the STT events, it introduces biases in the upper troposphere resulting in no clear quantitative improvement compared to the control run without data assimilation. Less biased assimilated O3 products, with finer vertical resolution in the troposphere, in addition to higher IFS (Integrated Forecasting System) vertical resolution, are expected to provide a better representation of O3 variability during tropopause folds.


Author(s):  
Kevin Chiu ◽  
Peter Hoskin ◽  
Amit Gupta ◽  
Roeum Butt ◽  
Samsara Terparia ◽  
...  

Objectives: Radiologist input in peer review of head and neck radiotherapy has been introduced as a routine departmental approach. The aim was to evaluate this practice and to quantitatively analyse the changes made. Methods: Patients treated with radical-dose radiotherapy between August–November 2020 were reviewed. The incidence of major and minor changes, as defined by The Royal College of Radiologists guidance, was prospectively recorded. The amended radiotherapy volumes were compared with the original volumes using Jaccard Index (JI) to assess conformity; Geographical Miss Index (GMI) for under contouring; and Hausdorff Distance (HD) between the volumes. Results: In total 73 out of 87 (84%) patients were discussed. Changes were recommended in 38 (52%) patients: 30 had ≥1 major change, eight had minor changes only. There were 99 amended volumes: The overall median JI, GMI and HD was 0.91 (interquartile range [IQR]=0.80–0.97), 0.06 (IQR = 0.02–0.18) and 0.42 cm (IQR = 0.20–1.17 cm) respectively. The nodal gross-tumour-volume (GTVn) and therapeutic high-dose nodal clinical-target-volume (CTVn) had the biggest magnitude of changes: The median JI, GMI and HD of GTVn was 0.89 (IQR = 0.44–0.95), 0.11 (IQR = 0.05–0.51), 3.71 cm (IQR = 0.31–6.93 cm); high-dose CTVn was 0.78 (IQR = 0.59–0.90), 0.20 (IQR = 0.07–0.31) and 3.28 cm (IQR = 1.22–6.18 cm) respectively. There was no observed difference in the quantitative indices of the 85 ‘major’ and 14 ‘minor’ volumes (p = 0.5). Conclusions: Routine head and neck radiologist input in radiotherapy peer review is feasible and can help avoid gross error in contouring. Advances in knowledge: The major and minor classification may benefit from differentiation with quantitative indices but requires correlation from clinical outcomes.


Electricity ◽  
2021 ◽  
Vol 2 (4) ◽  
pp. 423-438
Author(s):  
Rodrigo D. Trevizan ◽  
Cody Ruben ◽  
Aquiles Rossoni ◽  
Surya C. Dhulipala ◽  
Arturo Bretas ◽  
...  

Simultaneous real-time monitoring of measurement and parameter gross errors poses a great challenge to distribution system state estimation due to usually low measurement redundancy. This paper presents a gross error analysis framework, employing μPMUs to decouple the error analysis of measurements and parameters. When a recent measurement scan from SCADA RTUs and smart meters is available, gross error analysis of measurements is performed as a post-processing step of non-linear DSSE (NLSE). In between scans of SCADA and AMI measurements, a linear state estimator (LSE) using μPMU measurements and linearized SCADA and AMI measurements is used to detect parameter data changes caused by the operation of Volt/Var controls. For every execution of the LSE, the variance of the unsynchronized measurements is updated according to the uncertainty introduced by load dynamics, which are modeled as an Ornstein–Uhlenbeck random process. The update of variance of unsynchronized measurements can avoid the wrong detection of errors and can model the trustworthiness of outdated or obsolete data. When new SCADA and AMI measurements arrive, the LSE provides added redundancy to the NLSE through synthetic measurements. The presented framework was tested on a 13-bus test system. Test results highlight that the LSE and NLSE processes successfully work together to analyze bad data for both measurements and parameters.


2021 ◽  
Author(s):  
Adrián Ledroz ◽  
Barry Smart ◽  
Navin Maharaj

Abstract There are several reasons for obtaining gyroscopic surveys in directional wells. A gyro measurement provides reliable data when magnetic measurements are affected by interference from nearby wells; it can significantly reduce the positional uncertainty and provides redundancy data and gross error checks on MWD surveys. However, the complexity and extent of the necessary testing and handling of the tools have prevented widespread adoption, and gyro services have remained limited to "must-have" scenarios. The benefits of solid-state technology and new developments in communication capabilities are gradually changing the way of thinking related to wellbore positioning. The first gyro while drilling tools were introduced in the early 2000s and were based on spinning mass gyro technology. These gyros can be very accurate with low noise levels and drift; however, they are fragile, built with moving parts, and susceptible to calibration shifts. Extensive pre-job testing, validation during job execution and post-job analysis are required to obtain reliable directional survey data. Solid-state gyros have reached the same, or even better, levels of noise and drift without the fragility of their spinning mass counterpart. With different degrees of complexity and coverage, remote operations have been used for many years in the oilfield. Still, the adoption of monitoring gyro services with no personnel at the rig-site has been minimal due to the described complexity of the system and the small volume of jobs that prevented investment and the development of the necessary processes. Solid-state gyro technology addresses these challenges More than 30 gyro-while-drilling jobs have successfully run remotely. The changes in operational procedures forced by the Covid-19 pandemic accelerated the demand for uncrewed operations, and solid-state gyro technology has shown high reliability with zero non-productive time due to tool failures or shifts in the calibration. This new way of working also results in a significant reduction in the environmental impact of the operations as all travel related to personnel and equipment has been reduced and battery life extended by up to 10. Several scenarios related to wellbore positioning and directional drilling greatly benefit by having a gyro in the BHA. The gyro technology and the workflow described in this paper show how this can be done reliably, maintaining the quality of the survey data and reducing the environmental impact.


2021 ◽  
Vol 11 (17) ◽  
pp. 8170
Author(s):  
Shenglei Xu ◽  
Yunjia Wang ◽  
Meng Sun ◽  
Minghao Si ◽  
Hongji Cao

Indoor position technologies have attracted the attention of many researchers. To provide a real-time indoor position system with high precision and stability is necessary under many circumstances. In a real-time position scenario, gross errors of the Bluetooth low energy (BLE) fingerprint method are more easily occurring and the heading angle of the pedestrian will drift without acceleration and magnetic field compensation. A real-time BLE/pedestrian dead-reckoning (PDR) integrated system by using an improved robust filter has been proposed. In the PDR method, the improved Mahony complementary filter based on the pedestrian motion states is adopted to estimate the heading angle reducing the drift error. Then, an improved robust filter is utilized to detect and restrain the gross error of the BLE fingerprint method. The robust filter detected the gross error at different granularity by constructing a robust vector changing the observation covariance matrix of the extended Kalman filter (EKF) adaptively when the application is running. Several experiments are conducted in the true position scenario. The mean position accuracy obtained by the proposed method in the experiment is 0.844 m and RMSE is 0.74 m. Compared with the classic EKF, these two values are increased by 38% and 18%, respectively. The results show that the improved filter can avoid the gross error in the BLE method and provide high precision and scalability in indoor position service.


Author(s):  
Suhaib Anwar

Abstract: In the analysis of structures subjected to earthquake forces, it is usually assumed that the structure is fixed at the base to simplify the mathematical problem. This assumption leads to gross error in assessment of overall response under dynamic loads. The interaction phenomenon is principally affected by the mechanism of energy exchanged between the soil and the structure during an earthquake. In the present investigation, a multi-storied building which is located in Amaravati is chosen as the study area which consists of different types of soil / rock profiles at different locations. Many high rise structures are expected in future in the new city. Earthquake analysis is carried out when similar structure rests on different types of soils and the results of fundamental time periods, base shears and displacements are compared with the results obtained from fixed base condition.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Robert Duchnowski ◽  
Patrycja Wyszkowska

Abstract The main objective of the empirical influence function (EIF) is to describe how estimates behave when an observation set is affected by gross errors. Unlike the influence function, which represents the estimation method’s general properties, EIF can provide valuable information about applying different methods to a particular network. The chosen example allows us to compare different robust methods. The paper focuses on non-standard applications of EIF, for example, in assuming steering parameter of robust methods (usually related to the assumed interval for acceptable observation errors). The paper shows that commonly used values do not always work well, and EIFs might help choose appropriate values, guaranteeing the estimation process’s robustness. The most important new application of EIFs concerns the detection and assessment of a single gross error. The blinded experiments proved that such an approach is correct and can be an alternative to classic statistical tests for outlier detection.


2021 ◽  
Author(s):  
Jun Meng ◽  
Gangyi Ding ◽  
Laiyang Liu ◽  
Zheng Guan

Abstract In this study, a data-driven regional carbon emissions prediction model is proposed. The Grubbs criterion is used to eliminate the gross error data in carbon emissions sensor data. Then, according to the nearby valid data, the exponential smoothing method is used to interpolate the missing values to generate the continuous sequence for model training. Finally, the GRU network, which is a deep learning method, is used to process these sequential standardized data to obtain the prediction model. In this paper, the wireless carbon sensor network monitoring data set from August 2012 to April 2014 trained and evaluated the prediction model, and compared with the prediction model based on BP network. The experimental results prove the feasibility of the research method and related technical approaches, and the accuracy of the prediction model, which provides a method basis for the nowcasting of carbon emissions and other greenhouse gas environmental data.


Sign in / Sign up

Export Citation Format

Share Document