data error
Recently Published Documents


TOTAL DOCUMENTS

258
(FIVE YEARS 62)

H-INDEX

20
(FIVE YEARS 3)

2022 ◽  
Vol 70 (1) ◽  
pp. 38-52
Author(s):  
Frank Schiller ◽  
Dan Judd ◽  
Peerasan Supavatanakul ◽  
Tina Hardt ◽  
Felix Wieczorek

Abstract A fundamental measure of safety communication is the residual error probability, i. e., the probability of undetected errors. For the detection of data errors, typically a Cyclic Redundancy Check (CRC) is applied, and the resulting residual error probability is determined based on the Binary Symmetric Channel (BSC) model. The use of this model had been questioned since several error types cannot be sufficiently described. Especially the increasing introduction of security algorithms into underlying communication layers requires a more adequate channel model. This paper introduces an enhanced model that extends the list of considered data error types by combining the BSC model with a Uniformly Distributed Segments (UDS) model. Although models beyond BSC are applied, the hitherto method of the calculation of the residual error probability can be maintained.


2021 ◽  
Author(s):  
Indra Priyadharshini ◽  
Jasmine Gilda A ◽  
Sherin Glory J ◽  
Mukhil V

E-society – a financial and event management system, a web based application which leverage waterfall development model for managing the financial operations typically done in a house society and also provides the facility to create, organize and prioritize events and raise funds for the same. At present these details were maintained in a spreadsheet, and it has its own issues when comes to calculations, human introduced data error, missing required precession etc. Due to the manual maintenance of financial records, getting a spending report is become too tedious and difficult to extract in a given amount of time.This system is exclusively used by a small group or an organization allows people to keep track of the transaction between members of the society, admin and the workers working for that organization or society. By using this we can reduce the manual calculations and human errors while computation of expenditure. The system allows the retrieving and updating facilities to authorized persons.To bring in the transparency in expenses of the society’s funds, the application allows every user to generate a report to know about the expenses and funds collected between a given date range.


2021 ◽  
Vol 2133 (1) ◽  
pp. 012017
Author(s):  
Yanhong Bai ◽  
Yun Wu

Abstract This paper analyzes the influencing factors of data error in chemical analysis of iron and steel materials, including sample preparation factor, sample decomposition factor, analytical instrument factor, reagent factor, analysis method factor. The purpose is to reduce the error of data measurement results and improve the accuracy of data analysis results by studying the measures of eliminating instrument application error, doing a good job in reagent selection, appropriately increasing the number of experiments, strictly following the operation specifications and reasonably using the allowable deviation table.


Diagnostics ◽  
2021 ◽  
Vol 11 (9) ◽  
pp. 1599
Author(s):  
Wen-Fan Chen ◽  
Hsin-You Ou ◽  
Cheng-Tang Pan ◽  
Chien-Chang Liao ◽  
Wen Huang ◽  
...  

Due to the fact that previous studies have rarely investigated the recognition rate discrepancy and pathology data error when applied to different databases, the purpose of this study is to investigate the improvement of recognition rate via deep learning-based liver lesion segmentation with the incorporation of hospital data. The recognition model used in this study is H-DenseUNet, which is applied to the segmentation of the liver and lesions, and a mixture of 2D/3D Hybrid-DenseUNet is used to reduce the recognition time and system memory requirements. Differences in recognition results were determined by comparing the training files of the standard LiTS competition data set with the training set after mixing in an additional 30 patients. The average error value of 9.6% was obtained by comparing the data discrepancy between the actual pathology data and the pathology data after the analysis of the identified images imported from Kaohsiung Chang Gung Memorial Hospital. The average error rate of the recognition output after mixing the LiTS database with hospital data for training was 1%. In the recognition part, the Dice coefficient was 0.52 after training 50 epochs using the standard LiTS database, while the Dice coefficient was increased to 0.61 after adding 30 hospital data to the training. After importing 3D Slice and ITK-Snap software, a 3D image of the lesion and liver segmentation can be developed. It is hoped that this method could be used to stimulate more research in addition to the general public standard database in the future, as well as to study the applicability of hospital data and improve the generality of the database.


2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Meng Wang ◽  
Guizhen Lu

The contrast source inversion (CSI) is an effective method for solving microwave imaging problems which is widely utilized. The core of the CSI is to change the conventional inverse scattering problem into an optimization problem. The two items in the objective function describe the state error and data error, respectively. As it is all known, there is almost no complete performance comparison based on Fresnel data for the CSI and its related improved algorithms. In addition, the performance of the algorithm under different weights was not analyzed before and the convergence speed of original CSI is slow. Firstly, this paper compares the performance of traditional CSI and its improved algorithms from three aspects of qualitative imaging effect, convergence speed, and objective function value based on Fresnel data. Secondly, the influence of the state error and the data error under different weights on the convergence rate and the objective function value are studied. For the limitation of a slower convergence rate, the CSI with weights (W-CSI), the CSI with dynamic reduction factor (CSI-DRF), and its related algorithms, which can get better convergence rate compared with their relative original algorithms, are proposed. Eventually, the future research work is prospected.


2021 ◽  
Vol 3 (4) ◽  
pp. 308-319
Author(s):  
Mohammad Nurwahid

Geometry is a branch of mathematics and is one of the subject matter in mathematics in elementary schools. Measurement of area is one of the fundamental topics in mathematics. In fact, with regard to broad measurement skills, most of the students have difficulty in describing the problem. the mistakes that students make in answering a problem or problem need to be identified, the information obtained about errors in answering math problems can be used in improving mathematics teaching and learning activities. The purpose of this study was to identify errors made in solving the broad problem of combining data shapes based on the Watson error category. This type of research is descriptive qualitative research. The subjects used were 6 4th grade students of MI Nurul Huda with three different ability criteria. The selection is based on the advice of the math teacher and the daily test scores of the previous material. The results of the study show that the errors made by the research are missing conclusion errors, incorrect data errors, incorrect procedures, missing data error, and skill hierarchy problem


2021 ◽  
Author(s):  
Maike Offer ◽  
Riccardo Scandroglio ◽  
Daniel Draebing ◽  
Michael Krautblatter

<p>Warming of permafrost in steep rock walls decreases their mechanical stability and could triggers rockfalls and rockslides. However, the direct link between climate change and permafrost degradation is seldom quantified with precise monitoring techniques and long-term time series. Where boreholes are not possible, laboratory-calibrated Electrical Resistivity Tomography (ERT) is presumably the most accurate quantitative permafrost monitoring technique providing a sensitive record for frozen vs. unfrozen bedrock. Recently, 4D inversions allow also quantification of frozen bedrock extension and of its changes with time (Scandroglio et al., in review).</p><p>In this study we (i) evaluate the influence of the inversion parameters on the volumes and (ii) connect the volumetric changes with measured mechanical consequences.</p><p>The ERT time-serie was recorded between 2006 and 2019 in steep bedrock at the permafrost affected Steintälli Ridge (3100 m asl). Accurately positioned 205 drilled-in steel electrodes in 5 parallel lines across the rock ridge have been repeatedly measured with similar hardware and are compared to laboratory temperature-resistivity (T–ρ) calibration of water-saturated samples from the field. Inversions were conducted using the open-source software BERT for the first time with the aim of estimating permafrost volumetric changes over a decade.</p><p>(i) Here we present a sensitivity analysis of the outcomes by testing various plausible inversion set-ups. Results are computed with different input data filters, data error model, regularization parameter (λ), model roughness reweighting and time-lapse constraints. The model with the largest permafrost degradation was obtained without any time-lapse constraints, whereas constraining each model with the prior measurement results in the smallest degradation. Important changes are also connected to the data error estimation, while other setting seems to have less influence on the frozen volume. All inversions confirmed a drastic permafrost degradation in the last 13 years with an average reduction of 3.900±600 m<sup>3</sup> (60±10% of the starting volume), well in agreement with the measured air temperatures increase.</p><p>(ii) Average bedrock thawing rate of ~300 m<sup>3</sup>/a is expected to significantly influence the stability of the ridge. Resistivity changes are especially evident on the south-west exposed side and in the core of the ridge and are here connected to deformations measured with tape extensometer, in order to precisely estimate the mechanical consequences of bedrock warming.</p><p>In summary, the strong degradation of permafrost in the last decade it’s here confirmed since inversion settings only have minor influence on volume quantification. Internal thermal dynamics need correlation with measured external deformation for a correct interpretation of stability consequences. These results are a fundamental benchmark for evaluating mountain permafrost degradation in relation to climate change and demonstrate the key role of temperature-calibrated 4D ERT.</p><p> </p><p>Reference:</p><p>Scandroglio, R. et al. (in review) ‘4D-Quantification of alpine permafrost degradation in steep rock walls using a laboratory-calibrated ERT approach’, <em>Near Surface Geophysics</em>.</p>


Sign in / Sign up

Export Citation Format

Share Document