Visual Quality Control of Planar Working Pieces: A Curve Based Approach Using Prototype Fitting

Author(s):  
Georg Maier ◽  
Andreas Schindler
2019 ◽  
Vol 1335 ◽  
pp. 012013
Author(s):  
Simon Reich ◽  
Florian Teich ◽  
Minija Tamosiunaite ◽  
Florentin Wörgötter ◽  
Tatyana Ivanovska

Author(s):  
Pradeep Reddy Raamana ◽  
Athena Theyers ◽  
Tharushan Selliah ◽  
Piali Bhati ◽  
Stephen R. Arnott ◽  
...  

AbstractQuality control of morphometric neuroimaging data is essential to improve reproducibility. Owing to the complexity of neuroimaging data and subsequently the interpretation of their results, visual inspection by trained raters is the most reliable way to perform quality control. Here, we present a protocol for visual quality control of the anatomical accuracy of FreeSurfer parcellations, based on an easy to use open source tool called VisualQC. We comprehensively evaluate its utility in terms of error detection rate and inter-rater reliability on two large multi-site datasets, and discuss site differences in error patterns. This evaluation shows that VisualQC is a practically viable protocol for community adoption.


2019 ◽  
Author(s):  
Yassine Benhajali ◽  
AmanPreet Badhwar ◽  
Helen Spiers ◽  
Sebastian Urchs ◽  
Jonathan Armoza ◽  
...  

Automatic alignment of brain anatomy in a standard space is a key step when processing magnetic resonance imaging for group analyses. Such brain registration is prone to failure, and the results are therefore typically reviewed visually to ensure quality. There is however no standard, validated protocol available to perform this visual quality control. We propose here a standardized QC protocol for brain registration, with minimal training overhead and no required knowledge of brain anatomy. We validated the reliability of three-level QC ratings (OK, Maybe, Fail) across different raters. Nine experts each rated N=100 validation images, and reached moderate to good agreement (Kappa from 0.4 to 0.68, average of 0.54±0.08), with the highest agreement for “Fail” images (Dice from 0.67 to 0.93, average of 0.8±0.06). We then recruited volunteers through the Zooniverse crowdsourcing platform, and extracted a consensus panel rating for both the Zooniverse raters (N=41) and the expert raters. The agreement between expert and Zooniverse panels was high (kappa=0.76), demonstrating the feasibility of crowdsourcing QC of brain registration. Our brain registration QC protocol will help standardize QC practices across laboratories, improve the consistency of reporting of QC in publications, and will open the way for QC assessment of large datasets which could be used to train automated QC systems.


Author(s):  
Loek Tonnaer ◽  
Jiapeng Li ◽  
Vladimir Osin ◽  
Mike Holenderski ◽  
Vlado Menkovski

Sign in / Sign up

Export Citation Format

Share Document