Proficiency testing by interlaboratory comparisons

2015 ◽  
2001 ◽  
Vol 6 (6) ◽  
pp. 244-251 ◽  
Author(s):  
N. P. Boley ◽  
Paul De Bièvre ◽  
Philip D. P. Taylor ◽  
Adam Uldall

2010 ◽  
pp. 457-466
Author(s):  
Oswin Kerkhof ◽  
Michel van Son ◽  
Adriaan M. H. Van Der Veen

Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3660
Author(s):  
Wladmir Araujo Chapetta ◽  
Jailton Santos das Neves ◽  
Raphael Carlos Santos Machado

Modern sensors deployed in most Industry 4.0 applications are intelligent, meaning that they present sophisticated behavior, usually due to embedded software, and network connectivity capabilities. For that reason, the task of calibrating an intelligent sensor currently involves more than measuring physical quantities. As the behavior of modern sensors depends on embedded software, comprehensive assessments of such sensors necessarily demands the analysis of their embedded software. On the other hand, interlaboratory comparisons are comparative analyses of a body of labs involved in such assessments. While interlaboratory comparison is a well-established practice in fields related to physical, chemical and biological sciences, it is a recent challenge for software assessment. Establishing quantitative metrics to compare the performance of software analysis and testing accredited labs is no trivial task. Software is intangible and its requirements accommodate some ambiguity, inconsistency or information loss. Besides, software testing and analysis are highly human-dependent activities. In the present work, we investigate whether performing interlaboratory comparisons for software assessment by using quantitative performance measurement is feasible. The proposal was to evaluate the competence in software code analysis activities of each lab by using two quantitative metrics (code coverage and mutation score). Our results demonstrate the feasibility of establishing quantitative comparisons among software analysis and testing accredited laboratories. One of these rounds was registered as formal proficiency testing in the database—the first registered proficiency testing focused on code analysis.


Sign in / Sign up

Export Citation Format

Share Document