scholarly journals Ensuring Quality of Large Scale Industrial Process Collections: Experiences from a Case Study

Author(s):  
Merethe Heggset ◽  
John Krogstie ◽  
Harald Wesenberg
2018 ◽  
Vol 64 (247) ◽  
pp. 811-821 ◽  
Author(s):  
STEFAN LIPPL ◽  
SAURABH VIJAY ◽  
MATTHIAS BRAUN

ABSTRACTDespite their importance for mass-balance estimates and the progress in techniques based on optical and thermal satellite imagery, the mapping of debris-covered glacier boundaries remains a challenging task. Manual corrections hamper regular updates. In this study, we present an automatic approach to delineate glacier outlines using interferometrically derived synthetic aperture radar (InSAR) coherence, slope and morphological operations. InSAR coherence detects the temporally decorrelated surface (e.g. glacial extent) irrespective of its surface type and separates it from the highly coherent surrounding areas. We tested the impact of different processing settings, for example resolution, coherence window size and topographic phase removal, on the quality of the generated outlines. We found minor influence of the topographic phase, but a combination of strong multi-looking during interferogram generation and additional averaging during coherence estimation strongly deteriorated the coherence at the glacier edges. We analysed the performance of X-, C- and L- band radar data. The C-band Sentinel-1 data outlined the glacier boundary with the least misclassifications and a type II error of 0.47% compared with Global Land Ice Measurements from Space inventory data. Our study shows the potential of the Sentinel-1 mission together with our automatic processing chain to provide regular updates for land-terminating glaciers on a large scale.


Author(s):  
Wagner Al Alam ◽  
Francisco Carvalho Junior

The efforts to make cloud computing suitable for the requirements of HPC applications have motivated us to design HPC Shelf, a cloud computing platform of services for building and deploying parallel computing systems for large-scale parallel processing. We introduce Alite, the system of contextual contracts of HPC Shelf, aimed at selecting component implementations according to requirements of applications, features of targeting parallel computing platforms (e.g. clusters), QoS (Quality-of-Service) properties and cost restrictions. It is evaluated through a small-scale case study employing a componentbased framework for matrix-multiplication based on the BLAS library.


2020 ◽  
Vol 11 (3) ◽  
pp. 49-65
Author(s):  
Emily Ng K.L.

The resources and time constraints of assessing large classes are always weighed up against the validity, reliability, and learning outcomes of the assessment tasks. With the digital revolution in the 21st Century, educators can benefit from computer technology to carry out a large-scale assessment in higher education more efficiently. In this article, an in-depth case study of a nursing school that has integrated online assessment initiatives into their nursing program. To assess a large class of first-year nursing students, a series of non-proctored multiple-choice online quizzes are administered using a learning management system. Validity and reliability are commonly used to measure the quality of an assessment. The aim of the present article to analyze these non-proctored multiple-choice online assessments in the context of content validity and reliability. We use this case study to examine online assessment in nursing education, exploring the benefits and challenges. We conclude that instructors have to determine how to use the full potential of online assessment as well as ensure validity and reliability.


Author(s):  
Emily Ng K.L.

The resources and time constraints of assessing large classes are always weighed up against the validity, reliability, and learning outcomes of the assessment tasks. With the digital revolution in the 21st Century, educators can benefit from computer technology to carry out a large-scale assessment in higher education more efficiently. In this article, an in-depth case study of a nursing school that has integrated online assessment initiatives into their nursing program. To assess a large class of first-year nursing students, a series of non-proctored multiple-choice online quizzes are administered using a learning management system. Validity and reliability are commonly used to measure the quality of an assessment. The aim of the present article to analyze these non-proctored multiple-choice online assessments in the context of content validity and reliability. We use this case study to examine online assessment in nursing education, exploring the benefits and challenges. We conclude that instructors have to determine how to use the full potential of online assessment as well as ensure validity and reliability.


2001 ◽  
Vol 2 (4) ◽  
pp. 196-206 ◽  
Author(s):  
Christian Blaschke ◽  
Alfonso Valencia

The Dictionary of Interacting Proteins(DIP) (Xenarioset al., 2000) is a large repository of protein interactions: its March 2000 release included 2379 protein pairs whose interactions have been detected by experimental methods. Even if many of these correspond to poorly characterized proteins, the result of massive yeast two-hybrid screenings, as many as 851 correspond to interactions detected using direct biochemical methods.We used information retrieval technology to search automatically for sentences in Medline abstracts that support these 851 DIP interactions. Surprisingly, we found correspondence between DIP protein pairs and Medline sentences describing their interactions in only 30% of the cases. This low coverage has interesting consequences regarding the quality of annotations (references) introduced in the database and the limitations of the application of information extraction (IE) technology to Molecular Biology. It is clear that the limitation of analyzing abstracts rather than full papers and the lack of standard protein names are difficulties of considerably more importance than the limitations of the IE methodology employed. A positive finding is the capacity of the IE system to identify new relations between proteins, even in a set of proteins previously characterized by human experts. These identifications are made with a considerable degree of precision.This is, to our knowledge, the first large scale assessment of IE capacity to detect previously known interactions: we thus propose the use of the DIP data set as a biological reference to benchmark IE systems.


2021 ◽  
Vol 13 (16) ◽  
pp. 8819
Author(s):  
Luca Sbrogiò ◽  
Carlotta Bevilacqua ◽  
Gabriele De Sordi ◽  
Ivano Michelotto ◽  
Marco Sbrogiò ◽  
...  

Two-thirds of the Italian building stock was already built by the 1970s, largely according to gravity load design and using economical materials and poor workmanship. Currently, the structures, fixtures, and fittings of these buildings have reached the end of their service life, and they require both an assessment and an update to meet new standards and new needs. As an example of a common type, this article deals with the assessment of the present state and the proposal of an integrated structural and architectural intervention on an existing brick masonry mid-rise apartment building in the suburbs of Venice, Northern Italy. The structural analysis highlights a moderate vulnerability, despite the low seismic hazard, and the energy analysis indicates that the highest management costs are due to heating and sanitary uses. Low-impact strategies are preferred for each aspect of the required interventions. Their costs are counterbalanced by (a) the reduction to a fifth of the present management costs; (b) a 20% average increase in the economic value of the flats; and (c) a favorable tax regime at the national level. Transformed into parametric values, also useful for large scale analyses, these costs resulted in a sustainable monthly instalment from the owners, who may also benefit from the increased quality of the place where they live.


2018 ◽  
Vol 10 (9) ◽  
pp. 3225 ◽  
Author(s):  
Tadayoshi Nakashima ◽  
Shigeyuki Okada

In the aftermath of the 1995 Kobe Earthquake, a large-scale effort towards reconstruction of houses damaged by the quake was required. This led to increased mortgage, thereby financially plaguing a number of earthquake victims and inhibiting their long-term sustainability and self-supported recovery. The current framework of housing reconstruction assistance provided by the Japanese government does not account for regional disparities in cost and other socioeconomic factors. This study proposes a technique for estimating the cost of reconstructing household units damaged in an earthquake by considering the effects of construction methods influenced by regional climatic zones. The financial constraints on rebuilding resources have been estimated by considering the annual regional income and household savings, as determined by social factors and employment opportunities. The susceptibility of regions to the occurrence of earthquakes has also been factored in the calculation of recovery costs. Together, these factors are used to provide a more complete picture of economic costs associated with earthquake recovery in different regions of Japan, thereby revealing large disparities in the difficulty and financial burden involved in the reconstruction of household units. Results of this study could be used to develop a robust system for earthquake-recovery assistance that accounts for differences in recovery costs between different regions, thereby improving the speed and quality of post-earthquake recovery.


2014 ◽  
Vol 18 (4) ◽  
pp. 1265-1272 ◽  
Author(s):  
L. Chen ◽  
Y. Zhong ◽  
G. Wei ◽  
Z. Shen

Abstract. The identification of priority management areas (PMAs) is essential for the control of non-point-source (NPS) pollution, especially for a large-scale watershed. However, previous studies have typically focused on small-scale catchments adjacent to specific assessment points; thus, the interactions between multiple river points remain poorly understood. In this study, a multiple-assessment-point PMA (MAP-PMA) framework was proposed by integrating the upstream sources and the downstream transport aspects of NPS pollution. Daning River watershed was taken as a case study in this paper, which has demonstrated that the integration of the upstream input changes was vital for the final PMAs map, especially for downstream areas. Contrary to conventional wisdom, this research recommended that the NPS pollutants could be best controlled among the upstream high-level PMAs when protecting the water quality of the entire watershed. The MAP-PMA framework provided a more cost-effective tool for the establishment of conservation practices, especially for a large-scale watershed.


Author(s):  
Khuyagbaatar Batsuren ◽  
Gábor Bella ◽  
Fausto Giunchiglia

AbstractWe present CogNet, a large-scale, automatically-built database of sense-tagged cognates—words of common origin and meaning across languages. CogNet is continuously evolving: its current version contains over 8 million cognate pairs over 338 languages and 35 writing systems, with new releases already in preparation. The paper presents the algorithm and input resources used for its computation, an evaluation of the result, as well as a quantitative analysis of cognate data leading to novel insights on language diversity. Furthermore, as an example on the use of large-scale cross-lingual knowledge bases for improving the quality of multilingual applications, we present a case study on the use of CogNet for bilingual lexicon induction in the framework of cross-lingual transfer learning.


Sign in / Sign up

Export Citation Format

Share Document