scholarly journals DEVELOPMENT AND APPLICATION OF THE KEY TECHNOLOGIES FOR THE QUALITY CONTROL AND INSPECTION OF NATIONAL GEOGRAPHICAL CONDITIONS SURVEY PRODUCTS

Author(s):  
Y. Zhao ◽  
L. Zhang ◽  
W. Ma ◽  
P. Zhang ◽  
T. Zhao

The First National Geographical Condition Survey is a predecessor task to dynamically master basic situations of the nature, ecology and human activities on the earth’s surface and it is the brand-new mapping geographic information engineering. In order to ensure comprehensive, real and accurate survey results and achieve the quality management target which the qualified rate is 100 % and the yield is more than 80 %, it is necessary to carry out the quality control and result inspection for national geographical conditions survey on a national scale. To ensure that achievement quality meets quality target requirements, this paper develops the key technology method of “five-in-one” quality control that is constituted by “quality control system of national geographical condition survey, quality inspection technology system, quality evaluation system, quality inspection information management system and national linked quality control institutions” by aiming at large scale, wide coverage range, more undertaking units, more management levels, technical updating, more production process and obvious regional differences in the national geographical condition survey and combining with novel achievement manifestation, complicated dependency, more special reference data, and large data size. This project fully considering the domestic and foreign related research results and production practice experience, combined with the technology development and the needs of the production, it stipulates the inspection methods and technical requirements of each stage in the quality inspection of the geographical condition survey results, and extends the traditional inspection and acceptance technology, and solves the key technologies that are badly needed in the first national geographic survey.

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 135582-135594 ◽  
Author(s):  
Cuiying Zhou ◽  
Zichun Du ◽  
Li Gao ◽  
Weihua Ming ◽  
Jinwu Ouyang ◽  
...  

2018 ◽  
Vol 26 (2) ◽  
pp. 278-289
Author(s):  
Kristen Cibelli Hibben ◽  
Beth-Ellen Pennell ◽  
Lesli Scott

Purpose At the invitation of the Programme for the International Assessment of Adult Competencies (PIAAC), this paper aims to examine advances in survey interviewer monitoring and make recommendations on minimizing the effect of interviewers on survey results. Design/methodology/approach The authors first provide an overview of the most recent literature on interviewer effects, quality assurance and quality control. Here, they draw upon recent publications such as the cross-cultural survey guidelines (www.ccsg.isr.umich.edu) and newly published or in-press material specifically addressing these issues in multicultural, multinational and multiregional (3MC) contexts. Findings The authors discuss trends and innovations in quality assurance and quality control in 3MC studies and draw upon examples from international surveys that are using cutting-edge and innovative approaches to monitor interviewer behavior and minimize interviewer effects. Originality/value With a view to continuous quality improvement, the authors conclude with concrete recommendations for PIAAC to consider for the next cycle. Many of the recommendations have general relevance for other large-scale cross-national surveys.


2014 ◽  
Vol 13s7 ◽  
pp. CIN.S16346 ◽  
Author(s):  
Scott White ◽  
Karoline Laske ◽  
Marij J.P. Welters ◽  
Nicole Bidmon ◽  
Sjoerd H. Van Der Burg ◽  
...  

With the recent results of promising cancer vaccines and immunotherapy 1 – 5 , immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization 21 – 23 , as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ksenia Lisova ◽  
Jia Wang ◽  
Tibor Jacob Hajagos ◽  
Yingqing Lu ◽  
Alexander Hsiao ◽  
...  

AbstractCurrent equipment and methods for preparation of radiopharmaceuticals for positron emission tomography (PET) are expensive and best suited for large-scale multi-doses batches. Microfluidic radiosynthesizers have been shown to provide an economic approach to synthesize these compounds in smaller quantities, but can also be scaled to clinically-relevant levels. Batch microfluidic approaches, in particular, offer significant reduction in system size and reagent consumption. Here we show a simple and rapid technique to concentrate the radioisotope, prior to synthesis in a droplet-based radiosynthesizer, enabling production of clinically-relevant batches of [18F]FET and [18F]FBB. The synthesis was carried out with an automated synthesizer platform based on a disposable Teflon-silicon surface-tension trap chip. Up to 0.1 mL (4 GBq) of radioactivity was used per synthesis by drying cyclotron-produced aqueous [18F]fluoride in small increments directly inside the reaction site. Precursor solution (10 µL) was added to the dried [18F]fluoride, the reaction chip was heated for 5 min to perform radiofluorination, and then a deprotection step was performed with addition of acid solution and heating. The product was recovered in 80 µL volume and transferred to analytical HPLC for purification. Purified product was formulated via evaporation and resuspension or a micro-SPE formulation system. Quality control testing was performed on 3 sequential batches of each tracer. The method afforded production of up to 0.8 GBq of [18F]FET and [18F]FBB. Each production was completed within an hour. All batches passed quality control testing, confirming suitability for human use. In summary, we present a simple and efficient synthesis of clinically-relevant batches of [18F]FET and [18F]FBB using a microfluidic radiosynthesizer. This work demonstrates that the droplet-based micro-radiosynthesizer has a potential for batch-on-demand synthesis of 18F-labeled radiopharmaceuticals for human use.


1966 ◽  
Vol 05 (02) ◽  
pp. 67-74 ◽  
Author(s):  
W. I. Lourie ◽  
W. Haenszeland

Quality control of data collected in the United States by the Cancer End Results Program utilizing punchcards prepared by participating registries in accordance with a Uniform Punchcard Code is discussed. Existing arrangements decentralize responsibility for editing and related data processing to the local registries with centralization of tabulating and statistical services in the End Results Section, National Cancer Institute. The most recent deck of punchcards represented over 600,000 cancer patients; approximately 50,000 newly diagnosed cases are added annually.Mechanical editing and inspection of punchcards and field audits are the principal tools for quality control. Mechanical editing of the punchcards includes testing for blank entries and detection of in-admissable or inconsistent codes. Highly improbable codes are subjected to special scrutiny. Field audits include the drawing of a 1-10 percent random sample of punchcards submitted by a registry; the charts are .then reabstracted and recoded by a NCI staff member and differences between the punchcard and the results of independent review are noted.


Author(s):  
Lior Shamir

Abstract Several recent observations using large data sets of galaxies showed non-random distribution of the spin directions of spiral galaxies, even when the galaxies are too far from each other to have gravitational interaction. Here, a data set of $\sim8.7\cdot10^3$ spiral galaxies imaged by Hubble Space Telescope (HST) is used to test and profile a possible asymmetry between galaxy spin directions. The asymmetry between galaxies with opposite spin directions is compared to the asymmetry of galaxies from the Sloan Digital Sky Survey. The two data sets contain different galaxies at different redshift ranges, and each data set was annotated using a different annotation method. The results show that both data sets show a similar asymmetry in the COSMOS field, which is covered by both telescopes. Fitting the asymmetry of the galaxies to cosine dependence shows a dipole axis with probabilities of $\sim2.8\sigma$ and $\sim7.38\sigma$ in HST and SDSS, respectively. The most likely dipole axis identified in the HST galaxies is at $(\alpha=78^{\rm o},\delta=47^{\rm o})$ and is well within the $1\sigma$ error range compared to the location of the most likely dipole axis in the SDSS galaxies with $z>0.15$ , identified at $(\alpha=71^{\rm o},\delta=61^{\rm o})$ .


2021 ◽  
pp. 1-11
Author(s):  
Song Gang ◽  
Wang Xiaoming ◽  
Wu Junfeng ◽  
Li Shufang ◽  
Liu Zhuowen ◽  
...  

In view of the production quality management of filter rods in the manufacturing and execution process of cigarette enterprises, this paper analyzes the necessity of implementing the manufacturing execution system (MES) in the production process of filter rods. In this paper, the filter rod quality system of cigarette enterprise based on MES is fully studied, and the constructive information management system demand analysis, cigarette quality control process, system function module design, implementation and test effect are given. This paper utilizes the Fuzzy analytic hierarchy process to find the optimal system for processing the manufacturing of cigarette. The implementation of MSE based filter rod quality information management system for a cigarette enterprise ensures the quality control in the cigarette production process. Through visualization, real-time and dynamic way, the information management of cigarette production is completed, which greatly improves the quality of cigarette enterprise manufacturing process.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
Harshi Weerakoon ◽  
Jeremy Potriquet ◽  
Alok K. Shah ◽  
Sarah Reed ◽  
Buddhika Jayakody ◽  
...  

AbstractData independent analysis (DIA) exemplified by sequential window acquisition of all theoretical mass spectra (SWATH-MS) provides robust quantitative proteomics data, but the lack of a public primary human T-cell spectral library is a current resource gap. Here, we report the generation of a high-quality spectral library containing data for 4,833 distinct proteins from human T-cells across genetically unrelated donors, covering ~24% proteins of the UniProt/SwissProt reviewed human proteome. SWATH-MS analysis of 18 primary T-cell samples using the new human T-cell spectral library reliably identified and quantified 2,850 proteins at 1% false discovery rate (FDR). In comparison, the larger Pan-human spectral library identified and quantified 2,794 T-cell proteins in the same dataset. As the libraries identified an overlapping set of proteins, combining the two libraries resulted in quantification of 4,078 human T-cell proteins. Collectively, this large data archive will be a useful public resource for human T-cell proteomic studies. The human T-cell library is available at SWATHAtlas and the data are available via ProteomeXchange (PXD019446 and PXD019542) and PeptideAtlas (PASS01587).


Sign in / Sign up

Export Citation Format

Share Document