Data Processing in Pathology Laboratories: The Phoenix System

Author(s):  
J. Abson ◽  
A. Prall ◽  
I. D. P. Wootton

This paper completes the description of the Phoenix system by outlining the additional programs necessary to maintain the data files in a satisfactory condition and prevent them from becoming overfilled. The standards of training required by the operating staff are discussed and an assessment is made of the system performance in terms of cost/benefit. This was achieved by observing the time spent by staff during a period when the throughput of work was accurately measured. From these figures it is possible to estimate the needs of another laboratory. Finally, the continued extension of the computer facilities into other pathology disciplines and the provision of terminals in the hospital is described.

2019 ◽  
Vol 214 ◽  
pp. 03019
Author(s):  
Catherine Biscarat ◽  
Tommaso Boccali ◽  
Daniele Bonacorsi ◽  
Concezio Bozzi ◽  
Davide Costanzo ◽  
...  

The increase in the scale of LHC computing expected for Run 3 and even more so for Run 4 (HL-LHC) over the next ten years will certainly require radical changes to the computing models and the data processing of the LHC experiments. Translating the requirements of the physics programmes into computing resource needs is a complicated process and subject to significant uncertainties. For this reason, WLCG has established a working group to develop methodologies and tools intended tocharacterise the LHC workloads, better understand their interaction with the computing infrastructure, calculate their cost in terms of resources and expenditure and assist experiments, sites and the WLCG project in the evaluation of their future choices. This working group started in November 2017 and has about 30 active participants representing experiments and sites. In this contribution we expose the activities, the results achieved and the future directions.


2017 ◽  
Vol 50 (3) ◽  
pp. 959-966 ◽  
Author(s):  
J. Filik ◽  
A. W. Ashton ◽  
P. C. Y. Chang ◽  
P. A. Chater ◽  
S. J. Day ◽  
...  

A software package for the calibration and processing of powder X-ray diffraction and small-angle X-ray scattering data is presented. It provides a multitude of data processing and visualization tools as well as a command-line scripting interface for on-the-fly processing and the incorporation of complex data treatment tasks. Customizable processing chains permit the execution of many data processing steps to convert a single image or a batch of raw two-dimensional data into meaningful data and one-dimensional diffractograms. The processed data files contain the full data provenance of each process applied to the data. The calibration routines can run automatically even for high energies and also for large detector tilt angles. Some of the functionalities are highlighted by specific use cases.


1980 ◽  
Vol 2 (1) ◽  
pp. 63-72 ◽  
Author(s):  
A.A. CLARKE ◽  
S.J. COLEMAN ◽  
A. PRALL ◽  
I.D.P. WOOTTON
Keyword(s):  

2019 ◽  
Vol 8 (1) ◽  
pp. 48-52
Author(s):  
ELVIANNA ◽  
Nurul Saepul ◽  
Doni Kristianto

This thesis report is prepared based on the results of the analysis of the ongoing data collection system as well as the results of designing a new system at PT. SINAR MUSTIKA BINTAN SPBU Km 19 East Bintan. The results of the analysis show that manual data processing causes several problems so that the system performance becomes less efficient. To fix the deficiencies that exist in the system that is being implemented, a new system is designed to increase work efficiency. The data processing application developed can help solve problems that have existed before. This application was built using Borland Delphi 7.0. With this new system and application, it is hoped that the work process will be more efficient and can be improved.


Author(s):  
Anisa Tri Wahyuni ◽  
Sapri Sapri ◽  
Dimas Aulia Tringgana

ABSTRAKPAUD Uswatun Hasanah Kota Bengkulu merupakan salah satu lembaga pendidikan yang menyelenggarakan Program TK, Kober, TPA, SPS. Selama ini proses pengolahan data di PAUD masih menggunakan cara konvensional yaitu dengan mengisi formulir pendaftaran calon peserta didik, pendataan calon peserta didik, guru, hingga pembayaran uang rutin SPP. Dengan proses pengolahan data tersebut, terdapat masalah yang muncul yaitu sering terjadi kehilangan arsip data. Sistem pengolahan data terpadu berbasis client server pada PAUD Uswatun Hasanah Bengkulu dibuat menggunakan bahasa pemrograman PHP dan Database MySQL. Aplikasi ini digunakan untuk membantu proses pengolahan data, sehingga mempermudah admin dalam proses pengarsipan dan pengambilan data. Kata Kunci : Client Server, PHP dan mySQL ABSTRACTPAUD Uswatun Hasanah Bengkulu City is one of the educational institutions that organizes TK, Kober, TPA, SPS Programs. During this time the data processing in PAUD still uses conventional methods, namely by filling in the registration forms of prospective students, data collection of prospective students, teachers, to routine payment of SPP money. With the data processing, there is a problem that arises that is often the loss of data files. Integrated client server based data processing system in Uswatun Hasanah PAUD Bengkulu was made using the PHP programming language and MySQL Database. This application is used to assist data processing, making it easier for admins in the process of archiving and retrieving data. Keywords: Client Server, PHP and MySQL


Minerals ◽  
2018 ◽  
Vol 8 (7) ◽  
pp. 310
Author(s):  
Karin Engström ◽  
Kim Esbensen

Variographic characterisation has been shown to be a powerful tool to assess the performance of process measurement systems, using existing process data. Variogram interpretation enables decomposition of variabilities stemming from the process and measurement system, respectively, allowing to determine if measurements are able to describe the true process variability with sufficient resolution. This study evaluated 14 critical sampling locations, covering a total of 34 separate measurement systems, along the full processing value chain at Luossavaara Kiirunavaara limited company (LKAB), Sweden. A majority of the variograms show low sill levels, indicating that many sub-processes are well controlled. Many also show low nugget effect, indicating satisfactory measurement systems. However, some notable exceptions were observed, pointing to systems in the need of improvement. Even if some of these were previously recognized internally at LKAB, the use of variographic characterisation provide objective and numerical evidence of measurement system performance. The study also showed some unexpected results, for example that slurry shark-fin and spear sampling show acceptable variogram characteristics for the present materials, despite the associated incorrect sampling errors. On the other hand, the results support previous conclusions indicating that manual sampling and cross belt hammer samplers are leading to unacceptably large sampling errors and should be abandoned. Such specific findings underline the strength of comprehensive empirical studies. Based on the present compilation of results, it is possible to conduct rational enquiry of all evaluated measurement systems, enabling objective prioritization of where improvement efforts will have the largest cost–benefit effect.


Proceedings ◽  
2019 ◽  
Vol 19 (1) ◽  
pp. 19
Author(s):  
Benjamin Arroquia-Cuadros ◽  
Ángel Marqués-Mateu ◽  
Laura Sebastiá ◽  
Pablo Fdez-Arroyabe

Biometeorology is the field that relates meteorological and climatic variables with humans, animals and the environment in order to be studied jointly with their geographical distribution. The wide variety of data sources and the highly specialised data formats are fundamental issues for users in this area. This paper presents some preliminary results and several underlying technologies used to create a system to manage spatial data. Some sources of information are presented as a basis for biometeorological studies, together with a procedure for downloading and transforming the datasets. The resulting maps and data files derived from this study are useful for data analysis in other scientific fields.


2014 ◽  
Vol 519-520 ◽  
pp. 1159-1163
Author(s):  
Hui Lian Han

To solve the problem of the incompatible and interoperable in complex system configured by different buses or transducers, the paper introduced the IEEE1451 standard protocol, the transducer electric data sheet specified in the standard was analyzed in detail. On the basis, a smart sensor module was designed based on FPGA, TEDS specified in IEEE1451.2 were designed, data of transducers were filled in and downloaded in graphic and programmable VB language. Adopting intelligent TEDS in IEEE1451.2 standard improves the system performance, cost-effective and versatility, making the complex system be developed more easily, data processing simple.


Sign in / Sign up

Export Citation Format

Share Document