Comparison of data extraction from standardized versus traditional narrative operative reports for database related research and quality control

Surgery ◽  
2007 ◽  
Vol 142 (3) ◽  
pp. 420-421 ◽  
Author(s):  
Robin S. McLeod
Author(s):  
Eric S Kilpatrick

Background Even when a laboratory analyte testing process is in control, routine quality control testing will fail with a frequency that can be predicted by the number of quality control levels used, the run frequency and the control rule employed. We explored whether simply counting the number of assay quality control run failures during a running week, and then objectively determining if there was an excess, could complement daily quality control processes in identifying an out-of-control assay. Methods Binomial statistics were used to determine the threshold number of quality control run failures in any rolling week which would statistically exceed that expected for a particular test. Power function graphs were used to establish error detection (Ped) and false rejection rates compared with popular control rules. Results Identifying quality control failures exceeding the weekly limit (QC FEWL) is a more powerful means of detecting smaller systematic (bias) errors than traditional daily control rules (12s, 13s or 13s/22s/R4s) and markedly superior in detecting smaller random (imprecision) errors while maintaining false identification rates below 2%. Error detection rates also exceeded those using a within- and between-run Westgard multirule (13s/22s/41s/10x). Conclusions Daily review of tests shown to statistically exceed their rolling week limit of expected quality control run failures is more powerful than traditional quality control tools at identifying potential systematic and random test errors and so offers a supplement to daily quality control practices that has no requirement for complex data extraction or manipulation.


2021 ◽  
Vol 12 ◽  
Author(s):  
Keke Wu ◽  
Shuangqi Fan ◽  
Linke Zou ◽  
Feifan Zhao ◽  
Shengming Ma ◽  
...  

Diseases caused by Flaviviridae have a wide global and economic impact due to high morbidity and mortality. Flaviviridae infection usually leads to severe, acute or chronic diseases, such as liver injury and liver cancer resulting from hepatitis C virus (HCV) infection, dengue hemorrhagic fever (DHF) or dengue shock syndrome (DSS) caused by dengue virus (DENV). Given the highly complex pathogenesis of Flaviviridae infections, they are still not fully understood at present. Accumulating evidence suggests that host autophagy is disrupted to regulate the life cycle of Flaviviridae. Organelle-specific autophagy is able to selectively target different organelles for quality control, which is essential for regulating cellular homeostasis. As an important sub process of autophagy, lipophagy regulates lipid metabolism by targeting lipid droplets (LDs) and is also closely related to the infection of a variety of pathogenic microorganisms. In this review, we briefly understand the LDs interaction relationship with Flaviviridae infection, outline the molecular events of how lipophagy occurs and the related research progress on the regulatory mechanisms of lipophagy in Flaviviridae infection. Exploring the crosstalk between viral infection and lipophagy induced molecular events may provide new avenues for antiviral therapy.


Author(s):  
Jun Zhang ◽  
Yong Zhang ◽  
Chang Liu ◽  
Tom Covey ◽  
Julia Nielsen ◽  
...  

High-throughput analysis of compound dissolved in DMSO and arrayed in multiwell plates for quality control (QC) purposes has widespread utility in drug discovery, ranging from the QC of assay-ready plates dispatched by compound management, to compound integrity check in the screening collection, to reaction monitoring of chemical syntheses in microtiter plates. Due to the large number of samples (thousands per batch) involved, these workflows can put a significant burden on the liquid chromatography–mass spectrometry (LC-MS) platform typically used. To achieve the required speed of seconds per sample, several chromatography-free MS approaches have previously been used with mixed results. In this study, we demonstrated the feasibility of acoustic ejection–mass spectrometry (AE-MS) in full-scan mode for high-throughput compound QC in miniaturized formats, featuring direct, contactless liquid sampling, minimal sample consumption, and ultrafast analytical speed. The sample consumption and analysis time by AE-MS represent, respectively, a 1000-fold and 30-fold reduction compared with LC-MS. In qualitative QC, AE-MS generated comparable results to conventional LC-MS in identifying the presence and absence of expected compounds. AE-MS also demonstrated its utility in relative quantifications of the same compound in serial dilution plates, or substrate in chemical synthesis. To facilitate the processing of a large amount of data generated by AE-MS, we have developed a data processing platform using commercially available tools. The platform demonstrated fast and straightforward data extraction, reviewing, and reporting, thus eliminating the need for the development of custom data processing tools. The overall AE-MS workflow has effectively eliminated the analytical bottleneck in the high-throughput compound QC work stream.


Author(s):  
W.J. de Ruijter ◽  
M.R. McCartney ◽  
David J. Smith ◽  
J.K. Weiss

Further advances in resolution enhancement of transmission electron microscopes can be expected from digital processing of image data recorded with slow-scan CCD cameras. Image recording with these new cameras is essential because of their high sensitivity, extreme linearity and negligible geometric distortion. Furthermore, digital image acquisition allows for on-line processing which yields virtually immediate reconstruction results. At present, the most promising techniques for exit-surface wave reconstruction are electron holography and the recently proposed focal variation method. The latter method is based on image processing applied to a series of images recorded at equally spaced defocus.Exit-surface wave reconstruction using the focal variation method as proposed by Van Dyck and Op de Beeck proceeds in two stages. First, the complex image wave is retrieved by data extraction from a parabola situated in three-dimensional Fourier space. Then the objective lens spherical aberration, astigmatism and defocus are corrected by simply dividing the image wave by the wave aberration function calculated with the appropriate objective lens aberration coefficients which yields the exit-surface wave.


2003 ◽  
Vol 118 (3) ◽  
pp. 193-196 ◽  
Author(s):  
Jeffrey W McKenna ◽  
Terry F Pechacek ◽  
Donna F Stroup

1971 ◽  
Vol 127 (1) ◽  
pp. 101-105 ◽  
Author(s):  
L. L. Weed

Sign in / Sign up

Export Citation Format

Share Document