test volume
Recently Published Documents


TOTAL DOCUMENTS

152
(FIVE YEARS 53)

H-INDEX

12
(FIVE YEARS 1)

2021 ◽  
Vol 2108 (1) ◽  
pp. 012078
Author(s):  
Yi Zhou ◽  
Dong Zhang ◽  
Xiaoguang Xu ◽  
Jun Li

Abstract Through the study of the latest radiation disturbance measurement standards, the difference between test volume and equipment under test volume was compared and analyzed. Different types of antennas will be used in the radiated disturbance measurement in terms of the different frequency range. Various antennas have kinds of directional characteristics. This article analyzes the influence of antenna half-power beam width on the equipment under test volume, especially on the height, from three frequency ranges. Furthermore, taking the typical horn antenna as the starting point, as well as the formula of volume with antenna half-power beamwidth, the relationship between equipment volume and frequency is calculated. Finally, the influence of antenna half-power beam width on the equipment under test volume is obtained.


2021 ◽  
pp. jrheum.210611
Author(s):  
Teresa Carbone ◽  
Valentina Picerno ◽  
Vito Pafundi ◽  
Ernesto Esposito ◽  
Pietro Leccese ◽  
...  

Objective Early diagnosis of autoimmune rheumatic diseases (ARD) is key to achieving effective treatment and improved prognosis. The coronavirus disease 2019 (COVID-19) pandemic has led to major changes in clinical practice on a global scale. We aimed to evaluate the impact of the COVID-19 pandemic on rheumatological clinical practices and autoimmunity testing demands. Methods Data regarding first rheumatological visits and new diagnosis together with the autoimmunity laboratory testing volumes related to COVID-19 pandemic phase (January- December 2020), were collected from medical records and laboratory information system (LIS) of a regional reference hospital (Basilicata, Italy) and compared with those obtained during the corresponding period in 2019. Results A significant decrease in the 2020 autoimmunity laboratory test volume was found when compared with the same period in 2019 (9912 vs 14100, p<0.05). A significant decrease in first rheumatological visits and diagnosis (1272 vs 2336, p<0.05) was also observed. However, an equivalent or higher percentage of positive autoimmunity results from outpatients services were recorded during 2020 when compared to pre-pandemic state. Of note, COVID- 19 associated decline in new diagnosis mainly affected less severe diseases. In contrast, ARD with systemic involvement were diagnosed at the same levels as the pre-pandemic period. Conclusion The COVID-19 pandemic has impacted on the access to health services. However, our study highlighted that during the outbreak, greater appropriateness of the requests of laboratory test and visits emerged as shown by a greater percentage of positive testing results and new diagnosis of more severe ARD compared to pre-pandemic period.


2021 ◽  
Vol 156 (Supplement_1) ◽  
pp. S112-S112
Author(s):  
O Olayinka ◽  
O Odujoko ◽  
S Barasch ◽  
J Farley ◽  
C Woodruff ◽  
...  

Abstract Introduction/Objective We performed a retrospective analysis of test volumes in clinical pathology prior to and during the COVID-19 pandemic to better understand the impact of the pandemic on our laboratory utilization. Methods/Case Report The laboratory information system was queried for test order volume in 2019 and 2020 using Discern Analytics 2.0. Representative tests including C-reactive protein (CRP), D-dimer, fibrinogen, ferritin, lactate dehydrogenase (LDH), procalcitonin, prothrombin time (PT), point of care iSTAT blood gas analysis, ABO and Rhesus typing (ABORh), antibody screening, flow cytometry, and serum protein electrophoresis (SPEP). Data was analyzed using Microsoft Excel 2013. Results (if a Case Study enter NA) The data showed an increase in the number of tests ordered and verified in the in-patient setting. The increase was most substantial for D-dimer, CRP and LDH with a percentage increase of approximately 200% on each test from year 2019 to 2020. An increase of 73% and 57% was noted for ferritin and fibrinogen respectively. A slight decrease in volume was noted for tests ordered in the out-patient setting including SPEPs during the pandemic. There was no significant change in the number of orders verified for point of care ISTAT blood gas testing between 2019 and 2020. Procalcitonin test volume increased steadily from its implementation in May 2020 with a steep rise in test volume in November and December. A total of 75,295 SARS-CoV-2 molecular tests were ordered between March and December 2020 with approximately 80% of the orders being performed as a send- out test. Conclusion The COVID-19 pandemic has had a substantial impact on laboratory utilization with significant volume increases in tests that guide the management of hospitalized COVID-19 patients and slight decrease in tests ordered mostly in the outpatient setting. These results may help guide current and future decisions relating to laboratory operations during pandemics.


2021 ◽  
Vol 156 (Supplement_1) ◽  
pp. S130-S130
Author(s):  
A S Maris ◽  
L Tao ◽  
C W Stratton ◽  
R M Humphries ◽  
J E Schmitz

Abstract Introduction/Objective The COVID-19 pandemic exacerbated deficiencies of testing personnel, reagents, supplies and disposables, instruments, and automation in many clinical laboratories. Upon entering respiratory season, a strategy was warranted to optimize laboratory resources when supplies were already limited and expected respiratory season test volume was unknown. An algorithm was devised to prioritize test ordering and TAT based on patient clinical scenario. Methods/Case Report The institutional respiratory season SARS-CoV-2 algorithm was constructed by a multidisciplinary team including infectious disease, infection prevention, laboratory, and IT/LIS leadership. CDC guidance on influenza testing was incorporated. Antigen-based testing was discontinued; only molecular amplification- based platforms with FDA EUA were utilized. Platforms had a range of TAT (20 minutes to 8 hours) and included fully- automated high throughput, rapid random access, point-of-care, and CDC SARS-CoV-2 assays. Test bundles included SARS-CoV-2 (monoplex), or SARS-CoV-2 + fluA&B (triplex), or SARS-CoV-2 + respiratory pathogen panel (multiplex RPP; includes 22 targets, including flu A&B). Results (if a Case Study enter NA) Key factors in the algorithm included whether the patient was outpatient or inpatient, hospital employee or not, symptomatic or not, immunocompetent or immunocompromised, and whether a concurrent order for other respiratory pathogens was included or not. Clinician responses for these factors determined the type of swab collected (wet swab in VTM or dry swab) and how quickly the TAT was indicated for a given patient using a colored-dot sticker system. Priority TAT in decreasing order was symptomatic inpatients, asymptomatic pre- procedure patients, asymptomatic admissions, symptomatic employees, and symptomatic outpatients. Conclusion An algorithm for respiratory pathogen testing during an unprecedented respiratory season prioritizes result TAT to an individual patient’s clinical situation while maximizing laboratory stewardship by eliminating redundant influenza testing and requiring ‘all upfront’ orders to avoid add-on orders that require ‘dumpster diving’ for samples. Limitations include inherent differences in sensitivity, LOD, and specificity when multiple different platforms are utilized to detect the same analytes.


2021 ◽  
Author(s):  
Emmanuel Udofia

Abstract Well testing could be described as a process required to calculate the volumes of (oil, water and gas) production from a well in a bid to identify the current state of the well. Amongst other things, well testing aims to provide information for effective Well, Reservoir and Facility Management. Normally, as a means of well performance health-check, reconciliation factor (RF) is generated by comparing the fiscal production volume against the theoretical well test volume. Experiences from the Coronavirus pandemic has brought about the new normal into well test execution. In deepwater environment, the process of well testing is more challenging and this paper aims to address these challenges and propose optimum well test frequency for deepwater operations. It is usually required that routine well test be conducted once every month on all flowing strings, this is for statutory compliance and well health-check purposes. However, in deepwater environment, it is difficult to comply with this periodic well test requirement mainly due to production flow line slugging, plant process upset and/or tripping resulting in production deferment and operational risk exposure. Furthermore, to carry out well test in deepwater operation, production cutback is required for flow assurance purpose and this usually results in huge production deferment. In this field of interest, this challenge has been managed by deploying a data-driven application to monitor production on individual flowing strings in real-time thereby optimizing the frequency of well test on every flowing well. Varying rate well test data are captured and used to calibrate this tool or application for subsequent real-time production monitoring. This initiative ensures that all the challenges earlier mentioned are managed while actually optimizing the frequency of testing the wells using intelligent application which serves as a ‘virtual meter’ for testing all producing wells in real time. As mentioned, well testing in most deepwater assets remain a big challenge but this project based field experience has ensured effective well testing operation resulting in reduction of production deferment and safety exposure during plant tripping whilst optimizing frequency of testing the wells. Following this achievement of the optimized well test to quarterly frequency in this field in Nigerian deepwater, recommendation from this paper will assist other deepwater field operators in managing routine well testing operation optimally.


Author(s):  
Eric Stephen Kilpatrick ◽  
Elicia Ginn ◽  
Ben Lee

Background: Repeated phlebotomy for laboratory diagnostic testing is a known cause of iatrogenic anaemia and in critically ill neonates often leads to blood transfusion being required. This study has developed a spreadsheet clinical decision support (CDS) tool to allow neonatal staff to determine the true minimum blood volume (MBV) required to analyse groups of blood tests and modelled its potential benefit compared to the existing system in use. Methods: The tool calculates the MBV accounting for novel factors including the current patient haematocrit for plasma/serum samples, instrument minimum test and dead volumes (including those where shared) and sharing of samples within/between laboratory departments. A year of neonatal unit laboratory requests were examined comparing the volumes and containers of blood recommended by the hospital information system (HIS) with both the amount actually collected by staff and that recommended by the tool. Results: 463 patients had 8,481 blood draws for 23,899 tests or test profiles over the year. The HIS recommended collecting 11,222mL of blood into 18,509 containers, while 17,734 containers were actually received (10,717mL if fully filled). The tool recommended collecting 4,915mL of blood into 15,549 containers. Conclusions: This tool allows NICU staff to objectively determine the MBV required for a combination of tests and is generalisable between laboratory instruments. Compared to the HIS, use of the MBV-CDS tool could maximally reduce the volume of blood collected from this neonatal unit by more than a half. NICU staff had apparently already gone some way to determining their own minimum volumes required.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Tobias M. Holden ◽  
Reese A. K. Richardson ◽  
Philip Arevalo ◽  
Wayne A. Duffus ◽  
Manuela Runge ◽  
...  

Abstract Background Availability of SARS-CoV-2 testing in the United States (U.S.) has fluctuated through the course of the COVID-19 pandemic, including in the U.S. state of Illinois. Despite substantial ramp-up in test volume, access to SARS-CoV-2 testing remains limited, heterogeneous, and insufficient to control spread. Methods We compared SARS-CoV-2 testing rates across geographic regions, over time, and by demographic characteristics (i.e., age and racial/ethnic groups) in Illinois during March through December 2020. We compared age-matched case fatality ratios and infection fatality ratios through time to estimate the fraction of SARS-CoV-2 infections that have been detected through diagnostic testing. Results By the end of 2020, initial geographic differences in testing rates had closed substantially. Case fatality ratios were higher in non-Hispanic Black and Hispanic/Latino populations in Illinois relative to non-Hispanic White populations, suggesting that tests were insufficient to accurately capture the true burden of COVID-19 disease in the minority populations during the initial epidemic wave. While testing disparities decreased during 2020, Hispanic/Latino populations consistently remained the least tested at 1.87 tests per 1000 population per day compared with 2.58 and 2.87 for non-Hispanic Black and non-Hispanic White populations, respectively, at the end of 2020. Despite a large expansion in testing since the beginning of the first wave of the epidemic, we estimated that over half (50–80%) of all SARS-CoV-2 infections were not detected by diagnostic testing and continued to evade surveillance. Conclusions Systematic methods for identifying relatively under-tested geographic regions and demographic groups may enable policymakers to regularly monitor and evaluate the shifting landscape of diagnostic testing, allowing officials to prioritize allocation of testing resources to reduce disparities in COVID-19 burden and eventually reduce SARS-CoV-2 transmission.


Author(s):  
Michael G. Chapman ◽  
Megna N. Shah ◽  
Sean P. Donegan ◽  
J. Michael Scott ◽  
Paul A. Shade ◽  
...  

AbstractHigh-energy diffraction microscopy (HEDM) in-situ mechanical testing experiments offer unique insight into the evolving deformation state within polycrystalline materials. These experiments rely on a sophisticated analysis of the diffraction data to instantiate a 3D reconstruction of grains and other microstructural features associated with the test volume. For microstructures of engineering alloys that are highly twinned and contain numerous features around the estimated spatial resolution of HEDM reconstructions, the accuracy of the reconstructed microstructure is not known. In this study, we address this uncertainty by characterizing the same HEDM sample volume using destructive serial sectioning (SS) that has higher spatial resolution. The SS experiment was performed on an Inconel 625 alloy sample that had undergone HEDM in-situ mechanical testing to a small amount of plastic strain (~ 0.7%), which was part of the Air Force Research Laboratory Additive Manufacturing (AM) Modeling Series. A custom-built automated multi-modal SS system was used to characterize the entire test volume, with a spatial resolution of approximately 1 µm. Epi-illumination optical microscopy images, backscattered electron images, and electron backscattered diffraction maps were collected on every section. All three data modes were utilized and custom data fusion protocols were developed for 3D reconstruction of the test volume. The grain data were homogenized and downsampled to 2 µm as input for Challenge 4 of the AM Modeling Series, which is available at the Materials Data Facility repository.


Sign in / Sign up

Export Citation Format

Share Document