performance figure
Recently Published Documents


TOTAL DOCUMENTS

11
(FIVE YEARS 4)

H-INDEX

3
(FIVE YEARS 1)

2021 ◽  
Vol 118 (37) ◽  
pp. e2022194118
Author(s):  
Abhishek Roy ◽  
Surendar R. Venna ◽  
Gerard Rogers ◽  
Li Tang ◽  
Thomas C. Fitzgibbons ◽  
...  

In the next decade, separation science will be an important research topic in addressing complex challenges like reducing carbon footprint, lowering energy cost, and making industrial processes simpler. In industrial chemical processes, particularly in petrochemical operations, separation and product refining steps are responsible for up to 30% of energy use and 30% of the capital cost. Membranes and adsorption technologies are being actively studied as alternative and partial replacement opportunities for the state-of-the-art cryogenic distillation systems. This paper provides an industrial perspective on the application of membranes in industrial petrochemical cracker operations. A gas separation performance figure of merit for propylene/propane separation for different classes of materials ranging from inorganic, carbon, polymeric, and facilitated transport membranes is also reported. An in-house–developed model provided insights into the importance of operational parameters on the overall membrane design.


2021 ◽  
Author(s):  
Neil Jethani ◽  
Hao Zhang ◽  
Larry Chinitz ◽  
Yindalon Aphinyanaphongs ◽  
Rajesh Ranganath ◽  
...  

Background:Drug-induced QTc prolongation (diQTP) is frequent and associated with a risk of sudden cardiac death. Identifying patients at risk of diQTP can enhance monitoring and treatment plans. Objective: To develop a machine learning architecture for prediction of extreme diQTP (QTc >500ms OR ΔQTc >60ms) at the onset of treatment with a QTc prolonging drug. Methods: We included 4,628 adult patients who received a baseline ECG within 6 months prior to treatment onset with a QTc prolonging drug and a follow-up ECG after the fifth dose. We collected known clinical QTc prolongation risk factors (CF). We developed a novel neural network architecture (QTNet) to predict diQTP from both the CF and baseline ECG data (ECGD), composed of both the ECG waveform and measurements (i.e. QTc),by fusing a state-of-the-art convolution neural network to process raw ECG waveforms with the CF using amulti-layer perceptron. We fit a logistic regression model using the CF, replicating RISQ-PATH as Baseline. We further compared the performance of QTNet (CF+ECGD) to neural network models trained using three variable subsets: a) baseline QTc (QTC-NN), b) CF-NN, and c) ECGD-NN. Results: diQTP was present in 1030 patients (22.3%), of which baseline QTc was normal (QTc<450ms:Male/<470ms:Female) in 405 patients (39.3%). QTNet achieved the best performance (Figure 1)(AUROC, 0.802 [95% CI, 0.782-0.820]), outperforming predictions based on the Baseline (AUROC, 0.738 [95%CI, 0.716-0.757]), QTC-NN (AUROC, 0.735 [95% CI, 0.710-0.757]), CF-NN (AUROC, 0.778 [95% CI, 0.757-0.799]), and ECGD-NN (AUROC, 0.774 [95% CI, 0.750-0.794]). Conclusion: We developed QTNet, the first deep learning model for predicting extreme diQTP, outperforming models trained on known clinical risk factors.


2019 ◽  
Vol 7 (7_suppl5) ◽  
pp. 2325967119S0039
Author(s):  
Ayoosh Pareek ◽  
Chad W. Parkes ◽  
Alexey A. Leontovich ◽  
Christopher D. Bernard ◽  
Aaron John Krych ◽  
...  

Objectives: Traditional pitching statistics (ERA, WHIP, etc) have been used as surrogates for pitcher performance without being validated. Even amongst healthy pitchers, the normal variability of these parameters has not yet been established. The purpose of this study was to determine the normal variability of basic and advanced pitching statistics in non-injured Major League Baseball (MLB) pitchers. It is our hope that this work will serve as the foundation for the identification and implementation of validated, pitcher dependent statistical measures that can be used to assess return to play performance following injury. Methods: Publicly available data from MLB Statcast and Pitch/Fx databases was used to analyze all non-injured MLB pitchers during 2015 and 2016 seasons who pitcher greater than 100 innings each season without injury. Traditional and advanced baseball pitching statistics were analyzed. The variability of each parameter was assessed by computing coefficient of variation (CV) between individual pitchers and across all pitchers. A CV below 10 is typically indicative of a relatively constant parameter, and parameters with a CV > 10 are generally considered inconsistent and unreliable. Results: A total of 118 pitchers met all inclusion criteria. For each of these healthy pitchers, 38 basic/traditional parameters and 17 advanced parameters were analyzed. Of the traditional pitcher statistics, only 1 (3%) demonstrated a CV value < 10 (average fastball velocity [FBv]; CV 1.5) (Figure 1). In advanced statistics, 9 of 17 (53%) variables demonstrated acceptable consistency as evidenced by a CV value < 10 (Figure 2). Release position from plate (release_pos_y) along with velocity from the plate (vy0) where the two most constant advanced parameters. When separated by pitch type, these two parameters were the most constant (lowest CV) in every pitch type. Conclusion: The validity and variability of baseball statistics as surrogate markers for performance after injury/surgery have not yet been evaluated. It is critical that baseball statistics undergo proper vetting prior to being used to assess recovery. This study reveals average fastball velocity and release position from the plate to be the least variable basic and advanced baseball statistics in MLB pitchers. In total, only 10 of the 55 statistics analyzed demonstrated acceptable consistency and reliability. This study can be further used to determine the minimum time that each of these variables needs to be followed to ensure an appropriate sample size is obtained to detect significant differences in pre- and post- injury performance. [Figure: see text][Figure: see text]


Materials ◽  
2019 ◽  
Vol 12 (13) ◽  
pp. 2040 ◽  
Author(s):  
Muhammad Siyar ◽  
Jun-Young Cho ◽  
Woo-Chan Jin ◽  
Euy Heon Hwang ◽  
Miyoung Kim ◽  
...  

Heavily doped degenerate semiconductors such as Cu2SnSe3 (CTSe) attracted attention in thermoelectric (TE) and optoelectronic fields, due to their high electrical conductivity and small band gap. The small Seebeck coefficient of undoped CTSe, however, is the major issue in achieving high TE performance (figure of merit, ZT). Here, we report that the Seebeck coefficient of CTSe can be controlled by adding SnS within a CTSe matrix. CTSe-SnS composite has not only high Seebeck coefficient in the range of 300–500 µVolt/K but thermal conductivity which is lower than that of pristine CTSe due to the scattering at the interface between the matrix and the SnS particles. A reasonable ZT of 0.18 is achieved at 570 K by adding a small amount (3 wt.%) of SnS to the CTSe matrix.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Ki-Hyun Kim ◽  
Jan E. Szulejko ◽  
Yong-Hyun Kim ◽  
Min-Hee Lee

The relative performance figure of merits was investigated for the two most common analytical methods employed for carbonyl compounds (CC), for example, between high performance liquid chromatography (HPLC)-UV detector (with 2,4-dinitrophenylhydrazine (DNPH) derivatization) and thermal desorption (TD)-gas chromatography (GC)-mass spectrometry (MS) (without derivatization). To this end, the suitability of each method is assessed by computing the relative recovery (RR) between the gas- and liquid-phase standards containing a suite of CC such as formaldehyde (FA), acetaldehyde (AA), propionaldehyde (PA), butyraldehyde (BA), isovaleraldehyde (IA), and valeraldehyde (VA) along with benzene (B) as a recovery reference for the GC method. The results confirm that a TD-GC-MS is advantageous to attain the maximum recovery for the heavier CCs (i.e., with molecular weights (MW) above BA−MW ≥ 74). On the other hand, the HPLC-UV is favorable for the lighter CCs (like FA and AA) with the least bias. Such compound-specific responses for each platform are validated by relative ordering of CCs as a function of response factor (RF), method detection limit (MDL), and recovery pattern. It is thus desirable to understand the advantages and limitations of each method to attain the CC data with the least experimental bias.


Author(s):  
Bilal Succar

Building Information Modelling (BIM) is an expanding collection of concepts and tools which have been attributed with transformative capabilities within the Architecture, Engineering, Construction and Operations (AECO) industry. BIM discussions have grown to accommodate increasing software capabilities, infinitely varied deliverables, and competing standards emanating from an abundance of overlapping definitions attempting to delineate the BIM term. This chapter will steer away from providing its own definition of BIM yet concurs with those identifying it as a catalyst for change (Bernstein, 2005) poised to reduce industry’s fragmentation (CWIC, 2004), improve its efficiency (Hampson & Brandon, 2004) and lower its high costs of inadequate interoperability (NIST, 2004). In essence, BIM represents an array of possibilities and challenges which need to be understood and met respectively through a measurable and repeatable approach. This chapter briefly explores the multi-dimensional nature of the BIM domain and then introduces a knowledge tool to assist individuals, organisations and project teams to assess their BIM capability, maturity and improve their performance (Figure 1). The first section introduces BIM Fields and Stages which lay the foundations for measuring capability and maturity. Section 2 introduces BIM Competencies which can be used as active implementation steps or as performance assessment areas. Section 3 introduces an Organisational Hierarchy/Scale suitable for tailoring capability and maturity assessments according to markets, industries, disciplines and organisational sizes. Section 4 explores the concepts behind ‘capability maturity models’ and then adopts a five-level BIM-specific Maturity Index (BIMMI). Section 5 introduces the BIM Maturity Matrix (BIm³), a performance measurement and improvement tool which identifies the correlation between BIM Stages, Competency Sets, Maturity Levels and Organisational Scales. Finally, Section 6 introduces a Competency Granularity Filter which enables the tailoring of BIM tools, guides and reports according to four different levels of assessment granularity.


1997 ◽  
Vol 44 (11) ◽  
pp. 1888-1895 ◽  
Author(s):  
R.A. Chapman ◽  
T.C. Holloway ◽  
V.M. McNeil ◽  
A. Chatterjee ◽  
G.E. Stacey

Author(s):  
C.E. Fiori ◽  
C.R. Swyt

Recently, important improvements have been achieved in both the resolution of energy dispersive spectrometers (EDS) and their count rate performance. Figure 1 is a plot of the count rate performance versus resolution for one EDS system. This particular system was chosen for our study because data were available for 6 pulse processing times. It can be seen that there is a direct trade off between count rate performance and energy resolution. The purpose of this study is to examine the consequences of the various pulse processor conditions for quantitative analysis. A number of analytical problems were simulated including various combinations of acquisition times, degree of spectral overlap, and relative peak heights.We provide here one example from the study, the analysis of PbS, which offers a difficult spectral overlap of the Pb M lines and the S K lines. We used the recently available computer program DTSA (Desk Top Spectrum Analyzer), to simulate 1000 spectra for each of the pulse processor resolution conditions shown in Figure 1. A linear least squares procedure was used to determine the peak areas.


Sign in / Sign up

Export Citation Format

Share Document