Slow response times: Is it the pneumonia or the physician?*

2005 ◽  
Vol 33 (6) ◽  
pp. 1429-1430 ◽  
Author(s):  
Richard G. Wunderink
Keyword(s):  
2010 ◽  
Vol 29 (4) ◽  
pp. 214 ◽  
Author(s):  
Margaret Brown-Sica ◽  
Jeffrey Beall ◽  
Nina McHale

Response time as defined for this study is the time that it takes for all files that constitute a single webpage to travel across the Internet from a Web server to the end user’s browser. In this study, the authors tested response times on queries for identical items in five different library catalogs, one of them a next-generation (NextGen) catalog. The authors also discuss acceptable response time and how it may affect the discovery process. They suggest that librarians and vendors should develop standards for acceptable response time and use it in the product selection and development processes.


1971 ◽  
Vol 23 (1) ◽  
pp. 82-96
Author(s):  
Philip H. K. Seymour

Subjects gave grouped multiple reports of the congruence of each member of arrays of one, two or three word-shape or shape-shape pairs, and a measure was taken of the time elapsing between onset of the array and initiation of a multiple yes/no report. Double and triple reports were initiated less rapidly than single reports. Replicated arrays, involving repetition of a display pair, gave similar response times for double and triple reports, and were classified faster than non-replicated arrays requiring the same over report. In the case of non-replicated arrays triple reports were initiated less rapidly than double reports. Both classes of array showed substantial effects for congruence, giving slow response times where all pairs in the array were incongruent or where the lefthand or first reported display was incongruent.


Author(s):  
B U Umar ◽  
O M Olaniyi ◽  
L A Ajao ◽  
D Maliki ◽  
I C Okeke

            Democratic government in the world today rely on electronic voting as the foremost means of providing credible, transparent and fair elections for the electorate. There is a need for developed electronic voting systems to be security enhanced to ensure the authenticity of the developed system. Traditional paper balloting systems suffer from vote tampering, multiple voting and illegal voting by unregistered voters. They are also, susceptible to under aged voting due to the difficulty in authenticating the identity of prospective voters. Manual collation and publication of vote results also leads to slow response times and inaccuracies in published results. This research paper proposes a system to combat the current challenges through the development of a fingerprint biometric authentication system for secure electronic voting machines. It uses a fingerprint biometric sensor, integrated via Python to verify users of the system. The inclusion of biometrics improves the security features of the system. The secure voting system is built using PHP and easy to use Graphical User Interface was designed using HTML and CSS. Users are required to interact with the machine via a 7” touchscreen interface. From the results, it shows that the developed machine has a minimum response time of 0.6 seconds for specific operation, an FAR of 2%, FRR of 10% and overall system accuracy of 94%. The developed machine is able to combat the challenges of authentication of users, thereby guaranteeing the transparency, credibility, integrity and vote authenticity of the elections.


2020 ◽  
Vol 157 ◽  
pp. 103-114 ◽  
Author(s):  
Emir Efendić ◽  
Philippe P.F.M. Van de Calseyde ◽  
Anthony M. Evans
Keyword(s):  

2019 ◽  
Author(s):  
Emir Efendic ◽  
Philippe van de Calseyde ◽  
Anthony M Evans

Algorithms consistently perform well on various prediction tasks, but people often mistrust their advice. Here, we demonstrate one component that affects people’s trust in algorithmic predictions: response time. In seven studies (total N = 1928 with 14,184 observations), we find that people judge slowly generated predictions from algorithms as less accurate and they are less willing to rely on them. This effect reverses for human predictions, where slowly generated predictions are judged to be more accurate. In explaining this asymmetry, we find that slower response times signal the exertion of effort for both humans and algorithms. However, the relationship between perceived effort and prediction quality differs for humans and algorithms. For humans, prediction tasks are seen as difficult and effort is therefore positively correlated with the perceived quality of predictions. For algorithms, however, prediction tasks are seen as easy and effort is therefore uncorrelated to the quality of algorithmic predictions. These results underscore the complex processes and dynamics underlying people’s trust in algorithmic (and human) predictions and the cues that people use to evaluate their quality.


2021 ◽  
Author(s):  
Knut Ola Dølven ◽  
Juha Vierinen ◽  
Roberto Grilli ◽  
Jack Triest ◽  
Bénédicte Ferré

Abstract. Accurate, high resolution measurements are essential to improve our understanding of environmental processes. Several chemical sensors relying on membrane separation extraction techniques have slow response times due to a dependence on equilibrium partitioning across the membrane separating the measured medium (i.e., a measuring chamber) and the medium of interest (i.e., a solvent). We present a new technique for deconvolving slow sensor response signals using statistical inverse theory; applying a weighted linear least squares estimator with the growth-law as measurement model. The solution is regularized using model sparsity, assuming changes in the measured quantity occurs with a certain time-step, which can be selected based on domain-specific knowledge or L-curve analysis. The advantage of this method is that it: 1) models error propagation, providing an explicit uncertainty estimate of the response time corrected signal, 2) enables evaluation of the solutions self consistency, and 3) only requires instrument accuracy, response time, and data as input parameters. Functionality of the technique is demonstrated using simulated, laboratory, and field measurements. In the field experiment, the coefficient of determination (R2) of a slow response methane sensor in comparison with an alternative, fast response sensor, significantly improved from 0.18 to 0.91 after signal deconvolution. This shows how the proposed method can open up a considerably wider set of applications for sensors and methods suffering from slow response times due to a reliance on the efficacy of diffusion processes.


2010 ◽  
Vol 46 (46) ◽  
pp. 8821 ◽  
Author(s):  
Ian Y. Goon ◽  
Leo M. H. Lai ◽  
May Lim ◽  
Rose Amal ◽  
J. Justin Gooding
Keyword(s):  

2000 ◽  
Vol 89 (2) ◽  
pp. 581-589 ◽  
Author(s):  
A. D. Farmery ◽  
C. E. W. Hahn

Tidal ventilation gas-exchange models in respiratory physiology and medicine not only require solution of mass balance equations breath-by-breath but also may require within-breath measurements, which are instantaneous functions of time. This demands a degree of temporal resolution and fidelity of integration of gas flow and concentration signals that cannot be provided by most clinical gas analyzers because of their slow response times. We have characterized the step responses of the Datex Ultima (Datex Instrumentation, Helsinki, Finland) gas analyzer to oxygen, carbon dioxide, and nitrous oxide in terms of a Gompertz four-parameter sigmoidal function. By inversion of this function, we were able to reduce the rise times for all these gases almost fivefold, and, by its application to real on-line respiratory gas signals, it is possible to achieve a performance comparable to the fastest mass spectrometers. With the use of this technique, measurements required for non-steady-state and tidal gas-exchange models can be made easily and reliably in the clinical setting.


Sign in / Sign up

Export Citation Format

Share Document