CONDITIONS OF ODOR THRESHOLD DETERMINATION 2.1 Requirements for the test area Olfactonetric measurement should be undertaken in a roan or area which is kept free frcm odors. There should be an atmosphere of ccmfort and relaxation in the test chamber, which will encourage panel members to concentrate on the testing task and not to be distracted by external sti­ muli. The test should be carried out at roan temperature and normal humi­ dity. 2.2 General conditions for test procedure Odor measurements must be carried out with the help of a team leader, who instructs the panelists and operates the measuring equipment. Ccmnu-nication between the team leader and the panel has to be kept to an abso­ lute minimum. Because of fatigue, the duration of a test series as well as the time of the whole session should be limited. Breaks of at least the same duration as the proceeding test period should be provided. Germany France Nether­ United lands Kingdcm Panel leader yes yes yes yes duration of 15-30 min 20 min 15 min test series duration of breaks 15-30 min 20 min ? 5 min 30 min 2 test 2 hours time of a test 300 tests/ series of period day 20 tests Table 1: General conditions 3. DETECTION METHODS 3.1 Presentation of odor stimulus 3.1.1 Method of limits The most used method for establishing an absolute threshold in en­ vironmental studies is the Method of Limits. In its classical form, the stimuli are presented in alternating ascending and descending series, starting at different points to avoid having the subject fall into a rou­ tine. During this procedure there is a chance that adaptation phenomena may develop. An effort to minimize these effects is for example to use only an ascending series of stimuli. The threshold value for each sepa­ rate test series is defined as a point in-between the last undetected and the first detected point in the stimulus continuum. A modification of the method of limits is the "up and down" method. A stimulus is presented: if the response is positive, the next lower sti­ mulus is presented, if it is negative, the next higher is presented and so on. The primary advantage is, that it automatically concentrates near the mean and a considerable number of observations can be saved.

2020 ◽  
Vol 11 (1) ◽  
Author(s):  
Jing Zhao ◽  
Bernd Hamm ◽  
Winfried Brenner ◽  
Marcus R. Makowski

Abstract Purpose This study aimed to calculate an applicable relative ratio threshold value instead of the absolute threshold value for simultaneous 68Ga prostate-specific membrane antigen/positron emission tomography ([68Ga]Ga-PSMA-11 PET) in patients with prostate cancer (PCa). Materials and methods Our study evaluated thirty-two patients and 170 focal prostate lesions. Lesions are classified into groups according to Prostate Imaging Reporting and Data System (PI-RADS). Standardized uptake values maximum (SUVmax), corresponding lesion-to-background ratios (LBRs) of SUVmax, and LBR distributions of each group were measured based on regions of interest (ROI). We examined LBR with receiver operating characteristic analysis to determine threshold values for differentiation between multiparametric magnetic resonance imaging (mpMRI)-positive and mpMRI-negative lesions. Results We analyzed a total of 170 focal prostate lesions. Lesions number of PI-RADS 2 to 5 was 70, 16, 46, and 38. LBR of SUVmax of each PI-RADS scores was 1.5 (0.9, 2.4), 2.5 (1.6, 3.4), 3.7 (2.6, 4.8), and 6.7 (3.5, 12.7). Based on an optimal threshold ratio of 2.5 to be exceeded, lesions could be classified into MRI-positive lesion on [68Ga]Ga-PSMA PET with a sensitivity of 85.2%, a specificity of 72.0%, with the corresponding area under the receiver operating characteristic curve (AUC) of 0.83, p < 0.001. This value matches the imaging findings better. Conclusion The ratio threshold value of SUVmax, LBR, has improved clinical and research applicability compared with the absolute value of SUVmax. A higher threshold value than the background’s uptake can dovetail the imaging findings on MRI better. It reduces the bias from using absolute background uptake value as the threshold value.


2010 ◽  
Vol 19 (8) ◽  
pp. 996 ◽  
Author(s):  
Philip E. Higuera ◽  
Daniel G. Gavin ◽  
Patrick J. Bartlein ◽  
Douglas J. Hallett

Over the past several decades, high-resolution sediment–charcoal records have been increasingly used to reconstruct local fire history. Data analysis methods usually involve a decomposition that detrends a charcoal series and then applies a threshold value to isolate individual peaks, which are interpreted as fire episodes. Despite the proliferation of these studies, methods have evolved largely in the absence of a thorough statistical framework. We describe eight alternative decomposition models (four detrending methods used with two threshold-determination methods) and evaluate their sensitivity to a set of known parameters integrated into simulated charcoal records. Results indicate that the combination of a globally defined threshold with specific detrending methods can produce strongly biased results, depending on whether or not variance in a charcoal record is stationary through time. These biases are largely eliminated by using a locally defined threshold, which adapts to changes in variability throughout a charcoal record. Applying the alternative decomposition methods on three previously published charcoal records largely supports our conclusions from simulated records. We also present a minimum-count test for empirical records, which reduces the likelihood of false positives when charcoal counts are low. We conclude by discussing how to evaluate when peak detection methods are warranted with a given sediment–charcoal record.


2015 ◽  
Vol 28 (7) ◽  
pp. 2745-2763 ◽  
Author(s):  
Tosiyuki Nakaegawa ◽  
Osamu Arakawa ◽  
Kenji Kamiguchi

Abstract The present study investigated the onset and withdrawal dates of the rainy season in Panama by using newly developed, gridded, daily precipitation datasets with a high horizontal resolution of 0.05° based on ground precipitation observations. The onset and withdrawal dates showed very complicated geographical features, although the country of Panama is oriented parallel to latitude lines, and the geographical patterns of the onset and withdrawal dates could simply reflect the latitudinal migration of the intertropical convergence zone, as seen in other regions and countries. An absolute threshold value of 3 mm day−1 (pentad mean precipitation) was used to determine the onset and withdrawal dates. The onset and withdrawal dates obtained from the gridded daily precipitation dataset clearly depicted the migration of the rainy season. The rainy season starts suddenly in pentad 21 (11–15 April) in most of eastern Panama and in pentad 22 (16–20 April) in most of western Panama. The termination of the rainy season begins in Los Santos Province during pentad 67 (27 November–1 December) and expands to both the eastern and western surrounding areas. There is no dry season in the western part of the Caribbean coastal zone. Water vapor fluxes and topography suggest dynamical causes, such as a topographically induced upward mass flux accompanied by high humidity, for the complicated geographical features of the onset and withdrawal dates. An assessment was made of uncertainties in the timing of the onset and withdrawal associated with the definition of these terms.


Geophysics ◽  
2012 ◽  
Vol 77 (4) ◽  
pp. WB171-WB177 ◽  
Author(s):  
Anders Vest Christiansen ◽  
Esben Auken

We tested a new robust concept for the calculation of depth of investigation (DOI) that is valid for any 1D electromagnetic (EM) geophysical model. A good estimate of DOI is crucial when building geologic and hydrological models from EM data sets because the validity of the models varies strongly with data noise and the resistivity of the layers themselves. For diffusive methods, such as ground-based and airborne electromagnetic, it is not possible to define an unambiguous depth below which there is no information on the resistivity structure and a measure of DOI is therefore to what depth the model can be considered reliable. The method we presented is based on the actual model output from the inversion process and we used the actual system response, contrary to assuming, e.g., planar waves over a homogeneous half-space, the widely used skin depth calculation. Equally important, the data noise and the number of data points are integrated into the calculation. Our methodology is based on a recalculated sensitivity (Jacobian) matrix of the final model and thus it can be used on any model type for which a sensitivity matrix can be calculated. Unlike other sensitivity matrix methods, we defined a global and absolute threshold value contrary to defining a relative (such as 5%), sensitivity limit. The threshold value will apply to all 1D inverted data and will thus produce comparable numbers of DOI.


2020 ◽  
pp. 1-5
Author(s):  
Anton Telishevskiy ◽  
Anton Telishevskiy ◽  
G. V. Papayan ◽  
I. A. Chizh ◽  
I. A. Vinogradov ◽  
...  

Introduction: A sentinel lymph node biopsy (SLNB) is a standard procedure for surgical staging in the early stage of breast cancer, avoiding excess lymphadenectomy in most patients. Among the new methods of SLNB, there is a prospective method based on the use of indocyanine green (ICG) as a contrast medium. Numerous studies to assess the possibility of the routine use of ICG fluorescence method generally favored the use of ICG. The main issue discussed is whether it can be used alone or in combination with the RI method. In this work, using the specified device, we conducted a study aimed at intercomparing of ICG fluorescence imaging and the radioactivity detection methods in order to assess prospects for the use of the ICG method of imaging for SLN detection in early-stage breast cancer. Material and Methods: A prospective, non-randomized single-center study was conducted at the oncological breast unit Pavlov State Medical University, Russia. The study included 32 patients aged from 34 to 78 (median-55.2 years) with breast cancer (cTis-3, N0-2). 4 patients underwent surgery after neoadjuvant systemic treatment. 30-45 minutes before the surgery, additionally, 2 ml of the ICG solution was administered near the tumor margin via single skin puncture. The solution was prepared by dissolving 25 mg ICG in 20% of human albumin. Then the place of injection was massaged for at least 5 minutes. SLN biopsy was performed by two criteria by the presence of radioactivity in the axillary region, which was monitored using a handheld gamma-detector or by the ICG fluorescence, location of which was visualized using the ICG-Scope system. The lymph node was recognized as a sentinel if its intensity exceeded the background radioactivity level of 99mTc or exceeded the threshold value upon the ICG fluorescence, which was 1% of the standard sample intensity.


Author(s):  
Stephan Mühlbacher-Karrer ◽  
Juliana Padilha Leitzke ◽  
Lisa-Marie Faller ◽  
Hubert Zangl

Purpose This paper aims to investigate the usability of the non-iterative monotonicity approach for electrical capacitance tomography (ECT)-based object detection. This is of particular importance with respect to object detection in robotic applications. Design/methodology/approach With respect to the detection problem, the authors propose a precomputed threshold value for the exclusion test to speed up the algorithm. Furthermore, they show that the use of an inhomogeneous split-up strategy of the region of interest (ROI) improves the performance of the object detection. Findings The proposed split-up strategy enables to use the monotonicity approach for robotic applications, where the spatial placement of the electrodes is constrained to a planar geometry. Additionally, owing to the improvements in the exclusion tests, the selection of subregions in the ROI allows for avoiding self-detection. Furthermore, the computational costs of the algorithm are reduced owing to the use of a predefined threshold, while the detection capabilities are not significantly influenced. Originality/value The presented simulation results show that the adapted split-up strategies for the ROI improve significantly the detection performance in comparison to the traditional ROI split-up strategy. Thus, the monotonicity approach becomes applicable for ECT-based object detection for applications, where only a reduced number of electrodes with constrained spatial placement can be used, such as in robotics.


Author(s):  
Amro M. Zaki ◽  
Sayed A. Nassar ◽  
Xianjie Yang

This study develops an analytical formula for determining the minimum initial preload required to prevent the self-loosening of preloaded countersunk fasteners that are subjected to cyclic transverse loading. The formula is based on mathematical modeling of the self-loosening behavior of the fastener. The accurate prediction of the minimum bolt preload required for preventing loosening would reliably enable the use of that minimum threshold preload as a primary locking feature in critical bolted joint applications. An experimental setup and test procedure is established to compare the model prediction with the experimental data. The focus of this paper is to investigate the effect of thread pitch, excitation amplitude, as well as the bearing friction coefficient on the threshold value of the bolt preload that would prevent loosening.


Talanta ◽  
2017 ◽  
Vol 174 ◽  
pp. 279-284 ◽  
Author(s):  
Mohamad Izzat Azmer ◽  
Fakhra Aziz ◽  
Zubair Ahmad ◽  
Ehsan Raza ◽  
Mansoor Ani Najeeb ◽  
...  

Author(s):  
Rebecca A. Embacher ◽  
Mark B. Snyder

The hydraulic fracture test was developed under the Strategic Highway Research Program to address the need for a more rapid, less expensive test for concrete aggregate freeze–thaw durability. Although the test concept appeared sound, the original test and analysis procedures were not sufficiently reliable and accurate to merit widespread adoption and implementation. Several follow-up research efforts have been performed, and each has resulted in improvements to the test. The results of the most recent study, which evaluated changes in both the test procedure (to include additional test sieves for better characterization of particle fractures) and the analysis procedures, are described. The “hydraulic fracture index” has been replaced by a model that predicts freeze–thaw test dilation as a function of the distribution of particle mass retained on the test sieves. This model was developed using data obtained from freeze–thaw and hydraulic fracture testing of 18 quarried carbonate and gravel aggregate sources; the resulting correlation is exceptional ( r2 = 0.98). An additional improvement is the development of a large test chamber capable of handling aggregate samples five times larger than the original small chamber, which thereby allows aggregate durability characterization with a single test run. It is believed that the hydraulic fracture test is now ready for more broad-based validation testing and eventual widespread acceptance and implementation as an accurate screening tool for concrete aggregate freeze–thaw durability.


2012 ◽  
Vol 503-504 ◽  
pp. 1593-1596
Author(s):  
Yan Hong Li ◽  
Tian Dong Yu ◽  
Shu Liang Li

The harmonic pollution is one of the serious problems which impact on power system security and stability. With the development of power electronic technology, harmonic problems become increasingly prominent. Harmonic detection is an important content in the harmonic problems, is the base to solve the problem of harmonic. The harmonic of power system have characteristics of random and nonstationary, not easy to detect, in recent years, a variety of detection methods emerge in an endless stream, this paper introduces a technique of detecting harmonics in power system by using fractal method, this method by calculating the dynamic waveform grid fractal dimension, setting threshold value, and determines the waveform distortion occurring time. After the theoretical analysis and practical measurement, this method can be effectively used on harmonic detection and can meet the requirements in speed and accuracy.


Sign in / Sign up

Export Citation Format

Share Document