scholarly journals Adaptive Sampling of the Electrocardiogram Based on Generalized Perceptual Features

Sensors ◽  
2020 ◽  
Vol 20 (2) ◽  
pp. 373 ◽  
Author(s):  
Piotr Augustyniak

A non-uniform distribution of diagnostic information in the electrocardiogram (ECG) has been commonly accepted and is the background to several compression, denoising and watermarking methods. Gaze tracking is a widely recognized method for identification of an observer’s preferences and interest areas. The statistics of experts’ scanpaths were found to be a convenient quantitative estimate of medical information density for each particular component (i.e., wave) of the ECG record. In this paper we propose the application of generalized perceptual features to control the adaptive sampling of a digital ECG. Firstly, based on temporal distribution of the information density, local ECG bandwidth is estimated and projected to the actual positions of components in heartbeat representation. Next, the local sampling frequency is calculated pointwise and the ECG is adaptively low-pass filtered in all simultaneous channels. Finally, sample values are interpolated at new time positions forming a non-uniform time series. In evaluation of perceptual sampling, an inverse transform was used for the reconstruction of regularly sampled ECG with a percent root-mean-square difference (PRD) error of 3–5% (for compression ratios 3.0–4.7, respectively). Nevertheless, tests performed with the use of the CSE Database show good reproducibility of ECG diagnostic features, within the IEC 60601-2-25:2015 requirements, thanks to the occurrence of distortions in less relevant parts of the cardiac cycle.

2002 ◽  
Vol 14 (2) ◽  
pp. 187-198 ◽  
Author(s):  
N. Sigala ◽  
F. Gabbiani ◽  
N. K. Logothetis

We investigated the influence of a categorization task on the extraction and representation of perceptual features in humans and monkeys. The use of parameterized stimuli (schematic faces and fish) with fixed diagnostic features in combination with a similarity-rating task allowed us to demonstrate perceptual sensitization to the diagnostic dimensions of the categorization task for the monkeys. Moreover, our results reveal important similarities between human and monkey visual subordinate categorization strategies. Neither the humans nor the monkeys compared the new stimuli to class prototypes or based their decisions on conditional probabilities along stimulus dimensions. Instead, they classified each object according to its similarity to familiar members of the alternative categories, or with respect to its position to a linear boundary between the learned categories.


2020 ◽  
Author(s):  
Shuang Liang ◽  
Jiangyuan Zeng ◽  
Zhen Li

<p>Evaluating the performance and consistency of passive microwave (PM) sea ice concentration (SIC) products derived from different algorithms is critical since a good knowledge of the quality of the satellite SIC products is essential for their application and improvement. To comprehensively evaluate the performance of satellite SIC in long time series and the whole polar regions (both Arctic and Antarctic), in the study we examined the spatial and temporal distribution of the discrepancy between four PM satellite SIC products with the ERA-Interim sea ice fraction dataset (ERA SIC) during the period of 2015-2018. The four PM SIC products include the DMSP SSMIS with Arctic Radiation and Turbulence Interaction Study Sea Ice (ASI) algorithm (SSMIS/ASI), the GCOM-W AMSR2 with NASA Bootstrap (BT) algorithm (AMSR2/BT), the Chinese Feng Yun-3B with enhanced NASA Team (NT2) sea ice algorithm (FY3B/NT2), and the Chinese Feng Yun-3C with NT2 (FY3C/NT2) at a spatial resolution of 12.5 km.</p><p>The results show the spatial patterns of PM SIC products are generally in good agreement with ERA SIC. The comparison of monthly and annual SIC shows that the largest bias and root mean square difference (RMSD) for the PM SIC products mainly occur in summer and the marginal ice zone, indicating that there are still many uncertainties in PM SIC products in such period and region. Meanwhile, the daily sea ice extent (SIE) and sea ice area (SIA) derived from the four PM SIC products can generally well reflect the variation trend of SIE and SIA in Arctic and Antarctic. The largest bias of SIE and SIA are above 4×10<sup>6</sup> km<sup>2</sup> when the sea ice reaches the maximum and minimum value, and the daily bias of SIE and SIA vary seasonally and regionally, which is mainly concentrated from June to October in Arctic. In general, among the four PM SIC products, the SSMIS/ASI product performs the best compared with ERA SIC though it usually underestimates SIC with a negative bias. The FY3B/NT2 and FY3C/NT2 products show more significant discrepancy with higher RMSD and bias in Arctic and Antarctic compared with the SSMIS/ASI and AMSR2/BT. The AMSR2/BT product performs much better in Antarctic than in Arctic and it always overestimates ERA SIC with a positive bias. The consistency of the four PM products concerning ERA SIC in the Antarctic region is generally superior to that in Arctic region.</p>


Author(s):  
Ethan N. Evans ◽  
Patrick Meyer ◽  
Samuel Seifert ◽  
Dimitri N. Mavris ◽  
Evangelos A. Theodorou

Rapidly Exploring Random Trees (RRTs) have gained significant attention due to provable properties such as completeness and asymptotic optimality. However, offline methods are only useful when the entire problem landscape is known a priori. Furthermore, many real world applications have problem scopes that are orders of magnitude larger than typical mazes and bug traps that require large numbers of samples to match typical sample densities, resulting in high computational effort for reasonably low-cost trajectories. In this paper we propose an online trajectory optimization algorithm for uncertain large environments using RRTs, which we call Locally Adaptive Rapidly Exploring Random Tree (LARRT). This is achieved through two main contributions. We use an adaptive local sampling region and adaptive sampling scheme which depend on states of the dynamic system and observations of obstacles. We also propose a localized approach to planning and re-planning through fixing the root node to the current vehicle state and adding tree update functions. LARRT is designed to leverage local problem scope to reduce computational complexity and obtain a total lower-cost solution compared to a classical RRT of a similar number of nodes. Using this technique we can ensure that popular variants of RRT will remain online even for prohibitively large planning problems by transforming a large trajectory optimization approach to one that resembles receding horizon optimization. Finally, we demonstrate our approach in simulation and discuss various algorithmic trade-offs of the proposed approach.


Author(s):  
Tiantian Xie ◽  
Marc Olano

Real-time adaptive sampling is a new technique recently proposed for efficient importance sampling in realtime Monte Carlo sampling in subsurface scattering. It adaptively places samples based on variance tracking to help escape the uncanny valley of subsurface rendering. However, the occasional performance drop due to temporal lighting dynamics (e.g., guns or lights turning on and off) could hinder adoption in games or other applications where smooth high frame rate is preferred. In this paper we propose a novel usage of Control Variates (CV) in the sample domain instead of shading domain to maintain a consistent low pass time. Our algorithm seamlessly reduces to diffuse with zero scattering samples for sub-pixel scattering. We propose a novel joint-optimization algorithm for sample count and CV coefficient estimation. The main enabler is our novel time-variant covariance updating method that helps remove the effect of recent temporal dynamics from variance tracking. Since bandwidth is critical in real-time rendering, a solution without adding any extra textures is also provided.


PLoS ONE ◽  
2021 ◽  
Vol 16 (5) ◽  
pp. e0251521
Author(s):  
Jun Ruan ◽  
Zhikui Zhu ◽  
Chenchen Wu ◽  
Guanglu Ye ◽  
Jingfan Zhou ◽  
...  

Pathologists generally pan, focus, zoom and scan tissue biopsies either under microscopes or on digital images for diagnosis. With the rapid development of whole-slide digital scanners for histopathology, computer-assisted digital pathology image analysis has attracted increasing clinical attention. Thus, the working style of pathologists is also beginning to change. Computer-assisted image analysis systems have been developed to help pathologists perform basic examinations. This paper presents a novel lightweight detection framework for automatic tumor detection in whole-slide histopathology images. We develop the Double Magnification Combination (DMC) classifier, which is a modified DenseNet-40 to make patch-level predictions with only 0.3 million parameters. To improve the detection performance of multiple instances, we propose an improved adaptive sampling method with superpixel segmentation and introduce a new heuristic factor, local sampling density, as the convergence condition of iterations. In postprocessing, we use a CNN model with 4 convolutional layers to regulate the patch-level predictions based on the predictions of adjacent sampling points and use linear interpolation to generate a tumor probability heatmap. The entire framework was trained and validated using the dataset from the Camelyon16 Grand Challenge and Hubei Cancer Hospital. In our experiments, the average AUC was 0.95 in the test set for pixel-level detection.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


2019 ◽  
Vol 62 (5) ◽  
pp. 1486-1505
Author(s):  
Joshua M. Alexander

PurposeFrequency lowering in hearing aids can cause listeners to perceive [s] as [ʃ]. The S-SH Confusion Test, which consists of 66 minimal word pairs spoken by 6 female talkers, was designed to help clinicians and researchers document these negative side effects. This study's purpose was to use this new test to evaluate the hypothesis that these confusions will increase to the extent that low frequencies are altered.MethodTwenty-one listeners with normal hearing were each tested on 7 conditions. Three were control conditions that were low-pass filtered at 3.3, 5.0, and 9.1 kHz. Four conditions were processed with nonlinear frequency compression (NFC): 2 had a 3.3-kHz maximum audible output frequency (MAOF), with a start frequency (SF) of 1.6 or 2.2 kHz; 2 had a 5.0-kHz MAOF, with an SF of 1.6 or 4.0 kHz. Listeners' responses were analyzed using concepts from signal detection theory. Response times were also collected as a measure of cognitive processing.ResultsOverall, [s] for [ʃ] confusions were minimal. As predicted, [ʃ] for [s] confusions increased for NFC conditions with a lower versus higher MAOF and with a lower versus higher SF. Response times for trials with correct [s] responses were shortest for the 9.1-kHz control and increased for the 5.0- and 3.3-kHz controls. NFC response times were also significantly longer as MAOF and SF decreased. The NFC condition with the highest MAOF and SF had statistically shorter response times than its control condition, indicating that, under some circumstances, NFC may ease cognitive processing.ConclusionsLarge differences in the S-SH Confusion Test across frequency-lowering conditions show that it can be used to document a major negative side effect associated with frequency lowering. Smaller but significant differences in response times for correct [s] trials indicate that NFC can help or hinder cognitive processing, depending on its settings.


Author(s):  
Martin Chavant ◽  
Alexis Hervais-Adelman ◽  
Olivier Macherey

Purpose An increasing number of individuals with residual or even normal contralateral hearing are being considered for cochlear implantation. It remains unknown whether the presence of contralateral hearing is beneficial or detrimental to their perceptual learning of cochlear implant (CI)–processed speech. The aim of this experiment was to provide a first insight into this question using acoustic simulations of CI processing. Method Sixty normal-hearing listeners took part in an auditory perceptual learning experiment. Each subject was randomly assigned to one of three groups of 20 referred to as NORMAL, LOWPASS, and NOTHING. The experiment consisted of two test phases separated by a training phase. In the test phases, all subjects were tested on recognition of monosyllabic words passed through a six-channel “PSHC” vocoder presented to a single ear. In the training phase, which consisted of listening to a 25-min audio book, all subjects were also presented with the same vocoded speech in one ear but the signal they received in their other ear differed across groups. The NORMAL group was presented with the unprocessed speech signal, the LOWPASS group with a low-pass filtered version of the speech signal, and the NOTHING group with no sound at all. Results The improvement in speech scores following training was significantly smaller for the NORMAL than for the LOWPASS and NOTHING groups. Conclusions This study suggests that the presentation of normal speech in the contralateral ear reduces or slows down perceptual learning of vocoded speech but that an unintelligible low-pass filtered contralateral signal does not have this effect. Potential implications for the rehabilitation of CI patients with partial or full contralateral hearing are discussed.


1968 ◽  
Vol 11 (1) ◽  
pp. 63-76
Author(s):  
Donald C. Teas ◽  
Gretchen B. Henry

The distributions of instantaneous voltage amplitudes in the cochlear microphonic response recorded from a small segment along the basilar membrane are described by computing amplitude histograms. Comparisons are made between the distributions for noise and for those after the addition to the noise of successively stronger sinusoids. The amplitudes of the cochlear microphonic response to 5000 Hz low-pass noise are normally distributed in both Turn I and Turn III of the guinea pig’s cochlea. The spectral composition of the microphonic from Turn I and from Turn III resembles the output of band-pass filters set at about 4000 Hz, and about 500 Hz, respectively. The normal distribution of cochlear microphonic amplitudes for noise is systematically altered by increasing the strength of the added sinusoid. A decrease of three percent in the number of small amplitude events (±1 standard deviation) in the cochlear microphonic from Turn III is seen when the rms voltage of a 500 Hz sinusoid is at −18 dB re the rms voltage of the noise (at the earphone). When the rms of the sinusoid and noise are equal, the decrease in small voltages is about 25%, but there is also an increase in the number of large voltage amplitudes. Histograms were also computed for the output of an electronic filter with a pass-band similar to Turn III of the cochlea. Strong 500 Hz sinusoids showed a greater proportion of large amplitudes in the filter output than in CM III . The data are interpreted in terms of an anatomical substrate.


2000 ◽  
Vol 5 (6) ◽  
pp. 1-7
Author(s):  
Christopher R. Brigham ◽  
James B. Talmage ◽  
Leon H. Ensalada

Abstract The AMA Guides to the Evaluation of Permanent Impairment (AMA Guides), Fifth Edition, is available and includes numerous changes that will affect both evaluators who and systems that use the AMA Guides. The Fifth Edition is nearly twice the size of its predecessor (613 pages vs 339 pages) and contains three additional chapters (the musculoskeletal system now is split into three chapters and the cardiovascular system into two). Table 1 shows how chapters in the Fifth Edition were reorganized from the Fourth Edition. In addition, each of the chapters is presented in a consistent format, as shown in Table 2. This article and subsequent issues of The Guides Newsletter will examine these changes, and the present discussion focuses on major revisions, particularly those in the first two chapters. (See Table 3 for a summary of the revisions to the musculoskeletal and pain chapters.) Chapter 1, Philosophy, Purpose, and Appropriate Use of the AMA Guides, emphasizes objective assessment necessitating a medical evaluation. Most impairment percentages in the Fifth Edition are unchanged from the Fourth because the majority of ratings currently are accepted, there is limited scientific data to support changes, and ratings should not be changed arbitrarily. Chapter 2, Practical Application of the AMA Guides, describes how to use the AMA Guides for consistent and reliable acquisition, analysis, communication, and utilization of medical information through a single set of standards.


Sign in / Sign up

Export Citation Format

Share Document