unfiltered data
Recently Published Documents


TOTAL DOCUMENTS

9
(FIVE YEARS 4)

H-INDEX

4
(FIVE YEARS 0)

2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Alina van de Burgt ◽  
Petra Dibbets-Schneider ◽  
Cornelis H. Slump ◽  
Arthur J. H. A. Scholte ◽  
Douwe E. Atsma ◽  
...  

Abstract Background Quantitative SPECT enables absolute quantification of uptake in perfusion defects. The aim of this experimental study is to assess quantitative accuracy and precision of a novel iterative reconstruction technique (Evolution; GE Healthcare) for the potential application of response monitoring using 99mTc-tetrofosmin SPECT/CT in patients with coronary artery disease (CAD). Methods Acquisitions of an anthropomorphic torso phantom with cardiac insert containing defects (with varying sizes), filled with 99mTc-pertechnetate, were performed on a SPECT/CT (Discovery 670 Pro, GE Healthcare). Subsequently, volumes of interest of the defects were manually drawn on CT to assess the recovery coefficient (RC). Bull’s eye plots were composed to evaluate the uptake per segment. Finally, 99mTc-tetrofosmin SPECT/CT scans of 10 CAD patients were used to illustrate clinical application. Results The phantom study indicated that Evolution showed convergence after 7 iterations and 10 subsets. The average repeatability deviation of all configurations was 2.91% and 3.15% (%SD mean) for filtered (Butterworth) and unfiltered data, respectively. The accuracy after post-filtering was lower compared to the unfiltered data with a mean (SD) RC of 0.63 (0.05) and 0.70 (0.07), respectively (p < 0.05). More artificial defects were found on Bull’s eye plots created with the unfiltered data compared to filtered data. Eight out of ten patients showed significant changes in uptake before and after treatment (p < 0.05). Conclusion Quantification of 99mTc-tetrofosmin SPECT/CT seems feasible for CAD patients when 7 iterations (10 subsets), Butterworth post-filtering (cut off frequency 0.52 in cycles/cm, order of 5) and manual CT-delineation are applied. However, future prospective patient studies are required for clinical application.


2021 ◽  
Vol 12 ◽  
Author(s):  
Alice Risely ◽  
Mark A. F. Gillingham ◽  
Arnaud Béchet ◽  
Stefan Brändel ◽  
Alexander C. Heni ◽  
...  

The filtering of gut microbial datasets to retain high prevalence taxa is often performed to identify a common core gut microbiome that may be important for host biological functions. However, prevalence thresholds used to identify a common core are highly variable, and it remains unclear how they affect diversity estimates and whether insights stemming from core microbiomes are comparable across studies. We hypothesized that if macroecological patterns in gut microbiome prevalence and abundance are similar across host species, then we would expect that increasing prevalence thresholds would yield similar changes to alpha diversity and beta dissimilarity scores across host species datasets. We analyzed eight gut microbiome datasets based on 16S rRNA gene amplicon sequencing and collected from different host species to (1) compare macroecological patterns across datasets, including amplicon sequence variant (ASV) detection rate with sequencing depth and sample size, occupancy-abundance curves, and rank-abundance curves; (2) test whether increasing prevalence thresholds generate universal or host-species specific effects on alpha and beta diversity scores; and (3) test whether diversity scores from prevalence-filtered core communities correlate with unfiltered data. We found that gut microbiomes collected from diverse hosts demonstrated similar ASV detection rates with sequencing depth, yet required different sample sizes to sufficiently capture rare ASVs across the host population. This suggests that sample size rather than sequencing depth tends to limit the ability of studies to detect rare ASVs across the host population. Despite differences in the distribution and detection of rare ASVs, microbiomes exhibited similar occupancy-abundance and rank-abundance curves. Consequently, increasing prevalence thresholds generated remarkably similar trends in standardized alpha diversity and beta dissimilarity across species datasets until high thresholds above 70%. At this point, diversity scores tended to become unpredictable for some diversity measures. Moreover, high prevalence thresholds tended to generate diversity scores that correlated poorly with the original unfiltered data. Overall, we recommend that high prevalence thresholds over 70% are avoided, and promote the use of diversity measures that account for phylogeny and abundance (Balance-weighted phylogenetic diversity and Weighted Unifrac for alpha and beta diversity, respectively), because we show that these measures are insensitive to prevalence filtering and therefore allow for the consistent comparison of core gut microbiomes across studies without the need for prevalence filtering.


Author(s):  
José Ferraz-Caetano

This chapter will focus on the perception of scientific topics throughout digital dissemination and educational activities. By assessing the evolution on the impact of massive online information dissemination in scientific topics, the chapter aims to address the issue of quality and reliability of scientific information, disseminated towards university students. For this assessment, it will be argued if we must be concerned about a new generation of students being (intentionally or not) misinformed about core insights of their development. This will be done by outlining the major influences of digital and social media in science students' scholastic activities. By adapting scientific models on the spread of misinformation, this chapter argues how students can be subject to gather and to be exposed to unfiltered data, which can potentially be demeaning to their educational development.


2019 ◽  
Vol 12 (2) ◽  
Author(s):  
Pieter Blignaut

Prior to delivery of data, eye tracker software may apply filtering to correct for noise. Although filtering produces much better precision of data, it may add to the time it takes for the reporting of gaze data to stabilise after a saccade due to the usage of a sliding window. The effect of various filters and parameter settings on accuracy, precision and filter related latency is examined. A cost function can be used to obtain the optimal parameters (filter, length of window, metric and threshold for removal of samples and removal percentage). It was found that for any of the FIR filters, the standard deviation of samples can be used to remove 95% of samples in the window so than an optimum combination of filter related latency and precision can be obtained. It was also confirmed that for unfiltered data, the shape of noise, signified by RMS/STD, is around √2 as expected for white noise, whereas lower RMS/STD values were observed for all filters.


2009 ◽  
Vol 137 (11) ◽  
pp. 3874-3887 ◽  
Author(s):  
Aaron Donohoe ◽  
David S. Battisti

Abstract The “background” state is commonly removed from synoptic fields by use of either a spatial or temporal filter prior to the application of feature tracking. Commonly used spatial and temporal filters applied to sea level pressure data admit substantially different information to be included in the synoptic fields. The spatial filter retains a time-mean field that has comparable magnitude to a typical synoptic perturbation. In contrast, the temporal filter removes the entire time-mean field. The inclusion of the time-mean spatially filtered field biases the feature tracking statistics toward large cyclone (anticyclone) magnitudes in the regions of climatological lows (highs). The resulting cyclone/anticyclone magnitude asymmetries in each region are found to be inconsistent with the unfiltered data fields and merely result from the spurious inclusion of the time-mean fields in the spatially filtered data. The temporally filtered fields do not suffer from the same problem and produce modest cyclone/anticyclone magnitude asymmetries that are consistent with the unfiltered data. This analysis suggests that the weather forecaster’s assertion that cyclones have larger amplitudes than anticyclones is due to a composite of a small magnitude asymmetry in the synoptic waves and a large contribution from inhomogeneity in the background (stationary) field.


1968 ◽  
Vol 46 (10) ◽  
pp. S985-S989 ◽  
Author(s):  
Scott E. Forbush ◽  
S. P. Duggal ◽  
Martin A. Pomerantz

To test whether the diurnal variation is more reliably determined from filtered data, a daily harmonic analysis is made before and after filtering an adequate sequence of synthetic bihourly values containing only random noise and a 24-hour wave of constant phase and amplitude. For each of three filters it is shown, empirically, that the statistical uncertainty of the 24-hour wave from N days of such filtered data does not differ significantly from that from N days of unfiltered data. The filters were of different bandwidths and each was designed to pass the 24-hour wave.


Sign in / Sign up

Export Citation Format

Share Document