smoothing methods
Recently Published Documents


TOTAL DOCUMENTS

330
(FIVE YEARS 83)

H-INDEX

28
(FIVE YEARS 4)

2022 ◽  
Vol 54 (9) ◽  
pp. 1-33
Author(s):  
Josef Schmid ◽  
Alfred Höss ◽  
Björn W. Schuller

Network communication has become a part of everyday life, and the interconnection among devices and people will increase even more in the future. Nevertheless, prediction of Quality of Service parameters, particularly throughput, is quite a challenging task. In this survey, we provide an extensive insight into the literature on Transmission Control Protocol throughput prediction. The goal is to provide an overview of the used techniques and to elaborate on open aspects and white spots in this area. We assessed more than 35 approaches spanning from equation-based over various time smoothing to modern learning and location smoothing methods. In addition, different error functions for the evaluation of the approaches as well as publicly available recording tools and datasets are discussed. To conclude, we point out open challenges especially looking in the area of moving mobile network clients. The use of throughput prediction not only enables a more efficient use of the available bandwidth, the techniques shown in this work also result in more robust and stable communication.


2021 ◽  
Author(s):  
Rebecca Younk ◽  
Alik S Widge

Background Defensive and threat-related behaviors are common targets of investigation, because they model aspects of human mental illness. These behaviors are typically quantified by video recording and post hoc analysis. Those quantifications can be laborious and/or computationally intensive. Depending on the analysis method, the resulting measurements can be noisy or inaccurate. Other defensive behaviors, such as suppression of operant reward seeking, require extensive animal pre-training. New Method We demonstrate a method for quantifying defensive behavior (immobility or freezing) by 3-axis accelerometry integrated with an electrophysiology headstage. We tested multiple pre-processing and smoothing methods, and correlated them against two common methods for quantification: freezing as derived from standard video analysis, and suppression of operantly shaped bar pressing. We assessed these three methods' ability to track defensive behavior during a standard threat conditioning and extinction paradigm. Results The best approach to tracking defensive behavior from accelerometry was Gaussian filter smoothing of the first derivative (change score or jerk). Behavior scores from this method reproduced canonical conditioning and extinction curves at the group level. At the individual level, timepoint-to-timepoint correlations between accelerometry, video, and bar press metrics were statistically significant but modest (largest r=0.53, between accelerometry and bar press). Comparison with existing methods The integration with standard electrophysiology systems and relatively lightweight signal processing may make accelerometry particularly well suited to detect behavior in resource-constrained or real-time applications. At the same time, there were modest cross-correlations between all three methods for quantifying defensive behavior. Conclusions Accelerometry analysis allows researchers already using electrophysiology to assess defensive behaviors without the need for additional behavioral measures or video. The similarities in behavioral tracking and modest correlations between each metric suggest that each measures a distinct aspect of defensive behavior. Accelerometry is a viable alternative to current defensive measurements, and its non-overlap with other metrics may allow a more sophisticated dissection of threat responses in future experiments.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Ben Duggan ◽  
Yusong Liu ◽  
Tongxin Wang ◽  
Jie Zhang ◽  
Travis S. Johnson ◽  
...  

Background: Cancer heterogeneity can impact diagnosis and therapeutics. This has been studied at the cellular level using single cell RNA sequencing (scRNA-seq) but not spatially. Spatial transcriptomics (ST) has recently enabled measuring transcriptomes at hundreds of locations in a tissue, revolutionizing our understanding of tumor heterogeneity. Low RNA quantities in scRNA-seq and ST introduce missed reads and noise. Multiple methods have been developed to impute and smooth scRNA-seq data including MAGIC and SAVER. To date, only our newly developed spatial and pattern combined smoothing (SPCS) method has been developed specifically for ST. We compared the biological interpretability of these smoothing methods to determine which method best informs tumor heterogeneity. Methods: Ten ST slides from six patients with PDAC were computationally smoothed using SAVER, MAGIC, and SPCS. ST spots from unsmoothed and smoothed slide were split into TM4SF1 high or low expression groups and differentially expressed genes (DEG) found. Significant up- and down-regulated DEGs were used for functional enrichment analysis (EA) to compare the biological insights each method provides. Results: DEGs were found in eight samples, with SPCS finding the most DEGs in six of the eight samples. The number of EA terms generally correlated with the numbers DEGs. SPCS had the most up-regulated enriched terms for every ST slide. Top ten up- and down-regulated EA terms were presented from one ST slide to demonstrate that SPCS gives more biologically interpretable results. Lastly, we present SPCS terms previously been reported in PDAC or involved in the TM4SF1 cancer pathway. Conclusion: Smoothing ST data using SPCS provided more DEGs and more useful EA terms. The EA terms found using SPCS are more consistent with our current understanding of PDAC. Not smoothing, MAGIC, and SAVER missed many important terms which SPCS found. Compared to other methods, SPCS smoothed data is more interpretable.


Forecasting ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 839-850
Author(s):  
Eren Bas ◽  
Erol Egrioglu ◽  
Ufuk Yolcu

Exponential smoothing methods are one of the classical time series forecasting methods. It is well known that exponential smoothing methods are powerful forecasting methods. In these methods, exponential smoothing parameters are fixed on time, and they should be estimated with efficient optimization algorithms. According to the time series component, a suitable exponential smoothing method should be preferred. The Holt method can produce successful forecasting results for time series that have a trend. In this study, the Holt method is modified by using time-varying smoothing parameters instead of fixed on time. Smoothing parameters are obtained for each observation from first-order autoregressive models. The parameters of the autoregressive models are estimated by using a harmony search algorithm, and the forecasts are obtained with a subsampling bootstrap approach. The main contribution of the paper is to consider the time-varying smoothing parameters with autoregressive equations and use the bootstrap method in an exponential smoothing method. The real-world time series are used to show the forecasting performance of the proposed method.


2021 ◽  
Author(s):  
Yusong Liu ◽  
Tongxin Wang ◽  
Ben Duggan ◽  
Kun Huang ◽  
Jie Zhang ◽  
...  

The recently developed spatial transcriptomics (ST) technique has made it possible to view spatial transcriptional heterogeneity in a high throughput manner. It is based on highly multiplexed sequence analysis and uses barcodes to split the sequenced reads into respective tissue locations. However, this type of sequencing technique suffers from high noise and drop-out events in the data, which makes smoothing a necessary step before performing downstream analysis. Traditional smoothing methods used in the similar single cell RNA sequencing (scRNA-seq) data are one-factor methods that can only utilize associations in transcriptome space. Since they do not account for associations in the Euclidean space, i.e. tissue location distances on the ST slide, these one-factor methods cannot take full advantage of all the knowledge in ST data. In this study, we present a novel two-factor smoothing technique, Spatial and Pattern Combined Smoothing (SPCS), that employs k-nearest neighbor technique to utilize associations from transcriptome and Euclidean space from the ST data. By performing SPCS on 10 ST slides from pancreatic ductal adenocarcinoma (PDAC), smoothed ST slides have better separability, partition accuracy, and biological interpretability than the ones smoothed by pre-existing one-factor smoothing methods. Source code of SPCS is provided in Github (https://github.com/Usos/SPCS).


BMJ Open ◽  
2021 ◽  
Vol 11 (11) ◽  
pp. e047806
Author(s):  
Valerie Andrees ◽  
Sandra Wolf ◽  
Matthias Augustin ◽  
Nicole Mohr ◽  
Jobst Augustin

ObjectivesGlobal prevalence rates of psoriasis differ significantly, with lowest rates in the equator region and increasing tendencies towards the north but also differences within-country. Information on regional variations in Germany is missing. This study aims to analyse the change of psoriasis prevalence in Germany over time and to detect regional variations.DesignCross sectional, spatio-epidemiological study on regional psoriasis prevalence in Germany.SettingClaims data study based on nationwide outpatient billing data on county level.MethodsAnalyses based on outpatient billing data for 2010–2017 derived from all people insured in statutory health insurances (about 72.8 million). We performed descriptive spatio-temporal analyses of prevalence rates using probability mapping and statistical smoothing methods, identified spatial clusters and examined a north-south gradient using spatial statistics.ResultsThe prevalence increased from 147.4 per 10 000 in 2010 to 173.5 in 2017. In 2017, counties’ prevalence rates ranged between 93.8 and 340.9. Decreased rates occurred mainly in southern counties, increased rates in northern and eastern counties. Clusters of low rates occur in southern and south-western Germany, clusters of high rates in the north and north-east. The correlation between counties’ latitudes and their prevalence rates was high with Pearson’s r=0.65 (p<0.05).ConclusionIncreased prevalence of psoriasis over time and marked regional variations in Germany were observed which need further investigation.


2021 ◽  
Vol 934 (1) ◽  
pp. 012016
Author(s):  
A Pamungkas ◽  
R Puspasari ◽  
A Nurfiarini ◽  
R Zulkarnain ◽  
W Waryanto

Abstract Pekalongan waters, a part of the Java Sea, has potency to develop marine fisheries sector to increase regional income and community livelihoods. The fluctuation of marine fish production every year requires serious attention in planning and policy strategies for the utilization of the fishery resources. Time series fish production data can be used to predict fish production in the following years through the forecasting process. The data used in this study is fish production data from Pekalongan Fishing Port, Central Java, from January 2011 to December 2020. The method used is data exponential smoothing by comparing three exponential smoothing methods consisting of single/simple exponential smoothing, double exponential smoothing and Holt-Winters’ exponential smoothing. The criterion that used to measure the forecasting performance is the mean absolute percentage error (MAPE) value. The smaller MAPE value shows the better the forecasting result. The smallest MAPE value is obtained by finding the optimal smoothing constant value which is usually calculated using the trial and error method. However, in this study, the constant value was calculated using the add-in solver approach in Microsoft Excel. The forecasting results obtained show that forecasting using the Holt Winter Exponential Smoothing method is reasonable with a MAPE value of 37.878.


Sign in / Sign up

Export Citation Format

Share Document