filtering procedure
Recently Published Documents


TOTAL DOCUMENTS

109
(FIVE YEARS 12)

H-INDEX

15
(FIVE YEARS 0)

Author(s):  
П.В. Полухин

В работе предложены математические инструменты на основе достаточных статистик и декомпозиции выборок в сочетании с алгоритмами распределенных вычислений, позволяющие существенно повысить эффективность процедуры фильтрации. Filtering algorithms are used to assess the state of dynamic systems when solving various practical problems, such as voice synthesis and determining the geo-position and monitoring the movement of objects. In the case of complex hierarchical dynamic systems with a large number of time slices, the process of calculating probabilistic characteristics becomes very time-consuming due to the need to generate a large number of samples. The essence of optimization is to reduce the number of samples generated by the filter, increase their consistency and speed up computational operations. The paper offers mathematical tools based on sufficient statistics and sample decomposition in combination with distributed computing algorithms that can significantly improve the efficiency of the filtering procedure.



2021 ◽  
Vol 2131 (2) ◽  
pp. 022098
Author(s):  
E Yu Bursian ◽  
A M Demin

Abstract The paper proposes the improved skeleton method of handwritten characters recognition, which is based on the filtering procedure and the principle of alternating shading schemes of skeletonized area on the 4- and 8-timeslinked raster. The procedure of high-frequency filtration based on discrete real cosine transformation or discrete complex Fourier transform with automatic selection of filtration parameters makes it possible to significantly improve the image quality of handwritten symbols, in particular, to eliminate in many cases thin bridges between the areas of symbol element representation. The principle of alternating the painting schemes along the 4- and 8-timeslinked raster makes it possible to get the wave front of the skeletonized area close to a circle. In this case, the broken lines representing the branches of the skeleton graphs retain the shapes of the symbols. Numerical experiments on the construction of skeleton sets and skeleton graphs for recognizable handwritten symbols located in the cells of the tables of logistic transport problems have been performed. Software implementation of the method is proposed.



2021 ◽  
Author(s):  
Inna Skarga-Bandurova ◽  
Tetiana Biloborodova ◽  
Illia Skarha-Bandurov ◽  
Yehor Boltov ◽  
Maryna Derkach

The paper introduces a multilayer long short-term memory (LSTM) based auto-encoder network to spot abnormalities in fetal ECG. The LSTM network was used to detect patterns in the time series, reconstruct errors and classify a given segment as an anomaly or not. The proposed anomaly detection method provides a filtering procedure able to reproduce ECG variability based on the semi-supervised paradigm. Experiments show that the proposed method can learn better features than the traditional approach without any prior knowledge and subject to proper signal identification can facilitate the analysis of fetal ECG signals in daily life.



2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Mael Conan ◽  
Nathalie Théret ◽  
Sophie Langouet ◽  
Anne Siegel

Abstract Background The liver plays a major role in the metabolic activation of xenobiotics (drugs, chemicals such as pollutants, pesticides, food additives...). Among environmental contaminants of concern, heterocyclic aromatic amines (HAA) are xenobiotics classified by IARC as possible or probable carcinogens (2A or 2B). There exist little information about the effect of these HAA in humans. While HAA is a family of more than thirty identified chemicals, the metabolic activation and possible DNA adduct formation have been fully characterized in human liver for only a few of them (MeIQx, PhIP, A$$\alpha$$ α C). Results We have developed a modeling approach in order to predict all the possible metabolites of a xenobiotic and enzymatic profiles that are linked to the production of metabolites able to bind DNA. Our prediction of metabolites approach relies on the construction of an enriched and annotated map of metabolites from an input metabolite.The pipeline assembles reaction prediction tools (SyGMa), sites of metabolism prediction tools (Way2Drug, SOMP and Fame 3), a tool to estimate the ability of a xenobotics to form DNA adducts (XenoSite Reactivity V1), and a filtering procedure based on Bayesian framework. This prediction pipeline was evaluated using caffeine and then applied to HAA. The method was applied to determine enzymes profiles associated with the maximization of metabolites derived from each HAA which are able to bind to DNA. The classification of HAA according to enzymatic profiles was consistent with their chemical structures. Conclusions Overall, a predictive toxicological model based on an in silico systems biology approach opens perspectives to estimate the genotoxicity of various chemical classes of environmental contaminants. Moreover, our approach based on enzymes profile determination opens the possibility of predicting various xenobiotics metabolites susceptible to bind to DNA in both normal and physiopathological situations.



Now a days, unexpectedly growing using on-line social networks (OSNs). Through this offerings user’s can speak and switch any data. The important thing downside of those Online Social Networking (OSN) offerings is the dearth of privateness for the user’s personal space. We use sample matching and textual content class set of rules for correct filtering results. We suggest a gadget permitting OSN customers to own a right awaymanages at the messages published on their walls. It might be a bendy region that rule primarily based totally gadget are used to lets in customers to customize the filtering procedure implemented to their user’s profiles. A system gaining knowledge of method robotically labeling messages in help of content-primarily based totally filtering. Index Terms: content-primarily based totally filtering, filtering rule, filtering gadget, system gaining knowledge of, on-line social networks



2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Céline Charon ◽  
Rodrigue Allodji ◽  
Vincent Meyer ◽  
Jean-François Deleuze

AbstractQuality control (QC) methods for genome-wide association studies and fine mapping are commonly used for imputation, however they result in loss of many single nucleotide polymorphisms (SNPs). To investigate the consequences of filtration on imputation, we studied the direct effects on the number of markers, their allele frequencies, imputation quality scores and post-filtration events. We pre-phrased 1031 genotyped individuals from diverse ethnicities and compared the imputed variants to 1089 NCBI recorded individuals for additional validation. Without QC-based variant pre-filtration, we observed no impairment in the imputation of SNPs that failed QC whereas with pre-filtration there was an overall loss of information. Significant differences between frequencies with and without pre-filtration were found only in the range of very rare (5E−04–1E−03) and rare variants (1E−03–5E−03) (p < 1E−04). Increasing the post-filtration imputation quality score from 0.3 to 0.8 reduced the number of single nucleotide variants (SNVs) < 0.001 2.5 fold with or without QC pre-filtration and halved the number of very rare variants (5E−04). Thus, to maintain confidence and enough SNVs, we propose here a two-step filtering procedure which allows less stringent filtering prior to imputation and post-imputation in order to increase the number of very rare and rare variants compared to conservative filtration methods.



2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Chong-Chih Tsai ◽  
Wei-Kuang Liang

AbstractThe detection of event-related potentials (ERPs) through electroencephalogram (EEG) analysis is a well-established method for understanding brain functions during a cognitive process. To increase the signal-to-noise ratio (SNR) and stationarity of the data, ERPs are often filtered to a wideband frequency range, such as 0.05–30 Hz. Alternatively, a natural-filtering procedure can be performed through empirical mode decomposition (EMD), which yields intrinsic mode functions (IMFs) for each trial of the EEG data, followed by averaging over trials to generate the event-related modes. However, although the EMD-based filtering procedure has advantages such as a high SNR, suitable waveform shape, and high statistical power, one fundamental drawback of the procedure is that it requires the selection of an IMF (or a partial sum of a range of IMFs) to determine an ERP component effectively. Therefore, in this study, we propose an intrinsic ERP (iERP) method to overcome the drawbacks and retain the advantages of event-related mode analysis for investigating ERP components. The iERP method can reveal multiple ERP components at their characteristic time scales and suitably cluster statistical effects among modes by using a tailored definition of each mode’s neighbors. We validated the iERP method by using realistic EEG data sets acquired from a face perception task and visual working memory task. By using these two data sets, we demonstrated how to apply the iERP method to a cognitive task and incorporate existing cluster-based tests into iERP analysis. Moreover, iERP analysis revealed the statistical effects between (or among) experimental conditions more effectively than the conventional ERP method did.



2021 ◽  
Author(s):  
Antoni Grau Ferrer ◽  
Mª Antònia Jiménez Cortés ◽  
Daniel Martínez Villagrasa ◽  
Joan Cuxart Rodamilans

&lt;p&gt;The Eastern Ebro basin is composed of an extensive lower irrigated&amp;#160;area, surrounded by dry-fed slopes and wooden mountain ranges to the North,&amp;#160;East and South, while to the West is open to the agricultural Western Ebro&amp;#160;basin. Previous studies, based on research data or on statistics for one station, indicate that these features determine the local circulations in the area.&amp;#160;A network of stations is used here to analyze a period of 15 years, taking&amp;#160;representative data for the different units of landscape. A filtering procedure&amp;#160;is developed which selects the events with predominance of local circulations,&amp;#160;based on detecting stably stratified nights.&lt;/p&gt;&lt;p&gt;The analysis of the filtered data indicates the presence of a valley circulation between the lower plain and the slopes and mountains that reverses its&amp;#160;sense of circulation between day and night, which intensity varies in summer&amp;#160;due to an increasing thermal contrast between irrigated and rain-fed areas. The&amp;#160;presence of sea-breeze in the late afternoon in the warm months is common,&amp;#160;bringing cooler and wetter marine air to the area after crossing the mountain&amp;#160;range at the South. At night in the centre of the basin, cold air pools are&amp;#160;formed, which evolve to persistent fog events in winter, causing the statistics&amp;#160;to be very different in that season compared to the rest of the year.&lt;/p&gt;



2021 ◽  
Vol 87 (1) ◽  
Author(s):  
Jan Nordström ◽  
Andrew R. Winters

AbstractWe prove that the most common filtering procedure for nodal discontinuous Galerkin (DG) methods is stable. The proof exploits that the DG approximation is constructed from polynomial basis functions and that integrals are approximated with high-order accurate Legendre–Gauss–Lobatto quadrature. The theoretical discussion re-contextualizes stable filtering results for finite difference methods into the DG setting. Numerical tests verify and validate the underlying theoretical results.



Sign in / Sign up

Export Citation Format

Share Document