gaussian models
Recently Published Documents


TOTAL DOCUMENTS

351
(FIVE YEARS 55)

H-INDEX

34
(FIVE YEARS 3)

2021 ◽  
Vol 2096 (1) ◽  
pp. 012137
Author(s):  
V M Artyushenko ◽  
V I Volovach

Abstract Analysis performed transformation of random signals and noise in linear and nonlinear systems based on the use of poly-Gaussian models and multidimensional PDF of the output paths of information-measuring and radio systems. The classification of elements of these systems, as well as expressions describing the input action and output response of the system are given. It is shown that the analysis of information-measuring and systems can be carried out using poly-Gaussian models. The analysis is carried out with a series connection of a linear system and a nonlinear element, a series connection of a nonlinear element and a linear system, as well as with a parallel connection of the named links. The output response in all cases will be a mixture of a poly-Gaussian distribution with a number of components. An example of the analysis of signal transmission through an intermediate frequency amplifier and a linear detector against a background of non-Gaussian noise is given. The resulting probability density distribution of the sum of the signal and non-Gaussian noise at the output of the detector will be poly-Rice. The multidimensional probability distribution density of the output processes of the nonlinear signal envelope detector is also obtained. The results of modeling the found distribution densities are presented. It is shown that the use of the poly-Gaussian representation of signals and noise, as well as the impulse response of the system, makes it possible to effectively analyze inertial systems in the time domain.


2021 ◽  
Vol 150 (4) ◽  
pp. 2492-2502
Author(s):  
Kenneth E. Hancock ◽  
Bennett O'Brien ◽  
Rosamaria Santarelli ◽  
M. Charles Liberman ◽  
Stéphane F. Maison

2021 ◽  
Author(s):  
Vitalii Dementiev ◽  
Alexander Tashlinskii
Keyword(s):  

Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4652
Author(s):  
Sergio Márquez-Sánchez ◽  
Israel Campero-Jurado ◽  
Jorge Herrera-Santos ◽  
Sara Rodríguez ◽  
Juan M. Corchado

It is estimated that we spend one-third of our lives at work. It is therefore vital to adapt traditional equipment and systems used in the working environment to the new technological paradigm so that the industry is connected and, at the same time, workers are as safe and protected as possible. Thanks to Smart Personal Protective Equipment (PPE) and wearable technologies, information about the workers and their environment can be extracted to reduce the rate of accidents and occupational illness, leading to a significant improvement. This article proposes an architecture that employs three pieces of PPE: a helmet, a bracelet and a belt, which process the collected information using artificial intelligence (AI) techniques through edge computing. The proposed system guarantees the workers’ safety and integrity through the early prediction and notification of anomalies detected in their environment. Models such as convolutional neural networks, long short-term memory, Gaussian Models were joined by interpreting the information with a graph, where different heuristics were used to weight the outputs as a whole, where finally a support vector machine weighted the votes of the models with an area under the curve of 0.81.


2021 ◽  
Vol 15 (2) ◽  
Author(s):  
Paul Bastide ◽  
Lam Si Tung Ho ◽  
Guy Baele ◽  
Philippe Lemey ◽  
Marc A. Suchard

Author(s):  
Ulrich Knief ◽  
Wolfgang Forstmeier

AbstractWhen data are not normally distributed, researchers are often uncertain whether it is legitimate to use tests that assume Gaussian errors, or whether one has to either model a more specific error structure or use randomization techniques. Here we use Monte Carlo simulations to explore the pros and cons of fitting Gaussian models to non-normal data in terms of risk of type I error, power and utility for parameter estimation. We find that Gaussian models are robust to non-normality over a wide range of conditions, meaning that p values remain fairly reliable except for data with influential outliers judged at strict alpha levels. Gaussian models also performed well in terms of power across all simulated scenarios. Parameter estimates were mostly unbiased and precise except if sample sizes were small or the distribution of the predictor was highly skewed. Transformation of data before analysis is often advisable and visual inspection for outliers and heteroscedasticity is important for assessment. In strong contrast, some non-Gaussian models and randomization techniques bear a range of risks that are often insufficiently known. High rates of false-positive conclusions can arise for instance when overdispersion in count data is not controlled appropriately or when randomization procedures ignore existing non-independencies in the data. Hence, newly developed statistical methods not only bring new opportunities, but they can also pose new threats to reliability. We argue that violating the normality assumption bears risks that are limited and manageable, while several more sophisticated approaches are relatively error prone and particularly difficult to check during peer review. Scientists and reviewers who are not fully aware of the risks might benefit from preferentially trusting Gaussian mixed models in which random effects account for non-independencies in the data.


Sign in / Sign up

Export Citation Format

Share Document