scholarly journals Nonparametric kernel methods for curve estimation and measurement errors

2014 ◽  
Vol 10 (S306) ◽  
pp. 28-39
Author(s):  
Aurore Delaigle

AbstractWe consider the problem of estimating an unknown density or regression curve from data. In the parametric setting, the curve to estimate is modelled by a function which is known up to the value of a finite number of parameters. We consider the nonparametric setting, where the curve is not modelled a priori. We focus on kernel methods, which are popular nonparametric techniques that can be used for both density and regression estimation. While these methods are appropriate when the data are observed accurately, they cannot be directly applied to astronomical data, which are often measured with a certain degree of error. It is well known in the statistics literature that when the observations are measured with errors, nonparametric procedures become biased, and need to be adjusted for the errors. Correction techniques have been developed, and are often referred to as deconvolution methods. We introduce those methods, in both the homoscedastic and heteroscedastic error cases, and discuss their practical implementation.

2020 ◽  
pp. 65-72
Author(s):  
V. V. Savchenko ◽  
A. V. Savchenko

This paper is devoted to the presence of distortions in a speech signal transmitted over a communication channel to a biometric system during voice-based remote identification. We propose to preliminary correct the frequency spectrum of the received signal based on the pre-distortion principle. Taking into account a priori uncertainty, a new information indicator of speech signal distortions and a method for measuring it in conditions of small samples of observations are proposed. An example of fast practical implementation of the method based on a parametric spectral analysis algorithm is considered. Experimental results of our approach are provided for three different versions of communication channel. It is shown that the usage of the proposed method makes it possible to transform the initially distorted speech signal into compliance on the registered voice template by using acceptable information discrimination criterion. It is demonstrated that our approach may be used in existing biometric systems and technologies of speaker identification.


2018 ◽  
Vol 294 (5) ◽  
pp. 1753-1762 ◽  
Author(s):  
Jacques-Alexandre Sepulchre ◽  
Sylvie Reverchon ◽  
Jean-Luc Gouzé ◽  
William Nasser

In the quest for a sustainable economy of the Earth's resources and for renewable sources of energy, a promising avenue is to exploit the vast quantity of polysaccharide molecules contained in green wastes. To that end, the decomposition of pectin appears to be an interesting target because this polymeric carbohydrate is abundant in many fruit pulps and soft vegetables. To quantitatively study this degradation process, here we designed a bioreactor that is continuously fed with de-esterified pectin (PGA). Thanks to the pectate lyases produced by bacteria cultivated in the vessel, the PGA is depolymerized into oligogalacturonates (UGA), which are continuously extracted from the tank. A mathematical model of our system predicted that the conversion efficiency of PGA into UGA increases in a range of coefficients of dilution until reaching an upper limit where the fraction of UGA that is extracted from the bioreactor is maximized. Results from experiments with a continuous reactor hosting a strain of the plant pathogenic bacterium Dickeya dadantii and in which the dilution coefficients were varied quantitatively validated the predictions of our model. A further theoretical analysis of the system enabled an a priori comparison of the efficiency of eight other pectate lyase–producing microorganisms with that of D. dadantii. Our findings suggest that D. dadantii is the most efficient microorganism and therefore the best candidate for a practical implementation of our scheme for the bioproduction of UGA from PGA.


2014 ◽  
Vol 14 (23) ◽  
pp. 12897-12914 ◽  
Author(s):  
J. S. Wang ◽  
S. R. Kawa ◽  
J. Eluszkiewicz ◽  
D. F. Baker ◽  
M. Mountain ◽  
...  

Abstract. Top–down estimates of the spatiotemporal variations in emissions and uptake of CO2 will benefit from the increasing measurement density brought by recent and future additions to the suite of in situ and remote CO2 measurement platforms. In particular, the planned NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) satellite mission will provide greater coverage in cloudy regions, at high latitudes, and at night than passive satellite systems, as well as high precision and accuracy. In a novel approach to quantifying the ability of satellite column measurements to constrain CO2 fluxes, we use a portable library of footprints (surface influence functions) generated by the Stochastic Time-Inverted Lagrangian Transport (STILT) model in combination with the Weather Research and Forecasting (WRF) model in a regional Bayesian synthesis inversion. The regional Lagrangian particle dispersion model framework is well suited to make use of ASCENDS observations to constrain weekly fluxes in North America at a high resolution, in this case at 1° latitude × 1° longitude. We consider random measurement errors only, modeled as a function of the mission and instrument design specifications along with realistic atmospheric and surface conditions. We find that the ASCENDS observations could potentially reduce flux uncertainties substantially at biome and finer scales. At the grid scale and weekly resolution, the largest uncertainty reductions, on the order of 50%, occur where and when there is good coverage by observations with low measurement errors and the a priori uncertainties are large. Uncertainty reductions are smaller for a 1.57 μm candidate wavelength than for a 2.05 μm wavelength, and are smaller for the higher of the two measurement error levels that we consider (1.0 ppm vs. 0.5 ppm clear-sky error at Railroad Valley, Nevada). Uncertainty reductions at the annual biome scale range from ~40% to ~75% across our four instrument design cases and from ~65% to ~85% for the continent as a whole. Tests suggest that the quantitative results are moderately sensitive to assumptions regarding a priori uncertainties and boundary conditions. The a posteriori flux uncertainties we obtain, ranging from 0.01 to 0.06 Pg C yr−1 across the biomes, would meet requirements for improved understanding of long-term carbon sinks suggested by a previous study.


Author(s):  
Oleh Turenko ◽  

The Foucault’s interpretation of the police, its theoretical substantiation, the range of powers and managerial tasks in modernist discourses. The French philosopher emphasized it should the modern concept of “police” does not coincide with its original theories of modern times. The doctrines of modern political scientists idealized the vocation of the police and identified it with the entire government, providing it with universal means of implementing the state interest. Considering the police from the perspective of “history of thought” Foucault notes that it is the unlimited nature of police functions gave the modern government to approve a disciplinary society, a new form of government - bio-power. This form of power totally controlled the individual, “took care of him” at all levels of biological life and, above all, the depths of consciousness - artificially created his authenticity. At the same time, in the theories of political scientists, the police received the status of a self-regulatory body, whose activities were not strictly controlled by state laws. In this case, the police, in the imaginary sense, is the living embodiment of state interest, morality and integrity, the formative and corrective body of state power. In order to form a disciplined and productive life, the police must direct individuals to regulation, to their temporal and hierarchical repetition. The a priori qualities of the police and its all-encompassing powers form the basis for the assertion of the idea of a “police state” and its radical form of panopticon. It is thanks to the idea of panopticon, its practical implementation by the police in modern society - the formation of disciplinary practice of continuous control in the social institutions of modernism.


Author(s):  
A.A. Kuzmitsky ◽  
M.I. Truphanov ◽  
O.B. Tarasova ◽  
D.V. Fedosenko

One of the key tasks associated with the fast identification of powerful tropical hurricanes, the assessment of the growth of their power, is the formation of such an input dataset, which is based on data that are technically easy and accurately recorded and calculated using existing sources located in the open accessibility. The presented work is based on the analysis of satellite images as the main data sources, and on weather data as peripheral. An obvious advantage of satellite images in comparison with other sources of data on weather conditions is their high spatial resolution, as well as the ability to obtain data from various satellites, which increases the timeliness and accuracy of retrieving primary information. The developed approach consists in performing the following main interconnected iteratively performed groups of subtasks: calculation of feature points describing the location of individual cloud areas at different points in time by using different descriptors; comparison of the same cloud areas at specified times to analyze the local directions of cloud movements; tracking of cloudiness for specified time intervals; calculation of local features for selected points of cloudiness to recognize the origin and analyze turbulence; the formation of the dynamics of changes in the local area near the trajectory of the point; recognition of primary characteristic features characterizing the transformation of local turbulences into a stable vortex formation; identification of signs of the growing of a hurricane and assessment of the primary dynamics of the increase in its power; generalization and refinement of a priori given features by analyzing similar features of known cyclones. To detect points, a modified algorithm for finding them has been introduced. To describe the points, additional descriptors are introduced based on the normalized gradient measured for the neighborhood of neighboring points and cyclically changing in the polar coordinate system. A comparative analysis of the results of applying the created method and algorithm when compared with known similar solutions revealed the following distinctive features: introduction of additional invariant orientations of features when describing characteristic points and greater stability of detecting characteristic points when analyzing cloudiness, identification of cloudiness turbulence and analysis of changes in their local characteristics and movement parameters, formation of a set of generalizing distributions when analyzing a set of moving points for the subsequent recognition of the signs of a hurricane at its initial stages of formation. The developed approach was tested experimentally in the analysis of hurricanes video recordings and their movement in the Atlantic region for the period from 2010 to 2020. The developed general approach and a specific algorithm for estimating hurricane parameters based on cloud analysis are presented. The approach is applicable for practical implementation and allows accumulating data for detecting hurricanes in real time based on publicly available data for the development of a physical and mathematical model.


Author(s):  
Rinat Galiautdinov

In this article, the author considers the possibility of applying modern IT technologies to implement information processing algorithms in UAV motion control system. Filtration of coordinates and motion parameters of objects under a priori uncertainty is carried out using nonlinear adaptive filters: Kalman and Bayesian filters. The author considers numerical methods for digital implementation of nonlinear filters based on the convolution of functions, the possibilities of neural networks and fuzzy logic for solving the problems of tracking UAV objects (or missiles), the math model of dynamics, the features of the practical implementation of state estimation algorithms in the frame of added additional degrees of freedom. The considered algorithms are oriented on solving the problems in real time using parallel and cloud computing.


2015 ◽  
Vol 42 (7) ◽  
pp. 490-502 ◽  
Author(s):  
Hediye Tuydes-Yaman ◽  
Oruc Altintasi ◽  
Nuri Sendil

Intersection movements carry more disaggregate information about origin–destination (O–D) flows than link counts in a traffic network. In this paper, a mathematical formulation is presented for O–D matrix estimation using intersection counts, which is based on an existing linear programming model employing link counts. The proposed model estimates static O–D flows for uncongested networks assuming no a priori information on the O–D matrix. Both models were tested in two hypothetical networks previously used in O–D matrix studies to monitor their performances assuming various numbers of count location and measurement errors. Two new measures were proposed to evaluate the model characteristics of O–D flow estimation using traffic counts. While both link count based and intersection count based models performed with the same success under complete data collection assumption, intersection count based formulation estimated the O–D flows more successfully under decreasing number of observation locations. Also, the results of the 30 measurement error scenarios revealed that it performs more robustly than the link count based one; thus, it better estimates the O–D flows.


Author(s):  
Rahmat Hidayat ◽  
I. Nyoman Budiantara ◽  
Bambang W. Otok ◽  
Vita Ratnasari

2020 ◽  
Vol 2020 ◽  
pp. 1-10
Author(s):  
Wengui Mao ◽  
Chaoliang Hu ◽  
Jianhua Li ◽  
Zhonghua Huang ◽  
Guiping Liu

As a kind of rotor system, the electric spindle system is the core component of the precision grinding machine. The vibration caused by the mass imbalance is the main factor that causes the vibration of the grinding machine. Identifying the eccentricity parameters in an electric spindle system is a key issue in eliminating mass imbalances. It is difficult for engineers to understand the approximate range of eccentricity by experience; that is, it is difficult to obtain a priori information about eccentricity. At the same time, due to the geometric characteristics of the electrospindle system, the material factors and the randomness of the measurement response, these uncertain factors, even in a small case, are likely to cause large deviations in the eccentricity recognition results. The search algorithm used in the maximum likelihood method to identify the eccentricity parameters of the electrospindle system is computationally intensive, and the sensitivity in the iterative process brings some numerical problems. This paper introduces an Advance-Retreat Method (ARM) of the search interval to the maximum likelihood method, the unknown parameter increment obtained by the maximum likelihood method is used as the step size in the iteration, and the Advance-Retreat Method of the search interval is used to adjust the next design point so that the objective function value is gradually decreasing. The recognition results under the three kinds of measurement errors show that the improved maximum likelihood method improves the recognition effect of the maximum likelihood method and can reduce the influence of uncertainty factors on the recognition results, and the robustness is satisfactory.


Sign in / Sign up

Export Citation Format

Share Document