scholarly journals АВТОМАТИЧЕСКАЯ ОЦЕНКА ДИСПЕРСИИ АДДИТИВНЫХ ПОМЕХ ДЛЯ ИСКАЖЕННЫХ ШУМОМ СИГНАЛОВ В ОБЛАСТИ ДКП

2018 ◽  
pp. 44-53
Author(s):  
Артем Юрьевич Харьков ◽  
Владимир Васильевич Лукин

In many practical situations, signals are corrupted by noise and it is desired to apply filtering to remove it. For most of modern filters, it is needed to know noise variance a priori or to pre-estimate it in a blind manner in the presence of signal component. Requirements to methods of blind estimation are formulated and it is shown that it is difficult to satisfy these requirements. Based on the used model of additive white Gaussian model, a method operting in DCT domain is considered and thoroughly studied. The choice of test signals is motivated. Local estimates obtained on blocks of different size are studied and it is demonstrated that these local estimates, although based on robust estimates of data scale, can be sufficiently influence by signal component that leads to a certain percentage of large amplitude DCT coefficients in data sample. It is then shown that such abnormal local estimates have to be rejected (or their influence on the final estimate should be minimized). This is done by robust processing of local estimates. It is established that block size considerably influences accuracy characterized by bias of estimates and their variance. The role of bias is dominant – noise standard deviation is overestimated - and the main task is to decrease it. According to experiments carried out for ten variants (parameter sets) of estimation method, the best results are, on the average, obtained if block size is equal to 32 and local estimates are processed using sample median. Computational efficiency is analyzed and it is shown that processing can be done quite quickly. This allows expecting real-time implementation for such applications as electrocardiogram and speech processing

2015 ◽  
Vol 2015 ◽  
pp. 1-13
Author(s):  
Jianwei Ding ◽  
Yingbo Liu ◽  
Li Zhang ◽  
Jianmin Wang

Condition monitoring systems are widely used to monitor the working condition of equipment, generating a vast amount and variety of telemetry data in the process. The main task of surveillance focuses on analyzing these routinely collected telemetry data to help analyze the working condition in the equipment. However, with the rapid increase in the volume of telemetry data, it is a nontrivial task to analyze all the telemetry data to understand the working condition of the equipment without any a priori knowledge. In this paper, we proposed a probabilistic generative model called working condition model (WCM), which is capable of simulating the process of event sequence data generated and depicting the working condition of equipment at runtime. With the help of WCM, we are able to analyze how the event sequence data behave in different working modes and meanwhile to detect the working mode of an event sequence (working condition diagnosis). Furthermore, we have applied WCM to illustrative applications like automated detection of an anomalous event sequence for the runtime of equipment. Our experimental results on the real data sets demonstrate the effectiveness of the model.


Geophysics ◽  
1987 ◽  
Vol 52 (9) ◽  
pp. 1211-1228 ◽  
Author(s):  
Peter Mora

The treatment of multioffset seismic data as an acoustic wave field is becoming increasingly disturbing to many geophysicists who see a multitude of wave phenomena, such as amplitude‐offset variations and shearwave events, which can only be explained by using the more correct elastic wave equation. Not only are such phenomena ignored by acoustic theory, but they are also treated as undesirable noise when they should be used to provide extra information, such as S‐wave velocity, about the subsurface. The problems of using the conventional acoustic wave equation approach can be eliminated via an elastic approach. In this paper, equations have been derived to perform an inversion for P‐wave velocity, S‐wave velocity, and density as well as the P‐wave impedance, S‐wave impedance, and density. These are better resolved than the Lamé parameters. The inversion is based on nonlinear least squares and proceeds by iteratively updating the earth parameters until a good fit is achieved between the observed data and the modeled data corresponding to these earth parameters. The iterations are based on the preconditioned conjugate gradient algorithm. The fundamental requirement of such a least‐squares algorithm is the gradient direction which tells how to update the model parameters. The gradient direction can be derived directly from the wave equation and it may be computed by several wave propagations. Although in principle any scheme could be chosen to perform the wave propagations, the elastic finite‐ difference method is used because it directly simulates the elastic wave equation and can handle complex, and thus realistic, distributions of elastic parameters. This method of inversion is costly since it is similar to an iterative prestack shot‐profile migration. However, it has greater power than any migration since it solves for the P‐wave velocity, S‐wave velocity, and density and can handle very general situations including transmission problems. Three main weaknesses of this technique are that it requires fairly accurate a priori knowledge of the low‐ wavenumber velocity model, it assumes Gaussian model statistics, and it is very computer‐intensive. All these problems seem surmountable. The low‐wavenumber information can be obtained either by a prior tomographic step, by the conventional normal‐moveout method, by a priori knowledge and empirical relationships, or by adding an additional inversion step for low wavenumbers to each iteration. The Gaussian statistics can be altered by preconditioning the gradient direction, perhaps to make the solution blocky in appearance like well logs, or by using large model variances in the inversion to reduce the effect of the Gaussian model constraints. Moreover, with some improvements to the algorithm and more parallel computers, it is hoped the technique will soon become routinely feasible.


Author(s):  
V. N. Evdokimenkov ◽  
R. V. Kim ◽  
M. N. Krasilshchikov ◽  
N. I. Selvesyuk

In this article, we analyze the modern concepts in the field of the aeronautical equipment integrated logistical support (ILS). The key element of the traditional logistical support system under consideration is the data on detected failures and malfunctions, recorded in the air flight and maintenance log (AFML), chart-orders, non-routine write-ups and accumulated within the structure of the logistic support analysis database. We propose a method for expanding the ILS capabilities by means of including of an additional element, called the flight information database, in the logistics center structure, along with the traditional database for analyzing the logistical support. This database is constantly growing during the aircraft operation. It also contains the values of the parameters recorded by the standard onboard flight data recorder, which reflect the state of the onboard systems. The inclusion of a flight information database into the structure of the logistical support center makes it possible to implement the probability-guaranteeing estimation method in respect of the risks, associated with the aircraft technical condition, for benefit of the integrated logistical support. The proposed method uses an inverse probabilistic criterion (quantile) as an integral characteristic of the aircraft systems technical condition. This is fully consistent with modern approaches to organizing condition-based maintenance. Among these approaches, the data-driven methodology (DDM) has the greatest potential and practical efficiency. The applicative value of the described method is in the fact that its implementation needs neither a priori information about the principles of the maintained equipment operation, nor information about the functioning principles of the on-board controller network, which is used to control the equipment physical parameters. In this article, we also present the accuracy estimates of forecasting the residual life of an aircraft gas turbine engine, using the proposed method. These estimates are based on the actual flight data presented in the National Aeronautics and Space Administration (NASA) repository.


2019 ◽  
Vol 141 (11) ◽  
Author(s):  
Yijun Li ◽  
Taehyun Shim ◽  
Dexin Wang ◽  
Timothy Offerle

The rack force is valuable information for a vehicle dynamics control system, as it relates closely to the road conditions and steering feel. Since there is no direct measurement of rack force in current steering systems, various rack force estimation methods have been proposed to obtain the rack force information. In order to get an accurate rack force estimate, it is important to have knowledge of the steering system friction. However, it is hard to have an accurate value of friction, as it is subject to variation due to operation conditions and material wear. Especially for the widely used column-assisted electric power steering (C-EPAS) system, the load-dependent characteristic of its worm gear friction has a significant effect on rack force estimation. In this paper, a rack force estimation method using a Kalman filter and a load-dependent friction estimation algorithm is introduced, and the effect of C-EPAS friction on rack force estimator performance is investigated. Unlike other rack force estimation methods, which assume that friction is known a priori, the proposed system uses a load-dependent friction estimation algorithm to determine accurate friction information in the steering system, and then a rack force is estimated using the relationship between steering torque and angle. The effectiveness of this proposed method is verified by carsim/simulink cosimulation.


2019 ◽  
Vol 12 (7) ◽  
pp. 3943-3961 ◽  
Author(s):  
Ali Jalali ◽  
Shannon Hicks-Jalali ◽  
Robert J. Sica ◽  
Alexander Haefele ◽  
Thomas von Clarmann

Abstract. Lidar retrievals of atmospheric temperature and water vapor mixing ratio profiles using the optimal estimation method (OEM) typically use a retrieval grid with a number of points larger than the number of pieces of independent information obtainable from the measurements. Consequently, retrieved geophysical quantities contain some information from their respective a priori values or profiles, which can affect the results in the higher altitudes of the temperature and water vapor profiles due to decreasing signal-to-noise ratios. The extent of this influence can be estimated using the retrieval's averaging kernels. The removal of formal a priori information from the retrieved profiles in the regions of prevailing a priori effects is desirable, particularly when these greatest heights are of interest for scientific studies. We demonstrate here that removal of a priori information from OEM retrievals is possible by repeating the retrieval on a coarser grid where the retrieval is stable even without the use of formal prior information. The averaging kernels of the fine-grid OEM retrieval are used to optimize the coarse retrieval grid. We demonstrate the adequacy of this method for the case of a large power-aperture Rayleigh scatter lidar nighttime temperature retrieval and for a Raman scatter lidar water vapor mixing ratio retrieval during both day and night.


2005 ◽  
Vol 5 (6) ◽  
pp. 1665-1677 ◽  
Author(s):  
A. von Engeln ◽  
G. Nedoluha

Abstract. The Optimal Estimation Method is used to retrieve temperature and water vapor profiles from simulated radio occultation measurements in order to assess how different retrieval schemes may affect the assimilation of this data. High resolution ECMWF global fields are used by a state-of-the-art radio occultation simulator to provide quasi-realistic bending angle and refractivity profiles. Both types of profiles are used in the retrieval process to assess their advantages and disadvantages. The impact of the GPS measurement is expressed as an improvement over the a priori knowledge (taken from a 24h old analysis). Large improvements are found for temperature in the upper troposphere and lower stratosphere. Only very small improvements are found in the lower troposphere, where water vapor is present. Water vapor improvements are only significant between about 1 km to 7 km. No pronounced difference is found between retrievals based upon bending angles or refractivity. Results are compared to idealized retrievals, where the atmosphere is spherically symmetric and instrument noise is not included. Comparing idealized to quasi-realistic calculations shows that the main impact of a ray tracing algorithm can be expected for low latitude water vapor, where the horizontal variability is high. We also address the effect of altitude correlations in the temperature and water vapor. Overall, we find that water vapor and temperature retrievals using bending angle profiles are more CPU intensive than refractivity profiles, but that they do not provide significantly better results.


Sign in / Sign up

Export Citation Format

Share Document