Signal Processing and the Design of Microarray Time-Series Experiments

2007 ◽  
pp. 75-94
Author(s):  
Robert R. Klevecz ◽  
Caroline M. Li ◽  
James L. Bolen
Mathematics ◽  
2018 ◽  
Vol 6 (7) ◽  
pp. 124 ◽  
Author(s):  
Elena Barton ◽  
Basad Al-Sarray ◽  
Stéphane Chrétien ◽  
Kavya Jagan

In this note, we present a component-wise algorithm combining several recent ideas from signal processing for simultaneous piecewise constants trend, seasonality, outliers, and noise decomposition of dynamical time series. Our approach is entirely based on convex optimisation, and our decomposition is guaranteed to be a global optimiser. We demonstrate the efficiency of the approach via simulations results and real data analysis.


Author(s):  
Knox T. Millsaps ◽  
Gustave C. Dahl ◽  
Daniel E. Caguiat ◽  
Jeffrey S. Patterson

This paper presents an analysis of data taken from several stall initiation events on a GE LM-2500 gas turbine engine. Specifically, the time series of three separate pressure signals located at compressor stages 3, 6, and 15 were analyzed utilizing various signal processing methods to determine the most reliable indicator of incipient stall for this engine. The spectral analyses performed showed that rotating precursor waves traveling around the annulus at approximately half of the rotor speed were the best indicators. Non-linear chaotic time series analyses were also used to predict stall, but it was not as reliable an indicator. Several algorithms were used and it was determined that stall wave perturbations can be reliably identified about 900 revolutions prior to the stall. This work indicates that a single pressure signal located at stage 3 on an LM-2500 gas turbine is sufficient to provide advance warning of more than 2 seconds prior to the fully developed stall event.


Author(s):  
Jesús Bernardino Alonso Hernández ◽  
Patricia Henríquez Rodríguez

The field of nonlinear signal characterization and nonlinear signal processing has attracted a growing number of researchers in the past three decades. This comes from the fact that linear techniques have some limitations in certain areas of signal processing. Numerous nonlinear techniques have been introduced to complement the classical linear methods and as an alternative when the assumption of linearity is inappropriate. Two of these techniques are higher order statistics (HOS) and nonlinear dynamics theory (chaos). They have been widely applied to time series characterization and analysis in several fields, especially in biomedical signals. Both HOS and chaos techniques have had a similar evolution. They were first studied around 1900: the method of moments (related to HOS) was developed by Pearson and in 1890 Henri Poincaré found sensitive dependence on initial conditions (a symptom of chaos) in a particular case of the three-body problem. Both approaches were replaced by linear techniques until around 1960, when Lorenz rediscovered by coincidence a chaotic system while he was studying the behaviour of air masses. Meanwhile, a group of statisticians at the University of California began to explore the use of HOS techniques again. However, these techniques were ignored until 1980 when Mendel (Mendel, 1991) developed system identification techniques based on HOS and Ruelle (Ruelle, 1979), Packard (Packard, 1980), Takens (Takens, 1981) and Casdagli (Casdagli, 1989) set the methods to model nonlinear time series through chaos theory. But it is only recently that the application of HOS and chaos in time series has been feasible thanks to higher computation capacity of computers and Digital Signal Processing (DSP) technology. The present article presents the state of the art of two nonlinear techniques applied to time series analysis: higher order statistics and chaos theory. Some measurements based on HOS and chaos techniques will be described and the way in which these measurements characterize different behaviours of a signal will be analized. The application of nonlinear measurements permits more realistic characterization of signals and therefore it is an advance in automatic systems development.


Author(s):  
José Luis Rojo-Álvarez ◽  
Manel Martínez-Ramón ◽  
Gustavo Camps-Valls ◽  
Carlos E. Martínez-Cruz ◽  
Carlos Figuera

Digital signal processing (DSP) of time series using SVM has been addressed in the literature with a straightforward application of the SVM kernel regression, but the assumption of independently distributed samples in regression models is not fulfilled by a time-series problem. Therefore, a new branch of SVM algorithms has to be developed for the advantageous application of SVM concepts when we process data with underlying time-series structure. In this chapter, we summarize our past, present, and future proposal for the SVM-DSP frame-work, which consists of several principles for creating linear and nonlinear SVM algorithms devoted to DSP problems. First, the statement of linear signal models in the primal problem (primal signal models) allows us to obtain robust estimators of the model coefficients in classical DSP problems. Next, nonlinear SVM-DSP algorithms can be addressed from two different approaches: (a) reproducing kernel Hilbert spaces (RKHS) signal models, which state the signal model equation in the feature space, and (b) dual signal models, which are based on the nonlinear regression of the time instants with appropriate Mercer’s kernels. This way, concepts like filtering, time interpolation, and convolution are considered and analyzed, and they open the field for future development on signal processing algorithms following this SVM-DSP framework.


Sign in / Sign up

Export Citation Format

Share Document