scholarly journals Application of L1 Trend Filtering Technology on the Current Time Domain Spectroscopy of Dielectrics

Electronics ◽  
2019 ◽  
Vol 8 (9) ◽  
pp. 1046
Author(s):  
Changyou Suo ◽  
Zhonghua Li ◽  
Yunlong Sun ◽  
Yongsen Han

The current time domain spectroscopy of dielectrics provides important information for the analysis of dielectric properties and mechanisms. However, there is always interference during the testing process, which seriously affects the analysis of the test results. Therefore, the effective filtering of current time domain spectroscopy is particularly necessary. L1 trend filtering can estimate the trend items exactly in a set of time series. It has been widely used in the fields of economics and sociology. Therefore, this paper attempts to apply L1 trend filtering to the current time domain spectroscopy. Firstly, polarization and depolarization currents are measured in the laboratory. Then the test results are filtered by L1 trend filtering and the filtering effects are compared with several common filtering algorithms, such as a sliding mean filter and Savitzky–Golay smoothing filter. Finally, the robustness and time complexity of L1 trend filtering are analyzed. The filtering results show that because the polarization currents vary in a wide range of the time domain (about 2–3 orders of magnitude), smooth and undistorted curves in the whole test time range can hardly be obtained through common filtering algorithms, while they can be obtained by L1 trend filtering. The results of robustness analysis and time complexity analysis show that L1 trend filtering can extract the trend items accurately in the time series under given different noise levels, and the execution time is also lower than 176.67 s when the number of tested points is no more than 20,000. Those results show that L1 trend filtering can be applied to the time domain current spectroscopy of dielectrics.

2019 ◽  
pp. 77-113
Author(s):  
Chris Chatfield ◽  
Haipeng Xing

Geophysics ◽  
1992 ◽  
Vol 57 (8) ◽  
pp. 994-1003 ◽  
Author(s):  
Michael Leppin

A numerical method is presented by which the transient electromagnetic response of a two‐dimensional (2-D) conductor, embedded in a conductive host rock and excited by a rectangular current loop, can be modeled. This 2.5-D modeling problem has been formulated in the time domain in terms of a vector diffusion equation for the scattered magnetic induction, which is Fourier transformed into the spatial wavenumber domain in the strike direction of the conductor. To confine the region of solution of the diffusion equation to the conductive earth, boundary values for the components of the magnetic induction on the ground surface have been calculated by means of an integral transform of the vertical component of the magnetic induction at the air‐earth interface. The system of parabolic differential equations for the three magnetic components has been integrated for 9 to 15 discrete spatial wavenumbers ranging from [Formula: see text] to [Formula: see text] using an implicit homogeneous finite‐difference scheme. The discretization of the differential equations on a grid representing a cross‐section of the conductive earth results in a large, sparse system of linear equations, which is solved by the successive overrelaxation method. The three‐dimensional (3-D) response has been computed by an inverse Fourier transformation of the cubic spline interpolated scattered magnetic induction in the wavenumber domain using a digital filtering technique. To test the algorithm, responses have been computed for a two‐layered half‐space and a vertical prism embedded in a conductive host rock. These examples were then compared with results obtained analytically or numerically using frequency‐domain finite‐element and time‐domain integral equation methods. The new numerical procedure gives satisfactory results for a wide range of 2-D conductivity distributions with conductivity ratios exceeding 1:100, provided the grid is sufficiently refined at the corners of the conductivity anomalies.


2008 ◽  
Vol 25 (4) ◽  
pp. 534-546 ◽  
Author(s):  
Anthony Arguez ◽  
Peng Yu ◽  
James J. O’Brien

Abstract Time series filtering (e.g., smoothing) can be done in the spectral domain without loss of endpoints. However, filtering is commonly performed in the time domain using convolutions, resulting in lost points near the series endpoints. Multiple incarnations of a least squares minimization approach are developed that retain the endpoint intervals that are normally discarded due to filtering with convolutions in the time domain. The techniques minimize the errors between the predetermined frequency response function (FRF)—a fundamental property of all filters—of interior points with FRFs that are to be determined for each position in the endpoint zone. The least squares techniques are differentiated by their constraints: 1) unconstrained, 2) equal-mean constraint, and 3) an equal-variance constraint. The equal-mean constraint forces the new weights to sum up to the same value as the predetermined weights. The equal-variance constraint forces the new weights to be such that, after convolved with the input values, the expected time series variance is preserved. The three least squares methods are each tested under three separate filtering scenarios [involving Arctic Oscillation (AO), Madden–Julian oscillation (MJO), and El Niño–Southern Oscillation (ENSO) time series] and compared to each other as well as to the spectral filtering method—the standard of comparison. The results indicate that all four methods (including the spectral method) possess skill at determining suitable endpoints estimates. However, both the unconstrained and equal-mean schemes exhibit bias toward zero near the terminal ends due to problems with appropriating variance. The equal-variance method does not show evidence of this attribute and was never the worst performer. The equal-variance method showed great promise in the ENSO project involving a 5-month running mean filter, and performed at least on par with the other realistic methods for almost all time series positions in all three filtering scenarios.


Author(s):  
Withawat Withayachumnankul ◽  
Bernd M Fischer ◽  
Derek Abbott

The use of T-rays, or terahertz radiation, to identify substances by their spectroscopic fingerprints is a rapidly moving field. The dominant approach is presently terahertz time-domain spectroscopy. However, a key problem is that ambient water vapour is ubiquitous and the consequent water absorption distorts the T-ray pulses. Water molecules in the gas phase selectively absorb incident T-rays at discrete frequencies corresponding to their molecular rotational transitions. When T-rays propagate through an atmosphere, this results in prominent resonances spread over the T-ray spectrum; furthermore, in the time domain, fluctuations after the main pulse are observed in the T-ray signal. These effects are generally undesired, since they may mask critical spectroscopic data. So, ambient water vapour is commonly removed from the T-ray path by using a closed chamber during the measurement. Yet, in some applications, a closed chamber is not always feasible. This situation, therefore, motivates the need for an optional alternative method for reducing these unwanted artefacts. This paper represents a study on a computational means that is a step towards addressing the problem arising from water vapour absorption over a moderate propagation distance. Initially, the complex frequency response of water vapour is modelled from a spectroscopic catalogue. Using a deconvolution technique, together with fine tuning of the strength of each resonance, parts of the water vapour response are removed from a measured T-ray signal, with minimal signal distortion, thus providing experimental validation of the technique.


1999 ◽  
Vol 3 (1) ◽  
pp. 69-83 ◽  
Author(s):  
Hui Boon Tan ◽  
Richard Ashley

A simple technique for directly testing the parameters of a time-series regression model for instability across frequencies is presented. The method can be implemented easily in the time domain, so that parameter instability across frequency bands can be conveniently detected and modeled in conjunction with other econometric features of the problem at hand, such as simultaneity, cointegration, missing observations, and cross-equation restrictions. The usefulness of the new technique is illustrated with an application to a cointegrated consumption-income regression model, yielding a straightforward test of the permanent income hypothesis.


Author(s):  
Simon Vaughan

Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations (‘noise’) from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.


2020 ◽  
Vol 24 (11) ◽  
pp. 5473-5489 ◽  
Author(s):  
Justin Schulte ◽  
Frederick Policielli ◽  
Benjamin Zaitchik

Abstract. Wavelet coherence is a method that is commonly used in hydrology to extract scale-dependent, nonstationary relationships between time series. However, we show that the method cannot always determine why the time-domain correlation between two time series changes in time. We show that, even for stationary coherence, the time-domain correlation between two time series weakens if at least one of the time series has changing skewness. To overcome this drawback, a nonlinear coherence method is proposed to quantify the cross-correlation between nonlinear modes embedded in the time series. It is shown that nonlinear coherence and auto-bicoherence spectra can provide additional insight into changing time-domain correlations. The new method is applied to the El Niño–Southern Oscillation (ENSO) and all-India rainfall (AIR), which is intricately linked to hydrological processes across the Indian subcontinent. The nonlinear coherence analysis showed that the skewness of AIR is weakly correlated with that of two ENSO time series after the 1970s, indicating that increases in ENSO skewness after the 1970s at least partially contributed to the weakening ENSO–AIR relationship in recent decades. The implication of this result is that the intensity of skewed El Niño events is likely to overestimate India's drought severity, which was the case in the 1997 monsoon season, a time point when the nonlinear wavelet coherence between AIR and ENSO reached its lowest value in the 1871–2016 period. We determined that the association between the weakening ENSO–AIR relationship and ENSO nonlinearity could reflect the contribution of different nonlinear ENSO modes to ENSO diversity.


Author(s):  
Michael Bentham ◽  
Gerard Stansby ◽  
John Allen

Photoplethysmography (PPG) is a simple-to-perform vascular optics measurement technique that can detect changes in blood volume in the microvascular tissue bed. Beat-to-beat analysis of the PPG waveform enables the study of the variability of pulse features such as amplitude and pulse arrival time (PAT), and when quantified in the time and frequency domains, has considerable potential to shed light on perfusion changes associated with peripheral arterial disease (PAD). In this pilot study innovative multi-site bilateral finger and toe PPG recordings from 43 healthy control subjects and 31 PAD subjects were compared (recordings each at least 5 minutes, collected in a warm temperature-controlled room). Beat-to-beat normalized amplitude and PAT variability was then quantified in the time-domain using SD and IQR measures and in the frequency-domain bilaterally using Magnitude Squared Coherence (MSC). Significantly reduced normalized amplitude variability (healthy control 0.0384 (IQR 0.0217-0.0744) vs PAD 0.0160 (0.0080-0.0338) (p<0.001) and significantly increased PAT variability (healthy control 0.0063 (0.0052-0.0086) vs PAD 0.0093 (0.0078-0.0144) (p<0.001) was demonstrated in PAD using the time-domain analysis. Frequency-domain analysis demonstrated significantly lower MSC values across a range of frequency bands for PAD patients. These changes suggest a loss of right-to-left body side coherence and cardiovascular control in PAD. This study has also demonstrated the feasibility of using these measurement and analysis methods in studies investigating multi-site PPG variability for a wide range of cardiac and vascular patient groups.


2009 ◽  
Vol 6 (2) ◽  
pp. 2451-2498 ◽  
Author(s):  
B. Schaefli ◽  
E. Zehe

Abstract. This paper proposes a method for rainfall-runoff model calibration and performance analysis in the wavelet-domain by fitting the estimated wavelet-power spectrum (a representation of the time-varying frequency content of a time series) of a simulated discharge series to the one of the corresponding observed time series. As discussed in this paper, calibrating hydrological models so as to reproduce the time-varying frequency content of the observed signal can lead to different results than parameter estimation in the time-domain. Therefore, wavelet-domain parameter estimation has the potential to give new insights into model performance and to reveal model structural deficiencies. We apply the proposed method to synthetic case studies and a real-world discharge modeling case study and discuss how model diagnosis can benefit from an analysis in the wavelet-domain. The results show that for the real-world case study of precipitation – runoff modeling for a high alpine catchment, the calibrated discharge simulation captures the dynamics of the observed time series better than the results obtained through calibration in the time-domain. In addition, the wavelet-domain performance assessment of this case study highlights which frequencies are not well reproduced by the model, which gives specific indications about how to improve the model structure.


Sign in / Sign up

Export Citation Format

Share Document