Double smoothing of images using median and Wiener filters

1989 ◽  
Vol 37 (6) ◽  
pp. 943-946 ◽  
Author(s):  
S.Y. Park ◽  
Y.H. Lee
Author(s):  
K.-H. Herrmann ◽  
E. Reuber ◽  
P. Schiske

Aposteriori deblurring of high resolution electron micrographs of weak phase objects can be performed by holographic filters [1,2] which are arranged in the Fourier domain of a light-optical reconstruction set-up. According to the diffraction efficiency and the lateral position of the grating structure, the filters permit adjustment of the amplitudes and phases of the spatial frequencies in the image which is obtained in the first diffraction order.In the case of bright field imaging with axial illumination, the Contrast Transfer Functions (CTF) are oscillating, but real. For different imageforming conditions and several signal-to-noise ratios an extensive set of Wiener-filters should be available. A simple method of producing such filters by only photographic and mechanical means will be described here.A transparent master grating with 6.25 lines/mm and 160 mm diameter was produced by a high precision computer plotter. It is photographed through a rotating mask, plotted by a standard plotter.


2008 ◽  
Vol 47 (02) ◽  
pp. 167-173 ◽  
Author(s):  
A. Pfahlberg ◽  
O. Gefeller ◽  
R. Weißbach

Summary Objectives: In oncological studies, the hazard rate can be used to differentiate subgroups of the study population according to their patterns of survival risk over time. Nonparametric curve estimation has been suggested as an exploratory means of revealing such patterns. The decision about the type of smoothing parameter is critical for performance in practice. In this paper, we study data-adaptive smoothing. Methods: A decade ago, the nearest-neighbor bandwidth was introduced for censored data in survival analysis. It is specified by one parameter, namely the number of nearest neighbors. Bandwidth selection in this setting has rarely been investigated, although the heuristical advantages over the frequently-studied fixed bandwidth are quite obvious. The asymptotical relationship between the fixed and the nearest-neighbor bandwidth can be used to generate novel approaches. Results: We develop a new selection algorithm termed double-smoothing for the nearest-neighbor bandwidth in hazard rate estimation. Our approach uses a finite sample approximation of the asymptotical relationship between the fixed and nearest-neighbor bandwidth. By so doing, we identify the nearest-neighbor bandwidth as an additional smoothing step and achieve further data-adaption after fixed bandwidth smoothing. We illustrate the application of the new algorithm in a clinical study and compare the outcome to the traditional fixed bandwidth result, thus demonstrating the practical performance of the technique. Conclusion: The double-smoothing approach enlarges the methodological repertoire for selecting smoothing parameters in nonparametric hazard rate estimation. The slight increase in computational effort is rewarded with a substantial amount of estimation stability, thus demonstrating the benefit of the technique for biostatistical applications.


Geophysics ◽  
1973 ◽  
Vol 38 (2) ◽  
pp. 310-326 ◽  
Author(s):  
R. J. Wang ◽  
S. Treitel

The normal equations for the discrete Wiener filter are conventionally solved with Levinson’s algorithm. The resultant solutions are exact except for numerical roundoff. In many instances, approximate rather than exact solutions satisfy seismologists’ requirements. The so‐called “gradient” or “steepest descent” iteration techniques can be used to produce approximate filters at computing speeds significantly higher than those achievable with Levinson’s method. Moreover, gradient schemes are well suited for implementation on a digital computer provided with a floating‐point array processor (i.e., a high‐speed peripheral device designed to carry out a specific set of multiply‐and‐add operations). Levinson’s method (1947) cannot be programmed efficiently for such special‐purpose hardware, and this consideration renders the use of gradient schemes even more attractive. It is, of course, advisable to utilize a gradient algorithm which generally provides rapid convergence to the true solution. The “conjugate‐gradient” method of Hestenes (1956) is one of a family of algorithms having this property. Experimental calculations performed with real seismic data indicate that adequate filter approximations are obtainable at a fraction of the computer cost required for use of Levinson’s algorithm.


2008 ◽  
Vol 56 (10) ◽  
pp. 5013-5019 ◽  
Author(s):  
L.L. Scharf ◽  
E.K.P. Chong ◽  
M.D. Zoltowski ◽  
J.S. Goldstein ◽  
I.S. Reed

1996 ◽  
Vol 39 (1) ◽  
Author(s):  
C. Del Negro

The frequency-domain Wiener filtering was applied to magnetic anomalies in the volcanic area of Mt. Etna. This filter, under suitable conditions (additive noise, linear processing and mean-square error criterion), can furnish an effective tool for discriminating the geologic feature of interest (the signal) from the noise. The filter was first tested with synthetic data. Afterwards it was applied to a magnetic profile carried out across the principal fault system of the Mt. Etna volcano, that hosted the dykes feeding both the 1989 and the 1991-93 eruptions. The magnetic anomalies linked to the volcanic section and those linked to the contact between the clay basement and the lava coverage show significant spectral overlap. Thus by estimating the power spectrum of the signal, obtained resolving the forward problem, a least-squares Wiener filter has been designed. In such context, it was possible to verify the effectiveness of Wiener filters, whereas traditional band-pass filtering proved inadequate. In fact, analysis of the noise showed that all the meaningful components of the observed magnetic field were resolved. The results put further constraints on location and geometry of the shallow plumbing system of Mt. Etna.


Sign in / Sign up

Export Citation Format

Share Document