scholarly journals The Impact of Range-Oversampling Processing on Tornado Velocity Signatures Obtained from WSR-88D Superresolution Data

2015 ◽  
Vol 32 (9) ◽  
pp. 1581-1592 ◽  
Author(s):  
Sebastián M. Torres ◽  
Christopher D. Curtis

AbstractWSR-88D superresolution data are produced with finer range and azimuth sampling and improved azimuthal resolution as a result of a narrower effective antenna beamwidth. These characteristics afford improved detectability of weaker and more distant tornadoes by providing an enhancement of the tornadic vortex signature, which is characterized by a large low-level azimuthal Doppler velocity difference. The effective-beamwidth reduction in superresolution data is achieved by applying a tapered data window to the samples in the dwell time; thus, it comes at the expense of increased variances for all radar-variable estimates. One way to overcome this detrimental effect is through the use of range oversampling processing, which has the potential to reduce the variance of superresolution data to match that of legacy-resolution data without increasing the acquisition time. However, range-oversampling processing typically broadens the radar range weighting function and thus degrades the range resolution. In this work, simulated Doppler velocities for vortexlike fields are used to quantify the effects of range-oversampling processing on the velocity signature of tornadoes when using WSR-88D superresolution data. The analysis shows that the benefits of range-oversampling processing in terms of improved data quality should outweigh the relatively small degradation to the range resolution and thus contribute to the tornado warning decision process by improving forecaster confidence in the radar data.

2017 ◽  
Author(s):  
Shih-Chiao Tsai ◽  
Jenn-Shyong Chen ◽  
Yen-Hsyang Chu ◽  
Ching-Lun Su ◽  
Jui-Hsiang Chen

Abstract. Multi-frequency range imaging (RIM) has been implemented in the Chung-Li very-high-frequency (VHF) radar, located on the campus of National Central University, Taiwan, since 2008. RIM processes the echo signals with a group of closely spaced transmitting frequencies through appropriate inversion methods to obtain high-resolution distribution of echo power in the range direction. This is beneficial to the investigation of the small scale structure embedded in dynamic atmosphere. Five transmitting frequencies were employed in the radar experiment for observation of the precipitating atmosphere during the period between 21 and 23 Aug, 2013. Using the Capon and Fourier methods, the radar echoes were synthesized to retrieve the temporal signals at a smaller range step than the original range resolution defined by the pulse width, and such retrieved temporal signals were then processed in the Doppler frequency domain to identify the atmosphere and precipitation echoes. An analysis called conditional averaging was further executed for echo power, Doppler velocity, and spectral width to verify the potential capabilities of the retrieval processing in resolving small-scale precipitation and atmosphere structures. Point-by-point correction of range delay combined with compensation of range weighting function effect has been performed during the retrieval of temporal signals to improve the continuity of power spectra at gate boundaries, making the small-scale structures in the power spectra more natural and reasonable. We examined stratiform and convective precipitations and demonstrated their different structured characteristics by means of the Capon-processed results.


2018 ◽  
Vol 11 (1) ◽  
pp. 581-592
Author(s):  
Shih-Chiao Tsai ◽  
Jenn-Shyong Chen ◽  
Yen-Hsyang Chu ◽  
Ching-Lun Su ◽  
Jui-Hsiang Chen

Abstract. Multi-frequency range imaging (RIM) has been operated in the Chung-Li very high-frequency (VHF) radar, located on the campus of National Central University, Taiwan, since 2008. RIM processes the echo signals with a group of closely spaced transmitting frequencies through appropriate inversion methods to obtain high-resolution distribution of echo power in the range direction. This is beneficial to the investigation of the small-scale structure embedded in dynamic atmosphere. Five transmitting frequencies were employed in the radar experiment for observation of the precipitating atmosphere during the period between 21 and 23 August 2013. Using the Capon and Fourier methods, the radar echoes were synthesized to retrieve the temporal signals at a smaller range step than the original range resolution defined by the pulse width, and such retrieved temporal signals were then processed in the Doppler frequency domain to identify the atmosphere and precipitation echoes. An analysis called conditional averaging was further executed for echo power, Doppler velocity, and spectral width to verify the potential capabilities of the retrieval processing in resolving small-scale precipitation and atmosphere structures. Point-by-point correction of range delay combined with compensation of range-weighting function effect has been performed during the retrieval of temporal signals to improve the continuity of power spectra at gate boundaries, making the small-scale structures in the power spectra more natural and reasonable. We examined stratiform and convective precipitation and demonstrated their different structured characteristics by means of the Capon-processed results. The new element in this study is the implementation of RIM on spectral analysis, especially for precipitation echoes.


2012 ◽  
Vol 29 (10) ◽  
pp. 1417-1427 ◽  
Author(s):  
Therese E. Thompson ◽  
Louis J. Wicker ◽  
Xuguang Wang

Abstract Maximizing the accuracy of ensemble Kalman filtering (EnKF) radar data assimilation requires that the observation operator sample the model state in the same manner that the radar sampled the atmosphere. It may therefore be desirable to include volume averaging and power weighting in the observation operator. This study examines the impact of including radar-sampling effects in the Doppler velocity observation operator on EnKF analyses and forecasts. Locally substantial differences are found between a simple point operator and a realistic radar-sampling operator when they are applied to the model state at a single time. However, assimilation results indicate that the radar-sampling operator does not substantially improve the EnKF analyses or forecasts, and it greatly increases the computational cost of the data assimilation.


2012 ◽  
Vol 29 (6) ◽  
pp. 796-806 ◽  
Author(s):  
Sebastián M. Torres ◽  
Christopher D. Curtis

Abstract The range-weighting function (RWF) determines how individual scatterer contributions are weighted as a function of range to produce the meteorological data associated with a single resolution volume. The RWF is commonly defined in terms of the transmitter pulse envelope and the receiver filter impulse response, and it determines the radar range resolution. However, the effective RWF also depends on the range-time processing involved in producing estimates of meteorological variables. This is a third contributor to the RWF that has become more significant in recent years as advanced range-time processing techniques have become feasible for real-time implementation on modern radar systems. In this work, a new formulation of the RWF for weather radars that incorporates the impact of signal processing is proposed. Following the derivation based on a general signal processing model, typical scenarios are used to illustrate the variety of RWFs that can result from different range-time signal processing techniques. Finally, the RWF is used to measure range resolution and the range correlation of meteorological data.


2010 ◽  
Vol 4 (1) ◽  
pp. 57-63 ◽  
Author(s):  
T. C. Cheung ◽  
P. W. Chan

More accurate prediction of the strong winds and heavy rain associated with tropical cyclones using numerical weather prediction (NWP) models would be helpful in the provision of weather services for the public. In this paper, the impact of assimilating radar data in the simulation of Typhoon Neoguri and Severe Tropical Storm Kammuri in 2008 is studied using Weather Research and Forecasting (WRF) version 2.2 and WRF VAR version 2.1. Only the data from the radar at Tate's Cairn in Hong Kong are considered. Four experiments are conducted, namely, (a) simulation without radar data, (b) simulation with radar data assimilated at the initial time, (c) cycling simulation with the assimilation of radar data (Doppler velocity and reflectivity) directly assimilated, and (d) cycling simulation with the assimilation of 2D wind field retrieved from the Doppler velocity data from the radar. By comparing with actual observations of the surface wind distribution in Hong Kong and the actual radar reflectivity data, it turns out that both (c) and (d) outperform (a) and (b), and (c) and (d) show comparable skills. As a result, cycling simulation with the assimilation of weather radar data (even for a single radar) could improve the prediction of winds and rain bands associated with tropical cyclones.


2016 ◽  
pp. 55-94
Author(s):  
Pier Luigi Marchini ◽  
Carlotta D'Este

The reporting of comprehensive income is becoming increasingly important. After the introduction of Other Comprehensive Income (OCI) reporting, as required by the 2007 IAS 1-revised, the IASB is currently seeking inputs from investors on the usefulness of unrealized gains and losses and on the role of comprehensive income. This circumstance is of particular relevance in code law countries, as local pre-IFRS accounting models influence financial statement preparers and users. This study aims at investigating the role played by unrealized gains and losses reporting on users' decision process, by examining the impact of OCI on the Italian listed companies RoE ratio and by surveying a sample of financial analysts, also content analysing their formal reports. The results show that the reporting of comprehensive income does not affect the financial statement users' decision process, although it statistically affects Italian listed entities' performance.


Author(s):  
Yoav Weizman ◽  
Ezra Baruch

Abstract In recent years, two new techniques were introduced for flip chip debug; the Laser Voltage Probing (LVP) technique and Time Resolved Light Emission Microscopy (TRLEM). Both techniques utilize the silicon’s relative transparency to wavelengths longer than the band gap. This inherent wavelength limitation, together with the shrinking dimensions of modern CMOS devices, limit the capabilities of these tools. It is known that the optical resolution limits of the LVP and TRLEM techniques are bounded by the diffraction limit which is ~1um for both tools using standard optics. This limitation was reduced with the addition of immersion lens optics. Nevertheless, even with this improvement, shrinking transistor geometry is leading to increased acquisition time, and the overlapping effect between adjacent nodes remains a critical issue. The resolution limit is an order of magnitude above the device feature densities in the < 90nm era. The scaling down of transistor geometry is leading to the inevitable consequence where more than 50% of the transistors in 90nm process have widths smaller than 0.4um. The acquisition time of such nodes becomes unreasonably long. In order to examine nodes in a dense logic cuicuit, cross talk and convolution effects between neighboring signals also need to be considered. In this paper we will demonstrate the impact that these effects may have on modern design. In order to maintain the debug capability, with the currently available analytical tools for future technologies, conceptual modification of the FA process is required. This process should start on the IC design board where the VLSI designer should be familiar with FA constraints, and thus apply features that will enable enhanced FA capabilities to the circuit in hand during the electrical design or during the physical design stages. The necessity for reliable failure analysis in real-time should dictate that the designer of advanced VLSI blocks incorporates failure analysis constraints among other design rules. The purpose of this research is to supply the scientific basis for the optimal incorporation of design rules for optical probing in the < 90nm gate era. Circuit designers are usually familiar with the nodes in the design which are critical for debug, and the type of measurement (logic or DC level) they require. The designer should enable the measurement of these signals by applying certain circuit and physical constraints. The implementation of these constraints may be done at the cell level, the block level or during the integration. We will discuss the solutions, which should be considered in order to mitigate tool limitations, and also to enable their use for next generation processes.


2017 ◽  
Vol 7 (2) ◽  
pp. 7-25
Author(s):  
Karolina Diallo

Pupil with Obsessive-Compulsive Disorder. Over the past twenty years childhood OCD has received more attention than any other anxiety disorder that occurs in the childhood. The increasing interest and research in this area have led to increasing number of diagnoses of OCD in children and adolescents, which affects both specialists and teachers. Depending on the severity of symptoms OCD has a detrimental effect upon child's school performance, which can lead almost to the impossibility to concentrate on school and associated duties. This article is devoted to the obsessive-compulsive disorder and its specifics in children, focusing on the impact of this disorder on behaviour, experience and performance of the child in the school environment. It mentions how important is the role of the teacher in whose class the pupil with this diagnosis is and it points out that it is necessary to increase teachers' competence to identify children with OCD symptoms, to take the disease into the account, to adapt the course of teaching and to introduce such measures that could help children reduce the anxiety and maintain (or increase) the school performance within and in accordance with the school regulations and curriculum.


Sensors ◽  
2021 ◽  
Vol 21 (10) ◽  
pp. 3480
Author(s):  
Walter Takashi Nakamura ◽  
Iftekhar Ahmed ◽  
David Redmiles ◽  
Edson Oliveira ◽  
David Fernandes ◽  
...  

The success of a software application is related to users’ willingness to keep using it. In this sense, evaluating User eXperience (UX) became an important part of the software development process. Researchers have been carrying out studies by employing various methods to evaluate the UX of software products. Some studies reported varied and even contradictory results when applying different UX evaluation methods, making it difficult for practitioners to identify which results to rely upon. However, these works did not evaluate the developers’ perspectives and their impacts on the decision process. Moreover, such studies focused on one-shot evaluations, which cannot assess whether the methods provide the same big picture of the experience (i.e., deteriorating, improving, or stable). This paper presents a longitudinal study in which 68 students evaluated the UX of an online judge system by employing AttrakDiff, UEQ, and Sentence Completion methods at three moments along a semester. This study reveals contrasting results between the methods, which affected developers’ decisions and interpretations. With this work, we intend to draw the HCI community’s attention to the contrast between different UX evaluation methods and the impact of their outcomes in the software development process.


2019 ◽  
Vol 148 (1) ◽  
pp. 63-81 ◽  
Author(s):  
Kevin Bachmann ◽  
Christian Keil ◽  
George C. Craig ◽  
Martin Weissmann ◽  
Christian A. Welzbacher

Abstract We investigate the practical predictability limits of deep convection in a state-of-the-art, high-resolution, limited-area ensemble prediction system. A combination of sophisticated predictability measures, namely, believable and decorrelation scale, are applied to determine the predictable scales of short-term forecasts in a hierarchy of model configurations. First, we consider an idealized perfect model setup that includes both small-scale and synoptic-scale perturbations. We find increased predictability in the presence of orography and a strongly beneficial impact of radar data assimilation, which extends the forecast horizon by up to 6 h. Second, we examine realistic COSMO-KENDA simulations, including assimilation of radar and conventional data and a representation of model errors, for a convectively active two-week summer period over Germany. The results confirm increased predictability in orographic regions. We find that both latent heat nudging and ensemble Kalman filter assimilation of radar data lead to increased forecast skill, but the impact is smaller than in the idealized experiments. This highlights the need to assimilate spatially and temporally dense data, but also indicates room for further improvement. Finally, the examination of operational COSMO-DE-EPS ensemble forecasts for three summer periods confirms the beneficial impact of orography in a statistical sense and also reveals increased predictability in weather regimes controlled by synoptic forcing, as defined by the convective adjustment time scale.


Sign in / Sign up

Export Citation Format

Share Document