scholarly journals New Perspectives on Ensemble Sensitivity Analysis with Applications to a Climatology of Severe Convection

Author(s):  
Brian C. Ancell ◽  
Austin A. Coleman

AbstractEnsemble sensitivity analysis (ESA) is a statistical technique applied within an ensemble to reveal the atmospheric flow features that relate to a chosen aspect of the flow. Given its ease of use (it is simply a linear regression between a chosen function of the forecast variables and the entire atmospheric state earlier or simultaneously in time), ensemble sensitivity has been the focus of several studies over roughly the last ten years. Such studies have primarily tried to understand the relevant dynamics and/or key precursors of high-impact weather events. Other applications of ESA have been more operationally oriented, including observation targeting within data assimilation systems and real-time adjustment techniques that attempt to utilize both sensitivity information and observations to improve forecasts.While ESA has gained popularity, its fundamental properties remain a substantially underutilized basis for realizing the technique’s full scientific potential. For example, the relationship between ensemble sensitivity and the pure dynamics of the system can teach us how to appropriately apply various sensitivity-based applications, and combining sensitivity with other ensemble properties such as spread can distinguish between a fluid dynamics problem and a predictability one. This work aims to present new perspectives on ensemble sensitivity, and clarify its fundamentals, with the hopes of making it a more accessible, attractive, and useful tool in the atmospheric sciences. These new perspectives are applied in part to a short climatology of severe convection forecasts to demonstrate the unique knowledge that can gained through broadened use of ESA.

2017 ◽  
Vol 2 (9) ◽  
Author(s):  
Tobias Günther ◽  
Alexander Kuhn ◽  
Hans-Christian Hege ◽  
Markus Gross ◽  
Holger Theisel

2020 ◽  
Vol 20 (5) ◽  
pp. 1513-1531 ◽  
Author(s):  
Oriol Rodríguez ◽  
Joan Bech ◽  
Juan de Dios Soriano ◽  
Delia Gutiérrez ◽  
Salvador Castán

Abstract. Post-event damage assessments are of paramount importance to document the effects of high-impact weather-related events such as floods or strong wind events. Moreover, evaluating the damage and characterizing its extent and intensity can be essential for further analysis such as completing a diagnostic meteorological case study. This paper presents a methodology to perform field surveys of damage caused by strong winds of convective origin (i.e. tornado, downburst and straight-line winds). It is based on previous studies and also on 136 field studies performed by the authors in Spain between 2004 and 2018. The methodology includes the collection of pictures and records of damage to human-made structures and on vegetation during the in situ visit to the affected area, as well as of available automatic weather station data, witness reports and images of the phenomenon, such as funnel cloud pictures, taken by casual observers. To synthesize the gathered data, three final deliverables are proposed: (i) a standardized text report of the analysed event, (ii) a table consisting of detailed geolocated information about each damage point and other relevant data and (iii) a map or a KML (Keyhole Markup Language) file containing the previous information ready for graphical display and further analysis. This methodology has been applied by the authors in the past, sometimes only a few hours after the event occurrence and, on many occasions, when the type of convective phenomenon was uncertain. In those uncertain cases, the information resulting from this methodology contributed effectively to discern the phenomenon type thanks to the damage pattern analysis, particularly if no witness reports were available. The application of methodologies such as the one presented here is necessary in order to build homogeneous and robust databases of severe weather cases and high-impact weather events.


2006 ◽  
Vol 16 (3) ◽  
pp. 167-180 ◽  
Author(s):  
Kate M. Thomas ◽  
Dominique F. Charron ◽  
David Waltner-Toews ◽  
Corinne Schuster ◽  
Abdel R. Maarouf ◽  
...  

2020 ◽  
Author(s):  
Marvin Kähnert ◽  
Teresa M. Valkonen ◽  
Harald Sodemann

<p>Numerical weather prediction (NWP) models generally display comparatively low predictive skill in the Arctic. Particularly, the large impact of sub-grid scale, parameterised processes, such as surface fluxes, radiation or cloud microphysics during high-latitude weather events pose a substantial challenge for numerical modelling. Such processes are most influential during mesoscale weather events, such as polar lows, often embedded in cold air outbreaks (CAO), some of which cause high impact weather. Uncertainty in Arctic weather forecasts is thus critically dependent on parameterised processes. The strong influence from several parameterised processes also makes model forecasts particularly susceptible to compensation of errors from different parameterisations, which potentially limits model improvement.<br>Here we analyse model output of individual parameterised tendencies of wind, temperature and humidity during Arctic high-impact weather in AROME-Arctic, the operational NWP model used by the Norwegian Meteorological Institute Norway for the European Arctic. Individual tendencies describe the contribution of each applied physical parameterisation to a respective variable per model time step. We study a CAO-event taking place during 24 - 27 December 2015. This intense and widespread CAO event, reaching from the Fram Straight to Norway and affecting a particularly large portion of the Nordic seas at a time, was characterised by strong heat fluxes along the sea ice edge. <br>Model intern definitions for boundary layer type become apparent as a decisive factor in tendency contributions. Especially the interplay between the dual mass flux and the turbulence scheme is of essence here. Furthermore, sensitivity experiments, featuring a run without shallow convection and a run with a new statistical cloud scheme, show how a physically similar result is obtained by substantially different tendencies in the model.</p>


2019 ◽  
Vol 147 (11) ◽  
pp. 4071-4089 ◽  
Author(s):  
Jeremy D. Berman ◽  
Ryan D. Torn

Abstract Perturbations to the potential vorticity (PV) waveguide, which can result from latent heat release within the warm conveyor belt (WCB) of midlatitude cyclones, can lead to the downstream radiation of Rossby waves, and in turn high-impact weather events. Previous studies have hypothesized that forecast uncertainty associated with diabatic heating in WCBs can result in large downstream forecast variability; however, these studies have not established a direct connection between the two. This study evaluates the potential impact of latent heating variability in the WCB on subsequent downstream forecasts by applying the ensemble-based sensitivity method to European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble forecasts of a cyclogenesis event over the North Atlantic. For this case, ensemble members with a more amplified ridge are associated with greater negative PV advection by the irrotational wind, which is associated with stronger lower-tropospheric southerly moisture transport east of the upstream cyclone in the WCB. This transport is sensitive to the pressure trough to the south of the cyclone along the cold front, which in turn is modulated by earlier differences in the motion of the air masses on either side of the front. The position of the cold air behind the front is modulated by upstream tropopause-based PV anomalies, such that a deeper pressure trough is associated with a more progressive flow pattern, originating from Rossby wave breaking over the North Pacific. Overall, these results suggest that more accurate forecasts of upstream PV anomalies and WCBs may reduce forecast uncertainty in the downstream waveguide.


2021 ◽  
Author(s):  
Santiago Gaztelumendi

<p>Although social media industry is now a very congested Marketplace, Twitter continues to maintain its status as a popular social media platform. There are 330 million monthly active users and 145 million daily active users on Twitter sending more than 6,000 tweets every second in the world. In Spain case 85% population are social media users, with around 5 million tweeter profiles for a population around 47 million. In the autonomous community of Basque country (2.17 million inhabitants) around 20% of citizens use Twitter.</p><p>Twitter is a social tool that enables users to post messages (tweets) of up to 280 characters supporting a wide variety of social communication practices including photo and video attach. The Basque Meteorology Agency @Euskalmet with more than 115,3 K followers is one of the most popular accounts in Basque Country. Twitter is not only an opportunity to instantaneous spread messages to people without intermediaries, but also as a potential platform for valuable data acquisition using tweeter API capabilities. In this contribution, we present a study of different aspects related to the operational use of Twitter data in the context of high impact weather scenarios at local level.</p><p>The most important activity in Euskalmet are actions in severe weather events. Before the event, mainly centered in forecast and communication, during the event in nowcast, surveillance and impact monitoring and after the event in post-event analysis. During all these complex processes real time tweets posted by local users offer a huge amount of data that conveniently processed could be useful for different purposes. For operational staff, working at office during severe weather episodes, is critical to understand the local effects that an adverse phenomenon is causing and the correct perception of the extent of impact and social alarm. For this purposes, among others, different information associated with posted tweets can be extracted and exploited conveniently. In this work, we present some results that demonstrate how different data mining and advances analytics techniques can be used in order to include social media data information for different tasks and particularly during high impact weather events.</p><p>In this paper we summarize our experience during a proof of concept project for automatic real time tweeter analysis and the development of an operational tool for tweeter API data exploitation in the Basque Country. We present the main challenges and problems that we have had to face, including how to deal with the lack of geolocation information, since in the case of the Basque country, as in other parts of the world, tweets containing geotags are the exception, not the rule.</p>


Author(s):  
Katsiaryna M. Sumak ◽  
Inna G. Semenova

In recent decades in the world, and in the Republic of Belarus in particular, the question of the impact of weather conditions on the development of sectors of the economy and life of the population has become acute. The sudden changes in weather conditions can lead to adverse and dangerous weather phenomena that cause significant damage to the country’s economy. This paper examines the frequency of dangerous weather phenomena in cyclones of different trajectories that moved through the territory of the Republic of Belarus during the period of 1995–2015. It is identified that southern and western cyclones caused dangerous weather events over the territory of Belarus. The interannual and seasonal frequency of cyclones causing dangerous weather phenomena in Belarus was analyzed. It is shown that the largest number of southern and western cyclones was characteristic mainly for the summer period, as well as the transitional seasons of the year, therefore the dangerous weather phenomena were associated mainly with the development of severe convection on atmospheric fronts. Such phenomena as very heavy rain, snowfall and wind had the highest frequency in cyclones, as in southern as western trajectories. The share of strong sticking of wet snow and large hail were isolated cases and these phenomena were recorded locally over the territory of country.


2020 ◽  
Vol 12 (14) ◽  
pp. 2305 ◽  
Author(s):  
Taylor Zimmerman ◽  
Karine Jansen ◽  
Jon Miller

Measuring beach topography accurately and with high spatial resolution is an important aspect of coastal management and is crucial for understanding changes in beach morphology, especially along complex, three-dimensional shorelines. Traditional methods of beach surveying even at high resolution are insufficient to measure the complex, dynamic behavior along these coasts. This study investigates the optimization of Unmanned Aerial Systems Structure from Motion (UAS-SfM) data acquisition methodology with regard to flight altitude and the configuration and amount of ground control points (GCPs). A sensitivity analysis was performed to determine the UAS and GCP characteristics that produce the most accurate digital elevation model (DEM). First, an evaluation of the UAS-SfM technique was performed and proved advantageous over traditional surveying techniques with regard to efficiency, automation, ease of use, and repeatability. The results of the sensitivity analysis showed the highest (116 m) flight altitude evaluated was the most accurate and required the least amount of survey and processing time. The optimal configuration of GCPs was determined to be (1) in the corners of the study site, (2) at high and low elevations within the study site, and (3) with sufficient cross-shore and alongshore coverage. Finally, it was found that 15 GCPs produced the best results, but that as few as 11 GCPs could be used without any significant loss in accuracy. It was also observed that fewer (≈7–9) well-placed GCPs in the optimal configuration produced the same magnitude of error as using more (15) poorly placed GCPs. Based on these results, a set of recommendations for conducting UAS-SfM surveys along complex, three-dimensional, developed coastlines is presented.


Sign in / Sign up

Export Citation Format

Share Document