scholarly journals A Spatiotemporal Water Vapor–Deep Convection Correlation Metric Derived from the Amazon Dense GNSS Meteorological Network

2017 ◽  
Vol 145 (1) ◽  
pp. 279-288 ◽  
Author(s):  
David K. Adams ◽  
Henrique M. J. Barbosa ◽  
Karen Patricia Gaitán De Los Ríos

Abstract Deep atmospheric convection, which covers a large range of spatial scales during its evolution, continues to be a challenge for models to replicate, particularly over land in the tropics. Specifically, the shallow-to-deep convective transition and organization on the mesoscale are often not properly represented in coarse-resolution models. High-resolution models offer insights on physical mechanisms responsible for the shallow-to-deep transition. Model verification, however, at both coarse and high resolution requires validation and, hence, observational metrics, which are lacking in the tropics. Here a straightforward metric derived from the Amazon Dense GNSS Meteorological Network (~100 km × 100 km) is presented based on a spatial correlation decay time scale during convective evolution on the mesoscale. For the shallow-to-deep transition, the correlation decay time scale is shown to be around 3.5 h. This novel result provides a much needed metric from the deep tropics for numerical models to replicate.

2008 ◽  
Vol 21 (10) ◽  
pp. 2187-2203 ◽  
Author(s):  
Benjamin R. Lintner ◽  
J. David Neelin

Abstract The decay characteristics of a mixed layer ocean passively coupled to an atmospheric model are important to the response of the climate system to stochastic or external forcing. Two salient features of such decay—the spatial-scale dependence of sea surface temperature anomaly (SSTA) decay time scales and the spatial inhomogeneities of SSTA decay modes—are addressed using intermediate-level complexity and simple analytic models of the tropical atmosphere. As expected, decay time scales increase with the spatial extent of the SSTA. Most modes decay rapidly—with characteristic decay times of 50–100 days for a 50-m mixed layer—with the decay determined by local surface flux adjustment. Only those modes with spatial scales approaching or larger than the tropical basin scale exhibit decay time scales distinctively longer than the local decay, with the decay time scale of the most slowly decaying mode of the order of 250–300 days in the tropics (500 days globally). Simple analytic prototypes of the spatial-scale dependence and the effect of basic-state inhomogeneities, especially the impact of nonconvecting regions, elucidate these results. Horizontal energy transport sets the transition between fast, essentially local, decay time scales and the slower decay at larger spatial scales; within the tropics, efficient wave dynamics accounts for the small number of slowly decaying modes. Inhomogeneities in the basic-state climate, such as the presence or absence of mean tropical deep convection, strongly impact large-scale SSTA decay characteristics. For nonconvecting regions, SSTA decay is slow because evaporation is limited by relatively slow moisture divergence. The separation of convecting- and nonconvecting-region decay times and the closeness of the slower nonconvecting-region decay time scale to the most slowly decaying modes cause a blending between local nonconvecting modes and the large-scale modes, resulting in pronounced spatial inhomogeneity in the slow decay modes.


2019 ◽  
Vol 147 (11) ◽  
pp. 4127-4149 ◽  
Author(s):  
Ron McTaggart-Cowan ◽  
Paul A. Vaillancourt ◽  
Ayrton Zadra ◽  
Leo Separovic ◽  
Shawn Corvec ◽  
...  

Abstract The parameterization of deep moist convection as a subgrid-scale process in numerical models of the atmosphere is required at resolutions that extend well into the convective “gray zone,” the range of grid spacings over which such convection is partially resolved. However, as model resolution approaches the gray zone, the assumptions upon which most existing convective parameterizations are based begin to break down. We focus here on one aspect of this problem that emerges as the temporal and spatial scales of the model become similar to those of deep convection itself. The common practice of static tendency application over a prescribed adjustment period leads to logical inconsistencies at resolutions approaching the gray zone, while more frequent refreshment of the convective calculations can lead to undesirable intermittent behavior. A proposed parcel-based treatment of convective initiation introduces memory into the system in a manner that is consistent with the underlying physical principles of convective triggering, thus reducing the prevalence of unrealistic gradients in convective activity in an operational model running with a 10 km grid spacing. The subsequent introduction of a framework that considers convective clouds as persistent objects, each possessing unique attributes that describe physically relevant cloud properties, appears to improve convective precipitation patterns by depicting realistic cloud memory, movement, and decay. Combined, this Lagrangian view of convection addresses one aspect of the convective gray zone problem and lays a foundation for more realistic treatments of the convective life cycle in parameterization schemes.


2015 ◽  
Vol 96 (12) ◽  
pp. 2151-2165 ◽  
Author(s):  
David K. Adams ◽  
Rui M. S. Fernandes ◽  
Kirk L. Holub ◽  
Seth I. Gutman ◽  
Henrique M. J. Barbosa ◽  
...  

Abstract The complex interactions between water vapor fields and deep atmospheric convection remain one of the outstanding problems in tropical meteorology. The lack of high spatial–temporal resolution, all-weather observations in the tropics has hampered progress. Numerical models have difficulties, for example, in representing the shallow-to-deep convective transition and the diurnal cycle of precipitation. Global Navigation Satellite System (GNSS) meteorology, which provides all-weather, high-frequency (5 min), precipitable water vapor estimates, can help. The Amazon Dense GNSS Meteorological Network experiment, the first of its kind in the tropics, was created with the aim of examining water vapor and deep convection relationships at the mesoscale. This innovative, Brazilian-led international experiment consisted of two mesoscale (100 km × 100 km) networks: 1) a 1-yr (April 2011–April 2012) campaign (20 GNSS meteorological sites) in and around Manaus and 2) a 6-week (June 2011) intensive campaign (15 GNSS meteorological sites) in and around Belem, the latter in collaboration with the Cloud Processes of the Main Precipitation Systems in Brazil: A Contribution to Cloud-Resolving Modeling and to the Global Precipitation Measurement (CHUVA) Project in Brazil. Results presented here from both networks focus on the diurnal cycle of precipitable water vapor associated with sea-breeze convection in Belem and seasonal and topographic influences in and around Manaus. Ultimately, these unique observations may serve to initialize, constrain, or validate precipitable water vapor in high-resolution models. These experiments also demonstrate that GNSS meteorology can expand into logistically difficult regions such as the Amazon. Other GNSS meteorology networks presently being constructed in the tropics are summarized.


2010 ◽  
Vol 10 (2) ◽  
pp. 2357-2395 ◽  
Author(s):  
N. C. Dickson ◽  
K. M. Gierens ◽  
H. L. Rogers ◽  
R. L. Jones

Abstract. The global observation, assimilation and prediction in numerical models of ice super-saturated (ISS) regions (ISSR) are crucial if the climate impact of aircraft condensations trails (contrails) is to be fully understood, and if, for example, contrail formation is to be avoided through aircraft operational measures. A robust assessment of the global distribution of ISSR will further this debate, and ISS event occurrence, frequency and spatial scales have recently attracted significant attention. The mean horizontal path length through ISSR as observed by MOZAIC aircraft is 150 km (±250 km). The average vertical thickness of ISS layers is 600–800 m (±575 m) but layers ranging from 25 m to 3000 m have been observed, with up to one third of ISS layers thought to be less than 100 m deep. Given their small scales compared to typical atmospheric model grid sizes, statistical representations of the spatial scales of ISSR are required, in both horizontal and vertical dimensions, if global occurrence of ISSR is to be adequately represented in climate models. This paper uses radiosonde launches made by the UK Meteorological Office, from the British Isles, Gibraltar, St. Helena and the Falkland Islands between January 2002 and December 2006, to investigate the probabilistic occurrence of ISSR. Specifically each radiosonde profile is divided into 50- and 100-hPa pressure layers, to emulate the coarse vertical resolution of some atmospheric models. Then the high resolution observations contained within each thick pressure layer are used to calculate an average relative humidity and an ISS fraction for each individual thick pressure layer. These relative humidity pressure layer descriptions are then linked through a probability function to produce an s-shaped curve describing the ISS fraction in any average relative humidity pressure layer. An empirical investigation has shown that this one curve is statistically valid for mid-latitude locations, irrespective of season and altitude, however, pressure layer depth is an important variable. Using this empirical understanding of the s-shaped relationship a mathematical model was developed to represent the ISS fraction within any arbitrary thick pressure layer. Here the statistical distributions of actual high resolution RHi observations in any thick pressure layer, along with an error function, are used to mathematically describe the s-shape. Two models were developed to represent both 50- and 100-hPa pressure layers with each reconstructing their respective s-shapes within 8–10% of the empirical curves. These new models can be used, to represent the small scale structures of ISS events, in modelled data where only low vertical resolution is available. This will be useful in understanding, and improving the global distribution, both observed and forecasted, of ice super-saturation.


Author(s):  
Russ S. Schumacher

Heavy precipitation, which in many contexts is welcomed because it provides the water necessary for agriculture and human use, in other situations is responsible for deadly and destructive flash flooding. Over the 30-year period from 1986 to 2015, floods were responsible for more fatalities in the United States than any other convective weather hazard (www.nws.noaa.gov/om/hazstats.shtml), and similar findings are true in other regions of the world. Although scientific understanding of the processes responsible for heavy rainfall continues to advance, there are still many challenges associated with predicting where, when, and how much precipitation will occur. Common ingredients are required for heavy rainfall to occur, but there are vastly different ways in which the atmosphere brings the ingredients together in different parts of the world. Heavy precipitation often occurs on very small spatial scales in association with deep convection (thunderstorms), factors that limit the ability of numerical models to represent or predict the location and intensity of rainfall. Furthermore, because flash floods are dependent not only on precipitation but also on the characteristics of the underlying land surface, there are fundamental difficulties in accurately representing these coupled processes. Areas of active current research on heavy rainfall and flash flooding include investigating the storm-scale atmospheric processes that promote extreme precipitation, analyzing the reasons that some rainfall predictions are very accurate while others fail, improving the understanding and prediction of the flooding response to heavy precipitation, and determining how heavy rainfall and floods have changed and may continue to change in a changing climate.


2011 ◽  
Vol 139 (9) ◽  
pp. 3016-3035 ◽  
Author(s):  
Russ S. Schumacher

This study makes use of operational global ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) to examine the factors contributing to, or inhibiting, the development of a long-lived continental vortex and its associated rainfall. From 25 to 30 June 2007, a vortex developed and grew upscale over the southern plains of the United States. It was associated with persistent heavy rainfall, with over 100 mm of rain falling in much of Texas, Oklahoma, Kansas, and Missouri, and amounts exceeding 300 mm in southeastern Kansas. Previous research has shown that, in comparison with other rainfall events of similar temporal and spatial scales, this event was particularly difficult for numerical models to predict. Considering the ensemble members as different possible realizations of the evolution of the event, several methods are used to examine the processes that led to the development and maintenance of the long-lived vortex and its associated rainfall, and to its apparently limited predictability. Linear statistics are calculated to identify synoptic-scale flow features that were correlated to area-averaged precipitation, and differences between composites of “dry” and “wet” ensemble members are used to pinpoint the processes that were favorable or detrimental to the system’s development. The maintenance of the vortex, and its slow movement in the southern plains, are found to be closely related to the strength of a closed midlevel anticyclone in the southwestern United States and the strength of a midlevel ridge in the northern plains. In particular, with a weaker upstream anticyclone, the shear and flow over the incipient vortex are relatively weak, which allows for slow movement and persistent heavy rains. On the other hand, when the upstream anticyclone is stronger, there is stronger northerly shear and flow, which causes the incipient vortex to move southwestward into the high terrain of Mexico and dissipate. These relatively small differences in the wind and mass fields early in the ensemble forecast, in conjunction with modifications of the synoptic and mesoscale flow by deep convection, lead to very large spread in the resulting precipitation forecasts.


2021 ◽  
Author(s):  
Michele Salmi ◽  
Chiara Marsigli ◽  
Manfred Dorninger

<p>During the last decade, the constant improvement in computational capacity led to the development of the first limited-area, kilometer-scale ensemble prediction systems (L-EPS). The COSMO-D2 EPS (now ICON-D2) is the operational L-EPS at the German weather service (DWD) and has a spatial resolution of around 2km. This grid resolution allows large scale, deep convective processes such as thunderstorms or heavy showers to be handled explicitly, without any physical parametrization necessary. Special parameters involving both clouds (micro-)physics and large scale lifting – such as the Lighting Potential Index, or LPI – have also been developed in order to try and bring the forecasting of deep convection and therefore also of lightning activity to a new level of spatial accuracy. With such high-precision forecasts comes however also a much higher error potential, at least for gridpoint-verification. The use of this high resolution setup in an ensemble prediction system might however bring huge benefits in terms of skill and predictability. This work is a preliminary attempt to apply innovative verification approaches such as the dispersion Fractions Skill Score (dFSS) or the ensemble-SAL (eSAL) to the LPI in the COSMO-D2 EPS. Aim of this work is to assess the relationship between the ensemble error and the ensemble dispersion at different spatial scales. For the summer months 2019 the COSMO-D2 EPS shows a general tendency to underestimate the unpredictability of the lightning events, though the spread-error relationship varies greatly for different forecast lead times. With the help of the dFSS, one can also express this relationship in terms of skillful scales. On average, the system produces a useful forecast during the afternoon hours for horizontal scales of around 200 km. However, the ensemble members show an average horizontal dispersion that lies around half of that value, at more or less 100 km.</p>


2021 ◽  
Vol 13 (13) ◽  
pp. 2508
Author(s):  
Loredana Oreti ◽  
Diego Giuliarelli ◽  
Antonio Tomao ◽  
Anna Barbati

The importance of mixed forests is increasingly recognized on a scientific level, due to their greater productivity and efficiency in resource use, compared to pure stands. However, a reliable quantification of the actual spatial extent of mixed stands on a fine spatial scale is still lacking. Indeed, classification and mapping of mixed populations, especially with semi-automatic procedures, has been a challenging issue up to date. The main objective of this study is to evaluate the potential of Object-Based Image Analysis (OBIA) and Very-High-Resolution imagery (VHR) to detect and map mixed forests of broadleaves and coniferous trees with a Minimum Mapping Unit (MMU) of 500 m2. This study evaluates segmentation-based classification paired with non-parametric method K- nearest-neighbors (K-NN), trained with a dataset independent from the validation one. The forest area mapped as mixed forest canopies in the study area amounts to 11%, with an overall accuracy being equal to 85% and K of 0.78. Better levels of user and producer accuracies (85–93%) are reached in conifer and broadleaved dominated stands. The study findings demonstrate that the very high resolution images (0.20 m of spatial resolutions) can be reliably used to detect the fine-grained pattern of rare mixed forests, thus supporting the monitoring and management of forest resources also on fine spatial scales.


2010 ◽  
Vol 6 (S272) ◽  
pp. 398-399 ◽  
Author(s):  
Carol E. Jones ◽  
Christopher Tycner ◽  
Jessie Silaj ◽  
Ashly Smith ◽  
T. A. Aaron Sigut

AbstractHα high resolution spectroscopy combined with detailed numerical models is used to probe the physical conditions, such as density, temperature, and velocity of Be star disks. Models have been constructed for Be stars over a range in spectral types and inclination angles. We find that a variety of line shapes can be obtained by keeping the inclination fixed and changing density alone. This is due to the fact that our models account for disk temperature distributions self-consistently from the requirement of radiative equilibrium. A new analytical tool, called the variability ratio, was developed to identify emission-line stars at particular stages of variability. It is used in this work to quantify changes in the Hα equivalent widths for our observed spectra.


Sign in / Sign up

Export Citation Format

Share Document