scholarly journals Downscaling of ERA-Interim Temperature in the Contiguous United States and Its Implications for Rain–Snow Partitioning

2018 ◽  
Vol 19 (7) ◽  
pp. 1215-1233 ◽  
Author(s):  
Guoqiang Tang ◽  
Ali Behrangi ◽  
Ziqiang Ma ◽  
Di Long ◽  
Yang Hong

Abstract Precipitation phase has an important influence on hydrological processes. The Integrated Multisatellite Retrievals for Global Precipitation Measurement (IMERG) uses temperature data from reanalysis products to implement rain–snow classification. However, the coarse resolution of reanalysis data may not reveal the spatiotemporal variabilities of temperature, necessitating appropriate downscaling methods. This study compares the performance of eight air temperature Ta downscaling methods in the contiguous United States and six mountain ranges using temperature from the Parameter-Elevation Regressions on Independent Slopes Model (PRISM) as the benchmark. ERA-Interim Ta is downscaled from the original 0.75° to 0.1°. The results suggest that the two purely statistical downscaling methods [nearest neighbor (NN) and bilinear interpolation (BI)] show similar performance with each other. The five downscaling methods based on the free-air temperature lapse rate (TLR), which is calculated using temperature and geopotential heights at different pressure levels, notably improves the accuracy of Ta. The improvement is particularly obvious in mountainous regions. We further calculated wet-bulb temperature Tw, for rain–snow classification, using Ta and dewpoint temperature from ERA-Interim and PRISM. TLR-based downscaling methods result in more accurate Tw compared to NN and BI in the western United States, whereas the improvement is limited in the eastern United States. Rain–snow partitioning is conducted using a critical threshold of Tw with Snow Data Assimilation System (SNODAS) snowfall data serving as the benchmark. ERA-Interim-based Tw using TLR downscaling methods is better than that using NN/BI and IMERG precipitation phase. In conclusion, TLR-based downscaling methods show promising prospects in acquiring high-quality Ta and Tw with high resolution and improving rain–snow partitioning, particularly in mountainous regions.

2015 ◽  
Vol 16 (4) ◽  
pp. 1466-1477 ◽  
Author(s):  
Elizabeth M. Sims ◽  
Guosheng Liu

Abstract When estimating precipitation using remotely sensed observations, it is important to correctly classify the phase of precipitation. A misclassification can result in order-of-magnitude errors in the estimated precipitation rate. Using global ground-based observations over multiple years, the influence of different geophysical parameters on precipitation phase is investigated, with the goal of obtaining an improved method for determining precipitation phase. The parameters studied are near-surface air temperature, atmospheric moisture, low-level vertical temperature lapse rate, surface skin temperature, surface pressure, and land cover type. To combine the effects of temperature and moisture, wet-bulb temperature, instead of air temperature, is used as a key parameter for separating solid and liquid precipitation. Results show that in addition to wet-bulb temperature, vertical temperature lapse rate affects the precipitation phase. For example, at a near-surface wet-bulb temperature of 0°C, a lapse rate of 6°C km−1 results in an 86% conditional probability of solid precipitation, while a lapse rate of −2°C km−1 results in a 45% probability. For near-surface wet-bulb temperatures less than 0°C, skin temperature affects precipitation phase, although the effect appears to be minor. Results also show that surface pressure appears to influence precipitation phase in some cases; however, this dependence is not clear on a global scale. Land cover type does not appear to affect precipitation phase. Based on these findings, a parameterization scheme has been developed that accepts available meteorological data as input and returns the conditional probability of solid precipitation.


Forests ◽  
2020 ◽  
Vol 11 (1) ◽  
pp. 74 ◽  
Author(s):  
Steve D. Kruger ◽  
John F. Munsell ◽  
James L. Chamberlain ◽  
Jeanine M. Davis ◽  
Ryan D. Huish

The volume, value and distribution of the nontimber forest product (NTFP) trade in the United States are largely unknown. This is due to the lack of systematic, periodic and comprehensive market tracking programs. Trade measurement and mapping would allow market actors and stakeholders to improve market conditions, manage NTFP resources, and increase the sustainable production of raw material. This is especially true in the heavily forested and mountainous regions of the eastern United States. This study hypothesized that the tendency to purchase medicinal NTFPs in this region can be predicted using socioeconomic and environmental variables associated with habitat and trade, and those same variables can be used to build more robust estimates of trade volume. American ginseng (Panax quinquefolius L.) dealers were surveyed (n = 700), because by law they must acquire a license to legally trade in this species, and therefore report a business address. They also record purchase data. Similar data are not reported for other medicinal species sold to the same buyers, known colloquially as ‘off-roots’. Ginseng buyers were queried about trade activity in eleven commonly-harvested and previously untracked medicinal NTFP species in 15 states. Multinomial logistic regression comprised of socioeconomic and environmental predictors tied to business location was used to determine the probability that a respondent purchased off-roots. Significant predictors included location in a particular subregion, population and percentage of employment in related industries. These variables were used in a two-step cluster analysis to group respondents and nonrespondents. Modeled probabilities for off-root purchasing among respondents in each cluster were used to impute average off-root volumes for a proportion of nonrespondents in the same cluster. Respondent observations and nonrespondent estimations were summed and used to map off-root trade volume and value. Model functionality and estimates of the total volume, value and spatial distribution are discussed. The total value of the species surveyed to harvesters was 4.3 million USD. We also find that 77 percent of the trade value and 73 percent of the trade volume were represented by two species: black cohosh (Actaea racemosa L.) and goldenseal (Hydrastis canqdensis L.)


2013 ◽  
Vol 54 (63) ◽  
pp. 120-130 ◽  
Author(s):  
Lene Petersen ◽  
Francesca Pellicciotti ◽  
Inge Juszak ◽  
Marco Carenzo ◽  
Ben Brock

AbstractNear-surface air temperature, typically measured at a height of 2 m, is the most important control on the energy exchange and the melt rate at a snow or ice surface. It is distributed in a simplistic manner in most glacier melt models by using constant linear lapse rates, which poorly represent the actual spatial and temporal variability of air temperature. In this paper, we test a simple thermodynamic model proposed by Greuell and Böhm in 1998 as an alternative, using a new dataset of air temperature measurements from along the flowline of Haut Glacier d’Arolla, Switzerland. The unmodified model performs little better than assuming a constant linear lapse rate. When modified to allow the ratio of the boundary layer height to the bulk heat transfer coefficient to vary along the flowline, the model matches measured air temperatures better, and a further reduction of the root-mean-square error is obtained, although there is still considerable scope for improvement. The modified model is shown to perform best under conditions favourable to the development of katabatic winds – few clouds, positive ambient air temperature, limited influence of synoptic or valley winds and a long fetch – but its performance is poor under cloudy conditions.


Water ◽  
2019 ◽  
Vol 11 (8) ◽  
pp. 1561 ◽  
Author(s):  
Bhanu Pratap ◽  
Parmanand Sharma ◽  
Lavkush Patel ◽  
Ajit T. Singh ◽  
Vinay Kumar Gaddam ◽  
...  

In Himalaya, the temperature plays a key role in the process of snow and ice melting and, importantly, the precipitation phase changes (i.e., snow or rain). Consequently, in longer period, the melting and temperature gradient determine the state of the Himalayan glaciers. This necessitates the continuous monitoring of glacier surface melting and a well-established meteorological network in the Himalaya. An attempt has been made to study the seasonal and annual (October 2015 to September 2017) characteristics of air temperature, near-surface temperature lapse rate (tlr), in-situ glacier surface melting, and surface melt simulation by temperature-index (T-index) models for Sutri Dhaka Glacier catchment, Lahaul-Spiti region in Western Himalaya. The tlr of the catchment ranges from 0.3 to 6.5 °C km−1, varying on a monthly and seasonal timescale, which suggests the need for avoiding the use of standard environmental lapse rate (SELR ~6.5 °C km−1). The measured and extrapolated average air temperature (tavg) was found to be positive on glacier surface (4500 to 5500 m asl) between June and September (summer). Ablation data calculated for the balance years 2015–16 and 2016–17 shows an average melting of −4.20 ± 0.84 and −3.09 ± 0.62 m w.e., respectively. In compliance with positive air temperature in summer, ablation was also found to be maximum ~88% of total yearly ice melt. When comparing the observed and modelled ablation data with air temperature, we show that the high summer glacier melt was caused by warmer summer air temperature and minimum spells of summer precipitation in the catchment.


MAUSAM ◽  
2021 ◽  
Vol 68 (3) ◽  
pp. 417-428
Author(s):  
JANAK LAL NAYAVA ◽  
SUNIL ADHIKARY ◽  
OM RATNA BAJRACHARYA

This paper investigates long term (30 yrs) altitudinal variations of surface air temperatures based on air temperature data of countrywide scattered 22 stations (15 synoptic and 7 climate stations) in Nepal. Several researchers have reported that rate of air temperature rise (long term trend of atmospheric warming) in Nepal is highest in the Himalayan region (~ 3500 m asl or higher) compared to the Hills and Terai regions. Contrary to the results of previous researchers, however this study found that the increment of annual mean temperature is much higher in the Hills (1000 to 2000 m asl) than in the Terai and Mountain Regions. The temperature lapse rate in a wide altitudinal range of Nepal (70 to 5050 m asl) is -5.65 °C km-1. Warming rates in Terai and Trans-Himalayas (Jomsom) are 0.024 and 0.029 °C/year respectively.  


2012 ◽  
Vol 25 (12) ◽  
pp. 4185-4203 ◽  
Author(s):  
Samuel S. P. Shen ◽  
Christine K. Lee ◽  
Jay Lawrimore

Abstract This paper estimates the sampling error variances of gridded monthly U.S. Historical Climatology Network, version 2 (USHCN V2), time-of-observation-biases (TOB)-adjusted data. The analysis of mean surface air temperature (SAT) assesses uncertainties, trends, and the rankings of the hottest and coldest years for the contiguous United States in the period of 1895–2008. Data from the USHCN stations are aggregated onto a 2.5° × 3.5° latitude–longitude grid by an arithmetic mean of the stations inside a grid box. The sampling error variances of the gridded monthly data are estimated for every month and every grid box with data. The gridded data and their sampling error variances are used to calculate the contiguous U.S. averages and their trends and associated uncertainties. The sampling error variances are smaller (mostly less than 0.2°C2) over the eastern United States, where the station density is greater and larger (with values of 1.3°C2 for some grid boxes in the earlier period) over mountain and coastal areas. In the period of 1895–2008, every month from January to December has a positive linear trend. February has the largest trend of 0.162°C (10 yr)−1, and September has the smallest trend at 0.020°C (10 yr)−1. The three hottest (coldest) years measured by the mean SAT over the United States were ranked as 1998, 2006, and 1934 (1917, 1895, and 1912).


2015 ◽  
Vol 31 (1_suppl) ◽  
pp. S109-S130 ◽  
Author(s):  
Oliver Boyd ◽  
Kathleen Haller ◽  
Nico Luco ◽  
Morgan Moschetti ◽  
Charles Mueller ◽  
...  

The USGS National Seismic Hazard Maps were updated in 2014 and included several important changes for the central United States (CUS). Background seismicity sources were improved using a new moment-magnitude-based catalog; a new adaptive, nearest-neighbor smoothing kernel was implemented; and maximum magnitudes for background sources were updated. Areal source zones developed by the Central and Eastern United States Seismic Source Characterization for Nuclear Facilities project were simplified and adopted. The weighting scheme for ground motion models was updated, giving more weight to models with a faster attenuation with distance compared to the previous maps. Overall, hazard changes (2% probability of exceedance in 50 years, across a range of ground-motion frequencies) were smaller than 10% in most of the CUS relative to the 2008 USGS maps despite new ground motion models and their assigned logic tree weights that reduced the probabilistic ground motions by 5–20%.


Sign in / Sign up

Export Citation Format

Share Document