scholarly journals From NEON Field Sites to Data Portal: A Community Resource for Surface–Atmosphere Research Comes Online

2019 ◽  
Vol 100 (11) ◽  
pp. 2305-2325 ◽  
Author(s):  
Stefan Metzger ◽  
Edward Ayres ◽  
David Durden ◽  
Christopher Florian ◽  
Robert Lee ◽  
...  

AbstractThe National Ecological Observatory Network (NEON) is a multidecadal and continental-scale observatory with sites across the United States. Having entered its operational phase in 2018, NEON data products, software, and services become available to facilitate research on the impacts of climate change, land-use change, and invasive species. An essential component of NEON are its 47 tower sites, where eddy-covariance (EC) sensors are operated to determine the surface–atmosphere exchange of momentum, heat, water, and CO2. EC tower networks such as AmeriFlux, the Integrated Carbon Observation System (ICOS), and NEON are vital for providing the distributed observations to address interactions at the soil–vegetation–atmosphere interface. NEON represents the largest single-provider EC network globally, with standardized observations and data processing explicitly designed for intersite comparability and analysis of feedbacks across multiple spatial and temporal scales. Furthermore, EC is tightly integrated with soil, meteorology, atmospheric chemistry, isotope, phenology, and rich contextual observations such as airborne remote sensing and in situ sampling bouts. Here, we present an overview of NEON’s observational design, field operation, and data processing that yield community resources for the study of surface–atmosphere interactions. Near-real-time data products become available from the NEON Data Portal, and EC and meteorological data are ingested into AmeriFlux and FLUXNET globally harmonized data releases. Open-source software for reproducible, extensible, and portable data analysis includes the eddy4R family of R packages underlying the EC data product generation. These resources strive to integrate with existing infrastructures and networks, to suggest novel systemic solutions, and to synergize ongoing research efforts across science communities.

2013 ◽  
Vol 13 (3) ◽  
pp. 6247-6294 ◽  
Author(s):  
J.-F. Lamarque ◽  
F. Dentener ◽  
J. McConnell ◽  
C.-U. Ro ◽  
M. Shaw ◽  
...  

Abstract. We present multi-model global datasets of nitrogen and sulfate deposition covering time periods from 1850 to 2100, calculated within the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). The computed deposition fluxes are compared to surface wet deposition and ice-core measurements. We use a new dataset of wet deposition for 2000–2002 based on critical assessment of the quality of existing regional network data. We show that for present-day (year 2000 ACCMIP time-slice), the ACCMIP results perform similarly to previously published multi-model assessments. For this time slice, we find a multi-model mean deposition of 50 Tg(N) yr−1 from nitrogen oxide emissions, 60 Tg(N) yr−1 from ammonia emissions, and 83 Tg(S) yr−1 from sulfur emissions. The analysis of changes between 1980 and 2000 indicates significant differences between model and measurements over the United States but less so over Europe. This difference points towards misrepresentation of 1980 NH3 emissions over North America. Based on ice-core records, the 1850 deposition fluxes agree well with Greenland ice cores but the change between 1850 and 2000 seems to be overestimated in the Northern Hemisphere for both nitrogen and sulfur species. Using the Representative Concentration Pathways to define the projected climate and atmospheric chemistry related emissions and concentrations, we find large regional nitrogen deposition increases in 2100 in Latin America, Africa and parts of Asia under some of the scenarios considered. Increases in South Asia are especially large, and are seen in all scenarios, with 2100 values more than double 2000 in some scenarios and reaching > 1300 mg(N) m−2 yr−1 averaged over regional to continental scale regions in RCP 2.6 and 8.5, ~30–50 % larger than the values in any region currently (2000). The new ACCMIP deposition dataset provides novel, consistent and evaluated global gridded deposition fields for use in a wide range of climate and ecological studies.


2018 ◽  
Vol 11 (7) ◽  
pp. 2941-2953 ◽  
Author(s):  
Sebastian D. Eastham ◽  
Michael S. Long ◽  
Christoph A. Keller ◽  
Elizabeth Lundgren ◽  
Robert M. Yantosca ◽  
...  

Abstract. Global modeling of atmospheric chemistry is a grand computational challenge because of the need to simulate large coupled systems of ∼100–1000 chemical species interacting with transport on all scales. Offline chemical transport models (CTMs), where the chemical continuity equations are solved using meteorological data as input, have usability advantages and are important vehicles for developing atmospheric chemistry knowledge that can then be transferred to Earth system models. However, they have generally not been designed to take advantage of massively parallel computing architectures. Here, we develop such a high-performance capability for GEOS-Chem (GCHP), a CTM driven by meteorological data from the NASA Goddard Earth Observation System (GEOS) and used by hundreds of research groups worldwide. GCHP is a grid-independent implementation of GEOS-Chem using the Earth System Modeling Framework (ESMF) that permits the same standard model to operate in a distributed-memory framework for massive parallelization. GCHP also allows GEOS-Chem to take advantage of the native GEOS cubed-sphere grid for greater accuracy and computational efficiency in simulating transport. GCHP enables GEOS-Chem simulations to be conducted with high computational scalability up to at least 500 cores, so that global simulations of stratosphere–troposphere oxidant–aerosol chemistry at C180 spatial resolution (∼0.5∘×0.625∘) or finer become routinely feasible.


2021 ◽  
pp. 107755872110008
Author(s):  
Edward R. Berchick ◽  
Heide Jackson

Estimates of health insurance coverage in the United States rely on household-based surveys, and these surveys seek to improve data quality amid a changing health insurance landscape. We examine postcollection processing improvements to health insurance data in the Current Population Survey Annual Social and Economic Supplement (CPS ASEC), one of the leading sources of coverage estimates. The implementation of updated data extraction and imputation procedures in the CPS ASEC marks the second stage of a two-stage improvement and the beginning of a new time series for health insurance estimates. To evaluate these changes, we compared estimates from two files that introduce the updated processing system with two files that use the legacy system. We find that updates resulted in higher rates of health insurance coverage and lower rates of dual coverage, among other differences. These results indicate that the updated data processing improves coverage estimates and addresses previously noted limitations of the CPS ASEC.


2017 ◽  
Vol 26 (7) ◽  
pp. 551 ◽  
Author(s):  
Christopher J. Dunn ◽  
David E. Calkin ◽  
Matthew P. Thompson

Wildfire’s economic, ecological and social impacts are on the rise, fostering the realisation that business-as-usual fire management in the United States is not sustainable. Current response strategies may be inefficient and contributing to unnecessary responder exposure to hazardous conditions, but significant knowledge gaps constrain clear and comprehensive descriptions of how changes in response strategies and tactics may improve outcomes. As such, we convened a special session at an international wildfire conference to synthesise ongoing research focused on obtaining a better understanding of wildfire response decisions and actions. This special issue provides a collection of research that builds on those discussions. Four papers focus on strategic planning and decision making, three papers on use and effectiveness of suppression resources and two papers on allocation and movement of suppression resources. Here we summarise some of the key findings from these papers in the context of risk-informed decision making. This collection illustrates the value of a risk management framework for improving wildfire response safety and effectiveness, for enhancing fire management decision making and for ushering in a new fire management paradigm.


2020 ◽  
Author(s):  
Vicki Ferrini ◽  
John Morton ◽  
Lindsay Gee ◽  
Erin Heffron ◽  
Hayley Drennon ◽  
...  

2021 ◽  
Author(s):  
Martin Gauch ◽  
Frederik Kratzert ◽  
Grey Nearing ◽  
Jimmy Lin ◽  
Sepp Hochreiter ◽  
...  

<p>Rainfall–runoff predictions are generally evaluated on reanalysis datasets such as the DayMet, Maurer, or NLDAS forcings in the CAMELS dataset. While useful for benchmarking, this does not fully reflect real-world applications. There, meteorological information is much coarser, and fine-grained predictions are at best available until the present. For any prediction of future discharge, we must rely on forecasts, which introduce an additional layer of uncertainty. Thus, the model inputs need to switch from past data to forecast data at some point, which raises several questions: How can we design models that support this transition? How can we design tests that evaluate the performance of the model? Aggravating the challenge, the past and future data products may include different variables or have different temporal resolutions.</p><p>We demonstrate how to seamlessly integrate past and future meteorological data in one deep learning model, using the recently proposed Multi-Timescale LSTM (MTS-LSTM, [1]). MTS-LSTMs are based on LSTMs but can generate rainfall–runoff predictions at multiple timescales more efficiently. One MTS-LSTM consists of several LSTMs that are organized in a branched structure. Each LSTM branch processes a part of the input time series at a certain temporal resolution. Then it passes its states to the next LSTM branch—thus sharing information across branches. We generalize this layout to handovers across data products (rather than just timescales) through an additional branch. This way, we can integrate past and future data in one prediction pipeline, yielding more accurate predictions.</p><p> </p><p>[1] M. Gauch, F. Kratzert, D. Klotz, G. Nearing, J. Lin, and S. Hochreiter. “Rainfall–Runoff Prediction at Multiple Timescales with a Single Long Short-Term Memory Network.” Hydrology and Earth System Sciences Discussions, in review, 2020.</p>


2016 ◽  
Author(s):  
Francesca Sprovieri ◽  
Nicola Pirrone ◽  
Mariantonia Bencardino ◽  
Francesco D’Amore ◽  
Francesco Carbone ◽  
...  

Abstract. Long-term monitoring data of ambient mercury (Hg) on a global scale to assess its emission, transport, atmospheric chemistry, and deposition processes is vital to understanding the impact of Hg pollution on the environment. The Global Mercury Observation System (GMOS) project was funded by the European Commission (www.gmos.eu), and started in November 2010 with the overall goal to develop a coordinated global observing system to monitor Hg on a global scale, including a large network of ground-based monitoring stations, ad-hoc periodic oceanographic cruises and measurement flights in the lower and upper troposphere, as well as in the lower stratosphere. To date more than 40 ground-based monitoring sites constitute the global network covering many regions where little to no observational data were available before GMOS. This work presents atmospheric Hg concentrations recorded worldwide in the framework of the GMOS project (2010–2015), analyzing Hg measurement results in terms of temporal trends, seasonality and comparability within the network. Major findings highlighted in this paper include a clear gradient of Hg concentrations between the Northern and Southern Hemisphere, confirming that the gradient observed is mostly driven by local and regional sources, which can be anthropogenic, natural or a combination of both.


2018 ◽  
Author(s):  
SeaPlan

As more ocean plans are developed and adopted around the world, the importance of accessible, up-to-date spatial data in the planning process has become increasingly apparent. Many ocean planning efforts in the United States and Canada rely on a companion data portal–a curated catalog of spatial datasets characterizing the ocean uses and natural resources considered as part of ocean planning and management decision-making.Data portals designed to meet ocean planning needs tend to share three basic characteris- tics. They are: ocean-focused, map-based, and publicly-accessible. This enables planners, managers, and stakeholders to access common sets of sector-speci c, place-based information that help to visualize spatial relationships (e.g., overlap) among various uses and the marine environment and analyze potential interactions (e.g., synergies or con icts) among those uses and natural resources. This data accessibility also enhances the transparency of the planning process, arguably an essential factor for its overall success.This paper explores key challenges, considerations, and best practices for developing and maintaining a data portal. By observing the relationship between data portals and key principles of ocean planning, we posit three overarching themes for data portal best practices: accommodation of diverse users, data vetting and review by stakeholders, and integration with the planning process. The discussion draws primarily from the use of the Northeast Ocean Data Portal to support development of the Northeast Ocean Management Plan, with additional examples from other portals in the U.S. and Canada.


2021 ◽  
Vol 28 (2) ◽  
pp. 247-256
Author(s):  
Siming He ◽  
Jian Guan ◽  
Xiu Ji ◽  
Hang Xu ◽  
Yi Wang

Abstract. In spread spectrum induced polarization (SSIP) data processing, attenuation of background noise from the observed data is the essential step that improves the signal-to-noise ratio (SNR) of SSIP data. The time-domain spectral induced polarization based on pseudorandom sequence (TSIP) algorithm has been proposed to improve the SNR of these data. However, signal processing in background noise is still a challenging problem. We propose an enhanced correlation identification (ECI) algorithm to attenuate the background noise. In this algorithm, the cross-correlation matching method is helpful for the extraction of useful components of the raw SSIP data and suppression of background noise. Then the frequency-domain IP (FDIP) method is used for extracting the frequency response of the observation system. Experiments on both synthetic and real SSIP data show that the ECI algorithm will not only suppress the background noise but also better preserve the valid information of the raw SSIP data to display the actual location and shape of adjacent high-resistivity anomalies, which can improve subsequent steps in SSIP data processing and imaging.


Sign in / Sign up

Export Citation Format

Share Document