scholarly journals CO2 Transport, Variability, and Budget over the Southern California Air Basin Using the High-Resolution WRF-VPRM Model during the CalNex 2010 Campaign

2018 ◽  
Vol 57 (6) ◽  
pp. 1337-1352 ◽  
Author(s):  
Changhyoun Park ◽  
Christoph Gerbig ◽  
Sally Newman ◽  
Ravan Ahmadov ◽  
Sha Feng ◽  
...  

AbstractTo study regional-scale carbon dioxide (CO2) transport, temporal variability, and budget over the Southern California Air Basin (SoCAB) during the California Research at the Nexus of Air Quality and Climate Change (CalNex) 2010 campaign period, a model that couples the Weather Research and Forecasting (WRF) Model with the Vegetation Photosynthesis and Respiration Model (VPRM) has been used. Our numerical simulations use anthropogenic CO2 emissions of the Hestia Project 2010 fossil-fuel CO2 emissions data products along with optimized VPRM parameters at “FLUXNET” sites, for biospheric CO2 fluxes over SoCAB. The simulated meteorological conditions have been validated with ground and aircraft observations, as well as with background CO2 concentrations from the coastal Palos Verdes site. The model captures the temporal pattern of CO2 concentrations at the ground site at the California Institute of Technology in Pasadena, but it overestimates the magnitude in early daytime. Analysis of CO2 by wind directions reveals the overestimate is due to advection from the south and southwest, where downtown Los Angeles is located. The model also captures the vertical profile of CO2 concentrations along with the flight tracks. The optimized VPRM parameters have significantly improved simulated net ecosystem exchange at each vegetation-class site and thus the regional CO2 budget. The total biospheric contribution ranges approximately from −24% to −20% (daytime) of the total anthropogenic CO2 emissions during the study period.

1994 ◽  
Vol 84 (1) ◽  
pp. 47-61 ◽  
Author(s):  
Chandan K. Saikia ◽  
Douglas S. Dreger ◽  
Donald V. Helmberger

Abstract We have investigated energy amplification observed within Greater Los Angeles basin by analyzing regional waveforms recorded from several Nevada Test Site (NTS) nuclear explosions. Although the stations are located nearly at the same azimuth (distances ranging from 350 to 400 km), the seismograms recorded in Compton (the central part of the basin), Long Beach (the southern edge of the basin), and downtown Los Angeles are remarkably different, even for a common explosion. Following the onset of Lg waves, the Long Beach sites have recorded surface waves for more than 100 sec. From one explosion, the sites within downtown Los Angeles have recorded seismograms with strong 3-sec surface waves. These waves are not observed on the seismograms recorded in the neighboring hard-rock site California Institute of Technology (CIT) station. Thus, they must have been generated by local wave guides. Numerically, we modeled these 3-sec waves by convolving the CIT seismogram with the response of a sedimentary strata dipping gently (about 6°) from CIT toward downtown. We also examined the irregular basin effect by analyzing the variation of cumulative temporal energy across the basin relative to the energy recorded at CIT from the same explosion. Variation up to a factor of 30 was observed. To model the energy variation that is caused by extended surface waves in the Long Beach area, we used numerically simulated site transfer functions (STF) from a NNE-SSW oriented two-dimensional basin structure extending from Montebello to Palos Verdes that included low-velocity sedimentary material in the uppermost layers. These STFs were convolved with the CIT seismogram recorded from the MAST explosion. To simulate elongated duration of surface waves, we introduced in the upper sedimentary structure some discontinuous microbasin structures of varying size, each microbasin delaying the seismic waves propagating through them. Consequently, the surface-reflected phases through these structures are delayed and reflected into the upper medium by the underlying interfaces. This mechanism helps delayed energy to appear at a later time and result in a longer time duration at sites located at southern edge of the basin.


2019 ◽  
Vol 12 (3) ◽  
pp. 1029-1066 ◽  
Author(s):  
Lluís Fita ◽  
Jan Polcher ◽  
Theodore M. Giannaros ◽  
Torge Lorenz ◽  
Josipa Milovac ◽  
...  

Abstract. The Coordinated Regional Climate Downscaling Experiment (CORDEX) is a scientific effort of the World Climate Research Program (WRCP) for the coordination of regional climate initiatives. In order to accept an experiment, CORDEX provides experiment guidelines, specifications of regional domains, and data access and archiving. CORDEX experiments are important to study climate at the regional scale, and at the same time, they also have a very prominent role in providing regional climate data of high quality. Data requirements are intended to cover all the possible needs of stakeholders and scientists working on climate change mitigation and adaptation policies in various scientific communities. The required data and diagnostics are grouped into different levels of frequency and priority, and some of them even have to be provided as statistics (minimum, maximum, mean) over different time periods. Most commonly, scientists need to post-process the raw output of regional climate models, since the latter was not originally designed to meet the specific CORDEX data requirements. This post-processing procedure includes the computation of diagnostics, statistics, and final homogenization of the data, which is often computationally costly and time-consuming. Therefore, the development of specialized software and/or code is required. The current paper presents the development of a specialized module (version 1.3) for the Weather Research and Forecasting (WRF) model capable of outputting the required CORDEX variables. Additional diagnostic variables not required by CORDEX, but of potential interest to the regional climate modeling community, are also included in the module. “Generic” definitions of variables are adopted in order to overcome the model and/or physics parameterization dependence of certain diagnostics and variables, thus facilitating a robust comparison among simulations. The module is computationally optimized, and the output is divided into different priority levels following CORDEX specifications (Core, Tier 1, and additional) by selecting pre-compilation flags. This implementation of the module does not add a significant extra cost when running the model; for example, the addition of the Core variables slows the model time step by less than a 5 %. The use of the module reduces the requirements of disk storage by about a 50 %. The module performs neither additional statistics over different periods of time nor homogenization of the output data.


2019 ◽  
Vol 109 (4) ◽  
pp. 1563-1570 ◽  
Author(s):  
Zefeng Li ◽  
Egill Hauksson ◽  
Jennifer Andrews

Abstract Modern seismic networks commonly equip a station with multiple sensors, to extend the frequency band and the dynamic range of data recorded at the station. In addition, in our recent study we showed that comparison of data from co‐located seismometers and accelerometers is useful for detecting instrument malfunctions and monitoring data quality. In this study, we extend comparison of data from different co‐located sensors to two other applications: (1) amplitude calibration for data from vertical short‐period sensors with strong‐motion sensors as baseline and (2) measurement of orientation discrepancy between strong‐motion and broadband sensors. We perform systematic analyses of data recorded by the California Institute of Technology/U.S. Geological Survey Southern California Seismic Network. In the first application, we compare the amplitude of data from vertical short‐period sensors to that of data from co‐located strong‐motion sensors and measure the amplitude calibration factors for 93 short‐period sensors. Among them, 49 stations are measured at ∼1.0, 42 measured at ∼0.6, as well as two outlying stations: GFF at 0.3 and CHI at 1.3. These values are found to be related to the sensors’ sensitivity values. In the second application, we measure orientation discrepancy between 222 co‐located broadband and strong‐motion sensors. All the vertical orientation differences are found to be within 5°. However, the horizontal orientation differences of 22 stations are greater than 6°, among which four stations have reverse rotation or 180° from the expected orientation. These measurements have been communicated to network operators and fixes are being applied. This study, together with our previously developed data monitoring framework, demonstrates that comparison of different co‐located sensors is a simple and effective tool for a broad range of seismic data assessment and instrument calibration.


2020 ◽  
Author(s):  
Chunyu Dong ◽  
Glen MacDonald ◽  
Gregory Okin ◽  
Thomas Gillespie

<p>California's climate is projected to have more droughts and heatwaves in the future. A combination of heat and drought stress may significantly affect vegetation health of the Mediterranean ecosystems than drought stress alone. Based on multi-source remote sensing and surface data, we investigated the impacts of drought and climate change on the Mediterranean-climate vegetation of California at different scales, i.e. the entire state, southern California, and Los Angeles urban area. For entire California, we find that a hydroclimatic dipole regulated by El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO) intensifies the aridity in southern California compared to the north. At a regional scale of southern California, we utilized a bootstrapping regression model to analyze the geographical influences on the relationships between vegetation and drought. Results suggest a warmer climate can significantly increase vegetation sensitivity to drought. In addition, soil texture and elevation seem to also play an important role in adjusting the wildland vegetation susceptibility to drought. In the Los Angeles urban area, we find socioeconomic conditions is the decisive influence in intensifying or mitigating the vegetation response to water-scarce seasons and years. The projected hotter climate in the 21<sup>st </sup>century may reshape the future landscapes of the coupled human-natural system in California by exacerbating drought severity and duration, differentiating mortality, and increasing wildfires.</p>


2020 ◽  
Vol 110 (1) ◽  
pp. 213-225
Author(s):  
Walker S. Evans ◽  
Andreas Plesch ◽  
John H. Shaw ◽  
Natesh L. Pillai ◽  
Ellen Yu ◽  
...  

ABSTRACT We present a new statistical method for associating earthquakes with their source faults in the Southern California Earthquake Center’s 3D Community Fault Models (CFMs; Plesch et al., 2007) in near-real time and for historical earthquakes. The method uses the hypocenter location, focal mechanism orientation, and earthquake sequencing to produce the probabilities of association between a given earthquake and each fault in the CFM as well as the probability that the event occurred on a fault not represented in the CFM. We used a set of known likely associations (the Known Likely Sets) as training or testing data and demonstrated that our models perform effectively on these examples and should be expected to perform well on other earthquakes with similar characteristics including the full catalog of southern California earthquakes (Hauksson et al., 2012). To produce near-real-time associations for future earthquakes, the models have been implemented as an R script and connected to the Southern California Seismic Network data processing system operated by the California Institute of Technology and the U.S. Geological Survey to automatically produce fault associations for earthquakes of M≥3.0 as they occur. To produce historical associations, we apply the method to the most recent CFM version (v.5.2), yielding modeled historical associations for all events of M≥3.0 in the catalog of southern California earthquakes from 1981 to 2016. More than 80% of these events and 99% of moment within the geography covered by the CFM had a primary association with a CFM fault. The models can help identify clusters of small earthquakes that indicate the onset of activity associated with major faults. The method will also assist in communicating objective information about the faults that source earthquakes to the scientific community and general public. In the event of a damaging southern California earthquake, the near-real-time association will provide valuable information regarding the similarity of the current event to forecast scenarios, potentially aiding in earthquake response.


2015 ◽  
Vol 143 (11) ◽  
pp. 4514-4532 ◽  
Author(s):  
Erin B. Munsell ◽  
Jason A. Sippel ◽  
Scott A. Braun ◽  
Yonghui Weng ◽  
Fuqing Zhang

Abstract The governing dynamics and uncertainties of an ensemble simulation of Hurricane Nadine (2012) are assessed through the use of a regional-scale convection-permitting analysis and forecast system based on the Weather Research and Forecasting (WRF) Model and an ensemble Kalman filter (EnKF). For this case, the data that are utilized were collected during the 2012 phase of the National Aeronautics and Space Administration’s (NASA) Hurricane and Severe Storm Sentinel (HS3) experiment. The majority of the tracks of this ensemble were successful, correctly predicting Nadine’s turn toward the southwest ahead of an approaching midlatitude trough, though 10 members forecasted Nadine to be carried eastward by the trough. Ensemble composite and sensitivity analyses reveal the track divergence to be caused by differences in the environmental steering flow that resulted from uncertainties associated with the position and subsequent strength of a midlatitude trough. Despite the general success of the ensemble track forecasts, the intensity forecasts indicated that Nadine would strengthen, which did not happen. A sensitivity experiment performed with the inclusion of sea surface temperature (SST) updates significantly reduced the intensity errors associated with the simulation. This weakening occurred as a result of cooling of the SST field in the vicinity of Nadine, which led to weaker surface sensible and latent heat fluxes at the air–sea interface. A comparison of environmental variables, including relative humidity, temperature, and shear yielded no obvious differences between the WRF-EnKF simulations and the HS3 observations. However, an initial intensity bias in which the WRF-EnKF vortices are stronger than the observed vortex appears to be the most likely cause of the final intensity errors.


2017 ◽  
Vol 74 (11) ◽  
pp. 3771-3785 ◽  
Author(s):  
Yue Ying ◽  
Fuqing Zhang

Abstract Through a series of convection-permitting regional-scale ensembles based on the Weather Research and Forecasting (WRF) Model, this study investigates the predictability of multiscale weather and convectively coupled equatorial waves during the active phase of a Madden–Julian oscillation (MJO) event over the Indian Ocean from 12 October to 12 November 2011. It is found that the practical predictability limit, estimated by the spread of the ensemble perturbed with realistic initial and boundary uncertainties, is as much as 8 days for horizontal winds, temperature, and humidity for scales larger than 2000 km that include equatorial Rossby, Kelvin, inertia–gravity, and mixed Rossby–gravity waves. The practical predictability limit decreases rapidly as scale decreases, resulting in a predictable time scale less than 1 day for scales smaller than 200 km. Through further experiments using minute initial and boundary perturbations an order of magnitude smaller than the current realistic uncertainties, the intrinsic predictability limit for tropical weather at larger scales (>2000 km) is estimated to be achievable beyond 2 weeks, but the limit is likely still less than 3 days for the small scales (<200 km).


2020 ◽  
Author(s):  
Lambert Caron ◽  
Erik Ivins

<p class="western"><span>Within the past decade, newly collected GPS data and geochronological constraints have resulted in refinement of glacial isostatic adjustment (GIA) models for Antarctica. These are critical to understanding ice mass changes at present-day. A correction needs to be made when using space gravity for ice mass balance assessments as any vertical movements of the solid Earth masquerade as changes in ice mass, and must be carefully removed. The main upshot of the new Antarctic GIA models is a downward revision of negative ice mass trends deduced from the Gravity Recovery and Climate Experiment (GRACE), resulting from a reduced GIA correction. This revision places GRACE inferred trend in mass balance within the 1-σ uncertainty of mass balance deduced by altimetry. Because uncertainties in Holocene ice history and the low viscosity rheology beneath the West Antarctic Ice Sheet (WAIS) continue to vex further improvement in predictions of present-day GIA gravity rate, more emphasis has been given to regional-scale GIA models. Here we use a Bayesian method to explore the gravimetric GIA trend over Antarctica, both with and without the impact of a late Pleistocene Antarctic ice loads, along with the contribution of oceanic loads. We call this model without loads associated with Antarctica a baseline for regional GIA models to build upon. We consider variations of the radial mantle viscosity profile and the volume of continental-scale ice sheets during the last glacial cycle. The modeled baseline GIA is mainly controlled by the lower mantle viscosity and continental levering caused by ocean loading. We find that the predicted baseline GIA correction weakly depends on the ice history. This correction averages to +28.4 [16.5–41.9, 95% confidence] Gt/yr. In contrast, with Pleistocene Antarctic-proximal ice included, the total modeled mass trend due to GIA is +73.7 [30.1–114.7] Gt/yr. A baseline GIA correction of 28.4 Gt/yr is of order 50% of the mean net mass trend measured during the period 1992-2017. The statistical analysis provides tools for synthesizing any regional Antarctic GIA model with a self-consistent far-field component. This may prove important for accounting for both global and regional 3-D variations in mantle viscosity.</span></p> <p class="western"><span>© 2020 California Institute of Technology.<br />Government sponsorship acknowledged. This work was performed at the California Institute of Technology's Jet Propulsion Laboratory under a contract with the National Aeronautics and Space Administration's Cryosphere Science Program. </span></p>


2014 ◽  
Vol 96 (4) ◽  
pp. 373-404
Author(s):  
Hunter Hollins

While spectator interest got aircraft off the ground, scientific inquiry initially fueled advances in design. But from a very early date, military application was a driving force. The histories of aircraft, the California Institute of Technology (Caltech), and Jet Propulsion Laboratory (JPL) bring to light the relative roles of science and military in the development of aerospace in Southern California.


Sign in / Sign up

Export Citation Format

Share Document