scholarly journals A Survey of X2 Isohaline Empirical Models for the San Francisco Estuary

Author(s):  
John Rath ◽  
◽  
Paul Hutton ◽  
Eli Ateljevich ◽  
Sujoy Roy ◽  
...  

This work surveys the performance of several empirical models, all recalibrated to a common data set, that were developed over the past 25 years to relate freshwater flow and salinity in the San Francisco Estuary (estuary). The estuary’s salinity regime—broadly regulated to meet urban, agricultural, and ecosystem beneficial uses—is managed in spring and certain fall months to meet ecosystem objectives by controlling the 2 parts per thousand bottom salinity isohaline position (referred to as X2). We tested five empirical models for accuracy, mean, and transient behavior. We included a sixth model, employing a machine learning framework and variables other than outflow, in this survey to compare fitting skill, but did not subject it to the full suite of tests applied to the other five empirical models. Model performance was observed to vary with hydrology, year, and season, and in some cases exhibited unique limitations as a result of mathematical formulation. However, no single model formulation was found to be consistently superior across a wide range of tests and applications. One test revealed that the models performed equally well when recalibrated to a uniformly perturbed input time-series. Thus, while the models may be used to identify anomalies or seasonal biases (the latter being the subject of a companion paper), their use as inverse models to infer freshwater outflow to the estuary from salinity observations is not expected to improve upon the absolute accuracy of existing outflow estimates. This survey suggests that, for analyses that span a long hydrologic record, an ensemble approach—rather than the use of any individual model on its own—may be preferable to exploit the strengths of individual models.

Author(s):  
Michael J. Wagner ◽  
Guangdong Zhu

This paper presents the technical formulation and demonstrated model performance results of a new direct-steam-generation (DSG) model in NREL’s System Advisor Model (SAM). The model predicts the annual electricity production of a wide range of system configurations within the DSG Linear Fresnel technology by modeling hourly performance of the plant in detail. The quasi-steady-state formulation allows users to investigate energy and mass flows, operating temperatures, and pressure drops for geometries and solar field configurations of interest. The model includes tools for heat loss calculation using either empirical polynomial heat loss curves as a function of steam temperature, ambient temperature, and wind velocity, or a detailed evacuated tube receiver heat loss model. Thermal losses are evaluated using a computationally efficient nodal approach, where the solar field and headers are discretized into multiple nodes where heat losses, thermal inertia, steam conditions (including pressure, temperature, enthalpy, etc.) are individually evaluated during each time step of the simulation. This paper discusses the mathematical formulation for the solar field model and describes how the solar field is integrated with the other subsystem models, including the power cycle and optional auxiliary fossil system. Model results are also presented to demonstrate plant behavior in the various operating modes.


2015 ◽  
Vol 19 (1) ◽  
pp. 209-223 ◽  
Author(s):  
A. J. Newman ◽  
M. P. Clark ◽  
K. Sampson ◽  
A. Wood ◽  
L. E. Hay ◽  
...  

Abstract. We present a community data set of daily forcing and hydrologic response data for 671 small- to medium-sized basins across the contiguous United States (median basin size of 336 km2) that spans a very wide range of hydroclimatic conditions. Area-averaged forcing data for the period 1980–2010 was generated for three basin spatial configurations – basin mean, hydrologic response units (HRUs) and elevation bands – by mapping daily, gridded meteorological data sets to the subbasin (Daymet) and basin polygons (Daymet, Maurer and NLDAS). Daily streamflow data was compiled from the United States Geological Survey National Water Information System. The focus of this paper is to (1) present the data set for community use and (2) provide a model performance benchmark using the coupled Snow-17 snow model and the Sacramento Soil Moisture Accounting Model, calibrated using the shuffled complex evolution global optimization routine. After optimization minimizing daily root mean squared error, 90% of the basins have Nash–Sutcliffe efficiency scores ≥0.55 for the calibration period and 34% ≥ 0.8. This benchmark provides a reference level of hydrologic model performance for a commonly used model and calibration system, and highlights some regional variations in model performance. For example, basins with a more pronounced seasonal cycle generally have a negative low flow bias, while basins with a smaller seasonal cycle have a positive low flow bias. Finally, we find that data points with extreme error (defined as individual days with a high fraction of total error) are more common in arid basins with limited snow and, for a given aridity, fewer extreme error days are present as the basin snow water equivalent increases.


2018 ◽  
Vol 44 (2) ◽  
pp. 144-179
Author(s):  
Eric Parsons ◽  
Cory Koedel ◽  
Li Tan

We study the relative performance of two policy-relevant value-added models—a one-step fixed effect model and a two-step aggregated residuals model—using a simulated data set well grounded in the value-added literature. A key feature of our data generating process is that student achievement depends on a continuous measure of economic disadvantage. This is a realistic condition that has implications for model performance because researchers typically have access to only a noisy, binary measure of disadvantage. We find that one- and two-step value-added models perform similarly across a wide range of student and teacher sorting conditions, with the two-step model modestly outperforming the one-step model in conditions that best match observed sorting in real data. A reason for the generally superior performance of the two-step model is that it better handles the use of an error-prone, dichotomous proxy for student disadvantage.


2014 ◽  
Vol 4 (3) ◽  
pp. 4-27
Author(s):  
Erin Beller ◽  
Ruth Askevold ◽  
Robin Grossinger

These maps, based on research by the San Francisco Estuary Institute’s Center for Resilient Landscapes, reconstruct California ecosystems as they were in the late 18th and early 19th centuries, and compare them to present-day landscapes. They are designed to provide an understanding of the complexity and diversity of California ecosystems, to help explain how landscapes worked, to track persistence and change, and to identify potential future scenarios. The changes made evident when the maps are compared remind us of the enormous power we have to shape the landscapes we inhabit, and of the wide range of potential options available—options to create diverse, functional, and beautiful landscapes, inspired by the past and grounded in local potential—as we imagine and then create the future.


2018 ◽  
Vol 61 (5) ◽  
pp. 1713-1727
Author(s):  
Ruilan Dong ◽  
Hongmin Dong ◽  
Karen A. Beauchemin ◽  
Hongwei Xin

Abstract. Manure nitrogen (N) output from dairy cattle is a major environmental concern in China. Various empirical models are available to predict manure N output from dairy cattle, but the accuracy and precision of these models has not been assessed for Chinese conditions. The objective of this study was to evaluate the performance of extant models that predict different forms of manure N output for lactating dairy cows in China with the aim of identifying the best-fit and most suitable prediction models. A total of 35 empirical models were evaluated for their ability to predict N excretion of dairy cows in China fed a wide range of diets. The data set consisted of 99 treatment means from 32 publications with information on animal and dietary characteristics and N output flows. Performance of the models was evaluated using root mean square prediction error (RMSPE) and concordance correlation coefficient (CCC) analysis. A model (eq. 19) based on N intake (NI) was selected as best for predicting fecal N excretion (RMSPE = 15.8% and CCC = 0.75). Another model that also used NI as an input variable was most suitable for predicting urinary N (RMSPE = 26.0% and CCC = 0.63, eq. 14) and total N (RMSPE = 15.8% and CCC = 0.81, eq. 31). Models predicting urinary urea N (UUN) and urinary N / total N performed poorly. Overall, the deviation of the regression line from the equality line (y = x line) for even the best-fit urinary, fecal, and total N excretion models demonstrated the need to develop improved models for use under Chinese conditions. Using N output data from dairy cows in China to develop manure N output models may help improve environmental stewardship of the dairy industry in China. Keywords: Dairy cows, Evaluation, Manures, Model performance, Nitrogen excretion.


2017 ◽  
Author(s):  
Ana Clara Santos ◽  
Maria Manuela Portela ◽  
Andrea Rinaldo ◽  
Bettina Schaefli

Abstract. This paper assesses the performance of an analytical modeling framework for streamflow probability distributions for summer streamflow of 26 Swiss catchments characterized by negligible anthropic influence. These catchments show a wide range of hydroclimatic regimes, including snow- and icemelt influenced streamflows. The model parameters are estimated from a gridded daily precipitation data set and observed daily discharge time series. The performance of the linear and nonlinear model version is assessed in terms of reproducing observed flow duration curves and their natural variability. The results show that the model performs well for summer discharges under all analyzed regimes and that there is a clear model performance increase with mean catchment elevation (i.e with transition from rainfall-dominated to snow-influenced regimes). The nonlinear model version outperforms the linear model for all regimes but the performance difference decreases also with mean catchment elevation. Future work will focus on the extension of the modeling framework, addressing snowmelt and snowfall onset.


Author(s):  
Dylan Stompe ◽  
Peter Moyle ◽  
Avery Kruger ◽  
John Durand

Many fishes in the San Francisco Estuary have suffered declines in recent decades, as shown by numerous long-term monitoring programs. A long-term monitoring program, such as the Interagency Ecological Program, comprises a suite of surveys, each conducted by a state or federal agency or academic institution. These types of programs have produced rich data sets that are useful for tracking species trends over time. Problems arise from drawing conclusions based on one or few surveys because each survey samples a different subset of species or reflects different spatial or temporal trends in abundance. The challenges in using data sets from these surveys for comparative purposes stem from methodological differences, magnitude of data, incompatible data formats, and end-user preference for familiar surveys. To improve the utility of these data sets and encourage multi-survey analyses, we quantitatively rate these surveys based on their ability to represent species trends, present a methodology for integrating long-term data sets, and provide examples that highlight the importance of expanded analyses. We identify areas and species that are under-sampled, and compare fish salvage data from large water export facilities with survey data. Our analysis indicates that while surveys are redundant for some species, no two surveys are completely duplicative. Differing trends become evident when considering individual and aggregate survey data, because they imply spatial, seasonal, or gear-dependent catch. Our quantitative ratings and integrated data set allow for improved and better-informed comparisons of species trends across surveys, while highlighting the importance of the current array of sampling methodologies.


Author(s):  
Vanessa Tobias ◽  
Randall Baxter

Abundance of estuarine fish species has declined globally. In the San Francisco Estuary (SFE), long-term monitoring documented declines of many species including the anadromous species Longfin Smelt (Spirinchus thaleichthys). To improve management and recovery planning, we identified patterns in the timing, seasonal occupancy, and distribution of Longfin Smelt in a monitoring study (San Francisco Bay Study) for five regions of the SFE using a generalized additive model. We then investigated the year-to-year variability in the shape of the seasonal relationships using functional data analysis (FDA). FDA separated the variability due to population size from variability due to differences in occupancy timing. We found that Longfin Smelt have a consistent seasonal distribution pattern, that two trawl types were needed to accurately describe the pattern, and that the pattern is largely consistent with the hypothesized conceptual model. After accounting for variability in occupancy due to year-class strength, the timing of occupancy has shifted in three regions. The most variable period for the upstream regions Suisun Bay and Confluence was age-0 summer and for the downstream region Central Bay, was age-0 late fall. This manifested as a recent delay in the typical fall re-occupation of upstream regions, reducing Longfin Smelt abundance as calculated by another monitoring study (Fall Midwater Trawl); thus, a portion of recent reductions in Fall Midwater Trawl abundance of Longfin Smelt result from changes in behavior rather than a decline in abundance. The presence of multiple monitoring surveys allowed analysis of distribution from one data set to interpret patterns in abundance of another. Future investigations will examine environmental conditions as covariates during these periods and could improve our understanding of what conditions contribute to the shifting occupancy timing of Longfin Smelt, and possibly provide insight into the long-term quality of the San Francisco Estuary as habitat.


Author(s):  
John-Carlos Perea ◽  
Jacob E. Perea

The concepts of expectation, anomaly, and unexpectedness that Philip J. Deloria developed in Indians in Unexpected Places (2004) have shaped a wide range of interdisciplinary research projects. In the process, those terms have changed the ways it is possible to think about American Indian representation, cosmopolitanism, and agency. This article revisits my own work in this area and provides a short survey of related scholarship in order to reassess the concept of unexpectedness in the present moment and to consider the ways my deployment of it might change in order to better meet the needs of my students. To begin a process of engaging intergenerational perspectives on this subject, the article concludes with an interview with Dr. Jacob E. Perea, dean emeritus of the Graduate College of Education at San Francisco State University and a veteran of the 1969 student strikes that founded the College of Ethnic Studies at San Francisco State University.


2019 ◽  
Vol 16 (7) ◽  
pp. 808-817 ◽  
Author(s):  
Laxmi Banjare ◽  
Sant Kumar Verma ◽  
Akhlesh Kumar Jain ◽  
Suresh Thareja

Background: In spite of the availability of various treatment approaches including surgery, radiotherapy, and hormonal therapy, the steroidal aromatase inhibitors (SAIs) play a significant role as chemotherapeutic agents for the treatment of estrogen-dependent breast cancer with the benefit of reduced risk of recurrence. However, due to greater toxicity and side effects associated with currently available anti-breast cancer agents, there is emergent requirement to develop target-specific AIs with safer anti-breast cancer profile. Methods: It is challenging task to design target-specific and less toxic SAIs, though the molecular modeling tools viz. molecular docking simulations and QSAR have been continuing for more than two decades for the fast and efficient designing of novel, selective, potent and safe molecules against various biological targets to fight the number of dreaded diseases/disorders. In order to design novel and selective SAIs, structure guided molecular docking assisted alignment dependent 3D-QSAR studies was performed on a data set comprises of 22 molecules bearing steroidal scaffold with wide range of aromatase inhibitory activity. Results: 3D-QSAR model developed using molecular weighted (MW) extent alignment approach showed good statistical quality and predictive ability when compared to model developed using moments of inertia (MI) alignment approach. Conclusion: The explored binding interactions and generated pharmacophoric features (steric and electrostatic) of steroidal molecules could be exploited for further design, direct synthesis and development of new potential safer SAIs, that can be effective to reduce the mortality and morbidity associated with breast cancer.


Sign in / Sign up

Export Citation Format

Share Document