scholarly journals Perennial Crop Dynamics May Affect Long-Run Groundwater Levels

Land ◽  
2021 ◽  
Vol 10 (9) ◽  
pp. 971
Author(s):  
Bradley Franklin ◽  
Kurt Schwabe ◽  
Lucia Levers

During California’s severe drought from 2011 to 2017, a significant shift in irrigated area from annual to perennial crops occurred. Due to the time requirements associated with bringing perennial crops to maturity, more perennial acreage likely increases the opportunity costs of fallowing, a common drought mitigation strategy. Increases in the costs of fallowing may put additional pressure on another common “go-to” drought mitigation strategy—groundwater pumping. Yet, overdrafted groundwater systems worldwide are increasingly becoming the norm. In response to depleting aquifers, as evidenced in California, sustainable groundwater management policies are being implemented. There has been little modeling of the potential effect of increased perennial crop production on groundwater use and the implications for public policy. A dynamic, integrated deterministic model of agricultural production in Kern County, CA, is developed here with both groundwater and perennial area by vintage treated as stock variables. Model scenarios investigate the impacts of surface water reductions and perennial prices on land and groundwater use. The results generally indicate that perennial production may lead to slower aquifer draw-down compared with deterministic models lacking perennial crop dynamics, highlighting the importance of accounting for the dynamic nature of perennial crops in understanding the co-evolution of agricultural and groundwater systems under climate change.

2019 ◽  
Vol 12 (1) ◽  
pp. 96 ◽  
Author(s):  
James Brinkhoff ◽  
Justin Vardanega ◽  
Andrew J. Robson

Land cover mapping of intensive cropping areas facilitates an enhanced regional response to biosecurity threats and to natural disasters such as drought and flooding. Such maps also provide information for natural resource planning and analysis of the temporal and spatial trends in crop distribution and gross production. In this work, 10 meter resolution land cover maps were generated over a 6200 km2 area of the Riverina region in New South Wales (NSW), Australia, with a focus on locating the most important perennial crops in the region. The maps discriminated between 12 classes, including nine perennial crop classes. A satellite image time series (SITS) of freely available Sentinel-1 synthetic aperture radar (SAR) and Sentinel-2 multispectral imagery was used. A segmentation technique grouped spectrally similar adjacent pixels together, to enable object-based image analysis (OBIA). K-means unsupervised clustering was used to filter training points and classify some map areas, which improved supervised classification of the remaining areas. The support vector machine (SVM) supervised classifier with radial basis function (RBF) kernel gave the best results among several algorithms trialled. The accuracies of maps generated using several combinations of the multispectral and radar bands were compared to assess the relative value of each combination. An object-based post classification refinement step was developed, enabling optimization of the tradeoff between producers’ accuracy and users’ accuracy. Accuracy was assessed against randomly sampled segments, and the final map achieved an overall count-based accuracy of 84.8% and area-weighted accuracy of 90.9%. Producers’ accuracies for the perennial crop classes ranged from 78 to 100%, and users’ accuracies ranged from 63 to 100%. This work develops methods to generate detailed and large-scale maps that accurately discriminate between many perennial crops and can be updated frequently.


2021 ◽  
Vol 20 (5) ◽  
pp. 1-34
Author(s):  
Edward A. Lee

This article is about deterministic models, what they are, why they are useful, and what their limitations are. First, the article emphasizes that determinism is a property of models, not of physical systems. Whether a model is deterministic or not depends on how one defines the inputs and behavior of the model. To define behavior, one has to define an observer. The article compares and contrasts two classes of ways to define an observer, one based on the notion of “state” and another that more flexibly defines the observables. The notion of “state” is shown to be problematic and lead to nondeterminism that is avoided when the observables are defined differently. The article examines determinism in models of the physical world. In what may surprise many readers, it shows that Newtonian physics admits nondeterminism and that quantum physics may be interpreted as a deterministic model. Moreover, it shows that both relativity and quantum physics undermine the notion of “state” and therefore require more flexible ways of defining observables. Finally, the article reviews results showing that sufficiently rich sets of deterministic models are incomplete. Specifically, nondeterminism is inescapable in any system of models rich enough to encompass Newton’s laws.


2016 ◽  
Vol 16 (24) ◽  
pp. 15629-15652 ◽  
Author(s):  
Ioannis Kioutsioukis ◽  
Ulas Im ◽  
Efisio Solazzo ◽  
Roberto Bianconi ◽  
Alba Badia ◽  
...  

Abstract. Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the forecast skill, provided that certain mathematical conditions are fulfilled. In this work, four ensemble methods were applied to two different datasets, and their performance was compared for ozone (O3), nitrogen dioxide (NO2) and particulate matter (PM10). Apart from the unconditional ensemble average, the approach behind the other three methods relies on adding optimum weights to members or constraining the ensemble to those members that meet certain conditions in time or frequency domain. The two different datasets were created for the first and second phase of the Air Quality Model Evaluation International Initiative (AQMEII). The methods are evaluated against ground level observations collected from the EMEP (European Monitoring and Evaluation Programme) and AirBase databases. The goal of the study is to quantify to what extent we can extract predictable signals from an ensemble with superior skill over the single models and the ensemble mean. Verification statistics show that the deterministic models simulate better O3 than NO2 and PM10, linked to different levels of complexity in the represented processes. The unconditional ensemble mean achieves higher skill compared to each station's best deterministic model at no more than 60 % of the sites, indicating a combination of members with unbalanced skill difference and error dependence for the rest. The promotion of the right amount of accuracy and diversity within the ensemble results in an average additional skill of up to 31 % compared to using the full ensemble in an unconditional way. The skill improvements were higher for O3 and lower for PM10, associated with the extent of potential changes in the joint distribution of accuracy and diversity in the ensembles. The skill enhancement was superior using the weighting scheme, but the training period required to acquire representative weights was longer compared to the sub-selecting schemes. Further development of the method is discussed in the conclusion.


2018 ◽  
Author(s):  
Ming Yang ◽  
Louis Z. Yang

ABSTRACTWhat values of relative numerical tolerance should be chosen in simulation of a deterministic model of a biochemical reaction is unclear, which impairs the modeling effort since the simulation outcomes of a model may depend on the relative numerical tolerance values. In an attempt to provide a guideline to selecting appropriate numerical tolerance values in simulation of in vivo biochemical reactions, reasonable numerical tolerance values were estimated based on the uncertainty principle and assumptions of related cellular parameters. The calculations indicate that relative numerical tolerance values can be reasonably set at or around 10−4 for the concentrations expressed in ng/L. This work also suggests that further reducing relative numerical values may result in erroneous simulation results.


2017 ◽  
Author(s):  
Nuno R. Nené ◽  
Alistair S. Dunham ◽  
Christopher J. R. Illingworth

ABSTRACTA common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the non-deterministic properties of mutation in a finite population. We propose an alternative approach which corrects for this error, which we denote the delay-deterministic model. Applying our model to a simple evolutionary system we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model.


2019 ◽  
Vol 11 (2) ◽  
pp. 229-245
Author(s):  
Fatemeh Delkhosh ◽  
Seyed Jafar Sadjadi

AbstractThe growing demand for fuels combined with the fact that there are limited fossil fuel resources has led the world to seek renewable energy resources such as biofuels. Micro-algae can be an efficient source of biofuel energy, since it significantly reduces air pollution. In this paper, we develop a micro-algae biofuel supply chain through a two-stage approach. This study aims to commercialize micro-algae as a new source of energy. In the first stage, we utilize the Best-Worst Method (BWM) to determine the best cultivation system, and in the second stage, a bi-objective mathematical model is presented which simultaneously optimizes the economic and environmental objectives. We also propose a robust optimization model to deal with the uncertain nature of the biofuel supply chain. Our analysis on the trade-off between the supply chain’s total cost and unfulfillment demand arrives at interesting managerial insights. Furthermore, to show the effectiveness of the robust optimization model, we compare the performance of the robust and deterministic models, and the results show that the robust model dominates over the deterministic model in all scenarios. Finally, sensitivity analysis on critical parameters is conducted to help decision-makers find the optimal trade-off between investment and its benefits.


2013 ◽  
Vol 103 (2) ◽  
pp. 117-128 ◽  
Author(s):  
Mark S. Sisterson ◽  
Drake C. Stenger

Replacement of diseased plants with healthy plants is commonly used to manage spread of plant pathogens in perennial cropping systems. This strategy has two potential benefits. First, removing infected plants may slow pathogen spread by eliminating inoculum sources. Second, replacing infected plants with uninfected plants may offset yield losses due to disease. The extent to which these benefits are realized depends on multiple factors. In this study, sensitivity analyses of two spatially explicit simulation models were used to evaluate how assumptions concerning implementation of a plant replacement program and pathogen spread interact to affect disease suppression. In conjunction, effects of assumptions concerning yield loss associated with disease and rates of plant maturity on yields were simultaneously evaluated. The first model was used to evaluate effects of plant replacement on pathogen spread and yield on a single farm, consisting of a perennial crop monoculture. The second model evaluated effects of plant replacement on pathogen spread and yield in a 100 farm crop growing region, with all farms maintaining a monoculture of the same perennial crop. Results indicated that efficient replacement of infected plants combined with a high degree of compliance among farms effectively slowed pathogen spread, resulting in replacement of few plants and high yields. In contrast, inefficient replacement of infected plants or limited compliance among farms failed to slow pathogen spread, resulting in replacement of large numbers of plants (on farms practicing replacement) with little yield benefit. Replacement of infected plants always increased yields relative to simulations without plant replacement provided that infected plants produced no useable yield. However, if infected plants produced useable yields, inefficient removal of infected plants resulted in lower yields relative to simulations without plant replacement for perennial crops with long maturation periods in some cases.


1971 ◽  
Vol 93 (3) ◽  
pp. 814-817
Author(s):  
Richard H. Lyon

The interaction of structures and sound fields frequently involves many degrees of freedom of each participant in a complicated interaction process. In one way or another, statistical or deterministic models of the systems involved are applied in such problems and the resulting vibration (or radiation) is calculated, frequently to a satisfactory degree of accuracy. At other times, however, the results of these calculations suggest that a statistical rather than deterministic model might have been more satisfactory (or vice versal). What is clearly lacking is an a priori criterion for deciding whether a statistical or deterministic model of a system/response situation is more appropriate. The purpose of this paper is to discuss the manner in which a measure of disorder similar to those employed in other areas of technology might be calculated for problems in sound and vibration.


Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 678 ◽  
Author(s):  
Michail Vlysidis ◽  
Yiannis Kaznessis

Deterministic and stochastic models of chemical reaction kinetics can give starkly different results when the deterministic model exhibits more than one stable solution. For example, in the stochastic Schlögl model, the bimodal stationary probability distribution collapses to a unimodal distribution when the system size increases, even for kinetic constant values that result in two distinct stable solutions in the deterministic Schlögl model. Using zero-information (ZI) closure scheme, an algorithm for solving chemical master equations, we compute stationary probability distributions for varying system sizes of the Schlögl model. With ZI-closure, system sizes can be studied that have been previously unattainable by stochastic simulation algorithms. We observe and quantify paradoxical discrepancies between stochastic and deterministic models and explain this behavior by postulating that the entropy of non-equilibrium steady states (NESS) is maximum.


Sign in / Sign up

Export Citation Format

Share Document