Modeling N Concentration and Uptake for Maize Hybrids under Growth Stage-Based Deficit Irrigations

2017 ◽  
Vol 60 (6) ◽  
pp. 2067-2081 ◽  
Author(s):  
Quanxiao Fang ◽  
Liwang Ma ◽  
Thomas J. Trout ◽  
Louise H. Comas ◽  
Kendall C. DeJonge ◽  
...  

Abstract. Current maize hybrids have lower critical aboveground biomass nitrogen (N) concentration (TCNP) and grain N concentration (GNC) compared to older hybrids, but few crop models have incorporated this trend. The objective of this study was to evaluate alternative algorithms for calculating TCNP (biomass-based method) and GNC (grain N demand based on N dilution curve) for predicting crop N concentration and N uptake for a current maize hybrid in the CERES-Maize model as implemented in the Root Zone Water Quality Model (RZWQM). Experimental data were obtained from a field study on maize irrigated to meet various percentages (40% to 100%) of evapotranspiration demand at both vegetative and reproductive stages in 2012 and 2013 in Greeley, Colorado. The original RZWQM showed little response of aboveground N concentration (AGBNC) to the irrigation treatments and overpredicted GNC in both years. As a result, crop N uptake was generally overpredicted, with root mean square error (RMSE) values of 28 to 60 kg N ha-1 for the two years. Adjusted coefficients in the original TCNP and GNC algorithms (RZWQM_ADJ) effectively reduced the overpredicted GNC but with less improvement in response to the irrigation treatments in 2013 compared with the original RZWQM simulations. The RZWQM with modified TCNP and GNC algorithms simulated lower GNC and AGBNC than the original version, significantly improved the responses to the irrigation treatments, and captured the variations in measured GNC among seasons. The corresponding crop N uptake simulations improved more in 2012 than in 2013, with lower RMSE values of 16 to 32 kg N ha-1 than the original and RZWQM_ADJ versions. The better-predicted grain N uptake by the alternative algorithms could be helpful to making better crop N management decisions under different deficit irrigation conditions. Keywords: CERES-Maize, Crop N concentration, Crop N demand, Crop N uptake, Deficit irrigation, Maize hybrid, RZWQM.

HortScience ◽  
2012 ◽  
Vol 47 (12) ◽  
pp. 1768-1774 ◽  
Author(s):  
Thomas G. Bottoms ◽  
Richard F. Smith ◽  
Michael D. Cahn ◽  
Timothy K. Hartz

As concern over NO3-N pollution of groundwater increases, California lettuce growers are under pressure to improve nitrogen (N) fertilizer efficiency. Crop growth, N uptake, and the value of soil and plant N diagnostic measures were evaluated in 24 iceberg and romaine lettuce (Lactuca sativa L. var. capitata L., and longifolia Lam., respectively) field trials from 2007 to 2010. The reliability of presidedressing soil nitrate testing (PSNT) to identify fields in which N application could be reduced or eliminated was evaluated in 16 non-replicated strip trials and five replicated trials on commercial farms. All commercial field sites had greater than 20 mg·kg−1 residual soil NO3-N at the time of the first in-season N application. In the strip trials, plots in which the cooperating growers’ initial sidedress N application was eliminated or reduced were compared with the growers’ standard N fertilization program. In the replicated trials, the growers’ N regime was compared with treatments in which one or more N fertigation through drip irrigation was eliminated. Additionally, seasonal N rates from 11 to 336 kg·ha−1 were compared in three replicated drip-irrigated research farm trials. Seasonal N application in the strip trials was reduced by an average of 77 kg·ha−1 (73 kg·ha−1 vs. 150 kg·ha−1 for the grower N regime) with no reduction in fresh biomass produced and only a slight reduction in crop N uptake (151 kg·ha−1 vs. 156 kg·ha−1 for the grower N regime). Similarly, an average seasonal N rate reduction of 88 kg·ha−1 (96 kg·ha−1 vs. 184 kg·ha−1) was achieved in the replicated commercial trials with no biomass reduction. Seasonal N rates between 111 and 192 kg·ha−1 maximized fresh biomass in the research farm trials, which were conducted in fields with lower residual soil NO3-N than the commercial trials. Across fields, lettuce N uptake was slow in the first 4 weeks after planting, averaging less than 0.5 kg·ha−1·d−1. N uptake then increased linearly until harvest (≈9 weeks after planting), averaging ≈4 kg·ha−1·d−1 over that period. Whole plant critical N concentration (Nc, the minimum whole plant N concentration required to maximize growth) was estimated by the equation Nc (g·kg−1) = 42 − 2.8 dry mass (DM, Mg·ha−1); on that basis, critical N uptake (crop N uptake required to maintain whole plant N above Nc) in the commercial fields averaged 116 kg·ha−1 compared with the mean uptake of 145 kg·ha−1 with the grower N regime. Soil NO3-N greater than 20 mg·kg−1 was a reliable indicator that N application could be reduced or delayed. Neither leaf N nor midrib NO3-N was correlated with concurrently measured soil NO3-N and therefore of limited value in directing in-season N fertilization.


2020 ◽  
Vol 63 (6) ◽  
pp. 2003-2020
Author(s):  
Maria I. Zamora Re ◽  
Sagarika Rath ◽  
Michael D. Dukes ◽  
Wendy Graham

HighlightsDSSAT simulations of final N uptake, biomass, and yield for a maize-peanut rotational field experiment with three irrigation treatments and three N fertilizer rates had good performance for the irrigated treatments (average nRMSE of 9%) but greater error for the rainfed treatments (average nRMSE of 15%).Experiments and DSSAT simulations demonstrated that N fertilizer and irrigation applications were reduced by 26% and 60%, respectively, when using a 247 kg N ha-1 fertilizer rate and a sensor-based irrigation schedule rather than conventional practices of 336 kg N ha-1 and a calendar-based irrigation method, with no impact on yield.Simulations demonstrated that N leaching during the crop rotation was reduced by 37% when an N fertilizer rate of 247 kg N ha-1 and sensor-based irrigation scheduling were used versus conventional practices.Soil N increased (=15 mg kg-1) when maize and peanut residues decayed and then leached during the fallow season. Cover or cash crops planted immediately after the maize and peanut harvests have potential to take up this N and reduce leaching.Abstract. Nitrogen (N) is an essential element for crop growth and yield; however, excessive N applications not taken up by crops can result in N leaching from the root zone, increasing N loads to waterbodies and leading to a host of environmental problems. The main objective of this study was to simulate water and N balances for a maize-peanut (Zea mays L. and Arachis hypogaea L.) rotational field experiment with three irrigation treatments and three N fertilizer rates. The irrigation treatments consisted of mimicking grower irrigation practices in the region (GROW), using soil moisture sensors to schedule irrigation (SMS), and non-irrigated (NON). The N fertilizer rates were low, medium, and high (157, 247, and 336 kg N ha-1, respectively) for maize with a constant 17 kg ha-1 for all peanut treatments. DSSAT maize genetic coefficients were calibrated using the SMS-high treatment combination under the assumption of no water or N stress. The other eight treatment combinations were used as independent data for model validation of the crop coefficients. All soil hydrologic parameters were specified based on measured values, and default DSSAT peanut genetic coefficients were used with no calibration. For the irrigated treatments, DSSAT models had good performance for N uptake, biomass, and yield (average nRMSE of 8%) and moderate performance for soil water content (average nRMSE of 18%). Soil nitrate RMSE was 21% lower than the standard deviation of the observed data (5.8 vs. 7.2 mg kg-1). For the rainfed treatments, DSSAT had greater error (average nRMSE of 15% for N uptake, biomass, and yield, and average nRMSE of 31% for soil water). Soil nitrate RMSE was 11% greater than the standard deviation of the observed data (8.0 vs. 7.2 mg kg-1), and nRMSE was >30% during the crop rotation. Simulations estimated that N leaching over the crop rotation was reduced by 24% on average when using the 247 kg N ha-1 fertilizer rate compared to 336 kg N ha-1 across the irrigation treatments. Furthermore, N leaching was reduced by 37% when using SMS to schedule irrigation and the 247 kg N ha-1 fertilizer rate for maize and 17 kg N ha-1 for peanut compared to conventional practices (GROW and 336 kg N ha-1 for maize and 17 kg N ha-1 for peanut). Moreover, this management practice reduced N fertilizer use by 26% and irrigation water use by up to 60% without negative impacts on yield. Observed and simulated soil N increased during maize and peanut residue decay, with simulations estimating that this soil N would leach below the root zone during the fallow season. This leaching could potentially be reduced if a cover crop or cash crop were planted between the maize and peanut crops to take up the mineralized N. Keywords: Agricultural best management practices, Bare fallow, BMPs, Maize-peanut rotation, N balance, N fertilization, N leaching, Sandy soils, Sensor-based irrigation scheduling, Water balance.


Agronomy ◽  
2019 ◽  
Vol 9 (5) ◽  
pp. 241 ◽  
Author(s):  
Quanxiao Fang ◽  
L. Ma ◽  
R. D. Harmel ◽  
Q. Yu ◽  
M. W. Sima ◽  
...  

An important but rarely studied aspect of crop modeling is the uncertainty associated with model calibration and its effect on model prediction. Biomass and grain yield data from a four-year maize experiment (2008–2011) with six irrigation treatments were divided into subsets by either treatments (Calibration-by-Treatment) or years (Calibration-by-Year). These subsets were then used to calibrate crop cultivar parameters in CERES (Crop Environment Resource Synthesis)-Maize implemented within RZWQM2 (Root Zone Water Quality Model 2) using the automatic Parameter ESTimation (PEST) algorithm to explore model calibration uncertainties. After calibration for each subset, PEST also generated 300 cultivar parameter sets by assuming a normal distribution of each parameter within their reported values in the literature, using the Latin hypercube sampling (LHS) method. The parameter sets that produced similar goodness of fit (11–164 depending on subset used for calibration) were then used to predict all the treatments and years of the entire dataset. Our results showed that the selection of calibration datasets greatly affected the calibrated crop parameters and their uncertainty, as well as prediction uncertainty of grain yield and biomass. The high variability in model prediction of grain yield and biomass among the six (Calibration-by-Treatment) or the four (Calibration-by-Year) scenarios indicated that parameter uncertainty should be considered in calibrating CERES-Maize with grain yield and biomass data from different irrigation treatments, and model predictions should be provided with confidence intervals.


2010 ◽  
Vol 148 (5) ◽  
pp. 593-602 ◽  
Author(s):  
J. C. MELGAR ◽  
J. M. DUNLOP ◽  
J. P. SYVERTSEN

SUMMARYThe effects of deficit irrigation (DI) and partial rootzone drying (PRD) on the growth and mineral nutrition of citrus rootstock seedlings in the glasshouse were determined, as well as the potential of DI and PRD to trigger root-to-shoot signalling of abscisic acid (ABA) to increase the growth per amount of water used (water use efficiency (WUE)). In the DI study, 3-month-old seedlings of the important citrus rootstock Swingle citrumelo with intact roots received three irrigation treatments: control (1·00 evapotranspiration (ET)), 0·75 ET and 0·50 ET. DI clearly decreased growth, the net assimilation of CO2 (ACO2), WUE and the total content of N and K in leaves, even though concentrations of leaf N and K were increased in the drought-stressed smaller plants. Root K was not affected by DI treatments. Leaf ABA concentration increased linearly with DI. For the PRD study, root systems of 6-month-old Swingle citrumelo were split into half and allowed to become established in adjacent pots. There were three irrigation treatments: control (1·00 of the total crop ET, 0·50 in each pot), PRD 50-0 (0·50 ET by weight applied to only one-half of root zone) and DI 25-25 (0·50 ET in total, with 0·25 ET applied to each root half). Although the total root length was decreased by the DI 25-25 treatment, PRD 50-0 did not affect any growth characteristics compared to control plants. The dry root zone of the PRD 50-0 treatment had a higher specific root length, longer roots per dry weight, than the wet root zone. Leaf ACO2 and WUE of the DI 25-25 treatment were significantly lower than control plants after 11 weeks. Although the total contents of N and K in leaves were not affected by either PRD treatment, the concentrations of N and K in leaves were increased by DI 25-25. Root K was decreased by PRD treatments. Leaf ABA concentration was increased by PRD 50-0 but not by DI 25-25. Although all drought stress treatments increased the levels of ABA in leaves, DI and PRD treatments did not affect the whole plant WUE. Compared to well-irrigated control plants, DI reduced growth, whereas PRD 50-0 did not.


Soil Research ◽  
1999 ◽  
Vol 37 (3) ◽  
pp. 575 ◽  
Author(s):  
C. A. Russell ◽  
I. R. P. Fillery

The rate of decomposition of 15N-labelled lupin (Lupinus angustifolius) stubble and the use of mineralised 15N by wheat were determined in field experiments on a deep loamy sand previously cropped to lupin. In one experiment, leaf, stem, and pod (pod-valve) components were applied separately to mini-plots that were either left unplanted or subsequently planted to wheat. In the second experiment, leaf and stem components, each of either low or high N concentration, were applied separately to mini-plots which were subsequently planted to wheat. Soil was recovered in layers to a maximum depth of 1 m and subsequently analysed for 15N in NH + 4 , NO-3 , and total N. The net mineralisation of stubble 15N was estimated from the decrease in soil organic 15N (total 15N – inorganic 15N), and the uptake of 15N by wheat was measured periodically. All treatments were characterised by the high retention of lupin stubble 15N in the soil organic matter. Between 9 and 34% of stem and pod 15N, and 19–49% of leaf 15N, was mineralised within a 10-month period. From these data the annual net mineralisation of a typical lupin stubble was estimated at 25–42 kg N/ha, an N benefit similar to that estimated from agronomic trials. Wheat uptake of lupin-stubble 15N ranged from 9 to 27%. Of the stubble components, only the leaf contained sufficient quantities of mineralisable N to be an important source of N for wheat. At wheat maturity in the first experiment, losses of stubble 15N ranged from 13% (leaf) to 7% (stem). In the second experiment, losses of 15N were only observed from the high N treatments (leaf 8%, stem 15·5%). Stubble component chemistry appeared to affect net mineralisation and plant uptake differently. Across both experiments, annual net mineralisation best correlated (R = 0·69) with the N concentration of the stubble components. Wheat N uptake was strongly positively correlated with polysaccharide content (R = 0·89) but negatively correlated with lignin content (R = – 0·79). Although large quantities (58 and 98 kg N/ha) of soil-derived inorganic N were found in the root-zone (–1·0 m) of wheat sown after lupins, and attributed to the decomposition of lupin root systems and surface residues prior to the establishment of each experiment, it is concluded that the short-term decomposition of lupin stubble 15N results in a modest release of inorganic N. Consequently, the primary value of lupin stubble in the N economy of lupin : cereal rotations is to replenish the soil organic N reserve.


2015 ◽  
Vol 66 (10) ◽  
pp. 993 ◽  
Author(s):  
Attila Yazar ◽  
Çigdem Incekaya ◽  
S. Metin Sezen ◽  
Sven-Erik Jacobsen

Field experiments were set up in order to evaluate the yield response of quinoa (Chenopodium quinoa Willd. cv. Titicaca) to irrigation with saline and fresh water under Mediterranean climate from 2010 to 2012 in Adana, Turkey. Irrigation treatments in 2010 and 2011 comprised full irrigation with fresh water, full irrigation with saline water of different salt concentrations (40, 30, 20, 10 dS m–1), deficit irrigations with fresh water (50%, 75% of full irrigation), partial root-zone drying, and deficit irrigation with saline water of 40 dS m–1 (50%). In 2012, in addition to the full irrigation treatments, two deficit irrigation levels of 67% and 33% of full irrigation with fresh or saline (30, 20, 10 dS m–1) water were considered. The results indicated that grain yields were slightly reduced by irrigation water salinity up to 30 dS m–1 compared with fresh water irrigation. Salinity and drought stress together interfered considerably with crop grain and biomass yields. However, salinity stress alone did not interfere with grain and biomass yield significantly; therefore, quinoa may be defined as a crop tolerant to salinity. Yield parameters such as aboveground biomass, seed yield and harvest index suggested a good adaptation of quinoa cv. Titicaca to Mediterranean environments.


2019 ◽  
Vol 70 (2) ◽  
pp. 301 ◽  
Author(s):  
H. Kirnak ◽  
H. A. Irik ◽  
O. Sipahioglu ◽  
A. Unlukara

In the present study, pumpkin (Cucurbita Pepo L.) was grown under water stress to determine its effects on the chemical composition of the seeds (i.e., oil, protein, fatty acids and vitamin E), in Kayseri, Turkey. Irrigation treatments were designed to supply different portions of depleted moisture within the efficient root zone of the plants (60 cm). The treatments were arranged as supplying 100% (I100), 80% (I80), 60% (I60), 40% (I40), 20% (I20) and 0% (I0) of depleted moisture through a drip irrigation system. The effects of irrigation levels on the oil content of pumpkin seeds were found to be significant (p < 0.01). The oil contents of irrigation treatments varied between 26% (I0, dry) and 64% (I100, full irrigation). However, the effects of deficit irrigation on protein, fatty acids and vitamin E contents were not found to be significant. The vitamin E contents varied from 41.6 – 55.3 mg/100 g; while the protein contents varied from 28.5–37.7%. Six different fatty acids (linolenic, linoleic, oleic, stearic, palmitic and myristic acid) were examined. The average concentration of palmitic, stearic, oleic and linoleic acids ranged from 10.7–12.6%, 6.4–10.4%, 39.6–48.9% and 32.4–35%, respectively. Myristic and linolenic acids were not detected in the pumpkin seeds.


Agronomy ◽  
2020 ◽  
Vol 10 (9) ◽  
pp. 1297 ◽  
Author(s):  
Branimir Urlić ◽  
Marko Runjić ◽  
Marija Mandušić ◽  
Katja Žanić ◽  
Gabriela Vuletin Selak ◽  
...  

The tomato is an important horticultural crop, the cultivation of which is often under influence of abiotic and biotic stressors. Grafting is a technique used to alleviate these problems. Shortage of water has stimulated the introduction of new irrigation methods: deficit irrigation (DI) and partial root-zone drying (PRD). This study was conducted in two spring–summer season experiments to evaluate the effects of three irrigation regimes: full irrigation (FI), PRD and DI on vegetative growth, leaf gas-exchange parameters, yield, water-use efficiency (WUE), nutrients profile and fruit quality of grafted tomatoes. In both years, the commercial rootstocks Emperador and Maxifort were used. In the first year, the scion cultivar Clarabella was grown on one stem and in the second year the cultivar Attiya was grown on two stems. Self-grafted cultivars were grown as a control. In both experiments, higher vegetative traits (leaf area and number, height, shoot biomass) were recorded in tthe plants grafted on commercial rootstocks. The stomatal conductance and transpiration rate were higher under FI. Under DI, transpiration was lowest and photosynthetic WUE was highest. Photosynthetic rate changed between irrigation treatments depending on plant type. In both years, the total yield was highest in grafted plants as result of more and bigger fruits per plant. In the 2nd year, grafted plants under FI had higher yield compared to PRD, but not to DI, while self-grafted plants did not differ between irrigation treatments. WUE was highest in DI and PRD treatments and in grafted plants. Leaf N, P, K and Ca was highest in tthe plants grafted on Emperador and Maxifort, while more Mg was measured in self-grafted plants. More Ca and Mg were recorded in tthe plants under DI and PRD. Fruit mineral concentrations were higher in tthe plants grafted on commercial rootstocks. Total soluble solids differed between irrigation regarding plant types, while fruit total acidity was higher in Emperador and Maxifort. In conclusion, our study showed that grafted plants could be grown under DI with minor yield reduction with 30–40% less water used for irrigation. Moderate DI could be used before PRD for cultivation of grafted tomato and double stemmed plants did not show negative effect on tomato yield so it can be used as standard under reduced irrigation.


2002 ◽  
Vol 12 (2) ◽  
pp. 250-256 ◽  
Author(s):  
Hudson Minshew ◽  
John Selker ◽  
Delbert Hemphill ◽  
Richard P. Dick

Predicting leaching of residual soil nitrate-nitrogen (NO3-N) in wet climates is important for reducing risks of groundwater contamination and conserving soil N. The goal of this research was to determine the potential to use easily measurable or readily available soilclimatic-plant data that could be put into simple computer models and used to predict NO3 leaching under various management systems. Two computer programs were compared for their potential to predict monthly NO3-N leaching losses in western Oregon vegetable systems with or without cover crops. The models were a statistical multiple linear regression (MLR) model and the commercially available Nitrate Leaching and Economical Analysis Package model (NLEAP 1.13). The best MLR model found using stepwise regression to predict annual leachate NO3-N had four independent variables (log transformed fall soil NO3-N, leachate volume, summer crop N uptake, and N fertilizer rate) (P < 0.001, R2 = 0.57). Comparisons were made between NLEAP and field data for mass of NO3-N leached between the months of September and May from 1992 to 1997. Predictions with NLEAP showed greater correlation to observed data during high-rainfall years compared to dry or averagerainfall years. The model was found to be sensitive to yield estimates, but vegetation management choices were limiting for vegetable crops and for systems that included a cover crop.


2021 ◽  
Vol 13 (5) ◽  
pp. 954
Author(s):  
Abhilash K. Chandel ◽  
Lav R. Khot ◽  
Behnaz Molaei ◽  
R. Troy Peters ◽  
Claudio O. Stöckle ◽  
...  

Site-specific irrigation management for perennial crops such as grape requires water use assessments at high spatiotemporal resolution. In this study, small unmanned-aerial-system (UAS)-based imaging was used with a modified mapping evapotranspiration at high resolution with internalized calibration (METRIC) energy balance model to map water use (UASM-ET approach) of a commercial, surface, and direct-root-zone (DRZ) drip-irrigated vineyard. Four irrigation treatments, 100%, 80%, 60%, and 40%, of commercial rate (CR) were also applied, with the CR estimated using soil moisture data and a non-stressed average crop coefficient of 0.5. Fourteen campaigns were conducted in the 2018 and 2019 seasons to collect multispectral (ground sampling distance (GSD): 7 cm/pixel) and thermal imaging (GSD: 13 cm/pixel) data. Six of those campaigns were near Landsat 7/8 satellite overpass of the field site. Weather inputs were obtained from a nearby WSU-AgWeatherNet station (1 km). First, UASM-ET estimates were compared to those derived from soil water balance (SWB) and conventional Landsat-METRIC (LM) approaches. Overall, UASM-ET (2.70 ± 1.03 mm day−1 [mean ± std. dev.]) was higher than SWB-ET (1.80 ± 0.98 mm day−1). However, both estimates had a significant linear correlation (r = 0.64–0.81, p < 0.01). For the days of satellite overpass, UASM-ET was statistically similar to LM-ET, with mean absolute normalized ET departures (ETd,MAN) of 4.30% and a mean r of 0.83 (p < 0.01). The study also extracted spatial canopy transpiration (UASM-T) maps by segmenting the soil background from the UASM-ET, which had strong correlation with the estimates derived by the standard basal crop coefficient approach (Td,MAN = 14%, r = 0.95, p < 0.01). The UASM-T maps were then used to quantify water use differences in the DRZ-irrigated grapevines. Canopy transpiration (T) was statistically significant among the irrigation treatments and was highest for grapevines irrigated at 100% or 80% of the CR, followed by 60% and 40% of the CR (p < 0.01). Reference T fraction (TrF) curves established from the UASM-T maps showed a notable effect of irrigation treatment rates. The total water use of grapevines estimated using interpolated TrF curves was highest for treatments of 100% (425 and 320 mm for the 2018 and 2019 seasons, respectively), followed by 80% (420 and 317 mm), 60% (391 and 318 mm), and 40% (370 and 304 mm) of the CR. Such estimates were within 5% to 11% of the SWB-based water use calculations. The UASM-T-estimated water use was not the same as the actual amount of water applied in the two seasons, probably because DRZ-irrigated vines might have developed deeper or lateral roots to fulfill water requirements outside the irrigated soil volume. Overall, results highlight the usefulness of high-resolution imagery toward site-specific water use management of grapevines.


Sign in / Sign up

Export Citation Format

Share Document