Can Subsurface Drip Irrigation (SDI) be a Competitive Irrigation System in the Great Plains Region for Commodity Crops?

2010 ◽  
Author(s):  
Freddie R Lamm ◽  
Paul D Colaizzi ◽  
James P Bordovsky ◽  
Todd P Trooien ◽  
Juan Enciso-Medina ◽  
...  
2017 ◽  
Vol 60 (3) ◽  
pp. 931-939 ◽  
Author(s):  
Freddie R. Lamm ◽  
Danny H. Rogers

Abstract. System longevity is an important economic factor to minimize amortized investment costs for subsurface drip irrigation (SDI), especially when growing lower-value commodity crops such as field corn. Kansas State University established a research site in 1989 at a research center to study SDI. One research study area was used for continuous production of SDI corn for 27 seasons without dripline replacement. Normalized plot flowrates for 23 separate plots after 27 seasons were within ±5% of their first annually measured value. Hydraulic performance of the driplines and emitters was measured and in the laboratory for excavated dripline samples after the SDI system was decommissioned in the fall of 2015. There were similar results from both and laboratory tests of the used driplines, with excellent coefficients of variation (CV) of approximately 3%, lower quartile distribution uniformities (DUlq) of 96 to 97, and Christiansen uniformity coefficients (UC) of approximately 98. The performance results of the excavated driplines were as good as or better than the performance of some unused driplines that had been in storage since 1990. Long SDI system life appears possible in the U.S. Central Great Plains when the systems are properly designed, installed, and maintained. The long system life (27 seasons and 26.5 years) improves the economic competitiveness of SDI with alternative irrigation systems such as center-pivot sprinkler systems, which are currently the predominant irrigation system in the region. The SDI system was decommissioned at the end of the 2015 crop growing season due to leaks arising from breakdown in the plastic material, rather than due to any clogging concerns and subsequent lower application uniformity. Keywords: Distribution uniformity, Drip irrigation, Flow variation, Microirrigation, Subsurface drip irrigation.


2019 ◽  
Vol 25 (9) ◽  
pp. 41-53
Author(s):  
Heba Najem Abid ◽  
Maysoon Basheer Abid

Soil wetted pattern from a subsurface drip plays great importance in the design of subsurface drip irrigation (SDI) system for delivering the required water directly to the roots of the plant. An equation to estimate the dimensions of the wetted area in soil are taking into account water uptake by roots is simulated numerically using HYDRUS (2D/3D) software. In this paper, three soil textures namely loamy sand, sandy loam, and loam soil were used with three different types of crops tomato, pepper, and cucumber, respectively, and different values of drip discharge, drip depth, and initial soil moisture content were proposed. The soil wetting patterns were obtained at every thirty minutes for a total time of irrigation equal to three hours. Equations for wetted width and depth were predicted and evaluated by utilizing the statistical parameters (model efficiency (EF), and root mean square error (RMSE)). The model efficiency was more than 95%, and RMSE did not exceed 0.64 cm for three soils. This shows that evolved formula can be utilized to describe the soil wetting pattern from SDI system with good accuracy.      


2004 ◽  
Vol 50 (2) ◽  
pp. 61-68 ◽  
Author(s):  
C. Choi ◽  
I. Song ◽  
S. Stine ◽  
J. Pimentel ◽  
C. Gerba

Two different irrigation systems, subsurface drip irrigation and furrow irrigation, are tested to investigate the level of viral contamination and survival when tertiary effluent is used in arid and semi-arid regions. The effluent was injected with bacteriophages of PRD1 and MS2. A greater number of PRD1 and MS2 were recovered from the lettuce in the subsurface drip-irrigated plots as compared to those in the furrow-irrigated plots. Shallow drip tape installation and preferential water paths through cracks on the soil surface appeared to be the main causes of high viral contamination in subsurface drip irrigation plots, which led to the direct contact of the lettuce stems with the irrigation water which penetrated the soil surface. The water use efficiency of the subsurface drip irrigation system was higher than that of the furrow irrigation system. Thus, subsurface drip irrigation is an efficient irrigation method for vegetable crops in arid and semi-arid regions if viral contamination can be reduced. Deeper installation of drip tapes, frequent irrigations, and timely harvests based on cumulative heat units may further reduce health risks by ensuring viral die-off under various field conditions.


2018 ◽  
Vol 34 (1) ◽  
pp. 213-221 ◽  
Author(s):  
Steven R. Evett ◽  
Gary W. Marek ◽  
Paul D. Colaizzi ◽  
Brice B. Ruthardt ◽  
Karen S. Copeland

Abstract. Large, precision weighing lysimeters can have accuracies as good as 0.04 mm equivalent depth of water, adequate for hourly and even half-hourly determinations of evapotranspiration (ET) rate from crops. Such data are important for testing and improving simulation models of the complex interactions of surface water and energy balances, soil physics, plant growth, and biophysics that determine crop ET in response to rapid microclimate dynamics. When crops are irrigated with sprinkler systems or other rapid additions of water, the irrigation event is typically short enough that not much ET data are compromised by the lysimeter mass change due to irrigation. In contrast, subsurface drip irrigation (SDI) systems may take many hours to apply an irrigation, during which time the lysimeter mass change is affected by both ET rate and irrigation application rate. Given that irrigation application rate can be affected by pressure dynamics of the irrigation system, emitter clogging and water viscosity changes with temperature over several-hour periods, it can be difficult to impossible to separate the ET signal from the interference of the irrigation application. The inaccuracies in the data can be important, particularly for comparisons of sprinkler and SDI systems, since they are of the order of 8 to 10% of daily ET. We developed an SDI irrigation system to apply irrigations of up to 50 mm to large weighing lysimeters while limiting the period of lysimeter mass change due to irrigation delivery to approximately ten minutes by storing the water needed for irrigation in tanks suspended from the lysimeter weighing system. The system applied water at the same rate as the SDI system in the surrounding field, allowed irrigation over periods of any duration, but often exceeding 12 h, without directly affecting lysimeter mass change and the accuracy of ET rate determinations, and allowed irrigation overnight without compromising lysimeter daily ET measurements. Errors in lysimeter ET measurements using the previous SDI system, which was directly connected to the field irrigation system, were up to 10% of daily ET compared with negligible error using the new system. Errors using the previous, directly connected, SDI system varied over time due to variable system pressure, and possibly due to water temperature (viscosity) changes and emitter clogging. With the new system, all of the water transferred to the lysimeter weighing system was eventually applied by the SDI system regardless of temperature, pressure, or emitter clogging. Differences between planned and applied irrigation depth were less than 2% over the irrigation season. Keywords: Evapotranspiration, ET, Subsurface drip irrigation, SDI, Weighing lysimeter.


Sign in / Sign up

Export Citation Format

Share Document