Thursday, 13 July 2006
70-3

Water and Energy Balance of Drip Irrigated Cotton: Measurements and Simulations.

Robert J. Lascano1, Bobbie McMichael2, Dennis Gitz2, and Jill Booker1. (1) Texas A&M Univ. Res. and Ext. Center, 3810 4th St., Lubbock, TX 79415, (2) USDA-ARS, 3810 4th Street, Lubbock, TX 79415

In recent years cotton (Gossypium hirsutum, L.) production with sub-surface drip irrigation (SDI) has gained popularity in the Texas High Plains. The SDI system consists of drip tape on alternate 1-m rows with the emitters 0.2 – 0.4 m from the bottom of the furrows. This region is characterized by a semiarid climate and declining irrigation well-capacities reducing water availability from the Ogallala aquifer. The introduction of more efficient irrigation systems such as SDI may extend the sustainability of irrigation in the Texas High Plains, but the combination of frequent droughts and reduced well capacities cannot supply the daily water requirements of crops and deficit-irrigation and water stress is common. The average annual rain is 460 mm/yr and many of the rain events are of high intensity and of short duration, with the majority of the rain falling during the growing season (Apr. – Sept.). During the growing season and periods of no rain, we speculate that cotton roots will preferentially concentrate in the wet furrow near the drip emitter compared to the dry furrow. Furthermore, lower SDI application rates are conducive to have plants with a root system that more effectively uses rainwater, when compared to higher rates of application. Therefore, we hypothesize that after a dry period the portion of water from a rainfall event lost to soil water evaporation (E), due to lack of roots to uptake and transpire (T) this water, will increase as the application rate of SDI increases. To test our hypothesis in 2005 we conducted a field experiment at the facilities of the USDA-ARS Plant Stress and Water Conservation Laboratory in Lubbock, TX. The soil at the site is classified in the Amarillo series (fine-loamy, mixed, thermic, aridic Paleustalf). Cotton, variety Fibermax-958, was planted 18 May at a density of 100,000 plants/ha. Three irrigation treatments plus dryland were selected based on the range of well capacities in the Texas High Plains, 2.5 mm/d (low), 5.0 mm/d (medium), and 7.5 mm/d (high). The crop was irrigated via a 3-yr old SDI system with drip lines buried on average 0.37 m below the bottom of the furrow. Row spacing was 1.0 m with drip tape placed in alternate furrows (2.0 m between tapes). Irrigations were scheduled using the Biotic system, which is based on canopy temperature changes measured with infrared thermometers. The time threshold used to trigger irrigation was 5.5 h. The experimental field design was a split-plot, with plots 4 m wide by 30 m long and each plot was replicated 3 times. However, for our purpose we only considered the low and high irrigation treatments. The experimental objective was to independently measure E and T for 30 days starting on 23 July. Also, we calculated the energy and water balance components with the mechanistic ENWATBAL simulation model. Plant T was measured using stem flow gauges, a steady state heat balance method that applies heat to the stem and measures water flow through the plant. Soil E was measured gravimetrically using micro-lysimeters placed in both the wet and dry furrow. Both measurements of E and T were replicated at least 4 times. Additional measurements were leaf area using destructive and non-destructive methods, and root distribution in the soil profile using mini-rhizotron tubes installed at a 45-degree parallel to the plant row to 1-m depth. Root images were acquired using a camera system and processed using commercial software. The percentage of image-frames containing roots along the mini-rhizotron tube described root proliferation within the soil profile. Inputs to ENWATBAL included soil hydraulic properties and hourly weather variables (air and dew-point temperature, wind-speed, short-wave irradiance, rain and irrigation) all measured at a screen height of 2.0 m above the soil surface at a weather station near the experimental field. Plant input state variables were the measured leaf area and root density. Measured values of E and T showed that for a 30-d span and two rain events of 20 and 22 mm, the low irrigation treatment effectively used more rainwater than the high treatment confirming our hypothesis. The low irrigation treatment had a root system, near the soil surface, better suited to uptake and thus T a larger portion of water received from rain compared to the high irrigation treatment. Furthermore, this finding was confirmed by the calculated values of E and T obtained with ENWATBAL. These results suggest that for the conditions of the Texas High Plains and depending on well capacity, SDI might not be the best choice as an irrigation system to effectively use water received from rain. In contrast, above-ground sprinkler systems, common in this area, in principle, contribute to a better root distribution near the soil surface and thus make better use of rain during the growing season.

Back to 1.0PW Synthesis, Modeling, and Applications of Disciplinary Soil Science Knowledge for Soil-Water-Plant-Environment Systems - Theater II
Back to WCSS

Back to The 18th World Congress of Soil Science (July 9-15, 2006)