Managing precipitation use in sustainable dryland agroecosystems


*Corresponding Author E-mail:


In the Great Plains of North America potential evaporation exceeds precipitation during most months of the year. About 75% of the annual precipitation is received from April through September, and is accompanied by high temperatures and low relative humidity. Dryland agriculture in the Great Plains has depended on wheat production in a wheat-fallow agroecosystem (one crop year followed by a fallow year). Historically this system has used mechanical weed control practices during the fallow period, which leaves essentially no crop residue cover for protection against soil erosion and greatly accelerates soil organic carbon oxidation. This paper reviews the progress made in precipitation management in the North American Great Plains and synthesises data from an existing long-term experiment to demonstrate the management principles involved.

The long-term experiment was established in 1985 to identify dryland crop and soil management systems that would maximize precipitation use efficiency (maximization of biomass production per unit of precipitation received), improve soil productivity, and increase economic return to the farmers in the West Central portion of the Great Plains. Embedded within the primary objective are sub-objectives that focus on reducing the amount of summer fallow time and reversing the soil degradation that has occurred in the wheat-fallow cropping system.

The experiment consists of four variables: 1) Climate regime; 2) Soils; 3) Management systems; and 4) Time. The climate variable is based on three levels of potential evapotranspiration (ET), which are represented by three sites in eastern Colorado. All sites have annual long-term precipitation averages of approximately 400–450 mm, but vary in growing season open pan evaporation from 1600 mm in the north to 1975 mm in the south. The soil variable is represented by a catenary sequence of soils at each site. Management systems, the third variable, differ in the amount of summer fallow time and emphasize increased crop diversity. All systems are managed with no-till techniques. The fourth variable is time, and the results presented in this paper are for the first 12 yr (3 cycles of the 4-yr system). Comparing yields of cropping systems that differ in cycle length and systems that contain fallow periods, when no crop is produced, is done with a technique called “annualisation”. Yields are “annualised” by summing yields for all crops in the system and dividing by the total number of years in the system cycle. For example in a wheat-fallow system the wheat yield is divided by two because it takes 2 yr to produce one crop.

Cropping system intensification increased annualised grain and crop residue yields by 75 to 100% compared to wheat-fallow. Net return to farmers increased by 25% to 45% compared to wheat-fallow.

Intensified cropping systems increased soil organic C content by 875 and 1400 kg ha−1, respectively, after 12 yr compared to the wheat-fallow system. All cropping system effects were independent of climate and soil gradients, meaning that the potential for C sequestration exists in all combinations of climates and soils. Soil C gains were directly correlated to the amount of crop residue C returned to the soil. Improved macroaggregation was also associated with increases in the C content of the aggregates. Soil bulk density was reduced by 0.01g cm−3 for each 1000 kg ha−1 of residue addition over the 12-yr period, and each 1000 kg ha−1 of residue addition increased effective porosity by 0.3%.

No-till practices have made it possible to increase cropping intensification beyond the traditional wheat-fallow system and in turn water-use efficiency has increased by 30% in West Central Great Plains agroecosystems. Cropping intensification has also provided positive feedbacks to soil productivity via the increased amounts of crop residue being returned to the soil.