Irrigation Fundamentals

Part 3, Hydraulic Concepts of Irrigation

By Ed Butts, PE, CPI

In the last two Engineering Your Business columns, we have discussed basic irrigation terms and definitions, decision-making criteria, and soil-water-plant relationships, including water-quality considerations and consumptive water use of plants.

This month in the third part of our six-part series, we will expand with an overview of a few of the more important elements of an irrigation system: hydraulic considerations including allowable soil loading (i.e., water application rate and depth per soil type); water losses and application efficiency; distribution uniformity; irrigation system efficiencies; and seasonal water volume and storage requirements.

Water Application Depth

There are two distinct types of applications that must be considered for an irrigation system. The first includes the instantaneous and average application rates in inches per hour, and the second is the total application over the irrigated area in inches.

By applying the consumptive use with a further understanding of the correlation between the volume of water in 1 acre-inch or foot versus the elapsed time of application, a designer can then determine the amount of water required to irrigate a specific crop over any size of field.

One acre-foot is defined as the volume of water that would occupy one acre of land to a uniform depth of 1 foot. Thus, since one acre of land in the United States is defined as an area with cross-sectional dimensions of 208.7 feet × 208.7 feet, the total area would equal 43,555.69 square feet.

With the knowledge there are 7.48 gallons of water per cubic foot, one acre-foot of water would therefore equal 43,555.69 ft2 × 7.48 gals./ft3 = 325,796.56 gallons. For convenience, this is often rounded to 325,800 gallons of water per acre-foot.

Conversely, if working in units of acre-inches, the unit commonly used for determining application depth equals 27,150 gallons of water per acre-inch (325,800 gallons/12 inches per foot).

With the prior knowledge of the crop’s evapotranspiration rate, these values can be used to determine the daily, weekly, or seasonal volume of water needing to be applied to a given crop.

For example, if a crop has an evapotranspiration (uptake) rate of 0.25 inches per day, the net volume of water required per acre equals 27,150 gallons/acre-inch × 0.25 = 6,787.5 gallons per day.

Always remember this represents the net volume of water, losses due to percolation, wind drift, and solar radiation. Nonuniform sprinkler coverage reflects the gross amount, which must be determined by dividing the net volume by the irrigation or application efficiency.

Many federal and state regulatory agencies and design firms apply this relationship to a generic value of the flow rate per acre. This is often expressed as gallons per minute per acre (GPM/acre). This value varies significantly with the agency, locality, crop, and specific local factors, but usually ranges from 3-4 GPM/acre to 10-15 GPM/acre. The following formulas are handy for computing the approximate depth of water applied to a field.

Cubic feet per second × Hours of operationApplied acres

= Acre-inches per acre or average water depth in inches

Gallons per minute × Hours of operation450 × Applied acres

= Acre-inches per acre or average water depth in inches

Example: What would the average depth of water be if applied to a 160-acre field, assuming a farmer pumped 1200 GPM for 10 days rotation (at a pumping rate of 16 hours a day)?

Solution: Use:

Gallons per minute × Hours of operation450 × Applied acres

= Average water depth in inches


1200 GPM ×10 days × 16 hours/day450 × 160 acres

= 2.67 inches of average water depth

Another formula used to determine the average precipitation rate from a single sprinkler, showing average application rate in inches per hour, is:

96.3 × GPM of sprinklerSprinkler spacing on laterals (ft) × Lateral spacings on mainline (ft)

Example: (1) Find the average application rate for an 8 GPM sprinkler with 40 feet spacing on the laterals by 50 feet on the mainline, and (2) is this typical rate excessive for a 10% slope on a 5-foot layer of light, sandy loams?


(1) 96.3 × 8 GPM40 feet × 50 feet



= 0.385 inches per hour

(2) Refer to Table 1: According to the data in this table, a maximum application rate of 0.60 inches per hour is acceptable for a 10% (8%-12% range) slope on light, sandy loams up to 6 feet in depth. Since the average rate of 0.385 inches per hour is just 64% of the maximum allowable rate, this application should be acceptable.

Irrigation Water Losses

The excessive application of water, during operation or while irrigating, is generally the most significant cause of water loss in any man-made irrigation system.

Regardless of how well the system is designed, if more water is applied than can be beneficially consumed and used by the crop—resulting in runoff, wind drift and loss, deep percolation losses, or evaporation—the overall efficiency of the process will suffer.

Thus, proper irrigation design, layout, operation, scheduling, and duration is critically important if high overall efficiencies are to be achieved.

Other types of possible water losses are specific to the type of irrigation system used. These losses may be caused from wind drift for sprinkler irrigation, furrow or ditch overflow or bypass in surface irrigation methods, or improperly spaced or sized emitters for drip irrigation. These losses, when applied in totality, generate the application or irrigation efficiency, the volume of which, when combined with the crop’s uptake use, is often considered as the total or gross application of water.

Each method and type of irrigation has their own unique and specific types of losses. Aside from excessive water, the major losses associated with typical surface irrigation systems are direct evaporation occurring from the wet soil surface; runoff losses from excessive or uneven terrain slopes, furrow, or ditch overflow; and seepage losses from water distribution ditches.

Direct evaporation and runoff losses can be important when irrigating young orchard crops. However, much of the runoff loss can be virtually eliminated with return flow systems capturing the runoff water and directing it back to the originating field or to other fields for repumping and reuse, which is known as tailwater recovery.

Underground drainage tiles can be used to divert excessive lateral water flow, particularly in tighter clay and loam soils, to a tailwater recovery facility as well. The amount of seepage loss from unlined ditches will depend on the specific soil characteristics and depth and the extent of the ditch network, but can easily range from 10% to 15% of the gross water volume. Seepage losses are eliminated by using lined canals or pipe (closed conduit) distribution systems.

The primary losses associated with sprinkler irrigation (other than those due to intentional or accidental excessive watering) methods are:

  • Direct evaporation from wet soil surfaces
  • Deep percolation losses
  • Wind drift and evaporation losses from the sprinkler spray or mist
  • Runoff from slopes
  • Missed or inadequate irrigation application
  • Normal and abnormal system drainage
  • Surface or underground pipe or gasket leaks.

The evaporation that occurs from the soil surface will depend upon the irrigation frequency and the extent of the bare and unplanted soil between the plants to be irrigated. These losses can be high in young orchards, vineyards, or other crops with wide spacings between plants or root zones.

Some of the water lost to wind drift and evaporation from sprinkler spray may not be actually lost, since it readily substitutes for crop transpiration in some cases. Net losses in this case may be as low as 2% to 3% or as high as 15% to 20% under extreme adverse conditions.

Well-maintained sprinkler systems should have leak and drainage losses below 2%, but poorly managed or porous systems have shown losses at or near 10%. If not over-irrigated, trickle system losses should be generally low. Although a relatively small portion of the soil surface is wetted, the irrigation frequency is high, so there will be some loss due to evaporation from the wet soil.

With good management and proper service intervals, losses due to leaks, system drainage, and routine flushing of filters and lateral lines in drip and trickle systems should not exceed 1%. The typical application efficiencies are shown in Table 2.

Example: What is the minimum required daily irrigation period for a crop with a consumptive use (CU) of 0.28 inches/day using solid-set lines with 6 GPM sprinklers set at 40 feet (sprinkler spacing) × 60 feet (lateral spacing)?

Solution: (1) Find required daily application: 0.28 inches per day of CU/0.75 (for 75% efficiency) = 0.373 inches per day

(2) Avg. application rate =

6 GPM per head × 96.340 feet × 60 feet

= 0.24 feet/hour

Thus, 0.373 inches/0.24 inches = 1.55 hours per day

Distribution Uniformity

The distribution uniformity (DU) is a measurement of how evenly and well distributed water is applied across a field and to a crop during irrigation. It applies to every method of irrigation including sprinkler, flood/furrow, and drip/trickle.

For example, assuming all other factors to be equal, if 1 inch of water is applied over one part of a field and only 1/2-inch is applied to another part of the same field, this would generally be defined as a poor DU.

As with application efficiency, DU is expressed as a percentage between 0% and 100% although it is virtually impossible to attain 100% in either in actual practice. As a generalization, for sprinkler and drip irrigation, DU ratings of less than 50% are considered to be poor and unacceptable, 50% to 60% is fair, 60% to 70% is considered good, 70% to 80% is very good, and greater than 80% to 85% or higher is
regarded as excellent.

In short, a low DU percentage means either too much water is applied over the irrigable area, resulting in an unnecessary expense and waste of water, or too little water is being applied over certain areas, causing undue stress and potential failure to thrive for many crops.

It is important to note that irrigation or application efficiency is not the same as the DU. The application efficiency refers to how well the irrigator matches each water application to the water needs of the crop and is generally used to determine how much water to apply and for how long and how often. This is also referred to as irrigation scheduling.

For instance, if a crop needs 30 feet of water per year and the irrigator ends up applying the same or close to that amount (less any distribution inefficiencies), then the net irrigation efficiency would be high. In contrast, if the irrigator applied 60 feet of water per year over the same acreage using a system with a high DU value of 90%, but the crop needed only 30 feet of water, the resulting application efficiency would be low, roughly 50%.

Obviously, there must be an acceptable DU value before there can be good irrigation efficiency if the crop is to be sufficiently watered. Distribution uniformity can vary with the method of irrigation and variables within each method. For example, DU with sprinkler irrigation can vary with different discharge rates from worn sprinkler nozzles or from wind distortion, while in drip irrigation DU can easily drop due to improper emitter spacings, undulating slopes, undersized or excessive lengths of laterals, varying system pressures (i.e., lack of pressure-compensating devices), or clogged emitters that may impact an otherwise uniform emitter discharge rate.

Although a high DU alone is no guarantee of acceptable irrigation efficiency, it is nonetheless a good place to start. DU for new designs may be determined theoretically by using uniformity software or it can be determined with existing systems by taking actual irrigation application measurements in the field.

In the case of field measurements, a series of catch cans are set up at various spacings and carefully positioned based on the crop’s and sprinkler spacings to catch the precipitation discharged from sprinkler heads or emitters over a specific time frame, usually a typical irrigation setting.

The variation of collected water volume in each can is then used to determine the system’s DU. DU can often be optimized or improved by using pressure-compensating sprinklers, valves, or emitters, good hydraulic design of mains and laterals, and proper sprinkler overlap for sprinkler systems and emitter spacings for drip systems.

Performing a DU examination is conducted in the following manner (for overhead and drip irrigation):

  1. Uniformly place identical catch cans throughout a grid or field that will receive water during an irrigation set (for drip irrigation, uniformly place the cans at grade under selected emitters).
  2. Following a normal irrigation set, measure and record the volume (depth) of water collected in each can.
  3. Map and rank the volume (depth) collected in all cans from lowest to highest.
  4. Calculate the average volume (depth) in the lowest 25% of collected samples.
  5. Divide this number by the average volume of all collected samples. This is converted to the DU percentage.

Example: Based on a square grid with a 30-foot × 30-foot impact sprinkler spacing and 16 catch cans (refer to Figure 1):

  1. Following a normal irrigation set, map the 30-foot × 30-foot grid and all 16 data points (catch cans)
  2. The average depth in the lowest 25% of 16 collected samples = 0.667 inches (5/8-inch)
  3. The average depth from the 16 collected samples = 0.875 inches (7/8-inch)
  4. DU = 0.667 inches/0.875 inches = 0.7622 × 100 = 76.22% distribution uniformity (very good).

Deficit Irrigation Practices

Although most irrigation systems are designed and operated on a routine schedule to replace the moisture consumed by the plant (uptake) plus the water lost to evaporation and percolation losses (evapotranspiration), many applications in water-starved or limited regions may employ a technique known as deficit irrigation.

Deficit irrigation is a specific operational strategy that combines a working knowledge and balance of the soil’s waterholding capacity, tolerance of the crop’s stresses during water shortages, and offsetting evaporation from rainfall events with the timing and net application of water from irrigation.

In some cases, the level of soil moisture is allowed to decline up to 25% below the crop’s evapotranspiration requirement. It is often used to maximize crop production in regions with water shortages and can allow more acres to be planted with a given volume of water than typically permitted.

It requires, though, careful management and monitoring of the moisture contained within a soil that is available and accessible for the plant’s root zones, including any carryover moisture from the previous season or winter/spring precipitation, and when supplemental water is required to avoid undue stress onto the plant from a severe lack of moisture. This can lead to a loss of production, wilting, and ultimate plant death if not managed properly.

Deficit irrigation is most effective when applied to soils with a high water-holding capacity and crops with deep root zones. It should only be applied by growers with crops that are drought-resistant or moisture stress-resistant and with knowledge of the plant’s specific water needs throughout the growth cycle.

It is often used to apply more water to the crop during periods of rapid development and less water when the growth rate has declined or the plant has matured.

For certain crops, experiments have confirmed deficit irrigation can increase water use efficiency without a severe reduction of crop yield.

The application of less water also reduces the leaching effects of nutrients from the root zone and applied fertilizers into underlying groundwater. Furthermore, it reduces the risk of developing certain crop diseases linked with high humidity (fungus) that are common in irrigation systems with higher water applications.

Although deficit irrigation has been effectively used with various crops without any apparent loss of production— including peaches, soybeans, cotton, and wheat—there are other crops that cannot endure this degree of moisture stress and will fail if exposed to it.

A variation of deficit irrigation, referred to as regulated deficit irrigation (RDI) is a scheduling technique that purposely stresses a plant by requiring half of a root zone to be dewatered while the other half remains fully watered. This was developed mainly for crops with deep root zones, such as fruit orchards and vineyards, with the cycle generally alternating to shift watered/dewatered root zones every 10 to 14 days.

Most of the difficulties associated with the use of RDI occur where the soil is difficult to dry out, either because it is of a type which readily retains moisture (clays or loams) or is in a region where late spring rainfall keeps it moist.

To be effective, the soil must dry out enough to induce a deficit response at the required growth stage and take in adequate levels of moisture at an adequate rate to relieve the deficit when irrigation is resumed. These are influenced by the rates of infiltration and evapotranspiration and the total water available to the roots.

Another definition of RDI, partial root-zone drying (PRD), is a fairly recent deficit irrigation strategy offering the potential to use deficit irrigation on crops where other deficit strategies can lead to negative outcomes, such as winery grapevines. PRD irrigation also allows water to be withheld from part of a plant’s root zone while the remaining part is kept well-watered.

In order to maintain the viability of the plant, irrigation is alternately applied to each side of the root zone, allowing the wet side to dry while the dry side is rewetted. Such application of PRD to grapevines has resulted in water savings of up to 50% with significant improvements in fruit quality but without a noticeable loss of yield.

Due to the presence of shallow root zones, deficit irrigation is not to be used for most turf and other grasses. Another severe caveat with deficit irrigation is with waters with a high salt content since there may not be adequate water applied to leach the salts away from the root zone, ultimately limiting the crop’s access to moisture.

Careful and precise monitoring of the water contained within the crop’s root zones that is available for uptake must be employed on a daily basis during the most aggressive stages of growth. This often requires the use of devices implanted in the soil and plant’s root zone, such as gypsum blocks or tensiometers, to provide the information needed as to when and how much irrigation water is required.

Most drip irrigation systems operate on the fundamental basis of deficit irrigation by applying only the water needed to foster and sustain plant growth. Although potentially effective in many water-limited regions, deficit irrigation should only be used with caution and scheduled using either automated methods of data collection or through manual methods of careful monitoring of crop stress and soil moisture conditions.

In summary, the use of deficit irrigation can be an effective way to stretch a limited water supply to optimize crop production or increase acreage, but it must be carefully planned, scheduled, and implemented to be fully realized.

Instantaneous and Daily Flow Rate

In most areas of the country, water uptake or use by established turf grass averages between 0.20 inches to 0.30 inches per day or roughly 5000 to 8000 gallons per acre per day (1 acre = 43,555 ft2 = 325,796 gallons per acre-foot or 27,150 gallons per acre-inch). The variance depends on the local climate, rainfall, water rates, and use patterns.

Water uptake for gardens and many vegetables is typically a little higher at 7000 to 10,000 gallons per day per acre (0.30 inches to 0.40 inches per day).

When calculating the irrigation demand on a water system, don’t forget the above figures are average and based on a full day of application and do not include the effect of the efficiency of the irrigation method used or what is often termed in the industry as the application efficiency. This can vary from a high of 90% for drip irrigation, 60% to 75% for most types of sprinkler irrigation systems, and 50% or less for open flood or furrow irrigation.

A nationwide survey of regions around the United States indicate an average of 0.26 inches per day (7059 gallons per acre/day) of evapotranspiration (crop uptake) generally applies to water usage. This means you would need to supplement 0.26 inches or roughly 7100 gallons of net water per day per acre to replace the moisture lost to evaporation and crop transpiration.

This average roughly translates to a gross application of 0.26 inches/0.75 (application efficiency) = 0.35 inches per day (9500 gallons per acre/day). However, this higher average applies to warmer and more humid climates and is considerably greater than the average for the cooler and temperate climate found in the Willamette Valley of Oregon, where the comparable evapotranspiration rate is closer to 0.22 inches per day, which after factoring for an average application efficiency of 75% equals approximately 0.30 inches per day.

A more effective and precise method of determining the daily irrigation demand on a water system is found by using the following formula:

Q (in GPM) = 453 ×

Irrigable area (in acres) × Gross depth of water application (in inches)Irrigation rotation frequency (in days) × Total operating hours of irrigation per day

Example: Define the net flow rate in GPM and GPM per acre of water required to replace 0.20 inches of net water crop uptake per day over 40 acres, covering and returning to the initial acreage over 10 days at 21 hours per day of operation using wheel-lines. (Assumed application efficiency = 70%)

Q (GPM) = 453 × 40 acres × (0.20 inches × 10 days/0.70)10 days rotation × 21 hours/day = 453 × 40 acres × 2.86 inches gross210

= 453 × 0.544 days

= 246.7 GPM or 246.7 GPM/40 acres = 6.17 GPM per acre

Seasonal and Required Storage Volume

In most cases, the yearly volume is directly satisfied by the source throughout a normal irrigation season. However, there are circumstances where supplemental water storage, in the form of a water storage basin, pond, or lagoon, must be provided for the total year’s volume differential.

The seasonal net volume required to adequately irrigate a specific number of acres depends on various factors. Chief among these are: incoming safe source capacity, type of irrigation system and application efficiency (from Table 2), length of the growing season, number of acres, available offsetting precipitation during the irrigation season, net evapotranspiration rate (uptake) of the crop, and two additional factors comprised of miscellaneous losses and a storage contingency.

Most of the factors are obvious from the previous discussion, but the available precipitation during the irrigation season, miscellaneous losses, and need for a storage contingency may be confusing.

Any available offsetting precipitation during the irrigation season is an unknown variable where any measurable rainfall that may occur during the irrigation season is used to offset the source inflow or storage volume. In many locales throughout the United States, particularly in the West, predicting a period of any significant rainfall during a specific growing season is problematic and unreliable. Therefore, except in
those regions with adequate historical records of offsetting precipitation along with measurable safeguards to protect from a miscalculation, this factor is generally disregarded.

Basically, miscellaneous losses are intended to provide a storage safety factor from any deep percolation losses in the storage basin along with the evaporation and wind drift losses from the top water surface.

These losses must be evaluated on a case-by-case and locale-by-locale basis, and are highly dependent on the specific region (the prevailing wind and sun exposure and local environmental factors such as humidity, mean air and water temperature, and energy stored within the storage water), the three dimensions (net volume) and the construction method of the storage basin (wall slopes, top and bottom surface area, and depth), soil type and structure, compaction level, and the addition of bentonite to soil or presence of a shotcrete or geomembrane pond liner.

The degree of evaporation and wind drift varies greatly between regions and even sites within the same region. However, for rough estimation purposes, an average total surface loss of 40 inches between May to October or approximately 0.22 inches per day (approximately 6000 gallons of loss per exposed surface acre per day or roughly 5 GPM per acre on a 20-hour per day basis) is generally a viable design value for most areas of the country.

Note this is the estimated daily evaporation that will need to be compensated from the source and will occur from the basin’s top water surface. Therefore, the actual volume will be a direct function related to the top exposed surface area of the basin, whereas the depth of the basin will have more of a direct impact on the head, and thus, the percolation losses.

The storage contingency is also an individual design factor that must be determined based on the separate impacts due to the reliability and seasonal availability of electric power or engine fuel and the source/pump; likelihood, frequency, and expected duration of source downtime and the potential for a short-term higher irrigation capacity needed to catch up if falling behind on irrigation; and configuration, shape, and maximum depth of the planned storage basin (for suction lift applications).

For most applications, a minimum storage contingency of 20% is recommended, although contingency volumes of 50% or more have been used.

Example: Assuming a crop CU of 0.20 inches per day over 40 acres, the available source capacity is limited to 125 GPM, application efficiency is 75%, and projected miscellaneous losses for percolation and wind drift are estimated at 15% of the seasonal volume and the crop’s growing season is 120 days:

  1. Is the source capacity fundamentally adequate for the yearly irrigation demands?
  2. How much in supplemental water storage volume in gallons and acre-feet is required?

Solution: This problem relies on a water balance determination. A water balance depends on the net extracted volume versus the incoming volume or the flow rate multiplied by the operational time to refill the basin plus the allowable number of days to operate. The seasonal volume required for the irrigation demands is:

Irrigation volume/day: 0.20 inches/day/0.75 (irrigation efficiency) × 40 acres × 27,150 gallons/acre-inch = 289,600 GPD

Irrigation volume/season: 289,600 GPD × 120 days growing season/0.85 (miscellaneous losses) = 40,884,706 gallons/year

The required total yearly (or seasonal) volume would now depend on the allowed days of irrigation system operation per irrigation year. This is often a state or local regulated parameter as some states do not specifically restrict an irrigation season, while others limit the number of days to 150, 180, or 240 days, among others. In this case assuming an allowable irrigation season of 240 days (eight months) yields a required yearly raw source volume in our example of:

Replenishment volume (gallons) = 125 GPM source capacity × 240 days × 1440 minutes/day = 43,200,000 gallons

(1). Influent source capacity: 43,200,000 gallons/year > 40,884,706 gallons/year required. Source is adequate.

The above volume, while important for permit or regulatory compliance and verification of adequate source capacity over the full season, does not provide the actual volume of water storage that is needed to accommodate the irrigation season’s daily requirements. This would be determined by evaluating the difference between the daily water consumption (outflow) versus the daily source capacity (inflow). This differential will provide an estimate of the net required storage volume:

Outflow: 289,600 GPD – 180,000 GPD inflow (125 GPM × 1440 minutes/day-continuous flow)-ΔV= 109,600 GPD

Required water storage = 109,600 GPD × 120 days = 13,152,000 gallons/0.80 (contingency) = 16,440,000 gallons

(2). Final solution: Approximately 16,500,000 gallons or 50.65 acre-feet of minimum water storage is needed.

The need for a 20% contingency factor is emphasized for three basic reasons:

  1. There was no indication of the expected evaporation from nor offsetting rainfall to the pond surface during the irrigation season. This can vary widely with the region, hours of sunlight exposure, humidity, and temperature, along with the exposed area of the top surface
    of the storage pond. Local sources can usually provide the information necessary to estimate this value.
  2. Many crops require a greater volume of water application at different stages of growth or the operational hours per day and days per week. In order to accommodate these variances, some degree of reserve volume must be included for these higher peak demands. This
    will also be a function of the irrigation system pump and pipeline sizing.
  3. As previously indicated, all water storage basins constructed in natural soils will leak to some degree. This will be most influenced by the pond’s soil type and consistency, type and degree of compaction (percentage), top and bottom cross-sectional area of pond, presence or use of a liner or barrier, and the greatest variable, the depth of the water in the pond, as more water head will usually increase the overall leakage rate.


This concludes the third part of this six-part series. Next month, we will take a break from the irrigation fundamentals topic because June marks National Safety Month and we’ll focus on arc-flash and electrical safety. We’ll return to our series in July and go over sprinkler and flood irrigation in July and August and wrap things up in September with a discussion on drip and trickle irrigation.

Until then, work safe and smart.

Ed Butts, PE, CPI, is the chief engineer at 4B Engineering & Consulting, Salem, Oregon. He has more than 40 years of experience in the water well business, specializing in engineering and business management. He can be reached at