Testing for water quality tells a whole lot about a water system.
By Michael Schnieders, PG, PH-GW
The U.S. Environmental Protection Agency does not mandate testing of private well systems, but many states do.
As directed by Congress, the EPA establishes and evaluates “health-based standards” for water quality as directed by the Safe Drinking Water Act for drinking water supplies.
These standards are generally adopted by states and Indian tribes for implementation and enforcement. This is why new well testing and required periodic testing are conducted through and reported to your state.
Water quality is an ambiguous term that has developed into a widely used but poorly understood reasoning for frequent—and sometimes expensive—testing. At its essence, water quality testing was designed to identify the presence of contaminants or germs in the water that can impact human health.
The default definition of water quality requirements is the National Drinking Water Regulations (NDWR), which is broken down into primary and secondary standards. The listing provides a maximum contaminant level (MCL) and goals specific to the contaminant and the testing ability. The primary standards are comprised of 88 contaminants including inorganic chemicals, radionuclides, organic chemicals, disinfectants, disinfection by-products, and microorganisms.
An example of a NDWR primary standard would be toluene, an aromatic hydrocarbon widely used in solvents. Toluene, which impacts the nervous system and has been linked to problems in the kidneys or liver, has an MCL of 1 mg/L.
The NDWR secondary standards are non-enforceable guidelines of 15 constituents that may cause cosmetic or aesthetic effects in drinking water. An example of a secondary contaminant is manganese, a free element in nature with a secondary maximum contaminant level (SMCL) of 0.05 mg/L. Manganese in concentrations higher than the SMCL can impact the color, odor, or taste of water.
It is important to state in the 40-plus years of the modern Safe Drinking Water Act, many occurrences of contamination have been identified, drastically bettering human health as a direct result of the improved testing. Millions of people have benefited from regular testing conducted in an effort to identify contamination occurring within our water.
However, aside from a specific concern or identifying a viable water source, what does this type of testing tell us about the well’s health?
With regards to the day-to-day operation of the well, testing concerns should focus on the potential for chemical corrosion, the accumulation of solids, the resident microbial population, and the presence of nuisance or offending organisms.
A few of the common water chemistry parameters useful in evaluating well health include pH, total dissolved solids, conductivity, oxidation-reduction potential, hardness, iron, manganese, and a corrosion index.
While not a conclusive list nor presented in a manner to substitute a full chemical workup, these parameters are helpful in evaluating problems impacting the operation of the well before they become significant.
pH represents an expression of the hydrogen (acid) ions in solution. It is a logarithmic representation in which the acid-alkaline level of water is written 0 to 14—with 14 being the most basic (alkaline). Conversely, any number below 7 is considered acidic.
The natural groundwater pH varies greatly across North America, with wells registering values as low as 4.5 and others as high as 9. Groundwater pH is typically considered a stable value with only minor fluctuations. Changes of 1 or more may indicate an influence is occurring on the well.
Total Dissolved Solids and Conductivity
The total dissolved solids (TDS) is a measure of the combined total of inorganic and organic substances present in a liquid in molecular or ionized form. TDS is not considered a pollutant, but is used as a cursory indication of drinking water quality. The two accepted methods of measuring TDS are gravimetric and electrical conductance.
The gravimetric method is the most accurate, but is also the most expensive and time-consuming.
Electrical conductance (conductivity) is the measure of the ability of water to pass an electrical current. Conductivity in water is affected by the presence of inorganic dissolved solids such as chlorides, nitrates, sulfates, and phosphate anions (ions that carry a negative charge) or sodium, magnesium, calcium, iron, and aluminum cations (ions that carry a positive charge).
Conductivity is also affected by temperature; the warmer the water, the higher the conductivity. For this reason, conductivity is reported at 25 degrees Celsius.
TDS and conductivity are relatively stable parameters in hard rock aquifers. In alluvial aquifers, these parameters generally increase during periods of heavy recharge. Regardless of the aquifer, fluctuations of more than 200 points in either TDS or conductivity would be considered abnormal and may indicate influence on the well or aquifer is occurring. A flooded well, for example, may exhibit a TDS or conductivity reading two to three times the normal value.
Oxidation Reduction Potential
The oxidation reduction potential (ORP) is the measure of the tendency of a solution to gain or lose electrons when it is subjected to change—be it chemical, biological, or mechanical. Positive readings are more oxidative while negative readings are said to be more reductive.
Water with a higher or more positive ORP will have a greater tendency to gain electrons from the new entity, therefore oxidizing the new species. An example is the oxidation of iron when it enters a water of high ORP.
Although it is relatively easy to take a reading, many factors limit its interpretation, and as such, field measurements seldom correlate with calculated values. Nevertheless, the ORP has proven useful as an indicator of change in a system rather than the calculation of absolute values. ORP results are read in millivolts (mV).
ORP levels fluctuate with well activity, so the well’s use or inactivity should be considered. Changes in ORP can indicate corrosion is occurring, microbial populations are increasing, or chemical influence is occurring. For example, when chlorine is present, ORP levels generally increase significantly. As such, abnormal spikes in ORP can indicate backflow of treated water into the well is occurring.
Hardness is a term used to include all or most of the multivalent ions that can pair with carbonate or non-carbonate ions to form mineral precipitates. Most of the time hardness consists of calcium (Ca), magnesium (Mg), and iron (Fe), three of the more common ions in groundwater. Usually, 99% of hardness readings are made up of the calcium and magnesium concentrations.
Hardness levels of 0-60 mg/L are classified as soft; levels from 61 to 120 mg/L are moderately hard; levels of 121 to 180 mg/L are hard; and water with a hardness level greater than 180 mg/L is classified as very hard.
Hardness is generally reflective of the supporting aquifer. For example, the carbonate-rich Ogallala Aquifer exhibits very high levels of carbonate hardness, often reaching 300 to 400 mg/L. Increases in hardness, either steady or sudden, may reflect the concentration of mineral-forming ions within the well.
Iron is a common parameter of concern for groundwater and is evaluated in several methods. The dissolved iron test analyzes for ferrous iron (Fe+2), which is iron in solution in the first stage of oxidation, usually as ferrous oxide (FeO) or ferrous sulfide (FeS).
Ferrous oxide can represent iron just released from a surface, such as a well casing sidewall, as a result of oxidation. Oxidation can indicate corrosion. Iron in this ferrous state may also be reflective of native background iron within the aquifer, but it is usually further oxidized immediately as it enters the borehole to ferric iron (Fe+3). Therefore, determining the ferrous iron (Fe+2) presence in a water sample can indicate active corrosion occurring within a system.
The level of dissolved iron can indicate the severity of the corrosion. Typically, groundwater wells exhibit dissolved iron (Fe+2) levels below the detection level of 0.02 mg/L.
The total iron test is reported as iron (Fe) and represents for both ferrous iron (Fe+2) and ferric iron (Fe+3) for a measure of the total iron in a sample. Ferric iron (Fe+3) is iron that has been further oxidized, moving from ferrous oxide (FeO) as ferrous iron (Fe+2) to ferric iron (Fe+3). Total iron levels above 1 mg/L in groundwater wells can be an indicator iron precipitation is occurring.
Increases in total iron can indicate problems are occurring within the well and aquifer. Iron bacteria, corrosion, aeration, and declining water levels can all cause increases in the iron level. Like several of these parameters, iron levels can fluctuate with well activity, and as such, the well’s use or inactivity should be considered when evaluating results.
The most common parameter used for the evaluation of corrosion potential is the Langelier Saturation Index. The LSI is a calculation that takes into account pH, conductivity, water temperature, calcium content, and alkalinity. The LSI calculation yields a value that falls within the range of –3 to +3 for most groundwater sources.
Positive values indicate a water source considered supersaturated with respect to calcium carbonate and scale forming. Negative values are undersaturated with respect to calcium carbonate with a limited potential for carbonate scale and thus more aggressive with regards to corrosion.
A second, lesser-known calculation is the Larson-Skold Index used to assess the corrosivity of a water source toward mild steel. The Larson-Skold Index builds on the LSI by accounting for the impacts of sulfates and chlorides. In areas where brackish water is a concern, use of the Larson-Skold Index can be helpful, especially when designing new well systems.
Regardless of our acceptance or comfort level, bacteria are universally present in our aquifer systems. In the water industry, the total coliform test has become the default method—right or wrong—of assessing microbial water quality.
Unfortunately, the total coliform test does little to tell us of the resident bacteria population and issues that can impact well operation. The population size and type are generally used to evaluate problems associated with bacteria such as biofouling, foul odors, and the presence of pathogens.
Important microbiological parameters for well evaluation include an assessment of the total population, the presence of specific troublesome organisms, and the presence of larger biological components.
Adenosine triphosphate (ATP) testing is a more accurate and advanced means of quantifying the total bacterial population within a water sample. Heterotrophic plate counts (HPC), while helpful, are better suited for more investigative testing.
Fluctuations in microbial population levels can reflect the influence of oxygen, nutrients, or changes in flow, indicative of well fouling. Anaerobic growth is a good indicator of microbial maturity and may reflect impacted or other influence on the well and near-well aquifer interface.
If you are aware of particular problems occurring, testing for specific organisms is recommended. For example, if a strong rotten egg odor has been reported, you should test for the presence of sulfate-reducing bacteria.
A simple microscopic evaluation can identify the presenceof larger microorganisms such as Protozoa, or biological components such as algae, which can reflect external influence on the well or aquifer system. Additionally, microscopic evaluation can assist in identifying the presence of sediment within the well such as sand, clay, or silt.
Local or Regional Concerns
In addition to the parameters outlined here, there are often constituents, either chemical or biological, of local concern. Where these issues may be more frequent, they should be incorporated into regular testing to identify issues before they become a problem.
Arsenic is an example of this. While arsenic is a primary drinking water standard constituent, if the local or regional geology contains arsenic, monitoring the levels on a regular basis can help to identify the concentration of these ions prior to them reaching a problematic level. You should check with your local health or environmental department to find out if any specific contaminants are a problem in your region.
While no means a substitute for a thorough chemical and microbiological examination, changes in these parameters can help identify problems before they become more substantial. We cannot move away from the more advanced and complete testing, either from a regulatory standpoint or in assessing all aspects of contamination.
However, unless we have a significant event, such as a flood, the exhaustive list of constituents is not necessary and does little to assess the day-to-day condition of the well. Tailoring your testing to specific or known challenges in a wellfield can reduce cost and ease of testing. Streamlining your evaluation can aid in identifying problems earlier in advance of an impact on production or water quality.
When coupled with regular pump tests, site visits, and periodic evaluation of the produced water treatment, water quality testing can help extend the operational life of a well and reduce the need for more costly and evasive treatment efforts. In a sense, this moves the groundwater professional to a more proactive approach.
Michael Schnieders, PG, PH-GW, is the president and principal hydrogeologist at Water Systems Engineering Inc. in Ottawa, Kansas. Schnieders was the 2017 NGWA McEllhiney Lecturer in Water Well Technology. He has an extensive background in groundwater geochemistry, geomicrobiology, and water resource investigation and management. He can be reached at email@example.com.