Choose Your Location:  

  Support   

Contact Us
Why You Need to Know About Municipal Water Chemistry

By: Rob Janowiak on December 4th, 2019

Print/Save as PDF

Why You Need to Know About Municipal Water Chemistry

Corzan® CPVC

For 45 years, federal law has required potable water suppliers to protect their customers from waterborne pathogens while minimizing health risks from disinfection and the byproducts of disinfection. The Safe Drinking Water Act of 1974 authorizes the U.S. Environmental Protection Agency to establish minimum standards to protect tap water, and requires all owners or operators of public water systems to comply with these health-related standards. 

A majority of municipal systems use chlorine and chlorine-based disinfectants, and for a simple reason: after its initial application to destroy existing waterborne pathogens, these disinfectants leave behind a residual level of protection that guards against the risk of microbial recontamination after initial treatment. EPA requires all water treatment facilities to maintain an averaged annual Maximum Residual Disinfectant Level (MRDL) at no more than 4.0 mg/L.

 According to the Centers for Disease Control and Prevention, when chlorine is added to water, some of it will react with organic materials and metals in the water and will never be available for disinfection. This is called the “chlorine demand” of the water. What remains is referred to as “total chlorine,” which is then further divided into (1) the amount of chlorine that has reacted with nitrates and is unavailable for disinfection, or “combined chlorine,” and (2) the residual chlorine available to inactivate disease-causing organisms and assure the potability of water, known as “free chlorine.”

 

The Impact of Complex and Changing Water Chemistries

Why is this important to understand? It is the free chlorine residual that may be subject to ongoing treatment with secondary chemicals by the municipal water utility to maintain microbiologically clean water according to accepted standards. Every supplier has a program to do this, based on cost, efficacy, stability, ease of application, taste and odor considerations, and formation of byproducts. The chemistries employed can change at any time as circumstances dictate. 

A survey conducted by the American Water Works Association found that an increasing share of water distribution systems, especially large ones, disinfect residual free chlorine with chloramine. In fact, more than one in five Americans is using drinking water treated with it. There are many reasons why chloramine might be used, but a major one is for the control of disinfection byproducts (DBPs) that form when free chlorine reacts with naturally-occurring organic matter in the water supply.

This is where drinking water chemistry comes to a crossroads. Using chloramine as a residual chlorine disinfectant can also give rise to a microbial process called nitrification which converts ammonia and other nitrogen compounds into nitrite and nitrate. This in turn disrupts the efficacy of the water treatment protocol in ways that must be addressed with further treatments. The problem tends to flourish when water temperatures are too warm and water usage is abnormally low. In addition, corroding pipes and equipment provide sinks for nitrifying bacteria to elude treatment by the residual disinfectant. Thus, water plants must work diligently and continuously to prevent and control nitrification before water is distributed – not only because it affects water quality, but also because it affects delivery infrastructure when those pipes are metallic.

 

Why Engineers Should Be Concerned

Traditionally, when plumbing and mechanical engineers prepare specifications for the hot and cold water piping in a new building, the selection likely does not include an assessment of how the water supply will be disinfected, or even knowledge that disinfection protocols change. This is a problem, even without accounting for municipal water disinfection variables, because certain benign bacteria exist in the water supply naturally in the form of biofilm. While not a concern unto itself, if biofilm builds up it as it can in metallic systems, it provides an ideal breeding ground for scale formation and microbiologically induced corrosion by providing the necessary habitat and nutrients.

Yet, after the building is commissioned and in service, the oversight of biofilm buildup has been left to facility managers and maintenance professionals. The surface roughness that is endemic to aging metallic systems makes awareness of water supply management that much more of an issue for engineers. Because the guiding principles for combating waterborne health problems through water management, as defined by the CDC, aren’t just a concern of the utility:  

  1. Maintaining water temperatures outside the ideal range for Legionella growth
  2. Preventing water stagnation
  3. Ensuring adequate disinfection
  4. Maintaining devices (such as piping) to prevent scale, corrosion and biofilm growth.

Instead, with the advent of ASHRAE 188: “Legionellosis: Risk Management for Building Water Systems,” we have entered a phase in the evolution of building science and design where engineers will increasingly be asked for their guidance to optimize piping system integrity in the face of routine disinfection changes at the utility and continuing outbreaks of Legionnaires’ disease. CDC investigations show almost all Legionella-driven events are caused by problems that would have been preventable with more effective water management. Complicating the situation and keeping Legionella bacteria in the foreground of concern are several other trends, including:

  • Water conservation strategies that often involve low-flow plumbing fixtures
  • Energy conservation strategies that may keep operating temperatures lower than they should be
  • Longer periods of dormant water before system startup, allowing bacteria to grow.

When it comes to the dizzying complexities of water quality management, not all piping materials are the same. And this is where selecting CPVC for the plumbing design becomes a strategic choice, because CPVC is unaffected by all forms of chlorine disinfection. Demonstrating a smooth, consistent surface throughout its service life, a chlorinated material like CPVC carries an inherent safeguard against pitting corrosion, scale and biofilm proliferation. It can withstand ASHRAE 188 Guideline decontamination procedures requiring very high temperatures and/or highly chlorinated water. In short, across nearly 60 years of handling hot chlorinated drinking water, the benefits of CPVC have been proven and its water quality-protective properties should endure for the design life of the system – good news for engineers who will be facing new demands in assessments of risk in the drinking water systems they design.

Legionella Growth Webinar

References:

This blog post originally appeared on pmengineer.com.