THIS IS HOW WE DO IT

Posted September 24, 2014 by waterbloguer
Categories: Uncategorized

In the last post it was asked if Dripping Springs and developers there could bust out of their 19th century approach to water resources management. In this post, we look at how they can do that, reviewing a plan for a decentralized concept wastewater system, with the reclaimed water used for irrigation. And in the next post we’ll look at a stormwater management plan based on distributed Low-Impact Development (LID) techniques, integrated with rainwater harvesting to also supply irrigation water. This integrated water management strategy is a 21st century approach.

To address our 21st century water challenges, the over-arching goal of employing these strategies is to pretty much take irrigation off the potable water system. It is typically expected that, averaged over the year, about 40% of projected water demands would be for irrigation. So if we attain our aim, the actual water required for new development could be only 60% of the currently projected demand. That would drastically reduce the strain on existing supplies from local groundwater and the Highland Lakes, and it would blunt the need to develop new supplies, like schemes to import water to this area from the Simsboro Aquifer. That would be very expensive and likely unsustainable over the long term, as reviewed in “One More Generation”. Decentralized concept wastewater systems and LID stormwater management can therefore help to move us toward sustainable water in this region.

A neighborhood in the proposed Headwaters project is used to offer an example of these techniques. This is one of the developments in the hinterlands around Dripping Springs to which the city is proposing to extend an interceptor and take their wastewater “away”. An overall draft plan for Headwaters is shown below. The neighborhood we’ll focus on is along the first side street you come to as you run on down the main road entering the project off U.S. Hwy. 290. This street and one cul-de-sac off of it are fronted by 29 houses.

hd conceptual yield study - full plan 2014

[click on image to enlarge]

 

DECENTRALIZED CONCEPT WASTEWATER SYSTEM

Before proceeding to review the details within this neighborhood, let’s look at how the decentralized concept strategy works with the “time value of money”. The development-scale conventional centralized wastewater system currently permitted for this project would have the treatment plant located in the lowest area down toward Barton Creek. So if the development were to begin with lots on the “higher” end, closer to Hwy. 290 to minimize the length of the main road and waterline to be initially installed, then they’d have to build a long run of wastewater interceptor main down to the treatment plant site, all of which would have to be sized to carry the flow from all of the development that would eventually occur along its route. This would impart a large “carrying charge” since much of that development wouldn’t be built for many years, thus much of that investment would lie “idle” for a long time. If, on the other hand, the first neighborhoods to be developed were down close to the treatment plant site, then a long run of road and waterline would have to be built, also imparting a “carrying charge”. As we will see, under the decentralized concept strategy, the whole wastewater system for each neighborhood is self-contained, built on a “just in time” basis, so one could start development anywhere desired without incurring “carrying charges” for that function.

Back to that example neighborhood, a plan approximating its layout is shown below, along with a decentralized concept wastewater system to serve those 29 houses. The plan utilizes the three essential tools of the decentralized concept to simplify the system, to make it more robust, to reduce its costs, and to maximize water utilization efficiency:

  • Effluent sewerage.
  • “Fail-safe” treatment.
  • Beneficial reuse of the water resource.

Headwaters neighborhood ww sketch plan

[click on image to enlarge]

Effluent Sewers – Simpler and Less Costly

In an effluent sewer system, wastewater from a house runs through the house drain into an interceptor tank (primary septic tank). These interceptor tanks hold and digest the settleable solids, so that only a liquefied effluent would run to the treatment unit. This allows use of small-diameter effluent sewer lines. These can run at very small and variable grades, typically with the lay of the land, in shallow, narrow trenches.

The interested reader can find a thorough review of the effluent sewer concept and its advantages here. But basically we use it because it is less expensive than conventional large-pipe sewers that have to run on larger and uniform grades, and it creates a simple, easy sludge management system – just pumping the interceptor tanks at multi-year intervals. Because the whole system is contained within the neighborhood, the overall cost of the collection system would be significantly less than a conventional collection system. That system would include not only the more costly lines within this neighborhood but also the large interceptor mains outside the neighborhood leading to the centralized treatment plant, the lines that would incur those “carrying charges”.

Ideally the area tributary to a treatment unit would be those houses that could drain by gravity through an effluent sewer system to the plant location. But where the topography dictates, one or more interceptor tanks could drain to a pump station, as shown in the drawing, to be pumped from there to the treatment unit, or to a point in the pipe system where gravity flow could take over. Such pump lines would also be small-diameter pipes installed in shallow, narrow trenches. Where they run parallel to the effluent gravity sewer, they would be in the same trench, so the cost of the pressure sewer would essentially be just the cost of the second pipe in that trench. Also an effluent pump station would be simpler and less problematic, and significantly less costly, than a conventional lift station, which would be needed at that point in a conventional centralized system.

“Fail-Safe” Treatment is Essential

The concept of “fail-safe” treatment bears a bit of explanation. I always use quotes in setting forth this tool, as nothing is ever completely fail-safe. Every sort of treatment unit will need proper operations and maintenance (O&M) in order to continue to function over time. However, there are some treatment technologies which, by their very nature, are resilient and robust, whose failure modes are slow and gradual, so can consistently and reliably produce a high quality effluent even in the face of temporarily poor operating conditions. One such technology is a variant of recirculating sand filter technology that I have labeled the high performance biofiltration concept. The interested reader can go here to get a thorough rundown of how this concept works and why it is highly robust, able to run with minimal routine oversight.

That’s a sharp contrast to the inherently unstable activated sludge process that is almost exclusively used by the mainstream in their centralized plants. And it is essential to a strategy entailing many small, distributed treatment units. Using activated sludge plants as distributed treatment units would be a disaster, as the O&M liabilities would be untenable. The high performance biofiltration concept, however, can run with little active oversight over long periods, so policing multiple plant sites would not create a great burden. This simple operating concept also uses far less energy than an activated sludge plant.

This treatment system will consistently and reliably produce an effluent quality better than that produced by most municipal treatment plants, including removing well more than half of the nitrogen from the wastewater. Recall from the last post that nitrogen was identified as a problematic pollutant, so care must be taken to remove it before any of the reclaimed water might seep into a creek.

Maximizing Irrigation Reuse, Minimizing Pollution

The reclaimed water coming out of the treatment unit would be routed into subsurface drip irrigation fields, arrayed as much as possible to irrigate areas that would be irrigated in any case, so maximizing the reuse value of this water resource. As the plan shows, much of the reclaimed water feed pipe could run in a common trench with the effluent sewer pipes. So the cost of much of this distribution system would basically be just the cost of the second pipe in that trench.

As noted, this decentralized concept plan aims to take irrigation demands off the potable water system. The plan shows dispersal of the reclaimed water is focused on front and side yards and parkways, in the “public” spaces, leaving the back yards – the “private” spaces – unencumbered by the drip fields, allowing the owners to install patios, pools, etc., there. (As will be reviewed in the next post, those private areas could be irrigated with harvested rainwater, integrating that into the stormwater management scheme, so taking that irrigation off the potable water system as well.)

The area of drip irrigation field is based on a design hydraulic application rate onto the drip fields of 0.1 gallon per square foot per day, about the average year-round evapotranspiration rate in this area. This plan provides more than enough space in areas that could be beneficially irrigated to meet that criterion. Note however this rate dictates that, on average, the field would be under-loaded through the heat of the summer and over-loaded through the winter. Thus, the drip field would act like a “drainfield” through part of the year, and indeed on any days at any time of year when a significant amount of rain fell. This may raise a concern about public health and environmental protection.

With the high quality effluent produced by the high performance biofiltration concept, and adding on UV (ultraviolet) disinfection as a safety factor, subsurface dispersal would pose no public health hazard. The water would be sequestered below the surface so contact potential would be extremely low. And any water that percolates through the soil, perhaps eventually to emerge at seeps as is likely in this topography, would have been completely “renovated” by passage through the improved soil that would be installed over the drip field areas.

Regarding environmental protection, the nitrogen concentration of the wastewater will be knocked down in the treatment unit, so it would be loaded at a rate more closely matching the uptake of nitrogen by plants covering the drip fields. Over the annual cycle the vast majority of the reclaimed water entering the drip irrigation fields would exit by way of evapotranspiration into the air instead of percolation down into the soil, where it could perhaps migrate to seeps and on into streams. So the potential mass loading of nitrogen into streams would be inherently very low. In any case, it can be expected that most of the nitrogen in the reclaimed water that is not taken up by plants would be eliminated by in-soil denitrification, gassing off the nitrogen into the atmosphere in the same manner it is eliminated in the treatment process. The soil is also by far the best medium for eliminating/assimilating the contaminants of emerging concern, such as pharmaceuticals, which would be so very problematic if the effluent were discharged to a stream.

Indeed, critics of dispersing the reclaimed water in uncontrolled access areas as shown in the plan – front and side yards, parkways, a park – would be hard pressed to show why this would be unsound practice in regard to either public health or environmental protection. As reviewed in “Slashing pollution, saving water – the classic win-win (but ignored by society)”, our controlling institutions allow – indeed they support – the spewing of water over the surface that has been questionably treated in home-sized activated sludge units subjected to a meaningless level of oversight, and then is rather questionably disinfected in a drop-feed tablet chlorinator, all over Hill Country watersheds. And many other houses have subsurface drainfields dispersing water into the soil, with no organized oversight at all. Contrast that with what is posed here – a professionally managed, highly robust and resilient treatment unit with subsurface dispersal at irrigation rates after highly effective UV disinfection – inherently far less problematic.

By the same token, concerns about marketability of a development with this sort of wastewater management system ring hollow. Again, there are all those houses being sold with that (smelly) activated sludge unit sitting right next to the house, with the poorly treated water sprayed around the lot. Whole large subdivisions, including some within Dripping Springs’ jurisdiction, employ that as the wastewater management strategy, and the builders don’t seem to be batting an eye. Indeed, it is the builders who insist upon installing the relatively cheap activated sludge and spray dispersal system, who refuse to consider using a “fail-safe” treatment unit and drip irrigation. So to suggest that the decentralized concept scheme would negatively impact on marketability is disingenuous, to say the least.

Back to real issues, irrigation of front and side yards with the reclaimed water would relieve the homeowners of water bills to irrigate these spaces. But, as noted, the amount of water each house would produce as wastewater would leave these areas under-loaded through the peak irrigation season, IF a conventional turf and/or high water demanding “exotic” plants were used to create those front yard landscapes. This suggests another strategy to match the needs of the landscape to the water made available through the wastewater system – a regionally appropriate landscaping aesthetic/ethic. A front yard landscape employing native and native-adapted plants, such as shown in the picture below, could thrive on the amount of water the wastewater system could provide, so no draw on the potable water system for additional irrigation would be needed through the peak irrigation season.

OLYMPUS DIGITAL CAMERA

[click on image to enlarge]

This sort of landscape might be institutionalized as the “face” of this development, displacing the sterile patch of water-guzzling turf that is the “stock” aesthetic in such places. The developer might deliver the home to the buyer with a basic native plant palette in place over the mulched and improved soil bed, as required to support drip dispersal of reclaimed water. The homeowner might be given an account at a participating native plant nursery and some assistance/instruction in native plants so that he/she can enhance the landscape as desired, creating buy-in to this aesthetic.

What Does It Cost?

Rough cost estimates were made for the effluent sewer system, the treatment unit, and the reclaimed water feed system shown in this neighborhood plan, yielding an estimate of about $8,000/house. To complete the entire system, the cost of the drip irrigation fields would have to be added. It is called to question however if those are not, in part at least, costs that would be borne in any case, given that much of the area shown on the plan as irrigation field might be irrigated anyway. It can be argued that, in this terrain, amended soil would cover the front yards and parkways to support improved landscaping without regard to whether it would be needed to support environmentally sound drip dispersal of the reclaimed water. Indeed, minimum soil depth on lots for landscaping is required under the Fish & Wildlife MOU that would allow this development to use water delivered by the Hwy. 290 pipeline from Lake Travis. And installing drip lines in this amended soil would not be significantly more costly than a spray irrigation system, which the drip fields displace.

While hard to compare without more details than I currently have, it is expected that these costs compare well with what would be needed to implement the conventional centralized system. The conventional collection system within this neighborhood would incur a similar cost to the effluent sewer system within it, with the interceptor tanks included, and then to that you’d have to add a share of the cost of the interceptors and lift stations needed to get wastewater from this neighborhood down to the centralized treatment plant. That plant would no doubt cost less per gallon per day of capacity than the small decentralized plant, but here again the centralized plant would be sized for the flow at buildout. So the total cost would be much greater, with the capacity that would not be fully utilized for many years imparting a “carrying charge”.

Also, under the centralized plan, the dispersal of the treated water would be a “land dumping” operation, with the cost of the dispersal system not providing any benefit other than making that water go “away”. So the entire reclaimed water distribution system and the entire dispersal system would all be extra costs, instead of displacing irrigation systems that would otherwise be installed anyway. As well as wasting all that water, while the homeowners would purchase potable water to run their irrigation systems.

If, instead of implementing their development-scale conventional centralized system, the developers connected to the City of Dripping Springs system, it does not appear that their cost situation would be much, if any, better. The estimated cost of the “east” interceptor that would receive wastewater from Headwaters is $7.78 million (per the Dripping Springs PERP dated July 2013). While of course that interceptor would eventually serve other development, its major reason for being in the Dripping Springs plan is to incorporate Headwaters into the city’s proposed conventional centralized system. Dividing that cost by the planned 1,000 lots in Headwaters, the cost per lot is $7,780, by itself almost as much as the rough estimate for collection and treatment of the wastewater and redistribution of reclaimed water under the decentralized plan. This would be in addition to the internal sewer network, including several lift stations, within Headwaters. There’d also be a charge for buy in to the city’s treatment capacity.

Of course, this all presumes that the city could have that interceptor and the lift station(s) associated with it on line, as well as its treatment plant expanded, before the first house in Headwaters becomes occupied. As Headwaters has filed a preliminary plan for its first phase of 208 lots, that is an open question. Dripping Springs has not yet even released its revised PERP, thus has not even begun the permitting process at TCEQ, which can be expected to run about a year. If all goes well, that is; it could get longer.

Also note that it would be only those 208 lots, not 1,000 lots, that the developer could spread the buy-in costs over. But the entire $7.78 million must be put in the ground up front, along with the $8+ million for the treatment plant expansion. And the estimated cost for permitting is another $1 million in up front money. It will not be the developer who will be prevailed upon to cover all these costs, rather they will be covered by bonds, the payments for which will doubtless be spread over the entire city’s ratepayer base. So there is an aspect of social equity to be considered here too, as existing ratepayers will be required to help pay the costs incurred due to growth. Which, it has been asserted, will never pay back through tax revenues what it costs to install and maintain the infrastructure needed to actuate it, at least if that continues to be the conventional infrastructure.

A note about operations and maintenance. Understand that all of the decentralized concept systems would be under unified management, either by the MUD organized by the developer or as an integral part of the Dripping Springs wastewater system. While it won’t be belabored here, it is expected that the O&M costs of the decentralized concept systems would be less, perhaps significantly so, than for the conventional centralized system.

All things considered, the price of the decentralized concept system appears likely to be a sweet deal for the developer, even without taking into account the “time value of money” benefits of installing the wastewater system infrastructure on a “just in time” basis, to serve only the neighborhoods slated for imminent development. And because of the social equity issues, it would be a sweet deal for the existing citizens of Dripping Springs as well.

A WIN-WIN-WIN

Then add on the reuse benefits, displacing irrigation demand from the potable water system, which will further benefit all the citizens of the area by delaying, perhaps obviating, the need to implement a very costly long-distance “California-style” water transfer scheme that would greatly increase water rates. Altogether this is a win-win-win. So clearly it would be in the interests of Dripping Springs and the developers there to give meaningful consideration to a decentralized concept wastewater management strategy.

 

Advertisement

Can Dripping Springs, and developers there, bust out of the 19th century?

Posted September 8, 2014 by waterbloguer
Categories: Uncategorized

Or will they choose to remain stuck there. Because, you know, that is a choice they are free to make.

It’s a simple proposition, really. If your aim is to maximize use of the water resource we mistakenly call “wastewater” to defray demands on the area’s water supplies, then it just makes sense to design the “waste” water system around that principle. It doesn’t make sense to instead use a large majority of the money dedicated to this function to build a large-scale system of pipes and pump stations focused on making what’s misperceived as a nuisance to go “away”, then to spend even more money on another large-scale system of pipes and pumps to run the reclaimed water back to where it came from in the first place!

That’s the standard MO of our mainstream institutions, like the City of Dripping Springs and the engineers who advise it and developers whose projects would feed into the city’s centralized wastewater system. This centralized management concept was a response to the conditions considered paramount in the 19th century. The industrial revolution was in full force, city populations were exploding, the stuff was littering the streets, creating a stench and a serious threat of epidemic disease. The response was to pipe it “away”, to be deposited in the most conveniently available water body. Later, as it was realized those water bodies were being turned into foul open sewers, creating a threat of disease in downstream cities that withdrew their water supplies from them, treatment at the end of the pipe was considered, and eventually adopted as the standard.

The intellectual leadership of the centralized pipe-it-away strategy was centered in well-watered areas like northern Europe and the northeastern and midwestern areas of the US. So the resource value of that “waste” water was never part of the equation. This water, and the nutrients it contains, was viewed solely and exclusively as a nuisance, to be made to go to that magical place we call “away” – the working definition of which is apparently “no longer noticeable by me.” This centralized pipe-it-away strategy became institutionalized as the manner in which cities manage wastewater.

Of course, that strategy flies in the face of the circumstances confronting us here in Central Texas in the 21st century – that water, all water, is a valuable resource which we can no longer afford to so cavalierly waste by addressing it solely and exclusively as if it were just a nuisance, simply because that is what the prevailing mental model dictates. Rather, it’s imperative we practically maximize the resource value of that water, using it to defray demands on the area’s water supplies, which are being stressed by both chronic drought and population growth.

In the Texas Hill Country, we also have an issue with surface discharge of wastewater, even when treated to the highest standards that the Texas Commission on Environmental Quality (TCEQ) has so far formulated. And before proceeding I’d note that this issue would remain even if the whole system were to operate perfectly all the time. But of course, it will not; there will inevitably be “incidents”. Which brings up the issue of the vulnerability created by centralization. I’ve often said, not entirely tongue-in-cheek, that the real point of regionalization – TCEQ-speak for centralizing flow from as far and wide as can be attained – is to gather all this stuff together at one point where it can really do some damage. Indeed, the whole organizational strategy is a “vulnerability magnet”. Large flows being run through one treatment plant or one lift station or one transmission main means that any mishap may create large impacts.

Back to the issue with discharge in the Hill Country, the major problem is those nutrients in the wastewater, in particular nitrogen. A discharge of the magnitude that an expanded Dripping Springs system would create, centralizing wastewater flow from developments for miles around the city in every direction, would make the receiving stream effluent-dominated. This would be partly an artifact of the drawdown of local aquifers drying up springs and thus reducing natural streamflow – again highlighting how critical it is to defray demands on these local water resources – but in larger part due simply to the magnitude of the wastewater flow. Highlighting the problematic nature of “permitted pollution” when the flow has been centralized so that, even with low concentration limits, the mass loadings may still be “large”. The nitrogen would cause chronic algal blooms in the creeks, making them very green most of the time, and then depleting oxygen in the water when the algae die off, degrading the riparian environment.

This is deemed an aesthetic affront by downstream landowners. But even more critical, the stream that would receive Dripping Springs’ discharge is Onion Creek, a major source of recharge to the Edwards Aquifer. That’s a sole source aquifer supplying water to about 60,000 people and is the source of Barton Springs, which is home to endangered species. So there’s great antipathy to any plan by Dripping Springs to discharge.

The “standard” option is to continue to “land apply” the effluent from its wastewater treatment plant – “irrigating” land for the sole purpose of making the water go “away” rather than to enhance the landscape or grow a cash crop – which the city does under its current permit. This practice is more accurately termed “land dumping”, and in this region, in this time, it is an unconscionable waste of this water resource.

At least discharge would have some utility, providing more constant flow in the creek, enhancing the riparian environment, and a more constant recharge of the Edwards Aquifer. That is, it would have utility if the water were to be treated to a standard that would preclude the “insults” noted above.

In regard to nutrients, that is technically possible – albeit unlikely to be required by TCEQ – but it would be quite expensive. Burnet discovered that treating to a higher standard to allow them to discharge into Hamilton Creek, which eventually flows into the Highland Lakes, would add about $10 million to the cost of their treatment plant. But that still won’t attain the high removal rate demanded for discharge into Hill Country creeks that recharge the Edwards Aquifer.

But nutrients aren’t all there is to be concerned about. There are also “contaminants of emerging concern” – pharmaceuticals, in particular endocrine disruptors. What it would cost to make discharge “safe” in this regard is an open question – another subject for another time. Suffice it to note here that TCEQ has no standards addressing these pollutants, thus there is no requirement to even consider what might be “safe”.

The latest word is that the overwhelming dissatisfaction with a discharge scheme has urged Dripping Springs to drop its plans to seek a discharge permit – for the present. It’s unclear if that means it would just expand its “land dumping” system (a rather costly proposition, due to the land requirements, so Dripping Springs might soon decide that’s just too expensive and would request a permit to discharge). Or would the city pursue any and all opportunities to route the treated effluent to beneficial reuse? Likely mainly within the developments generating the flow as few other opportunities have been identified, the 8-acre city park being the only one mentioned in the version of the Preliminary Engineering Planning Report (PERP) the city released last summer.

Which brings us to how the city would create a system plan predicated on beneficial reuse of this water resource to defray demand on other water supplies. The city appears to be leaning toward simply appending onto the already costly 19th century conventional centralized wastewater system another whole set of costly infrastructure to redistribute the water, once treated, back to the development that generated it. Note, however, that as TCEQ presently interprets its rules, the city will still be required to have a full-blown “disposal” system in place regardless of how much of that water they expect to route to beneficial reuse, making that whole concept somewhat problematic if indeed no discharge option would be sought. This focus of TCEQ rules, as currently applied, on “disposal” of a perceived nuisance, to the exclusion of focusing on integrated management of water resources, is an issue for any sort of plan the city may consider, highlighting the need to press TCEQ to reconsider that focus.

Indeed the city’s centralized plan would be costly. Dripping Springs is keeping its present engineering analyses close to the vest, but according to the version of the PERP released last summer, the three interceptor mains in that plan – denoted “east”, “west” and “south” (leaving us to wonder what will be done with development that may occur to the north) – and their associated lift stations would have a total cost of about $17.5 million. These are costs, along with the estimated $8.1 million for treatment plant expansion and an estimated $1 million for permitting, that must be sunk into the system prior to being able to provide service to the first house in the developments this system would cover. Then there is the cost of centralized collection infrastructure within the developments, to get their wastewater to those interceptors, no doubt running into the 10’s of millions at complete buildout.

And for this, all they get is “disposal” of a perceived nuisance!

With, as noted, the issue of how the water would be “disposed of”, if it is not discharged, still to be resolved – and paid for. If it is to be redistributed back to the far flung developments generating the flow, the facilities to do that will add many more millions to the overall cost of the complete system.

Far less costly, in both up-front and long-term costs, would be the creation of a 21st century system that would be designed around reuse, rather than “disposal”, of this water resource right from its point of generation. The city could pursue a decentralized concept strategy, focused on treatment and reuse of this water as close to where it is generated as practical, obviating the high cost of both the conventional centralized collection system and the reclaimed water distribution system.

Entailing a number of small-scale systems designed into rather than appended onto development, it is highly doubtful that the city could unilaterally impose that sort of system. The large developments around Dripping Springs are all planning – indeed they have obtained TCEQ permits for – smaller conventional centralized systems within each of them, featuring “land dumping” as the intended fate of the water. In fact, Dripping Springs has “sponsored” the permit for one of those developments, so is actively promoting this strategy. The development agreement with another large project specifies that the wastewater generated in that development must be run into the city interceptor whenever it is built, despite the development-scale system being in place. So if the city does develop interceptors that would drain wastewater from those developments to an expanded centralized plant, then these development-scale systems would be stranded assets, sunk costs incurred simply to allow development to begin prior to completion of the city interceptor, then to be abandoned, basically wasting the fiscal resources required to install them.

It’s clear then that Dripping Springs could pursue a decentralized concept strategy to expand service capacity to encompass those developments only if each of them were to cooperate in planning, designing, permitting and implementing the decentralized system, instead of those development-scale centralized systems they’re presently planning to build. But of course, unless Dripping Springs presumes a leadership role, the developers have no impetus to consider that. They must presume they’d have to abandon any sort of development-scale system and run their wastewater “away” into the city’s centralized system whenever interceptors were extended to their properties.

To pursue a decentralized concept strategy it must be determined how such a system would be organized and how it could be permitted, given the “disposal”-centric focus of how TCEQ wields its rule system. This is a complex subject that does not well lend itself to this medium. Complicated by the decentralized concept remaining “non-mainstream” despite it having been out there for quite a long time – I defined the decentralized concept in 1986, and it was “ratified” as a fully legitimate strategy in a 1996 report to Congress, among other milestones – so its means and methods remain largely unfamiliar to regulators, engineers and operating authorities. Further, being designed into rather than appended onto development, the details would be sensitive to context; while there are recognized organizing principles, there is no “one size fits all” formula.

For the interested reader, a broad overview is “The Decentralized Concept of Wastewater Management” (in the “Decentralized Concept” menu at www.venhuizen-ww.com), and a basic review of those organizing principles are set forth in this document, reviewing wastewater management options in the nearby community of Wimberley. But a review of exactly how to design a decentralized concept system for any given project in and around Dripping Springs is properly the subject of a PERP for each project, not something that can be credibly described here, absent any context. The means and methods are, however, all well understood technologies that can readily be implemented to cost efficiently maximize reuse of this water resource. [Note that a stab at detailing exactly how to do a decentralized concept system in the context of one of the development in Dripping Springs’ hinterlands is offered in the next post.]

Highlighting that the most salient feature of a decentralized concept strategy in the context of this region is the “short-stopping” of the long water loops characteristic of the conventional centralized strategy, so that reuse of the water resource would be maximized at the least cost. It is this 21st century imperative that should motivate Dripping Springs and the developers working in that area to explore the decentralized concept. A necessary part of that exploration is to press TCEQ to consider how it interprets and applies its present rules, and perhaps to consider the need for “better” rules that recognize our current water realities. None of this can be served up for the city or the developers as a fait accompli in this medium; it is a job they have to undertake. One which we all need them to undertake, for the benefit of this region’s citizens, current and future.

But from all indications to date, it does not appear they will even try – they just can’t seem to expand their mental model of wastewater management to encompass it. The result of which is that most of this wastewater will live down to its name for a long time to come, driving us ever further away from sustainable water. So the question is posed: Can Dripping Springs, and the developers there, bust out of the 19th century – or will they choose to remain stuck there?

 

 

Water for DFW – Building-scale rainwater harvesting vs. Marvin Nichols

Posted August 7, 2014 by waterbloguer
Categories: Uncategorized

In the last post we reviewed the potential of building-scale rainwater harvesting (RWH) as a water supply strategy in the high-growth area around Austin, in Central Texas. Here, we examine its potential in another high-growth area of Texas, the Dallas-Fort Worth area, commonly called the Metroplex. And then we will contrast that strategy with doubling down on the watershed-scale rainwater harvesting strategy, as may be represented by the proposed Marvin Nichols Reservoir.

To gain an appreciation for the potential of building-scale RWH in and around the Metroplex, modeling was executed for the following locations: Athens and Terrell to the east-southeast, Ferris closer in to the south, Cleburne to the southwest, Weatherford to the west, Bowie to the northwest, Sherman to the north-northeast, and Denton closer in to the north-northwest. Ringing the Metroplex, these locations offer an overview of conditions all around it.

As was the case for the modeling results of the Central Texas locations, it was seen that “right-sized” building-scale RWH systems around the Metroplex would have provided 97-99% of total interior supply through the recent drought period for houses modeled with a presumed average water usage rate of 45 gallon/person/day. But around the Metroplex, the “right-sized” systems would be somewhat smaller than would be required around Austin. Recall that the “right-sized” system there to serve a 4-person household would be a roofprint of 4,500 sq. ft. and a cistern volume of 35,000 gallons. In Bowie, Weatherford and Cleburne, the “right-sized” system for a 4-person household would require only 3,750 sq. ft. of roofprint, paired with a 25,000-gallon cistern in Cleburne and Weatherford and a 27,500-gallon cistern in Bowie. All other locations would require 3,250-3,500 sq. ft. of roofprint and 20,000-25,000 gallons of cistern capacity. It is expected that a one-story house plan with a 2-car garage plus a “typical” area of covered patios/porches could provide a roofprint of 3,000-3,500 sq. ft., so these modeling results indicate many houses in/around the Metroplex would not require any “extra” roofprint to be added on.

As reviewed in the last post, a usage rate of 45 gallons/person/day should be readily attainable by most people, given a house fitted with the current stock of water fixtures, but a lower rate could be routinely attained by people even moderately attentive to conserving water. If a usage rate of 40 gallons/person/day were routinely attained around the Metroplex, the “right-sized” systems that would have provided 97-100% of total interior supply for a 4-person household through the recent drought period would require 3,000-3,500 sq. ft. of roofprint and 17,500-20,000 gallons of cistern capacity for a 4-person household.

Just as in Central Texas, with the baby boomers reaching retirement age and demographics tending toward more one and two-person households in all age groups, a significant part of the market might be made up of houses that could be “right-sized” for a 2-person occupancy. Modeling this occupancy around the Metroplex, at a water usage rate of 45 gallons/person/day the “right-sized” system that would have covered 97-99% of total interior demand would have a roofprint of 1,750-2,000 sq. ft. of roofprint and a cistern capacity of 10,000-15,000 gallons. At a water usage rate of 40 gallons/person/day, a “right-sized” system covering 97-100% of interior demand would require a roofprint of 1,750 sq. ft. and a cistern capacity of only 10,000 gallons, except for Bowie where a 12,500-gallon cistern would have been required. Since it is expected that a one-story house plan plus garage or carport and modest area of covered patios/porches would provide about 2,000 sq. ft. of roofprint, this market could use building-scale RWH without requiring any “extra” roofprint, and would incur relatively modest cistern costs.

So the water supply potential of building-scale RWH around the Metroplex is pretty clear. Yet there is not a mention of this strategy in the planning documents of state planning Region C, the area around the Metroplex. Actually there is no respect shown for this strategy in any of the regional plans, and the state water plan explicitly dismisses it, stating, “While it is often a component of municipal water conservation programs, rainwater harvesting was not recommended as a water management strategy to meet needs since … the volume of water may not be available during drought conditions.” Which is to say that because a “right-sized” system may need 1-3% of the total supply from alternative sources during severe drought periods, this strategy is deemed not to exist at all!

This is likely due to the water planners being guided by a mental model that does not comprehend building-scale RWH as a consciously chosen broadscale strategy, as perhaps the water supply strategy in whole developments.  This was the subject of an investigation, funded by the Texas Water Development Board, that I ran a couple years ago, in which it was brought out that this strategy confers a number of advantages relative to conventional – or watershed-scale RWH – water supply systems. One of the issues considered was provision of backup supply, but only on the basis of the “mechanics” of delivering it. Not fettered by the mainstream’s mental model, it had not occurred to me to question the whole strategy because some small amount of backup supply would no doubt be needed – indeed, the whole idea of “right-sizing” was to cover water demands in all but the worst drought periods and plan on providing a backup supply, presuming that the relieved capacity offered by building-scale RWH would make such a supply available from the sources so relieved.

Still, this does beg the question of from exactly where that backup supply would be derived. As noted in the last post, the building-scale RWH strategy should be considered in the context of “conjunctive management”. Building-scale RWH would divert the vast majority of the demand off of the conventional sources, so decreasing the routine drawdown of those supplies, thus leaving in them the capacity to provide the small amount of backup supply. Of course, if it is presumed that any development on building-scale RWH is in addition to rather than in place of development drawing from those conventional supplies, and that this other development would be of such extent that it would tax the available supply sources during those drought periods, then there may indeed be a question of whether the capacity to provide backup supply for building-scale RWH systems would be available. It will require another whole study to examine how a conjunctive management concept could work in practice. Until the mainstream water planners can get around their mental model and recognize the inherent potential of building-scale RWH, however, it is unlikely that any such study would get funded.

Around the Metroplex, however, modeling shows that, unless the drought gets more severe than has been experienced since 2007, essentially 100% of interior demands could be provided by upsizing the roofprint and/or cistern volume only a modest amount above what is reported above. The worst case would be in Bowie, where a roofprint of 4,000 sq. ft. and a cistern capacity of 30,000 gallons would be required for a 4-person household using water at a rate of 45 gallons/person/day.

So we can provide interior water usage with building-scale RWH, but why should we, rather than continuing to expand and perpetuate the watershed-scale RWH strategy? Consideration of the problems and hazards of building Marvin Nichols Reservoir offers some insights into that.

Marvin Nichols Reservoir would be located in northeast Texas, about 115 miles east-northeast of the Metroplex. The Region C report offers this about that project:

“As a major reservoir project, Marvin Nichols Reservoir will have significant environmental impacts. The reservoir would inundate about 68,000 acres. The 1984 U.S. Fish and Wildlife Service Bottomland Hardwood Preservation Program classified some of the land that would be flooded as a Priority 1 bottomland hardwood site, which is “excellent quality bottomlands of high value to key waterfowl species.” … Permitting the project and developing appropriate mitigation for the unavoidable impacts will require years, and it is important that water suppliers start that process well in advance of the need for water from the project. Development of the Marvin Nichols Reservoir will require an interbasin transfer permit to bring the water from the Sulpher River Basin to the Trinity River Basin. The project will include a major water transmission system to bring the new supply to the Metroplex.”

Unstated is that many people in the area that would be impacted are highly opposed to this project, due in large part to those “unavoidable impacts.” This is a battle of economic interests – those in the Metroplex that purport a need for this water vs. those, such as the timber producers, that would be eliminated by the reservoir. Indeed, the official position of the planning process in planning Region D, where the reservoir would be located, is in opposition to the project, and it is not included in their plan. This contrasts with deriving “new” water supply from building-scale RWH, which would have positive economic impacts in Region C – benefiting businesses that would design, install and maintain the building-scale RWH systems – and no negative impacts in Region D.

As noted, utilizing in the Metroplex any of the water collected in this reservoir would require a huge investment in transmission facilities – pipelines and pump stations – and on-going operating costs to maintain them and for energy to run the pumps. Of course the water would need to be treated, also entailing considerable energy requirements. Since it takes water to make energy, this would cut into the water use efficiency from this source. And making that energy would also generate greenhouse gases, which would exacerbate the already problematic impacts of climate change on regional water resources. This contrasts with the building-scale RWH strategy, which would not require any transmission facilities and would require far less energy to treat and pressurize the water for use within the building.

As the Region C report states, it will take a long time to permit and build this reservoir and the transmission facilities, meaning delivery of the first drop of water is decades away. In contrast, the building-scale RWH strategy could begin delivering water supply immediately, and grow in lockstep with demand, one building at a time.

The passage from the Region C report refers only peripherally to the ecosystem services that flooding the land would eliminate or damage, noting only loss of habitat for “key waterfowl species”, without quantifying how critical to the well-being or survival of those species that loss may be. That of course would be sorted out in the process of preparing the environmental impact analysis that will be required as part of the permitting process, another expense that would be obviated by the building-scale RWH strategy. But those ecosystem services go well beyond their impact on birds. Eliminating the timberlands loses the oxygen production and carbon sequestration they provide, along with habitat for many other plants and animals. Forests are also important to maintaining water quality and to the storage and release of water for environmental flows, which would instead need to be provided “artificially”, with water from the reservoir of degraded quality, including thermal impacts. None of these “externalities” figure into the cost of water projected for this strategy, significantly “warping” the analysis.

There would also be significant losses from the watershed-scale rainwater harvesting system this reservoir would create. Huge evaporation losses from the reservoir would be incurred, and there would be significant losses in the transmission system. In contrast, the building-scale RWH strategy would suffer no such losses.

The Region C report also states, “… the unit cost [of the water supply the reservoir would provide] is less than that of most other major water management strategies.” While at the end of the day the overall direct cost of Marvin Nichols Reservoir and its required infrastructure might be less than the aggregate direct cost of the number of building-scale RWH systems that would provide equivalent supply – which it is noted has not been developed in the Region C report for comparison – much of the cost of the former would need to be expended well up front of delivering the first drop of water to the Metroplex, and all that investment would be at risk. The costs of the building-scale RWH strategy, on the other hand, would be incurred incrementally, one building supply system at a time, so the delivery of supply would pretty directly track the capital requirements. This works with the “time value of money” to defray the global long-term cost of the building-scale RWH strategy. So it is not at all clear that the global cost of the Marvin Nichols option, even neglecting the externalities which the Region C report ignores, would be less.

In summary, broadscale implementation of building-scale rainwater harvesting may provide sufficient supply so that the conventional sources would be sufficiently “relieved”, allowing growth to be sustained without requiring new reservoirs. And it may do so at a cost that would be competitive with the global costs of continuing to extend and perpetuate the watershed-scale rainwater harvesting strategy, which would require going far afield to obtain additional new supply. Yet this is, quite consciously, the road not taken by the water planners in Region C. Or, as noted, anywhere else in the state where building new reservoirs, raiding remote aquifers, and other conventional supply strategies are purported to be needed to support projected growth. Time to re-evaluate?

 

Rainwater Harvesting for Water Supply – By The Numbers

Posted July 3, 2014 by waterbloguer
Categories: Uncategorized

In “Zero Net Water” the case was made for centering water supply on building-scale rainwater harvesting (RWH). Here we look in more detail into the potential of that strategy to provide water supply in Central Texas, parts of which are forecast to have considerable population growth over the next few decades. Since it is in new development where the Zero Net Water concept would be best applied, this area is a prime target for that strategy.

As reviewed in “Zero Net Water”, a modeling process was used to determine the “right-size” of a rainwater harvesting system to supply interior usage in houses. Modeling was executed presuming a 4-person occupancy in “standard” subdivisions and a 2-person occupancy in subdivisions targeted at seniors. A “right-sized” system is one that has a roofprint and cistern volume relative to the expected water demand profile such that backup supply would only be required in the worst drought years, and even then would be rather limited. This is specified so that the demand for backup supply in these houses from our “normal” supply sources would be minimized, and in recognition that a trucked-in backup supply – expected to be the dominant mode of providing that supply for a number of reasons that are not belabored here – would be stressed if backup supply requirements were not so limited.

First we examine locations tributary to the Highland Lakes, which currently provide the water supply for Austin and much of the area around the lakes, including such fast-growing places as Bee Cave and Dripping Springs. The inherently greater efficiency of building-scale RWH vs. watershed-scale RWH noted in “Zero Net Water” is illustrated by modeling these locations in that tributary area:  Brownwood, Burnet, Fredericksburg, Llano, Menard, San Saba and Spicewood. Only in Brownwood and Menard, located further to the north and west in this area, does the modeling indicate that any backup supply would have been required after the extreme drought year of 2011, while the “right-sized” RWH systems would have provided all the interior water supply since then in all the other locations. This contrasts to how the lakes have “performed” as the watershed-scale “cistern” over that period, as they remain chronically low, not “recovering” after 2011 in the way the “right-sized” building-scale RWH systems would have.

The “right-sized” building-scale RWH systems would have provided 95-98% of the interior demands over the recent drought period at these locations. Using building-scale RWH for interior water supply would have relieved the lakes of having to provide that supply, thus they would have been drawn down more slowly if that had been a broadscale practice. So even though backup supplies to provide the 2-5% deficit may have been drawn out of the lakes – or withdrawn from streams flowing into them – the overall result would have been to significantly conserve region-wide water supply over the modeling period.

Now looking at Austin proper, and at Dripping Springs, as representative of the high-growth areas in this region, we see that a “right-sized” building-scale RWH system would have provided 96-98% of interior demands in the recent drought period through 2013. Indeed, even with 2014 having been very dry well into May, the models show that no backup supply would have been required to date in 2014 as well.

Based on a modeled demand rate of 45 gallons/person/day and an occupancy of 4 persons, “right-sized systems for single-family homes around Austin and Dripping Springs require 4,500 sq. ft. of roofprint and a 35,000-gallon cistern to have provided 97-98% of interior demand through the current drought period. These are fairly large, and would impose significant costs, so the impact of better demand control – water conservation – was also examined.

A demand rate of 45 gallons/person/day is reported by the American Water Works Association to be routinely expected for a residence equipped with state-of-the-art fixtures in which the users give “reasonably” conscientious attention to demand control – e.g., it presumes minimal leakage losses, “reasonable” showering time, etc. It is understood, however, that better demand control is readily attainable. My personal experience is a case in point. According to our winter water bills my wife and I have an average interior demand rate of 37 gallons/person/day for our two-person household. As we are served by the watershed-scale Austin Water RWH system, not a building-scale RWH system, we have no particular impetus to “highly” conserve, as would a rainwater harvester who could see the cistern volume dwindling when rain is scarce. The only “highly” efficient appliance in our house is a front-loading washing machine; all the rest are 1990s-era fixtures. One can conclude, therefore, that something in the range of 35-40 gallons/person/day is a demand rate that is readily attainable without any “crimping” of lifestyle.

Indeed, a lower demand rate is typically presumed by those who design and install building-scale RWH systems, with 35 gallons/person/day being routinely presumed. So the models were also run using a demand rate of 40 and 35 gallons/person/day. At 40, a “right-sized” system that would have attained that same 97-98% coverage of interior water demand requires 4,000 sq. ft. of roofprint and a 30,000-gallon cistern. At 35 gallons/person/day, 96-97% of interior demand would have been covered with a 3,500 sq. ft. roofprint and a 25,000-gallon cistern. All these results presume 4-person occupancy in the house, which is above what demographics indicates is the average household size in most single-family residential developments around Austin and in the Hill Country, so it is expected that this sizing criteria would adequately supply the demands in most new houses.

These findings indicate that attaining very good demand control can significantly decrease the scale of facilities needed to “right-size” the building-scale RHW system, which would significantly reduce their costs. A single-story house plus garage and a “normal” area of covered porches/patios might provide 3,500 sq. ft. of roofprint, so an RWH house “right-sized” for a demand rate of 35 gallons would not require “extra” roofprint to be fit into the plan, so would not entail a cost increase to provide the required roofprint. And with the cistern being the costliest component of a building-scale RWH system, reducing its size contributes significantly to rendering the overall system more cost efficient.

With the baby boomers coming to retirement age, and single people and “DINKS” (dual income, no kids) being significant demographics, many building-scale RWH systems may be sized to serve 2-person households, for which the “right-sized” systems would be much smaller. Modeling in Austin and Dripping Springs shows that, with a demand rate of 45 gallons/person/day, a roofprint of 2,500 sq. ft. and a cistern volume of 17,500 gallons would have covered 97-98% of interior demands through the recent drought period. At a demand rate of 40 gallons/person/day, this result would have been attained with a roofprint of only 2,000 sq. ft. along with that 17,500-gallon cistern. If demand rate averaged 35 gallons/person/day, then a roofprint of 2,000 sq. ft. along with a 12,500-gallon cistern would have covered 97-98% of total interior demand. A small single-story house plus garage or carport and a “reasonable” area of covered porch/patio would provide that 2,000 sq. ft. roofprint, thus requiring no “extra” roofprint to be paid for. So, with significantly smaller cisterns being required, this market could more cost efficiently employ a building-scale RWH water supply strategy.

A model was also run covering the drought of record period from the late 1940s to the mid-late 1950s. The worst portion of that drought was from 1950 to 1956. Model results show that for all the scenarios reported above, a “right-sized” building-scale RWH system would have covered 92-95% of the interior water demands through that period. Comparing the rainfall deficits relative to long-term averages, it is seen that the 1950-1956 period was somewhat more “intense” than the recent drought period; while 2011 was the worst year on record, overall the current drought has not (yet) approached the severity of the drought of record. Even under the drought of record condition, however, it is seen that a “right-sized” building-scale RWH system would have provided the vast majority of interior water demands.

Many commercial and institutional buildings would also have a roofprint to water demand ratio that would be favorable to building-scale RWH. For example, a system for a two-story office building in which water usage rate is 5 gallons/person/day (typical toilet and lavatory use by an office employee) might have provided ~99% of water demand through the recent drought period. Whole campuses of such buildings might be built without having to install any conventional water and wastewater infrastructure, using wastewater treated at the building scale, perhaps supplemented by condensate capture, to supply toilets and all irrigation of the grounds, so allowing a smaller cistern to be installed, or allowing a higher water usage rate – e.g., to also cover food service – while still providing essentially all the demand. Capturing roof runoff in the RWH system would also reduce the stormwater management problem in such a development, enhancing the benefit of this strategy.

We can see therefore that building-scale RWH has great potential for relieving stress on the watershed-scale RWH systems that compose our “normal” water supply strategies, and could blunt the need for such high-cost options as desalination, direct potable reuse, or long-distance transfers from remote water sources. So even though building-scale RWH is relatively expensive in capital costs, it may be cost efficient relative to other options, while also offering low long-term operating costs.

One of those costs is for energy to pump and treat water. Building-scale RWH is a strategy that would entail relatively low energy use. Since the water loop is “tight”, water would be pumped only very short distances with little elevation head to overcome. This would save even more water, since it takes water to produce electricity to drive pumps – the so-called “water-energy nexus”.

On the basis of water usage efficiency, then, the building-scale rainwater harvesting strategy is well-worth serious consideration as a major means of serving the increasing demands which would be imparted by the projected growth in Central Texas. The same can be demonstrated for other high-growth regions in Texas, such as the Dallas-Forth Worth area.

Yet the present State Water Plan utterly rejects building-scale RWH as having any merit as a water supply strategy. I am told the reason for this is because the mental model of our controlling institutions sees building-scale RWH as “unreliable” because the cisterns may run dry during severe drought and require those minor fractions of total supply to be added to them from other sources. The counter to this is to think of it as “conjunctive management” of the total water resource, with the RWH systems diverting demand from other sources, decreasing their routine drawdown so that they have the capacity to provide the backup supply.

This highlights that, as noted in “Zero Net Water” there are challenges to be addressed, but those challenges may be less problematic than those posed by desalination, direct potable reuse or long-distance transfer schemes. So water policy makers should be called upon to recognize this clear potential and to incorporate this strategy into their water planning going forward.

It is noted in closing that the analyses reported in this post addressed only interior water usage. As reviewed in “Zero Net Water”, that concept envisions exterior usage – irrigation – to be largely supplied by localized reclamation and reuse of the “waste” water produced in the buildings being supplied by building-scale rainwater harvesting. In itself that tight-looped “decentralized concept” of wastewater management is a more highly efficient strategy – in regard to both money and water – than the conventional long-looped “regional” system, as was generally reviewed in “It’s the infrastructure, stupid”. That aspect of the Zero Net Water concept will be further considered in a future post.

 

Stormwater Management Can Be “Green” Too

Posted March 16, 2014 by waterbloguer
Categories: Uncategorized

Tags: , ,

Meet Dr. Katherine Lieberknecht. She is a professor in the University of Texas School of Architecture who proposes the revolutionary idea that stormwater runoff can – and should – be managed as a water resource, rather than as nuisance to be drained “away” as “efficiently” as practical. This is “revolutionary”, of course, only to the conventional mindset, whose nuisance-centric mental model of stormwater management unfortunately continues to hold sway over much of the regulatory machinery and design community. Coming to understand the management strategy that Katherine advocates and to move that to the fore of mainstream practice is a deep conservation strategy; it is how we can make stormwater management “green” too, helping to move us toward sustainable water.

Katherine suggests capturing natural cycles of water movement through the landscape at the neighborhood scale, asserting this would lead to cost savings – including, notably, energy savings by minimizing irrigation needs – and would enhance habitat value of the landscaping. Medians, rights-of-way, parks – just about any greenspace can be made multi-functional, designed to hold onto stormwater instead of to shed it, serving as beautification and accomplishing water quality and quantity management goals. In this vein, Katherine suggests that a community attitude should be fostered. Shared, collaborative and cooperative “storage” of stormwater is urged, utilizing otherwise relatively hydrologically functionless areas – for example church grounds – to hold more water on the land. Throw in green roofs and the areas covered by buildings can also be made more hydrologically functional, holding water on the site instead of making it flow “away”. Or harvest the roof runoff and store it in cisterns to provide irrigation supply, which also effectively holds more water on the land.

The landscape-based strategy can be manifested in two basic ways. One is to design the greenspace as “formal” water quality management devices, such as bioretention beds, that infiltrate the water that flows into them. The other is to use a type of landscaping, such as wildflower meadows or restored native prairie, that naturally hold more water on the land, and perhaps to enhance that by utilizing permaculture techniques that create micro-ponding areas to further increase the amount of runoff that gets infiltrated. Consulting a table of curve numbers (CN) – a parameter that determines the propensity to shed or infiltrate runoff, the higher the number, the more runoff – shows that for a Group D soil (the class that produces the most runoff), a conventional turf landscape would have a CN of about 84 while a wildflower meadow or native prairie would have a CN of about 73. This would create a very significant reduction in runoff. And result in a whole lot more water held on the land, contributing to deep soil moisture and so maintaining the landscape better through drought periods.

The native landscape would also demand far less routine maintenance – very little mowing, no fertilization, little or no irrigation. Bringing up … A critical aspect of this, Katherine stresses, is follow-through with appropriate O&M, to assure that the hydrologic function of this distributed system is maintained. In setting forth this overall concept to the local regulatory system, it is indeed a concern for the O&M costs of many small installations that is their major objection to “mainstreaming” this concept. This highlights that we must use lower O&M strategies, like “passive” infiltration rain gardens and low-maintenance native plantings, as the mode of implementing this distributed concept.

But considering this concern for O&M also highlights the inherent resilience of this decentralized concept of stormwater management. The “failure” – however one might define that – of any one distributed component – e.g., “clogging” of a rain garden – impacts on only a very small part of the overall system. And so that overall system would continue to provide good overall performance, assuming there is sufficient O&M provided so that the occasional such failures would be detected and corrected before they become so widespread as to meaningfully impact on the overall system. Again, by choosing practices which would entail low O&M to begin with, such as choice of landscaping and passive infiltration devices, that “sufficient O&M” would entail a fairly low liability.

Another component of keeping O&M manageable would be education, so that the property owners would understand what that landscaped depression in the corner of their lot is and what function it provides, so that it would be less likely that they might do something “stupid”, like fill it in or radically alter the landscaping. The literature on Low-Impact Development (LID) – of which this distributed “green” approach is an exponent – universally notes that education is a fundamental component of that strategy. The concerns of the regulatory system could be blunted by requiring that the educational component be part and parcel of implementing that strategy.

This whole idea of rendering the landscape more hydrologically functional and lower maintenance, along with allied practices, can yield a number of benefits. To quantify some of those benefits, let me introduce Tom Hegemier. Now working in the private sector, Tom produced some estimates of water savings potential from pursuing various distributed strategies when he worked on water supply issues at the Lower Colorado River Authority, the agency that manages the lower part of the Colorado River in Texas. Tom’s calculations are not belabored here – I can provide the methodology to anyone who is interested – but they indicate there is huge potential for water savings.

Basically, Tom asked what if half of all new housing built in Travis County – where Austin, the largest city in the lower Colorado basin, is located – between now and 2040 utilized one or more of a suite of water management options. These include:

  • The “Hill County Landscape Option”, which minimizes turf in favor of native plants and emphasizes improving the soil so that its water holding capacity is enhanced. This would result in significantly decreased demand for landscape irrigation water.
  • Building-scale rainwater harvesting to capture water to provide landscape irrigation water.
  • Wastewater reclamation and reuse to defray landscape irrigation water demands.

Tom’s estimates of water savings were as follows:

  • Application of the Hill Country Landscape Option – water demand reduction ranging from 12,000 to 15,000 acre-feet per year.
  • Application of the Hill Country Landscape Option plus rainwater harvesting – water demand reduction ranging from 17,000 to 19,000 acre-feet per year.
  • Application of the Hill Country Landscape Option plus wastewater reuse to defray landscape irrigation demands – water demand reduction ranging from 20,500 to 24,000 acre-feet per year.
  • A combination of all three strategies – water demand reduction ranging from 25,000 to 28,500 acre-feet per year.

As a point of comparison, the total water use in Austin was about 170,000 acre-feet in 2011, and it is projected to be about 300,000 acre-feet in 2040. So a 50% penetration of just these site-based strategies would accomplish almost a 10% reduction in demand by 2040. Tom went on to ask, what if in addition to these strategies, LID practices like those advocated by Katherine plus rainwater harvesting were more universally employed as the manner in which stormwater is managed as sites are developed, opining that this would move development a long way toward being “water neutral”. Or what I set forth in the previous post to this blog as Zero Net Water.

An underappreciated facet of the savings potential is the reduction in peak water demands these strategies can offer. In the climate of Central Texas, annual peaking is driven by irrigation demands. So when irrigation demand is reduced, peaking is reduced. And when peaking is reduced, the sizes of all manner of water supply infrastructure can be reduced, or their implementation can be put off further into the future.

None of this is inherently difficult to accomplish. It hinges on the choice to view rain falling on the site as a water resource to be husbanded to the maximum practical extent, instead of as a nuisance to be shed and made to go “away”. The means to do this, in the physical sense, are readily available and are largely cost efficient. Institutionally, it is “merely” a matter of making that choice and setting regulations and accepted best practices to husband that water resource. Particularly in areas like Central Texas, where water supplies are becoming stressed by growth and that stress is being exacerbated by chronic drought, it is high time that our controlling institutions make that choice.

 

Zero Net Water

Posted January 21, 2014 by waterbloguer
Categories: Uncategorized

A sustainable water development concept for the Texas Hill Country – and beyond

Imagine a water management strategy that would accommodate growth and development without unsustainably pumping down aquifers or incurring the huge expense and societal disruption to build reservoirs or transport water from remote supplies to developing areas. Welcome to the concept of Zero Net Water.

As the name implies, Zero Net Water is a water management strategy that results in zero demand on our conventional water supplies – rivers, reservoirs and aquifers. Under the Zero Net Water development concept, water supply is centered on building-scale rainwater harvesting, “waste” water management centers on project-scale reclamation and reuse, and stormwater management employs distributed green infrastructure to maintain the hydrologic integrity of the site. Together these result in minimal disruption of flows through a watershed even as water is harvested at the site scale and used – and reused – to support development.

The key is taking advantage of the difference in capture and distribution efficiency between a building-scale rainwater harvesting system and the watershed-scale rainwater harvesting systems that compose all of our conventional water supplies. The basis for this is illustrated in the schematics below.

WATERSHED_SCALE3 copy

[click on image to enlarge]

The prevailing conventional water supply strategy – again, this is watershed-scale rainwater harvesting – is illustrated in this schematic. Typically only a very minor fraction of the total rain falling onto the watershed makes it into the “cisterns” of that rainwater harvesting system – the aquifers and reservoirs. The rest is lost to evapotranspiration, a “loss” which maintains the ecology of the watershed. Water that does make it into reservoirs is subject to high losses to evaporation (see a discussion of the severity of that here). So the inherent capture efficiency of this system is quite low.

Water supply produced from these watershed-scale “cisterns” is distributed to points of use – where some of the rain fell to begin with – a process which also suffers significant losses. Water industry standards recognize a 15% water loss in the distribution system as “good” performance, and many water systems have much greater losses. So here too we suffer an inherent inefficiency in turning rainfall into water supply that is available for human use.

BLDG_SCALE_RWH_Water_System

[click on image to enlarge]

The building-scale rainwater harvesting concept is illustrated in this schematic. Close to 100% of the rain falling onto a rooftop can be captured and converted into usable water supply. There will be some losses to cistern overflows in large storms or when there is an extended period of wet weather, so the actual efficiency will vary with weather patterns, but in a system properly sized relative to the water usage pattern, it will be consistently very high. The building-scale distribution system is typically – and can practically be – maintained “tight” so there would be negligible distribution losses.

This high capture and distribution efficiency allows the water supply to be “grown” in fairly direct proportion to water demand, one building at a time, thus rendering this a more sustainable water supply strategy. And because the water supply would be provided, and paid for, only to serve imminent development, this strategy is also economically efficient, and thus more fiscally sustainable.

An immediate, practically knee-jerk, objection to this water supply strategy is that harvesting rainwater off rooftops would “rob” the watershed of streamflow and/or recharge, and would thus produce no net gain in the available, usable water supply. As can be inferred from the illustrations above, this is not the case. When not directly harvested, a large majority of roof runoff would be abstracted in the watershed. In any case, when land is developed, the amount of rainfall that becomes quickflow – water that runs directly off the land – increases, and the amount that infiltrates is reduced. Because of the other impervious surfaces besides the rooftops that development adds, the volume of runoff would typically increase even if building-scale rainwater harvesting were to be implemented on all the buildings in the development, as noted in the illustration below.

BLDG_SCALE_WATER_CYCLE

[click on image to enlarge]

Indeed, because development increases runoff, development regulations generally require that steps be taken to treat and detain this excess runoff. Broadscale practice of building-scale rainwater harvesting can actually reduce the magnitude of this problem. The net result in any case is that the post-development runoff volume is typically greater than the predevelopment runoff volume, thus there would be no “robbing” of flows into the watershed-scale water supply system, relative to the pre-development flow regime.

In any case, the water sequestered in the building-scale cisterns is not removed from the watershed. Its release back into the hydrologic cycle is simply delayed. Most of this water, once used in the building, appears as wastewater flow. As reviewed below, and illustrated in the schematic above, under the Zero Net Water concept this flow would preferably be used to defray irrigation demands, so doing a better, more targeted job of maintaining some of the plant life in the watershed. If instead the wastewater were discharged into streams (after treatment of course), the result would be to create a more steady flow of this water over time, as opposed to the “flash” hydrology imparted by direct runoff from the rooftop.

A simple way to encapsulate all this is that we capture and utilize on site much of the additional runoff imparted by placing impervious surfaces over the land. We do this instead of allowing this additional runoff to become an increased quickflow that, if not mitigated in some other way, creates water quality, channel erosion, and flooding problems. So bottom line, broadscale practice of rainwater harvesting off all the buildings in a watershed would actually improve the overall yield from the watershed of water that would be directly usable by humans, without any significant impact on the rest of the ecology, in particular on “environmental flows” in our rivers.

“Right-Sizing”

There is a caveat on the “zero” in Zero Net Water. The cistern in a building-scale rainwater harvesting system operates in the same manner as a reservoir in a conventional surface water supply system – it stores the water for future use. Just like a reservoir, a building-scale cistern has a “firm yield” that will cover a given water demand profile. The building-scale cistern is typically sized to cover most conditions, with imported backup supply added to get through the worst drought periods. Considerations of cost efficiency and the sustainability of the backup supply system lead to the concept of “right-sizing” of the building-scale rainwater harvesting system. This is the combination of roofprint and cistern volume relative to the expected water usage profile that would result in only limited backup supply requirements, needed only during the worst drought periods.

The backup supply would of course be drawn from the conventional water supply systems, from aquifers and/or reservoirs. So there would be some small draw of water from the watershed-scale system to get the building-scale rainwater harvesting systems through the droughts. The magnitude would depend on how well the building-scale systems were “right-sized” and on whether the users of those systems practiced “sufficient” conservation, and also of course on the happenstance of the rainfall patterns over the area. Still, modeling indicates that the vast majority of the water supply for these buildings would be provided by direct capture of the rainfall onto the building’s roofprint.

The “right-sized” facilities vary around the state, depending on the area’s climate. In the Texas Hill Country, a 4-person household which is “reasonably” conservative with their water use typically requires a roofprint of 4,500 sq. ft. and a cistern volume of 35,000 gallons. These sizes could be decreased if the users practice very good water conservation. Most cost efficiently incorporating “extra” roofprint, and perhaps integrating the cistern into the building envelope, are the province of building design concepts. It is suggested that efforts be made to formulate a “Hill Country rainwater harvesting vernacular” house design concept to address those matters. This needs to be taken up by architecture schools, working architects and homebuilders.

Wastewater Reuse

That “right-sized” system noted above will only cover interior water use. To supply landscape irrigation directly from the cistern would require either a significantly larger system or would incur significantly greater backup supplies. However, there is a flow of water right there, water that has already been provided for use in the house – the wastewater flow out of the house. This flow can be treated and dispersed in a subsurface drip irrigation field to defray landscape irrigation demands. Modeling shows that doing this, a sizable area of irrigated landscaping can be maintained without having to either upsize the cistern and roofprint or incur much greater backup supplies.

This strategy was reviewed in “Slashing pollution, saving water – the classic win-win (but ignored by society)”. As set forth there, this sort of reuse system has been implemented on the site scale for over two decades, and doing so will provide superior environmental protection, particularly in sensitive watersheds. It is a small step to do this same process on a project scale, if the nature of the development requires that it employ a collective wastewater system, rather than an individual on-site system for each house. That project-scale reclamation and reuse concept must be part and parcel of the Zero Net Water concept if irrigated landscaping is to be supported.

Stormwater Management

As noted previously, development causes an increase in quickflow runoff at the expense of infiltration due to some of the ground area having been covered with impervious surfaces. These impervious surfaces also increase levels of pollution entrained in the runoff. So development regulations typically require that methods be implemented to blunt both the pollution and the impacts of the additional runoff on downstream flooding and on channel erosion. The building-scale rainwater harvesting systems can help to blunt all these impacts by sequestering roof runoff in the cisterns.

Runoff from the rest of the development and any cistern overflows can, and should, be addressed using distributed low-impact development (LID) practices, focusing on intercepting and infiltrating an initial depth of runoff deemed to have entrained most of the pollution. The aim of the LID strategy is to restore the rainfall-runoff response of the developed site as close as practical to that of the pre-development site. This matching of runoff to pre-development conditions would maintain the hydrologic integrity of the site, and by a multiplicity of sites so treated, would maintain the hydrologic integrity of the watershed. This whole area of “green” stormwater management is the subject of a future entry on this blog. Suffice it here to note that it is an important element of Zero Net Water, as it holds more water on the land and thus blunts the “desertification” of the site that development typically imparts.

Confirmation – and Challenges

Modeling indicates that for all locations in and around the Hill Country, “right-sized” rainwater harvesting systems would not have required any backup supplies after the severe drought of 2010-2011 broke in late 2011, even though the general impression is that drought has persisted in this region. One indicator of this is that water levels in Lake Travis and Lake Buchanan remain very low. Indeed, it is reported that inflow to the lakes in 2012 was the 6th lowest year on record and in 2013 it was the 2nd lowest, above only the extreme drought year of 2011. It is noted that this occurred despite total annual rainfalls over the drainage basins flowing into the lakes having been generally around the long-term average rainfalls there over those two years.

This is simply a confirmation that the capture efficiency of the building-scale rainwater harvesting system is inherently much higher than that of the watershed-scale system. The low inflows to the lakes are a happenstance of rainfall patterns, failing to create the large runoff events needed to significantly raise lake levels. But those same rainfall patterns would result in high capture efficiency off of a rooftop, and so the building-scale systems would not be under the same stress that persists in the watershed-scale system.

Despite the overall efficiency of the building-scale rainwater harvesting system, the Zero Net Water development concept faces challenges to becoming commonly practiced. The building design issues were noted above. The large roofprint required to “right-size” systems in Central Texas would require “right-sized” lots to accommodate it. And two-story houses would clearly be problematic under this concept. Multi-family housing, as presently configured, would also be hard pressed to provide roofprint commensurate with water demand. Then too storage cisterns would take up space, unless they were integrated into the building envelope. All that would have implications for development style, and so would require some tinkering with prevailing development models.

On the other hand, with a typical occupancy of only 2 persons, water demand in seniors-oriented developments – which may be a considerable portion of new development in Central Texas – would be supported by the roofprint typically provided by a one-story house plus garage. Many commercial and institutional buildings would also have a favorable relationship of roofprint to water use in the building. Indeed, as asserted in “First ‘Logue in the Water”, these would be prime candidates for a Zero Net Water strategy. Employing some combination of building-scale rainwater harvesting, condensate capture, and project-scale wastewater reclamation and reuse, those types of buildings would draw no water from the watershed-scale systems. That would relieve a significant portion of demand due to growth, and as a bonus would also blunt stormwater impacts in those sorts of projects, which typically entail high impervious cover, the roofprint being a significant portion of it.

Cost is, of course, a primary consideration, for society at large as well as development principals. Where there is already a conventional water system nearby which has capacity to provide service to the development, the cost of the building-scale systems could not be justified relative to installing conventional distribution infrastructure. Wherever capacity is a problem, however, then the real cost of increasing capacity has to be figured in. Where those costs are very large – e.g., building a new reservoir or tapping a remote aquifer and building pipelines to deliver water from those to growth areas – then building-scale facility costs may be globally competitive. And as noted previously, building-scale facilities require money to be spent only to support buildings as they are built, while those area-wide strategies require huge investments up front of being able to sell the first lot, so the Zero Net Water concept is inherently economically efficient.

We must indeed consider costs globally, not just the immediately apparent costs of continuing with “business as usual”. This comes starkly into play in the Hill Country, where aquifers are under stress even at current usage rates, and serving considerable new development out of them will only “mine” them further. The drawdown created is drying up springs that historically flowed all throughout the Hill Country. That creates a cost to the local ecology, and will fundamentally alter the character of the region. The impact of this degradation was encapsulated in the title of an article appearing in the Texas Observer a couple years ago, “The End of the Hill Country”. This is not to mention reducing water availability from the rivers flowing out of the Hill Country, water which is depended upon for both water supply and ecological services all the way to the Gulf of Mexico. Thus the Zero Net Water development concept may be particularly valuable in the Hill Country.

While the Zero Net Water development concept faces fiscal and institutional challenges, the prospects for sustainably accommodating growth in a globally more cost efficient manner urge its consideration as a water management strategy for new development over much of Texas. Particularly in areas like the Hill Country, where aquifers are under stress and the only other option is a long distance “California style” water transfer from remote aquifers or reservoirs, which may entail both fiscal and ecological sustainability issues. The Zero Net Water concept offers a pathway toward sustainable water even where high growth rates are forecast. It remains only to address the challenges and to put it into practice.

One More Generation

Posted October 29, 2013 by waterbloguer
Categories: Uncategorized

By the rude bridge that arched the flood,

Their flag to April’s breeze unfurled,

Here once the embattled farmers stood

And fired the shot heard round the world.

                                                                                                                                                                                                                             – Ralph Waldo Emerson

The “rude bridge” around here is in Lee County. Rather than a store of weapons, the “embattled farmers” there, and in neighboring Bastrop County, are defending the long-term sustainability of their water supply, and thus their own economic future. The British Redcoats in this analogy are the “water hustlers” and their allies who are attempting to gain the ability to pump the aquifer storing that water supply at unsustainable rates, which will result in large drawdown, and eventual depletion, of that aquifer as a usable water supply. In short, Bastrop and Lee counties are seen as water “colonies”.

One ally in particular, Hays County, has entered into an agreement with one of those water hustlers to create an unsustainable draw on that aquifer. Purportedly this is to meet the future water needs in Hays County, on the presumption its recent growth trend will continue for the next few decades. This is simply a taking from the future of the “colonies” to secure their own. That’s why King George was taxing the American colonies, right? You might say, therefore, that a shot has been fired in the Great Texas Water War. How far and wide that shot is heard remains to be seen.

According to the groundwater model which is accepted by the Texas Water Development Board as an accurate picture of the impact on this aquifer, the drawdown which the currently demanded pumpage would create indicates that the Simsboro Aquifer in Lee County would be significantly dewatered, headed toward depletion, in one more generation. And that is just considering the water to be exported, leaving little to support economic development in “the colonies”. Indeed, it is projected that, if the current drought in this region endures, Bastrop and Lee counties will have a water supply deficit for their own municipal needs even without that water being exported.

Coincidentally, one more generation is how far my “horizon of explicit concern” for the well being of regional society has just been extended, with the recent birth of my first grandchild. Which raises the question, what would I have actors like Hays County do to assure there is water available to sustain a healthy society in this region through that child’s lifetime? Isn’t it a function of government to take actions to assure a secure water supply for its citizens on into the future?

Sure, but does that really need to be done at the expense of the economic future of neighboring counties, and in a manner that is not sustainable? Understand that Hays County feels it needs water from Lee County in order to serve an expanding population over the next 50 years, but that population will not – everyone hopes at least – just go away then. Rather it will still be there, continuing to need water for the 50 years beyond that, and so on into the future. So when they suck Lee County dry, then what? A pipeline from the Great Lakes?

It should be quite clear, therefore, that the extractive once-though 19th century water infrastructure model that Hays County – and all the rest of regional society – seems intent on perpetuating simply cannot be sustained in this region, certainly not by actions like “mining” the Simsboro Aquifer. This hits on the continuing theme of this blog, that we need to transcend the mental model that (mis)informs that course of action and transition to a water infrastructure model that will lead us toward, rather than ever further away from, sustainable water.

As reviewed in previous posts, a sustainable water infrastructure model will impart deep conservation – durable increases in water use efficiency that are inherent in the water management methods being employed. Hays County, and similarly situated entities concerned about future water supply, could be leaders in moving society toward sustainable water, rather than ever further away with the sort of essentially “stopgap” (from a long-term perspective) projects like the water grab they are pursuing – really a slow-motion rearranging of the deck chairs as the ship goes down.

What we are seeing here is a classic “tragedy of the commons”. Indeed it is an enduring tragedy of the human condition that what is perceived to be needed for short-term well being, particularly of the large fiscal interests that exert great influence on our controlling institutions, is rather blindly pursued without much regard to the long-term well being of society. So understanding that short-term interests will invariably prevail over the long-term implications, the task at hand is to show how those short-term interests can be adequately served in a manner that would not run us into a box canyon – having promoted, in effect, a population in this region by drawing down its water supply to the point it is effectively depleted, leaving that population high and dry.

It should be understood that, besides its lack of sustainability, there are fiscal reasons to question that course of action. Consider the pitfalls of pursuing a long-distance “California-style” water transfer scheme in a place like Hays County. It seems to be just presumed that the water price this would induce would be deemed “affordable”, perhaps simply because it is presumed that no other options exist, thus since people need water they will pay whatever that price turns out to be. But is that growth actually “manifest destiny”, whatever it will cost?

The purported need for this additional water supply is to support a projected growth in population, in the case of Hays County a projected 4-fold growth by 2040. It should be clear that the upward-trending “J-curve” growth that Hays County has experienced over the last few decades is not any more sustainable than is the over-pumping of the Simsboro Aquifer. Indeed, that growth is predicated on simple extrapolation of historical trends which were based on societal conditions over the period during which those trends were observed. The State Water Plan, which projects the populations upon which future water demand is predicated, states, “… the county’s population is projected one year at a time by applying historical growth rates, survival rates, and net migration rates to individual cohorts ….” [emphasis added]

Much of the projected growth would be due to net migration to/from the area. That decision to come, or to stay or go, would be predicated on a number of factors, the cost of living being among them, and a hugely inflated cost of water would impact upon that. But perhaps more basic is the prospect of a job in that area, and the cost of water will be a factor in a job creator choosing to establish operations in an area. Thus the future cost of water imposed by pursuing that long-distance water transfer scheme would seem to be a very important consideration, one which may significantly alter the conditions on which the growth projections are based.

In the particular case of Hays County, it does not appear that much has been done to evaluate what impact this would have on the cost of water there. Hays County Judge Bert Cobb has stated that no plans have been evaluated to even pipe the water to Hays County, much less to treat and distribute this water within the county. Indeed, the $5 million that Hays County agreed to pay their water hustler ally reportedly is just a “reservation fee” and the price of water at the wellhead is not even clear. So at present it appears that Hays County does not know the cost of water this supply scheme would create.

I was involved in a study of water supply alternatives conducted by Hays County in the late 1980’s which projected water prices due to importation schemes from sources much closer than Lee County of up to $20/thousand gallons. In a study conducted by Hays County a couple years ago exploring water supply options for the western portion of the county, the price to import water from a relatively nearby source into the Wimberley Valley ran above $10/thousand gallons, even though the water treatment costs were not accounted for in that analysis. These prices compare with those typically charged at present by water supply entities in Hays County in the range of $3/thousand gallons. So it can be anticipated that scheme to raid the Simsboro Aquifer would lead to severe rate shock in Hays County.

An issue is that such prices increases would not kick in until the investments have been made and the water starts flowing into the county. Again, Hays County’s presumption is that this water is “needed” to enable the projected growth, but those investments – pretty much an “all or none” proposition – would have been incurred prior to that growth being in place, on speculation so to speak. The price signal that would urge pursuit of alternate strategies would lag the decision to impose the higher prices. Thus, a possibility would be that they built it but no one came – the very high cost of water deflected growth elsewhere.

This brings us back to the argument that we should first pursue those alternate strategies – a sustainable water infrastructure model, a more resilient, more decentralized infrastructure, imparting deep conservation. A model that can be implemented only as required to serve imminent development, thus matching required investments with actual growth. The central questions about any such strategy are of course, where will the water come from, and what will that water cost?

As reviewed in previous posts, a fundamental transformation of the form and function of the water infrastructure system can “wring” considerable more function out of existing supplies. And it is argued that this “new water” could be made available while saving money, because the sustainable water infrastructure model would be more cost efficient – as reviewed, for example, in “It’s the infrastructure, stupid”, in “A $13 million failure of imagination in Center Point” and in “Motherless in Bee Cave.” Imagine indeed if growth in irrigation demand were – very cost efficiently – supplied by distributed wastewater reuse instead of imported water. And as pointed out in “Irrigation efficiency – a new ‘reservoir’ for your city”, there is huge potential for relieving current demand for irrigation water, freeing up that amount of existing water supply for growth. And then there is a move to a more regionally appropriate landscaping ethic, centered on native plants that do not require much irrigation, even in severe drought, which could also free up a considerable amount of the existing water supply.

All this may pale, however, in comparison with a move to a “zero net water” development model. Under this concept, water supply would be centered on building-scale rainwater harvesting, rather than on the watershed-scale rainwater harvesting model which composes all of our conventional water supply systems. The inherent efficiency at which rainfall collected directly off a roof can be converted to a water supply usable by humans is close to 100%, and there is no transmission loss in a building-scale system. In the watershed-scale system, only 10-15% of rainfall onto a watershed typically makes it into an aquifer or a reservoir, then there is very high evaporation loss from reservoirs – up to 50% is reported – and considerable transmission loss in distribution of the water to points of use – 15% loss is considered excellent by the water industry, and much higher losses are commonly experienced. So due to that large increase in capture and delivery efficiency, the building-scale rainwater harvesting system can essentially “grow” the water supply in proportion to water demand, one building at a time.

Of course, the building-scale rainwater harvesting system will have implications for building design. In Hays County, for example, modeling indicates that a roofprint of about 4,500 sq. ft. would be required for a typical 3-bedroom home to be essentially water-independent; that is, needing a very limited amount of backup supply from the watershed-scale system only during the most severe drought years. The house plus a garage and covered patios/porches in “standard” one-story house plans would typically provide 3,000-3,500 sq. ft. of roofprint, so would require additional roofprint to be built on. And typically having an even smaller roofprint, two-story houses would clearly be problematic under this concept. Multi-family housing, as presently configured, would also be hard pressed to provide roofprint commensurate with water demand. Then too storage cisterns impose a considerable cost, and take up space. All that would have implications for development style, and so would require some tinkering with prevailing development models. How cost efficient that water supply strategy may be would depend on the setting – sustainability of groundwater in the face of continuing development, distance of the development from an existing waterline, etc. – and of course the price of using instead a piped-in water supply, which as noted may become way higher than currently prevailing prices.

On the other hand, with a typical occupancy of only 2 persons, water demand in seniors-oriented developments – which may be a considerable portion of new development in Hays County – would be supported by the roofprint typically provided by a one-story house plus garage. The same would be so for many commercial and institutional buildings. Indeed, as asserted in “First ‘Logue in the Water”, employing some combination of building-scale rainwater harvesting, condensate capture, and project-scale wastewater reclamation and reuse, those types of buildings would draw no water from the watershed-scale systems. That would relieve a significant portion of demand due to growth, and as a bonus would also blunt stormwater impacts in those sorts of projects, which typically entail high impervious cover, the roofprint being a significant portion of it.

Finally, as also noted in “First ‘Logue in the Water”, the energy demands of all this decentralized infrastructure would be significantly lower than required to run the prevailing infrastructure model. The low lift out of a cistern and the very short distance to the point of use impart drastically lower energy requirements for building-scale rainwater harvesting systems. Likewise, the greatly shortened water loops in a decentralized concept wastewater reclamation and reuse system would greatly reduce energy demanded by those systems. Since it takes water to make energy – this is the so-called “water-energy nexus” – moving to the sustainable water infrastructure model would save a lot of that water too.

As noted, places like Hays County could be the leaders in moving society toward a more sustainable water future in this region. Given the projected 4-fold population increase, that implies a lot of the growth would be on presently vacant land, to which no services have been extended, thus there is no sunk cost in conventional water and wastewater systems that need to be respected. In large part then they have a “blank canvas”. They can choose to “paint by numbers” and repeat in rote manner the prevailing extractive once-through 19th century infrastructure model, adding on the long-distance transfer of water from a source that would not be sustainable over the long term in a desperate attempt to extend that model’s usefulness one more generation. Or they can choose to move boldly into the 21st century and create a sustainable water future.

Motherless in Bee Cave

Posted September 23, 2013 by waterbloguer
Categories: Uncategorized

“I am the Lorax.  I speak for the trees.”

 So spoke the hero of the famous Dr. Seuss tale about the wanton destruction of his land’s forest resources by the Once-lers, intent on their profit and convenience of the moment.  Around here, we have the “Once-through-lers”, who seem intent on the wanton destruction of our land’s water resources, for their profit and convenience of the moment.  And just like the trees in the Dr. Seuss tale, no one speaks for the water.

 A most excellent example of this is being played out around Bee Cave, a fast-developing community in the Texas Hill County, just west of Austin.  An article in the Austin American-Statesman reviewed how the West Travis County Public Utility Agency (WTCPUA) – the entity that recently took over the wastewater system there from the Lower Colorado River Authority – has come to the realization that, with development activity in the area picking up steam, they do not have enough capacity to accommodate all the developments that have requested service or are expected to request service in the near future.  The situation was posed as a “crisis”, that development will go begging for service until capacity can be increased.

 If, as the adage goes, necessity is the mother of invention, then it would appear they are motherless in Bee Cave. That is because, as the article relates it, the WTCPUA’s mental model can only accommodate a conventional centralized sewer system to provide wastewater management for these developments.  Without feeling any necessity to evaluate any other options, they are making plans to extend and expand the capacity of the existing conventional centralized sewer system.  As a result, the fate of the wastewater, once treated at the centralized plant, would be spreading upon land set aside for that purpose, to make it go “away”.  A water resource, addressed solely and exclusively as if it were a nuisance, used “once-through” and then thrown away, truly wasting this water.  Wanton destruction of water resources!

 It just so happens that I have a perhaps unique perspective on this matter.  I was contacted by a real estate broker, who is marketing some properties around Bee Cave, to talk about creating stand-alone wastewater systems on those properties.  He contacted me because he knew that I advocate a “decentralized concept” wastewater management strategy.  The basic idea of that concept is to address this water as a resource right from its point of generation, and to maximize the beneficial reuse of this water to defray non-potable water demands on or near the project generating the wastewater.  Which is to say, centering “waste” water management on water management, not on making a resource misperceived as a nuisance to go “away”.

 This broker knew of the difficulties being faced by proposed developments in the Bee Cave area which were queuing up for service.  Since delaying development would cost development interests (including him) money, he wanted to know if decentralized concept systems might be used to allow development to proceed without having to await the sewer system expansion.  This broker also asserted that Bee Cave is facing a water supply crisis, that it did not have sure access to supplies that would support continuing the rapid pace of development in this area.  All the more necessity to manage all water as a resource there, rather than to so gratuitously throw it away.

 One of the broker’s agents, who is friends with the Bee Cave mayor, arranged an audience with the mayor, also attended by the city administrator and the city planner.  The concept of managing wastewater as a water resource rather than as a nuisance was presented, showing how to practically do this with point-of-use treatment and reuse.  They were shown how this would focus a majority of the fiscal resources on utilization of this water resource to defray non-potable demands, rather than on running pipes all over the countryside to make a perceived nuisance to go “away”, and how this sort of strategy would be less costly, both to the developer and to society at large.  It was a cordial audience, they listened, asked relevant questions.  They suggested that some of the other developers with wastewater needs be contacted, but indicated no interest on the city’s part to even discuss the matter with WTCPUA.

 A few inquiries were made.  No response, so I let it lie.  It was a long shot as I saw it, lacking any “enthusiasm” on the city’s part, to get a developer to “bite”, so dogged persistence did not seem merited.  Then that article came out in the Statesman.  It was suggested to some of our local environmental activists who are concerned about water in the Hill Country, and to a few “water friendly” politicians that their “crisis” could be our opportunity to press for Bee Cave to at least consider water management strategies that focus on the resource value of the water.  The “carrot” being that a decentralized concept wastewater system could grow “organically” with the development, rather than having to all be installed – and paid for – up front of putting the first house on the ground.  This would relieve their “crisis” – since capacity could be installed on an as needed, or “just in time”, basis – while saving money for both the developer and the general public.

 That communication was then copied to the mayor, city administrator and city planner.  The mayor responded, saying she thought the idea had merit.  But rather than asking the WTCPUA to consider it, she suggested again that I, unilaterally, go to the developers and try to get them to – on their own, without “sponsorship” by WTCPUA – consider a wastewater system concept not recognized as an option by the controlling institutions, despite its fiscal and water resources benefits.  Because of that, again such a unilateral outreach to a developer was considered to be a long shot.  But I attempted to contact the developers the mayor identified anyway.  Again, no response.  The activists and politicians were advised that this may be an opening, and were asked for assistance in making the contacts.  No response. It appeared no one wanted to speak for the water.

 But I did dig up some details on the developments.  These indicate that every development being planned out around Bee Cave has its plan rooted in the presumption that its wastewater would indeed go “away”.  It’s like they see that as an entitlement!  That the WTCPUA simply had to extend a line to their property.  (And, one expects, the raw land prices these developers incurred were predicated on a unit yield that presumed this sewer service.)  It appears this is a deeply rooted mental model, the water must go “away” – to be wasted – or it will hurt “the deal”.

 However, a crude evaluation – not knowing the explicit character of each property – indicates that a decentralized concept strategy could be used on at least some of those developments without reducing the planned number of units.  For example, in the largest development investigated, only 23% of land area would be required to house drip irrigation fields, out of the 60% minimum total pervious area required by the Bee Cave ordinance.  With appropriate design, utilizing front yards, parkways, medians, greenbelts and common areas, it can be reasonably expected that a workable system could be installed.

 So indeed it appears the opportunity may be sitting right there to save water while also saving money.  Again, the monetary savings would be attained by: (1) eliminating all those pipes, and lift stations too, that would do nothing but move the stuff around, (2) allowing the wastewater system to be built only as required to serve imminent development, and (3) saving the money not spent to produce and deliver water to make up for what would otherwise be wasted.

 However, with the WTCPUA appearing ready and willing to take the wastewater “away”, the developers would not likely be too keen on “encumbering” their projects with reclaimed water irrigation systems, even if the green space was there, even it if will be irrigated in any case.  Without WTCPUA taking over long-term operations, the developers would be very loath to consider a decentralized concept strategy, regardless of whatever savings in up front costs might be realized.

 We can all fully understand how the developers, and their agents like that broker I dealt with, would take the view that it is their deal which is of paramount importance.  If that entails the wanton destruction of water resources, that’s not their immediate concern.  They generally want the least hassle service plan that they deem affordable.

 The catch, of course, is that the conventional centralized plan may only be “affordable” if these developers are allowed to externalize some of their costs to society.  Society will pay in the long run for the value of the water wasted, day after day, year after year, by the conventional centralized, make-it-go-“away” wastewater service plan.  Anyone who doubts that, consider that this region is predicted to be importing water in the not too distant future, which will be very expensive – and paid for by the public.  Also, WTCPUA would quite likely raise rates on all its customers to cover the bonded indebtedness it would incur to install the system expansion it needs to serve these new developments, and for the increased on-going operational costs, so local society would also pay directly in the short term.

 What is harder to understand is why agents of the public interest would take the view that wasting water is just fine.  Shouldn’t the public expect that those agents would be open to even extraordinary efforts to avoid that, given the water realities of this region?  But they show no indication of interest. Again, it appears that no one will speak for the water.

 First there is WTCPUA, and all its participating entities.  One suspects that they perceive they can only get the revenues from these developments if they provide conventional centralized sewer service.  That’s their “deal of the moment”.  They also understand they’d have to deal with regulatory issues if they were to “sponsor” a decentralized concept management strategy.  It can be reasonably argued that those issues can be favorably resolved, but this effort is no doubt seen as “inconvenient”.

 The regulatory issues exist because another one of our institutions, the Texas Commission on Environmental Quality (TCEQ) – which should be an agent of the public interest – also focuses on its “deal of the moment”, defending rather than rationalizing its rules.  It runs a rule system that sees wastewater management as being ALL about “disposal” of a perceived nuisance.  These rules do not allow reuse to be contemplated until a fully developed “disposal” system is in place.  TCEQ would most definitely have a great deal of heartache about the very idea of a distributed, rather than a centralized, wastewater system, as it goes against its “regionalization” policy.  It appears it would be “inconvenient” to wrap their heads around the fact that water is fundamentally a resource, and so to regulate on that basis.

 Then there are the engineers that work for WTCPUA, who – if that entity were to ever consider its full range of options for the form and function of a wastewater system – would be called upon to inform it of its options and advise it on the merit of each.  No doubt they have on the line sizable contracts to plan, design and permit the sewer system expansion.  This “deal of the moment” would seem to make moving to a decentralized concept strategy rather “inconvenient” for them.

 Just like the developers, then, all these people too appear to be focused on their deals of the moment.  So it is that an opportunity to change the paradigm, to begin managing water resources as if water and the environment matter is quite certain to slide on by, because all these “Once-through-lers” are focused on their short-term profit and/or convenience.

 And no one speaks for the water.

 

A $13 million failure of imagination in Center Point

Posted April 20, 2013 by waterbloguer
Categories: Uncategorized

Center Point is a small unincorporated town in Kerr County, Texas, lying along the Guadalupe River, about midway between the larger towns of Kerrville and Comfort. The sewer plan proposed there is an excellent example of a mental model, when left unexamined, costing society way too much money to solve a fairly simple problem. At the same time this deprives society of an opportunity to implement deep conservation, and so move society toward, rather than ever further away from, sustainable water.

Researching this matter, it seems that the idea of sewering up Center Point had been cast about for many years. It was asserted that existing on-site wastewater systems were polluting the Guadalupe River, and relieving that condition creates the “need” for the sewer. In the recent reports, it is also asserted that the sewer is “needed” to accommodate growth, even though population growth projected thru 2040 over a several square mile service area is fairly low, from a current estimate of 2,090 persons to a projected population of 2,519.

The facility plan I found in the Texas Water Development Board (TWDB) file was performed by a large nationwide engineering firm, out of its San Antonio office. This plan offered as options only the continued use of conventional on-site systems – termed in the plan “septic tanks” – or piping the stuff “away” in a conventional large-pipe sewer system. In the recommended option, many miles away to the treatment plant in Comfort. The price-tag for the collection system to get the wastewater there is listed at $13 million and change. Total system cost would also include the price of providing treatment capacity for this flow, a cost which was not evaluated in the planning documents I found in the TWDB files. Spread over the 900+ connections in the service area, the cost is over $14,000 per connection for just the collection system.

This is a very impoverished range of options, ignoring everything between those two extremes, reflecting what I call the “dichotomy view”. That is a mental model which holds that wastewater management can be done in only two ways. One, totally within the confines of one’s lot, with the owner being unilaterally responsible for planning, design, permitting, funding, installing, and operating and maintaining that system. Or two, by dumping it in a pipe leading to the centralized treatment plant, with the user paying a fee and the sewer system operating authority doing all that. One or the other, no options in between. As observed in the case of Center Point, that is the commonly held “understanding” of our controlling institutions, including the large nationwide engineering firms.

Should the people in those institutions know better? In this case at least, ABSOLUTELY! Among the ranks of the engineering firm which prepared the Center Point facility plan are two employees with whom I am acquainted that are both long-time proponents of what I labeled in 1986 the “decentralized concept” of wastewater management. This concept adopts a “continuum view”, considering a range of options lying between the two extremes that compose the “dichotomy view”. The basic idea is that you treat – and beneficially reuse to the maximum extent feasible in the situation at hand – the wastewater as close to where it is generated as practical in the context at hand.

In many circumstances – as reviewed below, Center Point almost certainly being one of them – this strategy can deliver wastewater systems that are more fiscally reasonable, more societally responsible and more environmentally benign than conventional centralized systems. This is the so-called “triple bottom line” of sustainability. Those two employees that I know have been advancing that message for many years, including obtaining for their firm several contracts for various studies about decentralized concept strategies. So clearly, the knowledge exists within that firm to have known that a whole range of options was being ignored in Center Point.

It may be, however, that “mainstream” engineering firms avoid all that for “business reasons”. It is impossible to underestimate the level of resistance that the controlling institutions pose to the decentralized concept. As noted, the long-standing presumption has been that Center Point “needed” to be sewered up, so the local political leaders advanced that as the explicit goal, touting it in every interview I’ve read. These politicians no doubt want to be connected to a “grand project”, to deliver a large grant, and so are disinterested in advancing a “smaller” solution, no matter the relative merits. Then too, an engineering firm would be assured of a hefty design contract to implement the centralized system – indeed, for the Center Point project, TWDB has provided a $1.8 million grant for planning and design – while for a decentralized concept strategy, the available design fees are unknown. It’s also well understood that all the regulations, funding programs, etc., are heavily biased toward conventional projects. Then there are the legal firms, the financial advisers, etc., who stand to get a cut of the pie all lined up cheerleading the centralized system. In the face of all that “weight of expectation”, it’s easy to understand why a firm which wants to keep on doing such projects might hesitate to suggest any “outside the box” options, perceiving a risk of being branded as “not a team player”, of being “blacklisted” on future projects.

The result is that decentralized concept solutions, which could deliver superior service at less cost – including putting that water resource to work in Center Point instead of shipping it “away” – will never even be put on the table for consideration. I have observed this same pattern of behavior repeated by firm after firm, in community after community. In effect, our controlling institutions are operating a conspiracy to constrain the options that get considered to those which match their current mental model of how one manages wastewater.

Options which could have been put on the table – should have been, routinely so, in a context like Center Point – include improved on-lot systems and “cluster” systems, at various scales. These systems would feature high quality treatment – using technologies that are robust and resilient, thus are manageable in distributed systems – and subsurface drip irrigation dispersal. Those practices would preclude whatever pollution issues are purported to be caused by existing “septic tanks”, and they would provide whatever level of service is needed for further development in and around the town.

I would expect the preferred configuration to be on-lot or small-scale cluster systems, so the reclaimed water could be most cost efficiently routed to the most beneficial irrigation usage. This perhaps could be in the pecan orchard or commercial nursery, each close to Center Point’s town center. Or it could provide much of the landscape irrigation water demand at houses, businesses, parks, etc.

That sort of strategy would save money in several ways. First, not all of the 900+ connections have failing “septic tanks” so the initial installation could be limited to properties where the existing on-lot systems are failing, and to commercial sites on which conventional “septic tank” systems are inappropriate. Thus, the cost of fixing the actual immediate problems may be drastically lower. Long term, some properties may never need anything but their existing conventional “septic tank” system, so the ultimate cost of the whole system would most likely be lower, even if the cost per house needing new service under the decentralized concept approach were to be greater than it is with the centralized system.

An evaluation of a decentralized concept system in a similar community, however, indicates that small-scale collective systems could be installed for less than $14,000 per connection. In contrast to what that buys in the centralized system, a collection system only, this buys a collection, treatment and reuse system. Therefore, even if every connection did require an upgraded system, the total cost of the decentralized concept strategy is almost certain to be way less than the centralized option, on an “apples to apples” basis.

Second, since facilities need be built only to serve existing development and imminent new development, this strategy does not speculate on the scale of future development. Whatever new development does occur would not incur costs until it actually hits the ground, so those costs would be delayed until actually needed. This works with the “time value of money” – a dollar you can put off paying until later is worth more than a dollar you have to spend now. This frees money for other investments in the meantime.

This also has a “social justice” aspect. Under a decentralized concept strategy, the costs of developing new capacity, whenever they occur, would be borne directly by the development generating the need for that capacity. This relieves the existing population from having to participate in financing of facilities to serve activities which may not benefit them in any way, as they will be forced to do under the centralized system. Indeed, Kerr County has asserted that it has not been requiring failing “septic tanks” to be upgraded in Center Point because the residents cannot afford it. Yet, the proposed plan would surely impose high monthly fees on these people, likely higher than the amortized cost of an upgraded on-lot system.

Third, regarding those monthly charges, the overall operations and maintenance costs of the decentralized concept system are likely to be lower. This is difficult to evaluate right now because the planning documents for the centralized system provide O&M costs only for maintaining the collection system. Those costs work out to about $40/connection/month, already a pretty high sewer rate. Decentralized concept systems in a somewhat similar community were projected to incur a considerably lower monthly charge than this for all the O&M. In Center Point, however, the users of the proposed system would also have to pay sewer fees to the treatment plant operator, increasing their total monthly payout by an undetermined amount. As noted above, however, under a decentralized concept option, not all the connections are likely to require upgraded service, so the total O&M cost would very likely be much lower.

Fourth, the water has value, so if reused to defray irrigation that would be done in any case, that would save water. Whether or not this yields direct savings on an explicit water bill, there is also another planning process in play to augment water supply in Center Point, entailing another multi-million dollar project. Defraying water demands in this community would limit the need for new water supply. In particular, irrigation supply creates demand peaking, so shaving that peak with reuse could reduce the scale of facilities, saving some of that money.

Fifth, optimizing the beneficial use of the reclaimed water benefits the regional water economy, so would likely put off or decrease the cost of future water supply projects generally. Other, less readily apparent benefits may also decrease global long-term costs to society. For example, salinity of the estuary at the end of the Guadalupe River is of concern, and lessening water demands anywhere in the river basin frees up water for environmental flows. This simply highlights the multi-faceted water challenges facing this region. Without this sort of deep conservation being built into the water management system at every opportunity, costs would be incurred to free up water from other sources to provide the environmental flows, or the estuary will suffer, damaging the economy that depends on its productivity.

It is understood that, given the high institutional bias for conventional projects, getting a decentralized concept system approved and funded might entail a “hassle factor”. But this is no reason to ignore it. Really this is just another cost factor to be evaluated, as the “hassle” translates into hours expended to work through the barriers. So it would seem that the rational course is to evaluate the relative costs and benefits of various options, and then consider if any “hassle” really offsets the benefits that would be delivered by a “non-standard” project. Noting once again, not the least of which is moving us toward sustainable water. It seems that all concerned choose instead to simply presume, without analyzing it, that the conventional strategy is the only one that could be funded and approved. This avoids ever exposing the barriers and working toward their resolution. So the pattern repeats, with the engineer on the next project again fearing to venture “outside the box”.

The bottom line is that the work has not been done to know if the proposed centralized system is the most fiscally efficient, societally responsible or environmentally benign option available. Indeed, the necessary work to expose all the costs of that option have not even been done. Yet the controlling institutions are all conspiring to move this option forward, apparently unconcerned that other viable – and perhaps significantly superior – options have not been considered at all. In Center Point, this apparent compulsion to cater to a prevailing mental model is a $13 million – at least – failure of imagination.

In terms of both money and water, society cannot afford to continue to suffer such failures.

 

Slashing pollution, saving water – the classic win-win (but ignored by society)

Posted March 24, 2013 by waterbloguer
Categories: Uncategorized

In this entry, we’re going small-bore, looking at a rather localized and somewhat parochial issue. But one that highlights some of the challenges we face in moving society toward sustainable water, in stimulating deep conservation.

Barton Springs is the natural discharge point of the Barton Springs segment of the Edwards Aquifer, which lies to the south/southwest of Austin, Texas. Nitrate levels in Barton Springs have been increasing in recent years. And, according to a USGS report, a good deal of it is from wastewater sources. Who could have guessed? I mean, besides anyone who gave this matter a moment’s notice.

Nitrate in BSZ-USGS

The graphic above shows the growth of those wastewater sources from 1990 to 2010. The top line of graphics shows the growth in number of OSSFs. That stands for on-site sewage facility, the Texas rules-speak name for what are popularly known as “septic” systems. The bottom line shows TLAP systems. That stands for Texas land application permit. In this type of wastewater system, the effluent is spewed out over an area that is “irrigated” mainly just to make the water go “away” rather than for an actual irrigation benefit, such as an improved landscape or growing a marketable crop. In the most used type of “septic” system – consisting of an “aerobic treatment unit” (ATU) and a couple of spray rotors – the wastewater is also spewed out over the ground, with little regard for the value of the water, or of the environmental impacts. Such as increasing the nitrate levels in Barton Springs.

As these graphics show, the density of the “septic” systems has increased very dramatically over the 2 decades they cover. Growth in the number of TLAP systems, while not so dramatic, was also considerable. Given the nature of those systems, spewing the water over land surfaces that, in the case of OSSFs, are not “qualified” at all and, in the case of TLAPs, are addressed in a rather cursory manner, it should not have been the least bit surprising that nitrate levels would be rising in the waters that drain out of this watershed. Indeed, particularly when combined with increases in pollution it was known would occur simply because development was occurring there, it should have been readily anticipated that this would be so.

The level of nitrate in Barton Springs is approaching 2 mg/L. The often quoted limit for nitrate in drinking water is 10 mg/L. This is what is termed the “enforcement limit”, the level at which definitive action would be required to reduce nitrate loadings into the groundwater. But there is another limit in the rules, 2 mg/L, which is termed the “preventative action limit”. That is the level at which actions to stem the shedding of nitrate into the groundwater – preventative action – are to be considered. We are there!  So it’s time to start taking preventative action, no?

The tragedy here is that this did not have to happen. Preventative action has been available all throughout those 2 decades. Wastewater could have been managed by means which would have greatly blunted, perhaps essentially eliminated, the shedding of nitrates from these wastewater sources. AND this could have been done at very low overall cost, perhaps at NO cost – or even at a savings – in terms of global life-cycle costs of this water management function, while at the same time conserving water. In any case, the tide can certainly be turned going forward by moving practice to those methods.

First, here is what is wrong with the currently prevailing methods. The ATU employs a technology, activated sludge, which is inherently unstable, and so typically suffers “excursions” in its treatment quality, particularly when used in the essentially unsupervised on-lot environment. As my realtor cousin once said of them, “They puke solids.” In any case, the ATU does not remove nitrogen from the wastewater. Spraying this effluent over the ground surface also limits the amount of denitrification – the biologically-mediated conversion of nitrate to nitrogen gas – attained in the soil. These on-lot systems spew the effluent onto the ground without regard to whether it’s raining or how wet the ground is. All this results in a nitrate-rich effluent being dispersed in a manner that heightens the likelihood a good bit of it would be shed, rather than assimilated in the plant/soil ecology, and so would appear in the waters that drain from this watershed.

That shedding of nitrate can be greatly blunted, perhaps even essentially eliminated, by a shift in the type of OSSF used. First, a treatment unit employing recirculating gravel filter (RGF) technology can be designed to remove a majority of the nitrogen from the wastewater prior to dispersal. This is a very robust, inherently stable treatment process, so it can consistently produce this high-quality, denitrified effluent in the lightly supervised on-lot operating environment. The major proof-of-concept field study of this technology was a project I ran on Washington Island, Wisconsin, in which nitrogen reduction of over 60%, and in some cases approaching 90%, was consistently achieved by systems subject to all the vagaries of operating in the on-lot environment. So using the RGF instead of the ATU for treatment will eliminate over half the nitrogen loadings prior to dispersal, consistently and reliably.

Then, instead of spewing it into the air, this effluent can be dispersed in a subsurface drip irrigation field. With the level of nitrogen in the effluent reduced, it is much more evenly matched to the uptake rate by plants. This dispersal method will also enhance in-soil denitrification. Together, these assure consistently more complete assimilation of the nitrogen that is dispersed into the soil. And subsurface dispersal eliminates runoff of effluent during rainy weather. The result is that very little nitrate will leach or flow “away” to appear in the waters that drain from the watershed.

The RGF/drip strategy is also a deep conservation measure, that can move us toward sustainable water. Drip rather than spray dispersal can greatly serve the water economy by displacing potable water with this effluent to defray irrigation demands. Because spray dispersal entails a potential for contact with this partly treated water (which is also questionably disinfected, the reasons for which we won’t get into here), the spray heads are set away from the house, off somewhere on the lot where they won’t be “obtrusive”. But the improved landscaping, the plants that might be irrigated in any case, are typically up around the house, so these spray systems are hardly ever arrayed to serve that landscaping. Because the drip lines are subsurface, there is very low contact hazard, so the water can be dispersed anywhere on the lot where the owner chooses to install irrigated landscaping, and the effluent routed to that drip field would defray irrigation usage, pretty much gallon for gallon through the peak irrigation season.

Also, irrigation efficiency of drip is inherently much greater than for spray. In any case, the rules require the design dispersal rate for spray systems to be very low, so not much irrigation benefit could be derived even if it did operate at higher efficiency. The rules allow the application rate for drip to be significantly higher, much more in line with irrigation rates through the peak irrigation season. So, in combination with the high irrigation efficiency of drip, a much higher irrigation benefit can be derived from drip dispersal.

Further, the rules do not require the area over which effluent is sprayed to be “qualified” in any meaningful way, in regard to soil depths and plant cover. In contrast, drip fields must have at least 6 inches of soil beneath the drip lines and 6 inches of cover over them. In the Hill Country terrain of the Barton Springs watershed, this often requires importing soil to attain these depths. Soil is often “enhanced” to create improved landscaping in any case, so with drip the OSSF dispersal field is typically placed in the best soils available on the lot. Better soil increases irrigation efficiency by providing more soil moisture storage capacity, and with there being more soil volume to “absorb” the water even when the soil is already wet from rainfall, it provides for better assimilation of nutrients.

The bottom line is that with higher quality pretreatment, including significant nitrogen reduction, and drip dispersal, the shedding of nitrate would be greatly blunted, if not essentially eliminated, and a very high percentage of the annual effluent flow could contribute to defraying water used for irrigation. The first benefit would halt whatever portion of the nitrate increases in Barton Springs that have been due to OSSFs. The second benefit is a bonus, one that is very valuable to this water-challenged region. I’ve been designing this type of OSSF for over 20 years, and it has been approved by all the local jurisdictions. Therefore, it is clear that these are benefits which can be readily realized, which could have been attained all along.

So why weren’t they? We won’t belabor the details here, but the installed cost of the RGF/drip system would be somewhat higher than an ATU/spray system. And that’s why the latter are so ubiquitous, because first cost typically rules the day. However, the life-cycle costs would be similar, at least if the cost of the water saved is taken into account. (Whether the cost of that water shows up on a monthly bill would depend on if the home were served by a well or by a piped water system.) Other savings derive from much lower power costs (also a benefit in regard to energy sustainability), from lower equipment replacement costs, and from not requiring chlorine for disinfection (another insult to the environment that is avoided by subsurface drip systems). So nitrate reduction could be realized at very low, or no, cost on a global, life-cycle basis.  Again, the barrier is first cost.

These same technologies could be just as readily used in those TLAP systems. In those systems, land application is, in theory, operated so that nitrogen loadings match plant uptake and in-soil denitrification rates. That could be much more readily, and cost efficiently, attained using the denitrifying RGF system for treatment and subsurface drip irrigation for dispersal. The shedding of nitrate could be further attenuated by placing the drip fields in areas that would be irrigated in any case. This improved landscaping would have better soils than the rangeland and cedar breaks typically constituting the dispersal fields in TLAP systems, to which the water is routed simply to make it go “away”, with no intent of defraying irrigation water usage in the development the TLAP system serves.

Again, the RGF/drip strategy is an exemplar of deep conservation – integrating water efficient practices, instead of water wasting practices, into the very fabric of development. Indeed it could be called to question why any responsible entity in this increasingly water-challenged region would allow water to be so gratuitously wasted, when there are readily available – and globally cost efficient – methods that can blunt that water waste, to realize the resource value of what is now being so foolishly managed solely and exclusively as if it were a nuisance. That both state and local regulatory systems embrace and support those wasteful methods is testament to the institutional resistance to deep conservation.

Going forward, however, a win-win situation is there for the taking. At the same time that water use efficiency could be greatly enhanced, further increases in nitrate being shed into this watershed can be essentially eliminated by shifting to the appropriate technologies. Over time, the existing sources could also be phased out. As people come to value the water being thrown away in their sprayfields, the spray systems may be replaced with drip irrigation fields, arrayed to irrigate their highest value landscaping.

This could be spurred on if there continue to be water curtailments due to drought, since the drip field would “drought-proof” the landscaping it serves. That’s because water curtailments in all the local drought contingency plans impact only exterior water use. The wastewater dispersed in the drip field would derive from interior water use, which is not curtailed, so the landscaping over the drip field could continue to be irrigated through the drought.

Then too, as the ATUs wear out, or the owners get tired of the frequent replacement costs (or the stench they often produce), they could be retrofitted to an RGF, obtaining the nitrogen reductions in the treatment system as well. Together with the drip field replacing the spray system, again this would greatly blunt, if not essentially eliminate, the nitrate being shed by the existing OSSFs.

This is a fairly impressive list of benefits, for both water quality and water quantity, from simply plugging in the appropriate technologies for the circumstances at hand. As noted, the barrier is the first cost of those appropriate technologies, along with the inertia of the wastewater management field of practice, and the sad fact that ATU/spray systems are accorded what amounts to a “most favored status” in the OSSF rules system in Texas.

The latter two factors are matters of reforming the “culture” of the field, but the first cost issue is a ubiquitous problem in regard to all manner of efforts to enhance water sustainability. Society has not figured out how to send to those who incur the first costs the signal sent by the global life-cycle costs. The result is that choices are made which may well serve the short-term interests of those who bear those first costs but poorly serve the long-term best interests of society.

A solution to that conundrum could be provided by appropriate regulation to attain ends which do serve the long-term best interests of society. Like requiring OSSFs in nitrogen-sensitive watersheds to meet nitrogen reduction standards, while simultaneously significantly defraying irrigation demands on “original” water supplies. Here in Texas, society has not yet gotten around to considering its long-term best interests in these regards. So we’ve seen, and no doubt will continue to see, increases in the level of nitrate measured in Barton Springs. And all that water running through those wastewater systems will continue to be indeed wasted.