Let’s Compare

Posted September 26, 2016 by waterbloguer
Categories: Uncategorized

A water management tragedy is playing out around the Hill Country community of Dripping Springs.

As reviewed in a previous post, the city has predetermined, apparently without any meaningful analysis of options, that it will extend wastewater service to large developments being planned around the city by doubling down on the prevailing 19th century infrastructure model. The plan is to increase capacity at its existing centralized treatment plant and to extend sewer trunk mains to major developments to the east, west and south of the city. The city believes this “requires” them to apply for a permit to discharge effluent from the centralized plant into a branch of Onion Creek, on which are downstream sites of a goodly portion of the recharge of the Edwards Aquifer, and along which reside a goodly number of people who are concerned about the impacts of this discharge on the creek. Recent information indicates that discharges into the creek would also recharge the Trinity Aquifer, a source of their water supply, right about where the wells serving the city are located.

The level of treatment which the Texas Commission on Environmental Quality (TCEQ) appears likely to permit for Dripping Springs will not require reduction of nitrogen. That can lead to algal blooms in the creek, degrading water quality and the visual quality of the riparian environment. Recharge of nitrogen laden water would be a problem in drinking water withdrawn from wells. It appears the permit will also not require consideration of contaminants of emerging concern (CECs), such as pharmaceuticals, also problematic in drinking water. They may also impact life in the stream. Cases of sex changes in fish have been observed in waters receiving discharges containing CECs.

In response to widespread criticisms of their discharge permit, the city asserts that “most” of the effluent would not be discharged, rather would be routed to irrigation reuse. Some on city-owned facilities, but since those are quite limited, mostly it appears by routing it back to the developments generating much of the increased flow to the city’s plant, to be used for irrigation there. However, no reuse lines running to those developments are shown in the city’s Preliminary Engineering Planning Report (PERP), dated July 2013, which a city spokesperson stated is still their “official plan”. Since it’s clear this long-looped, far-flung centralized reuse system would be quite costly, it appears the city has not yet calculated the full cost of their “disposal” focused conventional centralized strategy with reuse just appended on at the end.

A 21st century option to this very roundabout method was reviewed in “This is How We Do It”. That decentralized concept would obviate the long-looped, high-cost system of pipes and pump stations to first make this water supply “go away” and then to run it back to where it was generated in the first place. Rather, a tight-looped reuse system would be integrated into the development – irrigating the neighborhood where the wastewater is generated – as if reuse were a basic principle of water management, instead of just an afterthought to a disposal-centric system down there at the end of the pipe. Reuse would be cost efficiently maximized, and being designed into the development would not be optional, so would most definitely save on using potable water for irrigation. This is a model which the city and the developers of the large projects around it have so far refused to consider.

Dripping Springs also contends it must centralize all wastewater flows because it aims to implement a direct potable reuse (DPR) scheme, providing additional treatment to bring this wastewater to potable quality and introducing it into the city’s water supply. They assert this is the “ultimate” scheme for reuse of this water. DPR, however, is a rather problematic strategy. The costs would be prodigious, and the city does not own, thus control, the water system that supplies the city; an independent water supply corporation does.

There are also social equity issues with this scheme. Even if it were to integrate its water system into a DPR scheme, that water supply corporation does not, and will not, provide water to some of the outlying development, so in the process of serving that growth, the existing citizens would be expected to drink the reclaimed water produced by those who generate it but won’t have to drink it.

It’s an open question if the water supply situation in and around Dripping Springs is, or will become, so dire that the city would ever seriously consider the extreme costs, and the other complications, of going to DPR. Arguments can be made that increased water supply could be more cost efficiently, and safely, provided locally by building-scale rainwater harvesting, so the actual utility of DPR is questionable. From all indications so far, a DPR system in Dripping Springs is just theoretical. Perhaps not the best argument to ignore anything but a conventional centralized wastewater system, without any regard for the consequences.

So let’s compare the two approaches to expanding the Dripping Springs wastewater system to serve those large outlying developments, to see what those consequences may be. They can be compared on fiscal, societal and environmental grounds.

More Fiscally Reasonable

As reviewed in “This is How We Do It”, the decentralized concept appears to be far more fiscally and economically efficient than the conventional centralized system is projected to be. That analysis was admittedly limited, meant only to be illustrative, and more work is needed to put some meat on that skeleton, but the comparison was stark.

To review, a sketch plan was created – see the figure below – and a cost estimate derived for this decentralized concept strategy in a neighborhood in the Headwaters project, to the east of Dripping Springs. The estimate was $8,000 per house for collection, treatment and redistribution of the reclaimed water throughout the neighborhood, to provide irrigation of front yards, parks/common areas, and parkways. Since these areas would be irrigated in any case, an irrigation system would need to be installed in any case. The drip irrigation fields in those areas would therefore not entail much additional cost. So the estimated global capital cost of the decentralized concept strategy was $8,000 per house, or about $8 million total for the 1,000 houses planned in Headwaters.

Headwaters neighborhood ww sketch plan

[click on image to enlarge]

From the city’s PERP, the estimated cost in 2013 of the “east interceptor”, to convey wastewater from Headwaters to the city’s centralized plant, was $7.78 million. Spread over those 1,000 houses, that yields a cost of $7,780 per house. This one line by itself costs almost as much as was estimated for the complete decentralized concept wastewater system, yet all it does is move the stuff around.

To complete the centralized collection system requires installation of all of the local collector lines, manholes, and lift stations within Headwaters, an additional cost likely north of $10 million. Then there would also be a buy-in cost for a share of the treatment capacity at the centralized plant, likely a few million more. So it seems pretty clear that the basic conventional centralized strategy would be more than double the cost of the decentralized concept strategy, which is again a complete system, including reuse. Yet for that much greater cost, they get only collection and treatment, still having to pay for any reuse of this water.

One wonders, in what other arena would the prospect of getting more function for less than half the cost not be compelling? But in this arena, that prospect does not seem to even be noticed!

Dripping Springs insists that little, if any, of the water would be discharged; instead facilities would be installed to route it to irrigation reuse. That would entail pump stations, transmission mains, storage facilities, distribution lines within the areas where the water would be irrigated, and the irrigation systems. What all this would cost has apparently not been addressed; as noted, the city’s latest PERP is utterly silent on this. To just get the water back to Headwaters, for example, would likely entail a cost similar to the “east interceptor”. Clearly, the cost of enabling reuse under the disposal-centric centralized infrastructure model would be much greater than the cost to integrate reuse into the very fabric of development, as the decentralized concept does.

Then too, the energy demands of the decentralized concept system would be much lower than for the centralized system. The multiple distributed treatment units would use less energy in total than would be required to run the centralized activated sludge treatment plant. Little if any energy would be required to pump wastewater to those distributed units. In contrast, wastewater would run through multiple lift stations to get to the centralized plant. The tight-looped distributed reuse system would require little energy, as the water would only be pumped short distances. In the long-looped centralized strategy, much more energy would be expended to get water from the centralized plant to far-flung points of reuse. These energy savings also impart a fiscal advantage to the decentralized concept.

As it is turning out, however, that analysis of Headwaters is “theoretical” because it appears that Dripping Springs is no longer planning to install the “east interceptor” and run the wastewater from Headwaters to its centralized treatment plant. It appears they will also not build the “west interceptor” to run Scenic Greens, another major development in the city’s hinterlands, to its centralized plant. Instead these developments will be left to implement and independently run stand-alone wastewater systems.

Still, the analysis of that Headwaters neighborhood is indicative of what may be generally expected in any of the outlying developments that Dripping Springs does include in its centralized system. So it remains a general indication of how the decentralized concept would be more fiscally reasonable, likely far more so.

The wastewater systems within Headwaters and Scenic Greens are presently planned to themselves be smaller-scale disposal-centric conventional centralized systems. Effluent will likely be run to “waste areas” within the development rather than to areas that would be irrigated in any case – “land dumping” this water resource. These satellite treatment plants are exactly what Dripping Springs has asserted they are centralizing to avoid, thus the decision to not include Headwaters has societal dimensions to it. Leading us to …

More Societally Responsible

Dripping Springs will face a clear temptation to “cut corners” on the centralized reuse program that’s just appended on to an otherwise “disposal” focused system exactly because it will cost them so much. But under the decentralized concept, reuse will be practically maximized, most cost efficiently, because it’s designed into the development, serving the local and regional water economy well just as a matter of course, no further effort or expense required.

A decentralized concept system would be inherently simpler to plan and finance. Each distributed system would serve a small area, a neighborhood, to be built out in short order. Contrast this with planning large-scale facilities over an area-wide system, with much less definite growth projections.

And because investments are so focused, the costs of planning, designing and implementing the wastewater infrastructure could be readily “assigned” to those who directly benefit from that development – the developer would directly fund the building of those distributed systems. Unlike the conventional centralized system, which is typically financed by loans and bonds, spreading the costs among the whole of the city’s citizenry and/or ratepayer base. So the decentralized concept could be more equitably financed; existing residents would not be compelled to be the “bank” for development.

Then there’s the “time value of money”. With distributed systems, only the infrastructure needed to serve imminent development would be installed, neighborhood by neighborhood, so cost would closely track actual service needs. In the conventional centralized system, on the other hand, facilities that will not be fully utilized for years to come are routinely installed; dollars paid today for something you don’t need for years, foregoing all other investments that money could fund in the meantime. In Headwaters, for example, buildout is expected to take years, but to centralize it, the “east interceptor” and associated lift stations, sized for that ultimate flow, would have to be installed up front of serving the first house.

And this is all money “at risk”. If, for example, we were to experience another “crash” such as occurred in 2008, the pace of development might slow down, even stop altogether for a time. But once the money is borrowed and the system built, the payments would be due whether development came on line to fund those payments or not. So whoever financed that infrastructure would be “on the hook” to make those payments. If these facilities were publicly financed, it would be all of the ratepayers, and/or taxpayers, who would be called upon to pony up. This could balloon their wastewater rates and/or tax bills. All that would be avoided under a decentralized concept strategy, which assigns that risk to the developer, who would be putting relatively small amounts at risk at a time.

If the management needs of each area were considered independently, there would be no need for a “one size fits all” approach. But the conventional centralized system is a one-trick pony; either an area is sewered and the “waste” water is piped “away”, or – sorry, that’s the one trick – it’s left unmanaged. Under a decentralized concept strategy, the needs of each area can be considered independently. Some areas might be connected to an existing centralized system, some areas may have distributed systems, some areas may use individual on-lot systems, with all of those systems under unified area-wide management. So one management entity could accommodate each development in the most cost efficient manner, with systems best suited to the characteristics of the area and the type of development planned for it.

This would eliminate the “balkanization” of wastewater management Dripping Springs said it’s centralizing to avoid, not wanting a bunch of independent operators running systems around it – or installing unmanaged on-lot systems. On its present course, however, balkanization is just what will happen. Having had to abandon centralizing Headwaters and Scenic Greens, each relatively close in to Dripping Springs proper, highlights that it’s clearly a pipe dream to centralize the whole of Dripping Springs’ far-flung extraterritorial jurisdiction. They will continue to accept a proliferation of independent operators and unmanaged wastewater systems. Under the decentralized concept, they wouldn’t have to; they could manage it all, effectively and cost efficiently.

Independent systems could also be required for any “industrial” wastewater generators that might locate within the service area. Each such generator could be required to tailor its treatment to the characteristics of its wastewater flow. And also to the reuse opportunities inherent in the operation at hand, or that may be offered by co-located activities.

The decentralized concept is inherently growth-neutral. Each distributed system serves only a limited area of known imminent development. The centralized system, however, creates large-scale infrastructure covering an area that would grow over time. Since this infrastructure needs to be installed, and financed, up front of any development over that larger area, that creates an impetus for growth, indeed for higher intensity growth, to pay for those large-scale facilities. The infrastructure funding “tail” is allowed to wag the pace and nature of development “dog”.

The “out of sight, out of mind” nature of the conventional centralized system, taking the water far, far away from the neighborhoods where it’s generated, has at times resulted in wastewater management failing to get adequate funding to do the job well. Many cities, MUDs, etc., have a story or two about that. But with the system right there in the neighborhood, there would be constant vigilance to assure that proper management effort is always applied, that adequate funding to maintain the system is always provided. Of course, some may question if keeping the wastewater in the neighborhood is an undue “hazard”. But as reviewed in “This is How We Do It”, these distributed systems would be rather less likely to create any problems than the wastewater systems now routinely used in hinterlands developments, which do not seem to be causing much alarm, so that objection is rather disingenuous.

A little noticed feature of the decentralized concept, the system could readily accommodate any level of water conservation found to be desirable, or necessary, in the future. Employing the effluent sewer concept, the “big chunks” are retained in the interceptor tanks, and only liquid effluent is conveyed to the treatment centers. So cutting the flow, no matter how drastically, would not cause any problems in these collection lines. In conventional sewers, on the other hand, if “too much” water conservation were practiced, the sewers would be “starved” of the liquid flow needed to move solids through the lines. During severe droughts, some utilities have had to haul in water to flush sewer lines because the wastewater generators were “too good” at cutting their water use. Stagnation of sewer flows can cause a buildup of hydrogen sulfide in the sewers. That’s a potentially deadly hazard to sewer workers and can degrade sewer system components. Another whole field of risk that would be completely avoided under the decentralized concept.

Another societal issue is vulnerability to pollution, an inherent quality of the type of wastewater system being used. This leads us to a consideration of the differences in environmental impacts between the two infrastructure models.

More Environmentally Benign

Scale is a major driver of environmental vulnerability. In the conventional centralized system, large flows run through one pipe or one lift station or one treatment plant, so the consequences of any mishap – like a line break, power outage, flow surge, flood damage – are potentially “large”. With distributed systems, flows at any point in the system remain “small”, thus the potential consequences of any mishap remain “small”. Eliminating all of the large-scale collection lines outside the neighborhoods, the decentralized system has shorter runs of smaller pipes, minimizing vulnerability. Then too, with a distributed system, any mishap that may occur would only affect a small part of the overall system. All the other independent distributed systems would not be impacted at all.

In any case, decentralized concept infrastructure would be much less likely to experience problems to begin with. Effluent sewers are built “tight”, with no manholes, and there’s short runs of small pipes, so infiltration/inflow and exfiltration/overflows would be somewhere between minimal and non-existent, while conventional sewers are inherently leak-prone, typically leaking more as they age. Decentralization also minimizes, perhaps can eliminate, pump stations in the collection system, removing a major source of (sometimes major) bypasses that plague centralized systems.

The distributed treatment unit employs a highly stable, very robust technology – the high performance biofiltration concept – which is highly resistant to upsets and, by the very nature of how it’s built and operates, does not allow bypassing of untreated wastewater. The conventional centralized plant, employing the inherently unstable activated sludge technology, is a point of high vulnerability where any sort of mishap, flow surge, etc., could lead to a bypass or poorly treated water running freely on through the treatment plant.

In a conventional centralized system, the larger sewers typically have to run in the lowest topography, the riparian zones. Thus these areas are torn up to install the sewers, and often to repair or upgrade them, creating environmental vulnerability. In the decentralized system, since flows are not highly aggregated, riparian areas can typically be avoided, eliminating this vulnerability.

Also, with the collection lines being small and shallowly buried, far less disruption is entailed when installing the sewer lines, wherever they are located, and reclaimed water distribution lines can typically be laid in the same trench, making their installation non-disruptive. Since the system would be expanded by adding new distributed systems rather than by routing ever more flow to existing treatment centers, there would never be a need to upgrade collection lines, eliminating that on-going disruption.

All these factors impart a far lower vulnerability to environmental degradation with a decentralized concept system. Indeed, centralization is a “vulnerability magnet”, gathering the stuff from far and wide to one point, where again any mishap can cause “large” impacts.

Backward – or Forward?

Carrying the promise of being (far) more fiscally reasonable, more societally responsible, and more environmentally benign than the “disposal” focused 19th century conventional centralized infrastructure model, it is nothing less than a water management tragedy that Dripping Springs, and the developers of the large projects around the city, will not consider the 21st century decentralized concept infrastructure model. Preferring the “comfort” of the familiar, they will extend and perpetuate that 19th century model, incurring the high costs of implementing it and the even higher costs of appending on at the end of the big pipe a far-flung reuse system, along with all the societal and environmental ills it entails. Since this infrastructure has a service life of several decades, this retreat into the past will cement in place an infrastructure model that may hamstring progressive water management in this area for generations to come.

This is a tragedy that local society does not have to endure. It merely requires the boldness and wisdom to move forward, instead of backward. To explore the full range of options, of infrastructure models, that the city and the surrounding developers have at their disposal. Their refusal to do so is a free choice. There are no imperatives “forcing” them to forego such an examination, not fiscally, nor societally, nor environmentally – as just reviewed, all those factors highly favor the decentralized concept. And not regulatorily. TCEQ has confirmed the decentralized concept can be readily permitted.

There is nothing really new here. This is just a re-framing, in the current context, of the forward-looking ideas, ideals, concepts and principles that have been set out for society’s consideration for decades. Perhaps here and now, with the “urgency” of an Onion Creek discharge in the mix, society will chose to act on it.

Indeed the question is, will we continue to fall backward, or move forward?


A Rain Garden’s Adventure – UPDATE

Posted September 7, 2015 by waterbloguer
Categories: Uncategorized

… from concept to the future

The Problem – our back yard floods when we get large, sustained rains.


Drainage from the lot across our back fence pools in our back yard because our house blocks drainage from our back yard toward the street. To blunt the breadth of this ponding, particularly for it to lap up on the back patio, we decided to install a rain garden, that will pond and infiltrate a lot of the flow within it, instead of it spreading over the yard.


The logical location for that rain garden is, of course, the low point, where ponding first appears. So that set the location, in the area in the picture above that is ponded.

What is a rain garden?

Informally, a “rain garden” is any vegetated low spot where runoff gathers and infiltrates into the ground. So in that sense, much of our back yard is a rain garden. As a formal term of art, a rain garden is a bioretention bed. That stormwater management tool is an excavation that is filled with an “engineered” media into which plants are installed. Water gathers in the excavation, filling the pores in that media and ponding up over it, to the overflow depth of the bed. Below is a generalized schematic of a bioretention bed.


Of course, since water ponds here instead of running off in any case, in terms of hydrology installing a formal rain garden – a bioretention bed – in this yard is rather gratuitous. But as noted, the aim was to have more of the water pond in the rain garden excavation, with less spreading across the yard and onto the patio. And also, as I am an avid advocate of Low-Impact Development – for which the bioretention bed is a prime tool – to also create an example that practices what I preach.

Creating Our Rain Garden

Having determined the best location, the low spot where ponding in the back yard begins, I cut the edge and removed the turf. This created an excavation about 4 inches deep, which I then lined with 4-inch cut stone blocks.


The next step was to dig out the rain garden excavation which would be filled with the engineered media. I decided a media depth of about 8 inches would be a good compromise between providing storage volume, limiting the depth plant roots would have to extend to get into the native soil, and “preserving” my back. I wheelbarrowed all the excavated material, including the turf dug off the top, out to the front yard, where I had long planned on installing a raised bed. Below we see the excavation, ready to have the media installed. Note the tree roots I cut out when digging out the bed. I laid those in the bottom of the excavation before installing the media, to create a sort of hugel culture bed for the rain garden plants.


I decided on a media obtained from a local yard named JV Dirt, a specialty mix they created that includes expanded shale in the mix. This increases the water holding capacity of the media. Since the media must be coarse/sandy so that water will readily infiltrate into it when runoff starts to gather in the bed, the ability of the media to hold water in the plant root zone over the media depth will be limited. The expanded shale “absorbs” water, which it then releases as the soil around it dries out, so that more water would be available to the plants through extended periods of no rain.

It took two runs in our pickup truck to get the media here. We wheelbarrowed the media into the back yard, dumped it into the rain garden excavation, and leveled it out. The finished product is seen below:


I decided to add some compost and mix it into the media, to provide a “better” planting bed, that would support the plants better than the “bare” media, which contains limited organics and nutrients. This is by design, actually, because often a bioretention bed is installed with an underdrain and acts as a biofilter, so that “excess” nutrients in the media may flow out of the bed, a sort of “compost tea”. As those biofiltration beds are installed to treat the runoff to protect water quality, regulations for those installations limit the nutrient content of the media. In our case, though, the water infiltrates into a deep soil in a “non-sensitive” watershed, so leaching of nutrients from the media here is not an environmental hazard. And besides, this is a rain garden installed “outside” of the regulatory system, so adding compost to our media was “okay” in that sense too. The bed with the compost mixed in, ready for planting, is shown below:


The next step was to decide what plants to install. We wanted the bed to be an attractive landscape amenity, so we chose a variety of plants that would provide various colors and plant shapes. We also chose plants that were going to be available at the Wildflower Center’s spring plant sale. I consulted lists of plants recommended for rain gardens put out by the Wildflower Center, Texas A&M, and the City of Austin, and came up with this planting plan:


The general plan is to have a ring of smaller plants around the edge and some central plants that would grow taller and spread out some. I was told by an expert landscaper friend that the flame acanthus would “overwhelm” a space this small, but I decided to give it a try anyway.

So when the Wildflower Sale came along, we got the plants. I wanted to install all 1-gallon plants so that they’d have a more well-developed root system, but had to settle for 4-inch plants for the red columbine and the plains coreopsis.

The planting begins:


Here we see me – and our cat Joey – at the “Bon Jovi point” (“Oh, we’re halfway there …”😉


And finally I get to the last plant – yea!!


I then covered the surface of the media with a thin layer of mulch, to both hold down “weed” growth and to blunt drying out of the media over extended periods with no rain.


With that, the rain garden is finished!! And I cracked open a Real Ale Fireman’s 4.

Into Operation

Lacking rain, I spot watered each plant every other day or so to get them established. Particularly important with the coarse/sandy media all around the plants. Below we see the bed about a week after planting. All plants are doing well.


It was a few weeks before we got a “significant” rainfall. When we finally did, we saw the rain garden begin to pond – with no ponding over the rest of the yard. It works!!


A few days later, we got a big enough rain that the bed completely filled up to the top of the rock border. As we see below, again with minimal ponding outside the bed.


This was the beginning of a series of larger rainstorms through May and June, including the big Memorial Day floods. So the system was really put to a test.

When we got a much larger rainstorm, there was still some ponding over the yard outside the bed too, as we see below, but clearly more of the water was contained within the rain garden.


And each time the rain garden filled up, it would drain down in less than a day, so the plants were not standing in water for too long. Here we see the rain garden in “mid-drain”:


Then we got a week of intermittently heavy rainfalls. The rain garden filled up and the ponding spread over the yard 3 times that week:


And then the rain garden filled up a 4th time. With the ground so saturated, the rain garden drained more slowly this time. All told, this spate of rainy weather resulted in the plants being in standing water for several days. The “too small” red columbines got completely covered – for too long – and they were clearly “toast”. The Texas betony also failed to keep stems above water after a few days of ponding.


In the picture below, we see the impacts on the plants of having been in standing water for several days. Being young plants, not having developed significant growth and an extensive root structure, they were perhaps not “prepared” for that. Or it could be that some of these plants are simply not really very good choices for a rain garden.


One of the purple coneflowers appears to be dying, and the yellow columbine and flame acanthus appear to be struggling mightily. The bunch grasses appear to have survived, the skullcaps look a bit worse for wear but are still standing tall, one of the coreopsis has lodged, the other looks fine. The gulf coast penstemon and – especially – the inland sea oats seem not to have been bothered at all.

Below is a more closeup view of the flame acanthus, yellow columbine, purple coneflower and the now “toasted” red columbine. And just at the edge of the view, the now “flattened” Texas betony. The flame acanthus appears highly compromised, but the two bunch grasses appeared be surviving, for the present.


The rains were relentless, however, and the bed flooded and flooded again. As noted, with the soil – not only below the bed but all around it – so saturated, the bed was draining more slowly, leaving the plants in ponded water for days on end. One by one, the plants began to fail. Until only the inland sea oats and the gulf coast penstemon survived. I’m guessing the bunch grasses would have made it if they had been big enough to have foliage above the water, but they weren’t, and so they succumbed. Leaving us with a very impoverished plant palette in a mostly bare rain garden bed. We filled it in with potted plants, “to keep up appearances”😉


It was a case of very bad timing, of a spate of heavy rains before the plants could become well established. Still, as we saw, the rain garden basically “worked” as it was expected to, containing the runoff up to the point that it was overwhelmed, and infiltrating it into the soil, holding water on the site and bolstering deep soil moisture.

Then the spigot shut off, and we did not get any rain for about 2 months. Note all the leaf fall at the end of August, showing how drought-stressed the trees are. And still the inland sea oats and gulf coast penstemon hung on. So we know those, at least, are very robust rain garden plants.


We’ll put in more plants in the fall when the temperatures moderate, about the time of the Wildflower Center fall plant sale. We’ll keep on watching and tracking the plants, and updating this document. Please follow along with us and watch to see what the future holds for this rain garden. Which plants will survive, and thrive? Which will have to be replaced? How will it look, and perform, over time?

On to the future …


The fall Wildflower Sale is here, so yesterday we got more plants for the rain garden and planted them. The rain garden now looks like this:


The added plants include more inland sea oats and another gulf coast penstemon – the survivors from the original planting – and switchgrass, yellow Indian grass, American beautyberry and Turk’s cap. If the promised El Nino ever brings us any rain, we’ll watch and see how these do.

When installing these new plants, I encountered a very “hard” media, needed to use my geology pick to create the plant holes. I attribute this to the rain garden media being “settled” by the repeated and prolonged ponding in May and June, then being “baked” for a couple months with no rain. I will observe how readily the water infiltrates – if we ever get any big rains that fill it up – to see if the media may need to be “aerated” to restore its permeability. That may be a maintenance program that needs to be considered for rain gardens in this climate. We’ll see.

Continuing to watch …


… and Stormwater Too

Posted October 14, 2014 by waterbloguer
Categories: Uncategorized

In the last post, we reviewed a decentralized concept “waste” water management strategy which can render the “waste” water system more fiscally and economically efficient for both the developer and society while also focusing on beneficial utilization of that water resource, rather than on making a perceived nuisance go “away”. In this post, we look at specifically how stormwater can be managed to attain those same ends, as was generally reviewed in “Stormwater Management Can Be ‘Green’ Too”.

As we did when reviewing “waste” water management, we’ll use that neighborhood in the proposed Headwaters project on the outskirts of Dripping Springs as an example of how we might husband the stormwater resource. A Low-Impact Development (LID) scheme for a stormwater management system is sketched onto the neighborhood plan in the figure below. Two strategies are employed in that scheme, with the aims being to defray demands on the potable water system and to hold at least as much rainwater on the land as would have infiltrated on this site in its “natural” condition:

  1. Integrate water quality management of runoff from the rooftops with rainwater harvesting to obtain direct use of that runoff to meet irrigation demands over the back yards of each lot.
  2. Provide water quality management of runoff from the ground level surfaces using bioretention beds, which would retain and infiltrate runoff from the areas tributary to each bed.

Headwaters neighborhood WQ sketch plan

[click on image to enlarge]

Recall from the last post that reclaimed “waste” water would be used to irrigate “public” spaces – front yards, parkways, a neighborhood park – leaving irrigation of back yards, the “private” spaces, to other means. Under the stormwater management scheme shown above, runoff from rooftops would be captured in a water quality tank integrated into a rainwater harvesting system so that this water could be used to irrigate these back yards. The system integrating these functions is illustrated in the figure below.

Rooftop WQRWH scheme_CROP

[click on image to enlarge]

The rules governing water quality management on this project would require a “water quality volume” (WQV) to be sequestered and treated, while runoff volumes in excess of that amount could overflow and leave the site. The water quality (WQ) tanks shown in the figure above are sized to hold the WQV calculated for the rooftop runoff. The rules require that the WQV must be evacuated within 48 hours, so that this tank capacity is again available to capture that volume from another storm. Typically, under such a scheme, the water in these tanks would be drained “away” over that period and so would not be available to provide irrigation supply, since little if any irrigation would be required within that 48-hour period.

Under the scheme illustrated above, however, these tanks would drain into the rainwater harvesting (RWH) cisterns, which would be buried in the back yard, and only when those tanks were full would water pond up in the WQ tanks. The water in the RWH cisterns could be held there as long as required until needed for irrigation, for which a high efficiency subsurface drip irrigation field would be installed. Thus much of the water that would have been “disposed of” would instead be retained and used to meet irrigation demands.

Note that the RWH cisterns could be installed at the homeowner’s option, required only if the homeowner does plan to irrigate the back yard. If the RWH cisterns were not there, the water quality management concept employing the WQ tanks would still operate in a “normal” mode, as this method is set forth in the applicable rules. As reviewed below, the arrangement shown in the figure above simply enhances that “normal” water quality management scheme.

It does that is by capturing, and retaining on the site, a somewhat larger portion of the roof runoff than would be retained if the RWH cisterns were not there. The WQ tanks would drain into the RWH cisterns until they are full. Typically the RWH cisterns would not be full at the start of most storms, since the typical inter-storm period is long enough that some portion of the water in the RWH cisterns would have been evacuated to run the irrigation system. In that case, the volume captured would be whatever amount is required to fill up the RWH cisterns plus, if that storm produced more than enough rain to fill them, whatever amount ponded up into the WQ tanks, up to the their overflow level. Only the water that had ponded up in the WQ tanks would be drained in short order, with the rest being retained in the RWH cisterns until irrigation was needed.

The majority of the rainfalls over the annual cycle are less than the depth that would generate runoff equal to the WQV and so fill the WQ tanks. Many rainfalls may not even cause water to pond up into the WQ tanks at all, rather all the roof runoff from that storm may flow into the RWH cisterns. Thus a large portion of the rainfall onto the roof would be slowly infiltrated into the soil, through the irrigation system. That water would evapotranspirate rather than flow “away”, just as most of the rainfall over the annual cycle would have infiltrated into the area of soil now covered by the house roof. This greatly blunts the impact of placing impervious cover on the site and so provides a level of water quality protection superior to a system that simply sequesters and releases the WQV within 48 hours, as current rules require. And it does this while also providing a water supply for irrigation. Again, the stormwater management function is integrated with the water supply function, rendering each of them more efficient.

This scheme also proposes that whatever volume does pond in the WQ tanks would drain from them into the drip lines. Knowing the flow rate of the drip emitters at the head created by the ponding depth in the WQ tanks, the number of emitters required to drain the WQV in 24 hours can be calculated, and that many emitters made available to receive this water. Note that this would be done regardless of whether or not the homeowner were to install the RWH cisterns, so that even in that case, the WQV would be largely stored in the soil, enhancing long-term soil moisture, instead of directly flowing “away”.

With this arrangement, draining the WQ tanks in 24 hours, the WQV would also defray any stormwater detention volume that may be required for this project under rules governing the control of peak runoff rates. That would decrease, gallon for gallon, the size of any detention facilities that must be built, another savings for the developer.

As noted, any irrigation system in the back yard would be a subsurface drip irrigation field, fed by a pump in the RWH cistern. This would render the irrigation process most efficient, so that the harvested water would provide as much irrigation benefit as practically attainable. When installed in conjunction with improving the soil to support landscaping in the back yard, a subsurface drip irrigation field would not be significantly more expensive than a less efficient spray system covering the same area. Any installation cost difference would deliver long-term value to the homeowner.

The RWH cistern volume needed so that the back yard could be irrigated with just this water supply can be determined by running a rainwater harvesting model. A fairly cost efficient installation would be two concrete tanks (modified septic tanks), each having a capacity of 2,500 gallons, so that the total RWH cistern volume would be 5,000 gallons. The details won’t be belabored here, but the model will show that, presuming the house roofprint is 2,000 sq. ft. and that the landscaping is composed of plants needing a “low” amount of irrigation, this system would cover the irrigation demands of an area the size of this back yard in all but such severe drought years as 2011 was around here.

About that low water demanding landscaping, if the homeowner wanted a back yard covered mainly with turf, as is often the case – for instance, to provide a playspace for children – then this would urge using the sort of drought tolerant turf offered by Native American Seed under the name Thunder Turf or by the Wildflower Center under the name Habiturf. The irrigation demand profile suggested by these bodies to maintain this turf in a fairly lush condition was presumed in running that rainwater harvesting model. To whatever extent turf would be displaced with shrub beds, ground cover, etc., these plants also should be drought-tolerant natives. This illustrates the need to move to more regionally appropriate landscaping as an overall part of water management strategy here.

Before proceeding to consider the general strategy for managing runoff from ground level surfaces, note an optional “wrinkle” suggested in the figure above for managing driveway runoff. If driveways were constructed using permeable pavement, in essence they would create their own water quality management system. The rain falling on the driveway would not run off, rather would infiltrate through the pavement surface, to be held in a gravel bed below until it infiltrated. That bed would be sized to hold at least the WQV calculated for the driveway pavement area. Until that bed filled up, no runoff would flow “away”. Again, over the annual cycle a majority of the rainfalls have a depth less than the WQV capture depth, so would not produce runoff from this pavement. This impervious surface then would have a rainfall-runoff response somewhat similar to the area of natural soil it displaces.

The driveway pavement could be composed of pavers, porous asphalt, or porous concrete. Being a fairly small area of pavement, any cost bump over a “normal” concrete driveway – the typical installation – would be “small” for each home. This would be defrayed by there being less impervious surface draining to the other ground level stormwater controls, decreasing their sizes, and thus their total costs.

A variation of this would be to make the driveway a rainwater harvesting system as well, by installing material below the pavement that would hold a significantly greater volume than the calculated WQV. This could be done by installing a deeper gravel bed to create more void volume. However, there are a number of products on the market made just for the purpose of creating a water storage volume below a paved surface which may provide this storage more cost efficiently. This stored water could also be used to defray irrigation needs.

Now to the management of stormwater runoff from ground level surfaces. As shown on the overall development plan above, bioretention beds would be built downslope of all developed areas. These beds would intercept and treat an amount of runoff from each storm up to the WQV calculated for the area tributary to each bioretention bed, and would also infiltrate that WQV. The general scheme for construction of the bioretention beds is shown in the figure below, on this particular site taking advantage of the sloping ground on which those beds would be arrayed.

Bioretention bed section detail_CROP

[click on image to enlarge]

The berms built to contain the bioretention beds could use material excavated on the project site for roads and house foundations, providing both on site “disposal” of this spoil and a cost efficient source for that material. The core of the berm could be the poor quality subsoil, while the top and downslope surfaces of the berm would be covered with salvaged topsoil so that those surfaces could be effectively restored to prevent erosion. All such disturbed surfaces on the project site would be restored with native grasses and/or wildflowers so that no long-term irrigation of those surfaces would be needed. The bioretention bed surfaces would also be restored with native plants, chosen to endure both short-term inundation and long-term dry spells. With good planning, these plants would enhance and blend in with the vegetation on the slopes below the developed areas to provide a pleasant viewscape from the house yards.

A bioretention bed stores the WQV until it can be infiltrated into the soil below it. A combination of the void volume in the bioretention media and a ponded depth above the media provides this storage. The minimum required footprint of the bioretention bed is determined by the infiltration rate of the soil underlying the bed. The height of the berm would be set so that the bioretention bed surface would run far enough up the slope to provide the required area. Any runoff into the bioretention bed in excess of the WQV would overflow the berm, either as sheet flow or through defined overflow weirs, in which case provisions would have to be made to avoid erosion due to those channelized flows.

By capturing and infiltrating the WQV in these bioretention beds, we would be holding on the land at least as much rainfall as would have been infiltrating under the “natural” condition of this land. The way these beds would be constructed, they could be rather cost efficiently “oversized” relative to the calculated WQV to retain more runoff. It would indeed be good to hold more rainfall on the land, since the “natural” condition of this particular property is rather hydrologically degraded, the legacy of poor land management practices by previous generations. This scheme for stormwater quality management can thus help to “heal” the land as well as provide superior water quality management. This strategy will greatly blunt the degradation of water quality over this watershed that the placement of development on this degraded landscape could readily impart.

Another benefit of this scheme is that, with the thin soils covering this landscape, much of the water that infiltrates below the bioretention beds would probably not be retained in the soil over the long term, rather it would migrate to rock shelves or other slowly permeable strata and move downslope, where a significant portion of it may emerge at seeps. This is why it is important to provide the high level of pretreatment that a bioretention bed can impart, which along with migration through the soil will deliver a highly “renovated” water to the seep faces.

This is a benefit because these seeps could flow into streams, increasing and extending baseflow in the creeks downstream. That would benefit the riparian environment generally, but in this particular area, it could provide another benefit. The creeks which drain this area run through the Recharge Zone of the Edwards Aquifer, and in creek beds are where a large majority of this aquifer’s recharge occurs. Understand that under a conventional stormwater quality management scheme, which focuses on making the runoff flow “away” after short-term detention, adding impervious surfaces would make creek flow more subject to “flash” hydrology, with a large flow surge following a storm and then no flow between storms. Those large flow surges would more readily flow on by recharge features in the creek beds, and so would less “efficiently” recharge the aquifer. Therefore, extending the period of flow at a lower rate – that is, enhancing baseflow at the expense of quickflow runoff – would enhance recharge, and so enhance the water supply obtainable from this aquifer.

As for the cost efficiency of this strategy, notice in the overall neighborhood plan there are no storm sewers. Indeed, flows would only be channelized in the street gutters, flows which are directed into bioretention beds where they are spread out and infiltrated. It has been typically observed that this sort of stormwater management concept, centered on distributed green infrastructure rather than gathering runoff into more centralized end-of-pipe devices, is significantly less expensive, even as it does a better job of water quality management. Add on the water supply value the suggested scheme provides and this approach is no doubt significantly more cost efficient, both directly for the developer and globally for society.

In summary, this stormwater management strategy of integrating rooftop runoff with rainwater harvesting and capturing, treating and infiltrating ground level runoff provides superior water quality protection while also augmenting water supply. Overall, just as for the “waste” water management strategy reviewed in the last post, this integrated stormwater management strategy offers a win-win-win for society, for the developer, and for the residents of this project. It is another aspect of the sustainable water strategy that should be considered for all development in this region.



Posted September 24, 2014 by waterbloguer
Categories: Uncategorized

In the last post it was asked if Dripping Springs and developers there could bust out of their 19th century approach to water resources management. In this post, we look at how they can do that, reviewing a plan for a decentralized concept wastewater system, with the reclaimed water used for irrigation. And in the next post we’ll look at a stormwater management plan based on distributed Low-Impact Development (LID) techniques, integrated with rainwater harvesting to also supply irrigation water. This integrated water management strategy is a 21st century approach.

To address our 21st century water challenges, the over-arching goal of employing these strategies is to pretty much take irrigation off the potable water system. It is typically expected that, averaged over the year, about 40% of projected water demands would be for irrigation. So if we attain our aim, the actual water required for new development could be only 60% of the currently projected demand. That would drastically reduce the strain on existing supplies from local groundwater and the Highland Lakes, and it would blunt the need to develop new supplies, like schemes to import water to this area from the Simsboro Aquifer. That would be very expensive and likely unsustainable over the long term, as reviewed in “One More Generation”. Decentralized concept wastewater systems and LID stormwater management can therefore help to move us toward sustainable water in this region.

A neighborhood in the proposed Headwaters project is used to offer an example of these techniques. This is one of the developments in the hinterlands around Dripping Springs to which the city is proposing to extend an interceptor and take their wastewater “away”. An overall draft plan for Headwaters is shown below. The neighborhood we’ll focus on is along the first side street you come to as you run on down the main road entering the project off U.S. Hwy. 290. This street and one cul-de-sac off of it are fronted by 29 houses.

hd conceptual yield study - full plan 2014

[click on image to enlarge]



Before proceeding to review the details within this neighborhood, let’s look at how the decentralized concept strategy works with the “time value of money”. The development-scale conventional centralized wastewater system currently permitted for this project would have the treatment plant located in the lowest area down toward Barton Creek. So if the development were to begin with lots on the “higher” end, closer to Hwy. 290 to minimize the length of the main road and waterline to be initially installed, then they’d have to build a long run of wastewater interceptor main down to the treatment plant site, all of which would have to be sized to carry the flow from all of the development that would eventually occur along its route. This would impart a large “carrying charge” since much of that development wouldn’t be built for many years, thus much of that investment would lie “idle” for a long time. If, on the other hand, the first neighborhoods to be developed were down close to the treatment plant site, then a long run of road and waterline would have to be built, also imparting a “carrying charge”. As we will see, under the decentralized concept strategy, the whole wastewater system for each neighborhood is self-contained, built on a “just in time” basis, so one could start development anywhere desired without incurring “carrying charges” for that function.

Back to that example neighborhood, a plan approximating its layout is shown below, along with a decentralized concept wastewater system to serve those 29 houses. The plan utilizes the three essential tools of the decentralized concept to simplify the system, to make it more robust, to reduce its costs, and to maximize water utilization efficiency:

  • Effluent sewerage.
  • “Fail-safe” treatment.
  • Beneficial reuse of the water resource.

Headwaters neighborhood ww sketch plan

[click on image to enlarge]

Effluent Sewers – Simpler and Less Costly

In an effluent sewer system, wastewater from a house runs through the house drain into an interceptor tank (primary septic tank). These interceptor tanks hold and digest the settleable solids, so that only a liquefied effluent would run to the treatment unit. This allows use of small-diameter effluent sewer lines. These can run at very small and variable grades, typically with the lay of the land, in shallow, narrow trenches.

The interested reader can find a thorough review of the effluent sewer concept and its advantages here. But basically we use it because it is less expensive than conventional large-pipe sewers that have to run on larger and uniform grades, and it creates a simple, easy sludge management system – just pumping the interceptor tanks at multi-year intervals. Because the whole system is contained within the neighborhood, the overall cost of the collection system would be significantly less than a conventional collection system. That system would include not only the more costly lines within this neighborhood but also the large interceptor mains outside the neighborhood leading to the centralized treatment plant, the lines that would incur those “carrying charges”.

Ideally the area tributary to a treatment unit would be those houses that could drain by gravity through an effluent sewer system to the plant location. But where the topography dictates, one or more interceptor tanks could drain to a pump station, as shown in the drawing, to be pumped from there to the treatment unit, or to a point in the pipe system where gravity flow could take over. Such pump lines would also be small-diameter pipes installed in shallow, narrow trenches. Where they run parallel to the effluent gravity sewer, they would be in the same trench, so the cost of the pressure sewer would essentially be just the cost of the second pipe in that trench. Also an effluent pump station would be simpler and less problematic, and significantly less costly, than a conventional lift station, which would be needed at that point in a conventional centralized system.

“Fail-Safe” Treatment is Essential

The concept of “fail-safe” treatment bears a bit of explanation. I always use quotes in setting forth this tool, as nothing is ever completely fail-safe. Every sort of treatment unit will need proper operations and maintenance (O&M) in order to continue to function over time. However, there are some treatment technologies which, by their very nature, are resilient and robust, whose failure modes are slow and gradual, so can consistently and reliably produce a high quality effluent even in the face of temporarily poor operating conditions. One such technology is a variant of recirculating sand filter technology that I have labeled the high performance biofiltration concept. The interested reader can go here to get a thorough rundown of how this concept works and why it is highly robust, able to run with minimal routine oversight.

That’s a sharp contrast to the inherently unstable activated sludge process that is almost exclusively used by the mainstream in their centralized plants. And it is essential to a strategy entailing many small, distributed treatment units. Using activated sludge plants as distributed treatment units would be a disaster, as the O&M liabilities would be untenable. The high performance biofiltration concept, however, can run with little active oversight over long periods, so policing multiple plant sites would not create a great burden. This simple operating concept also uses far less energy than an activated sludge plant.

This treatment system will consistently and reliably produce an effluent quality better than that produced by most municipal treatment plants, including removing well more than half of the nitrogen from the wastewater. Recall from the last post that nitrogen was identified as a problematic pollutant, so care must be taken to remove it before any of the reclaimed water might seep into a creek.

Maximizing Irrigation Reuse, Minimizing Pollution

The reclaimed water coming out of the treatment unit would be routed into subsurface drip irrigation fields, arrayed as much as possible to irrigate areas that would be irrigated in any case, so maximizing the reuse value of this water resource. As the plan shows, much of the reclaimed water feed pipe could run in a common trench with the effluent sewer pipes. So the cost of much of this distribution system would basically be just the cost of the second pipe in that trench.

As noted, this decentralized concept plan aims to take irrigation demands off the potable water system. The plan shows dispersal of the reclaimed water is focused on front and side yards and parkways, in the “public” spaces, leaving the back yards – the “private” spaces – unencumbered by the drip fields, allowing the owners to install patios, pools, etc., there. (As will be reviewed in the next post, those private areas could be irrigated with harvested rainwater, integrating that into the stormwater management scheme, so taking that irrigation off the potable water system as well.)

The area of drip irrigation field is based on a design hydraulic application rate onto the drip fields of 0.1 gallon per square foot per day, about the average year-round evapotranspiration rate in this area. This plan provides more than enough space in areas that could be beneficially irrigated to meet that criterion. Note however this rate dictates that, on average, the field would be under-loaded through the heat of the summer and over-loaded through the winter. Thus, the drip field would act like a “drainfield” through part of the year, and indeed on any days at any time of year when a significant amount of rain fell. This may raise a concern about public health and environmental protection.

With the high quality effluent produced by the high performance biofiltration concept, and adding on UV (ultraviolet) disinfection as a safety factor, subsurface dispersal would pose no public health hazard. The water would be sequestered below the surface so contact potential would be extremely low. And any water that percolates through the soil, perhaps eventually to emerge at seeps as is likely in this topography, would have been completely “renovated” by passage through the improved soil that would be installed over the drip field areas.

Regarding environmental protection, the nitrogen concentration of the wastewater will be knocked down in the treatment unit, so it would be loaded at a rate more closely matching the uptake of nitrogen by plants covering the drip fields. Over the annual cycle the vast majority of the reclaimed water entering the drip irrigation fields would exit by way of evapotranspiration into the air instead of percolation down into the soil, where it could perhaps migrate to seeps and on into streams. So the potential mass loading of nitrogen into streams would be inherently very low. In any case, it can be expected that most of the nitrogen in the reclaimed water that is not taken up by plants would be eliminated by in-soil denitrification, gassing off the nitrogen into the atmosphere in the same manner it is eliminated in the treatment process. The soil is also by far the best medium for eliminating/assimilating the contaminants of emerging concern, such as pharmaceuticals, which would be so very problematic if the effluent were discharged to a stream.

Indeed, critics of dispersing the reclaimed water in uncontrolled access areas as shown in the plan – front and side yards, parkways, a park – would be hard pressed to show why this would be unsound practice in regard to either public health or environmental protection. As reviewed in “Slashing pollution, saving water – the classic win-win (but ignored by society)”, our controlling institutions allow – indeed they support – the spewing of water over the surface that has been questionably treated in home-sized activated sludge units subjected to a meaningless level of oversight, and then is rather questionably disinfected in a drop-feed tablet chlorinator, all over Hill Country watersheds. And many other houses have subsurface drainfields dispersing water into the soil, with no organized oversight at all. Contrast that with what is posed here – a professionally managed, highly robust and resilient treatment unit with subsurface dispersal at irrigation rates after highly effective UV disinfection – inherently far less problematic.

By the same token, concerns about marketability of a development with this sort of wastewater management system ring hollow. Again, there are all those houses being sold with that (smelly) activated sludge unit sitting right next to the house, with the poorly treated water sprayed around the lot. Whole large subdivisions, including some within Dripping Springs’ jurisdiction, employ that as the wastewater management strategy, and the builders don’t seem to be batting an eye. Indeed, it is the builders who insist upon installing the relatively cheap activated sludge and spray dispersal system, who refuse to consider using a “fail-safe” treatment unit and drip irrigation. So to suggest that the decentralized concept scheme would negatively impact on marketability is disingenuous, to say the least.

Back to real issues, irrigation of front and side yards with the reclaimed water would relieve the homeowners of water bills to irrigate these spaces. But, as noted, the amount of water each house would produce as wastewater would leave these areas under-loaded through the peak irrigation season, IF a conventional turf and/or high water demanding “exotic” plants were used to create those front yard landscapes. This suggests another strategy to match the needs of the landscape to the water made available through the wastewater system – a regionally appropriate landscaping aesthetic/ethic. A front yard landscape employing native and native-adapted plants, such as shown in the picture below, could thrive on the amount of water the wastewater system could provide, so no draw on the potable water system for additional irrigation would be needed through the peak irrigation season.


[click on image to enlarge]

This sort of landscape might be institutionalized as the “face” of this development, displacing the sterile patch of water-guzzling turf that is the “stock” aesthetic in such places. The developer might deliver the home to the buyer with a basic native plant palette in place over the mulched and improved soil bed, as required to support drip dispersal of reclaimed water. The homeowner might be given an account at a participating native plant nursery and some assistance/instruction in native plants so that he/she can enhance the landscape as desired, creating buy-in to this aesthetic.

What Does It Cost?

Rough cost estimates were made for the effluent sewer system, the treatment unit, and the reclaimed water feed system shown in this neighborhood plan, yielding an estimate of about $8,000/house. To complete the entire system, the cost of the drip irrigation fields would have to be added. It is called to question however if those are not, in part at least, costs that would be borne in any case, given that much of the area shown on the plan as irrigation field might be irrigated anyway. It can be argued that, in this terrain, amended soil would cover the front yards and parkways to support improved landscaping without regard to whether it would be needed to support environmentally sound drip dispersal of the reclaimed water. Indeed, minimum soil depth on lots for landscaping is required under the Fish & Wildlife MOU that would allow this development to use water delivered by the Hwy. 290 pipeline from Lake Travis. And installing drip lines in this amended soil would not be significantly more costly than a spray irrigation system, which the drip fields displace.

While hard to compare without more details than I currently have, it is expected that these costs compare well with what would be needed to implement the conventional centralized system. The conventional collection system within this neighborhood would incur a similar cost to the effluent sewer system within it, with the interceptor tanks included, and then to that you’d have to add a share of the cost of the interceptors and lift stations needed to get wastewater from this neighborhood down to the centralized treatment plant. That plant would no doubt cost less per gallon per day of capacity than the small decentralized plant, but here again the centralized plant would be sized for the flow at buildout. So the total cost would be much greater, with the capacity that would not be fully utilized for many years imparting a “carrying charge”.

Also, under the centralized plan, the dispersal of the treated water would be a “land dumping” operation, with the cost of the dispersal system not providing any benefit other than making that water go “away”. So the entire reclaimed water distribution system and the entire dispersal system would all be extra costs, instead of displacing irrigation systems that would otherwise be installed anyway. As well as wasting all that water, while the homeowners would purchase potable water to run their irrigation systems.

If, instead of implementing their development-scale conventional centralized system, the developers connected to the City of Dripping Springs system, it does not appear that their cost situation would be much, if any, better. The estimated cost of the “east” interceptor that would receive wastewater from Headwaters is $7.78 million (per the Dripping Springs PERP dated July 2013). While of course that interceptor would eventually serve other development, its major reason for being in the Dripping Springs plan is to incorporate Headwaters into the city’s proposed conventional centralized system. Dividing that cost by the planned 1,000 lots in Headwaters, the cost per lot is $7,780, by itself almost as much as the rough estimate for collection and treatment of the wastewater and redistribution of reclaimed water under the decentralized plan. This would be in addition to the internal sewer network, including several lift stations, within Headwaters. There’d also be a charge for buy in to the city’s treatment capacity.

Of course, this all presumes that the city could have that interceptor and the lift station(s) associated with it on line, as well as its treatment plant expanded, before the first house in Headwaters becomes occupied. As Headwaters has filed a preliminary plan for its first phase of 208 lots, that is an open question. Dripping Springs has not yet even released its revised PERP, thus has not even begun the permitting process at TCEQ, which can be expected to run about a year. If all goes well, that is; it could get longer.

Also note that it would be only those 208 lots, not 1,000 lots, that the developer could spread the buy-in costs over. But the entire $7.78 million must be put in the ground up front, along with the $8+ million for the treatment plant expansion. And the estimated cost for permitting is another $1 million in up front money. It will not be the developer who will be prevailed upon to cover all these costs, rather they will be covered by bonds, the payments for which will doubtless be spread over the entire city’s ratepayer base. So there is an aspect of social equity to be considered here too, as existing ratepayers will be required to help pay the costs incurred due to growth. Which, it has been asserted, will never pay back through tax revenues what it costs to install and maintain the infrastructure needed to actuate it, at least if that continues to be the conventional infrastructure.

A note about operations and maintenance. Understand that all of the decentralized concept systems would be under unified management, either by the MUD organized by the developer or as an integral part of the Dripping Springs wastewater system. While it won’t be belabored here, it is expected that the O&M costs of the decentralized concept systems would be less, perhaps significantly so, than for the conventional centralized system.

All things considered, the price of the decentralized concept system appears likely to be a sweet deal for the developer, even without taking into account the “time value of money” benefits of installing the wastewater system infrastructure on a “just in time” basis, to serve only the neighborhoods slated for imminent development. And because of the social equity issues, it would be a sweet deal for the existing citizens of Dripping Springs as well.


Then add on the reuse benefits, displacing irrigation demand from the potable water system, which will further benefit all the citizens of the area by delaying, perhaps obviating, the need to implement a very costly long-distance “California-style” water transfer scheme that would greatly increase water rates. Altogether this is a win-win-win. So clearly it would be in the interests of Dripping Springs and the developers there to give meaningful consideration to a decentralized concept wastewater management strategy.


Can Dripping Springs, and developers there, bust out of the 19th century?

Posted September 8, 2014 by waterbloguer
Categories: Uncategorized

Or will they choose to remain stuck there. Because, you know, that is a choice they are free to make.

It’s a simple proposition, really. If your aim is to maximize use of the water resource we mistakenly call “wastewater” to defray demands on the area’s water supplies, then it just makes sense to design the “waste” water system around that principle. It doesn’t make sense to instead use a large majority of the money dedicated to this function to build a large-scale system of pipes and pump stations focused on making what’s misperceived as a nuisance to go “away”, then to spend even more money on another large-scale system of pipes and pumps to run the reclaimed water back to where it came from in the first place!

That’s the standard MO of our mainstream institutions, like the City of Dripping Springs and the engineers who advise it and developers whose projects would feed into the city’s centralized wastewater system. This centralized management concept was a response to the conditions considered paramount in the 19th century. The industrial revolution was in full force, city populations were exploding, the stuff was littering the streets, creating a stench and a serious threat of epidemic disease. The response was to pipe it “away”, to be deposited in the most conveniently available water body. Later, as it was realized those water bodies were being turned into foul open sewers, creating a threat of disease in downstream cities that withdrew their water supplies from them, treatment at the end of the pipe was considered, and eventually adopted as the standard.

The intellectual leadership of the centralized pipe-it-away strategy was centered in well-watered areas like northern Europe and the northeastern and midwestern areas of the US. So the resource value of that “waste” water was never part of the equation. This water, and the nutrients it contains, was viewed solely and exclusively as a nuisance, to be made to go to that magical place we call “away” – the working definition of which is apparently “no longer noticeable by me.” This centralized pipe-it-away strategy became institutionalized as the manner in which cities manage wastewater.

Of course, that strategy flies in the face of the circumstances confronting us here in Central Texas in the 21st century – that water, all water, is a valuable resource which we can no longer afford to so cavalierly waste by addressing it solely and exclusively as if it were just a nuisance, simply because that is what the prevailing mental model dictates. Rather, it’s imperative we practically maximize the resource value of that water, using it to defray demands on the area’s water supplies, which are being stressed by both chronic drought and population growth.

In the Texas Hill Country, we also have an issue with surface discharge of wastewater, even when treated to the highest standards that the Texas Commission on Environmental Quality (TCEQ) has so far formulated. And before proceeding I’d note that this issue would remain even if the whole system were to operate perfectly all the time. But of course, it will not; there will inevitably be “incidents”. Which brings up the issue of the vulnerability created by centralization. I’ve often said, not entirely tongue-in-cheek, that the real point of regionalization – TCEQ-speak for centralizing flow from as far and wide as can be attained – is to gather all this stuff together at one point where it can really do some damage. Indeed, the whole organizational strategy is a “vulnerability magnet”. Large flows being run through one treatment plant or one lift station or one transmission main means that any mishap may create large impacts.

Back to the issue with discharge in the Hill Country, the major problem is those nutrients in the wastewater, in particular nitrogen. A discharge of the magnitude that an expanded Dripping Springs system would create, centralizing wastewater flow from developments for miles around the city in every direction, would make the receiving stream effluent-dominated. This would be partly an artifact of the drawdown of local aquifers drying up springs and thus reducing natural streamflow – again highlighting how critical it is to defray demands on these local water resources – but in larger part due simply to the magnitude of the wastewater flow. Highlighting the problematic nature of “permitted pollution” when the flow has been centralized so that, even with low concentration limits, the mass loadings may still be “large”. The nitrogen would cause chronic algal blooms in the creeks, making them very green most of the time, and then depleting oxygen in the water when the algae die off, degrading the riparian environment.

This is deemed an aesthetic affront by downstream landowners. But even more critical, the stream that would receive Dripping Springs’ discharge is Onion Creek, a major source of recharge to the Edwards Aquifer. That’s a sole source aquifer supplying water to about 60,000 people and is the source of Barton Springs, which is home to endangered species. So there’s great antipathy to any plan by Dripping Springs to discharge.

The “standard” option is to continue to “land apply” the effluent from its wastewater treatment plant – “irrigating” land for the sole purpose of making the water go “away” rather than to enhance the landscape or grow a cash crop – which the city does under its current permit. This practice is more accurately termed “land dumping”, and in this region, in this time, it is an unconscionable waste of this water resource.

At least discharge would have some utility, providing more constant flow in the creek, enhancing the riparian environment, and a more constant recharge of the Edwards Aquifer. That is, it would have utility if the water were to be treated to a standard that would preclude the “insults” noted above.

In regard to nutrients, that is technically possible – albeit unlikely to be required by TCEQ – but it would be quite expensive. Burnet discovered that treating to a higher standard to allow them to discharge into Hamilton Creek, which eventually flows into the Highland Lakes, would add about $10 million to the cost of their treatment plant. But that still won’t attain the high removal rate demanded for discharge into Hill Country creeks that recharge the Edwards Aquifer.

But nutrients aren’t all there is to be concerned about. There are also “contaminants of emerging concern” – pharmaceuticals, in particular endocrine disruptors. What it would cost to make discharge “safe” in this regard is an open question – another subject for another time. Suffice it to note here that TCEQ has no standards addressing these pollutants, thus there is no requirement to even consider what might be “safe”.

The latest word is that the overwhelming dissatisfaction with a discharge scheme has urged Dripping Springs to drop its plans to seek a discharge permit – for the present. It’s unclear if that means it would just expand its “land dumping” system (a rather costly proposition, due to the land requirements, so Dripping Springs might soon decide that’s just too expensive and would request a permit to discharge). Or would the city pursue any and all opportunities to route the treated effluent to beneficial reuse? Likely mainly within the developments generating the flow as few other opportunities have been identified, the 8-acre city park being the only one mentioned in the version of the Preliminary Engineering Planning Report (PERP) the city released last summer.

Which brings us to how the city would create a system plan predicated on beneficial reuse of this water resource to defray demand on other water supplies. The city appears to be leaning toward simply appending onto the already costly 19th century conventional centralized wastewater system another whole set of costly infrastructure to redistribute the water, once treated, back to the development that generated it. Note, however, that as TCEQ presently interprets its rules, the city will still be required to have a full-blown “disposal” system in place regardless of how much of that water they expect to route to beneficial reuse, making that whole concept somewhat problematic if indeed no discharge option would be sought. This focus of TCEQ rules, as currently applied, on “disposal” of a perceived nuisance, to the exclusion of focusing on integrated management of water resources, is an issue for any sort of plan the city may consider, highlighting the need to press TCEQ to reconsider that focus.

Indeed the city’s centralized plan would be costly. Dripping Springs is keeping its present engineering analyses close to the vest, but according to the version of the PERP released last summer, the three interceptor mains in that plan – denoted “east”, “west” and “south” (leaving us to wonder what will be done with development that may occur to the north) – and their associated lift stations would have a total cost of about $17.5 million. These are costs, along with the estimated $8.1 million for treatment plant expansion and an estimated $1 million for permitting, that must be sunk into the system prior to being able to provide service to the first house in the developments this system would cover. Then there is the cost of centralized collection infrastructure within the developments, to get their wastewater to those interceptors, no doubt running into the 10’s of millions at complete buildout.

And for this, all they get is “disposal” of a perceived nuisance!

With, as noted, the issue of how the water would be “disposed of”, if it is not discharged, still to be resolved – and paid for. If it is to be redistributed back to the far flung developments generating the flow, the facilities to do that will add many more millions to the overall cost of the complete system.

Far less costly, in both up-front and long-term costs, would be the creation of a 21st century system that would be designed around reuse, rather than “disposal”, of this water resource right from its point of generation. The city could pursue a decentralized concept strategy, focused on treatment and reuse of this water as close to where it is generated as practical, obviating the high cost of both the conventional centralized collection system and the reclaimed water distribution system.

Entailing a number of small-scale systems designed into rather than appended onto development, it is highly doubtful that the city could unilaterally impose that sort of system. The large developments around Dripping Springs are all planning – indeed they have obtained TCEQ permits for – smaller conventional centralized systems within each of them, featuring “land dumping” as the intended fate of the water. In fact, Dripping Springs has “sponsored” the permit for one of those developments, so is actively promoting this strategy. The development agreement with another large project specifies that the wastewater generated in that development must be run into the city interceptor whenever it is built, despite the development-scale system being in place. So if the city does develop interceptors that would drain wastewater from those developments to an expanded centralized plant, then these development-scale systems would be stranded assets, sunk costs incurred simply to allow development to begin prior to completion of the city interceptor, then to be abandoned, basically wasting the fiscal resources required to install them.

It’s clear then that Dripping Springs could pursue a decentralized concept strategy to expand service capacity to encompass those developments only if each of them were to cooperate in planning, designing, permitting and implementing the decentralized system, instead of those development-scale centralized systems they’re presently planning to build. But of course, unless Dripping Springs presumes a leadership role, the developers have no impetus to consider that. They must presume they’d have to abandon any sort of development-scale system and run their wastewater “away” into the city’s centralized system whenever interceptors were extended to their properties.

To pursue a decentralized concept strategy it must be determined how such a system would be organized and how it could be permitted, given the “disposal”-centric focus of how TCEQ wields its rule system. This is a complex subject that does not well lend itself to this medium. Complicated by the decentralized concept remaining “non-mainstream” despite it having been out there for quite a long time – I defined the decentralized concept in 1986, and it was “ratified” as a fully legitimate strategy in a 1996 report to Congress, among other milestones – so its means and methods remain largely unfamiliar to regulators, engineers and operating authorities. Further, being designed into rather than appended onto development, the details would be sensitive to context; while there are recognized organizing principles, there is no “one size fits all” formula.

For the interested reader, a broad overview is “The Decentralized Concept of Wastewater Management” (in the “Decentralized Concept” menu at www.venhuizen-ww.com), and a basic review of those organizing principles are set forth in this document, reviewing wastewater management options in the nearby community of Wimberley. But a review of exactly how to design a decentralized concept system for any given project in and around Dripping Springs is properly the subject of a PERP for each project, not something that can be credibly described here, absent any context. The means and methods are, however, all well understood technologies that can readily be implemented to cost efficiently maximize reuse of this water resource. [Note that a stab at detailing exactly how to do a decentralized concept system in the context of one of the development in Dripping Springs’ hinterlands is offered in the next post.]

Highlighting that the most salient feature of a decentralized concept strategy in the context of this region is the “short-stopping” of the long water loops characteristic of the conventional centralized strategy, so that reuse of the water resource would be maximized at the least cost. It is this 21st century imperative that should motivate Dripping Springs and the developers working in that area to explore the decentralized concept. A necessary part of that exploration is to press TCEQ to consider how it interprets and applies its present rules, and perhaps to consider the need for “better” rules that recognize our current water realities. None of this can be served up for the city or the developers as a fait accompli in this medium; it is a job they have to undertake. One which we all need them to undertake, for the benefit of this region’s citizens, current and future.

But from all indications to date, it does not appear they will even try – they just can’t seem to expand their mental model of wastewater management to encompass it. The result of which is that most of this wastewater will live down to its name for a long time to come, driving us ever further away from sustainable water. So the question is posed: Can Dripping Springs, and the developers there, bust out of the 19th century – or will they choose to remain stuck there?



Water for DFW – Building-scale rainwater harvesting vs. Marvin Nichols

Posted August 7, 2014 by waterbloguer
Categories: Uncategorized

In the last post we reviewed the potential of building-scale rainwater harvesting (RWH) as a water supply strategy in the high-growth area around Austin, in Central Texas. Here, we examine its potential in another high-growth area of Texas, the Dallas-Fort Worth area, commonly called the Metroplex. And then we will contrast that strategy with doubling down on the watershed-scale rainwater harvesting strategy, as may be represented by the proposed Marvin Nichols Reservoir.

To gain an appreciation for the potential of building-scale RWH in and around the Metroplex, modeling was executed for the following locations: Athens and Terrell to the east-southeast, Ferris closer in to the south, Cleburne to the southwest, Weatherford to the west, Bowie to the northwest, Sherman to the north-northeast, and Denton closer in to the north-northwest. Ringing the Metroplex, these locations offer an overview of conditions all around it.

As was the case for the modeling results of the Central Texas locations, it was seen that “right-sized” building-scale RWH systems around the Metroplex would have provided 97-99% of total interior supply through the recent drought period for houses modeled with a presumed average water usage rate of 45 gallon/person/day. But around the Metroplex, the “right-sized” systems would be somewhat smaller than would be required around Austin. Recall that the “right-sized” system there to serve a 4-person household would be a roofprint of 4,500 sq. ft. and a cistern volume of 35,000 gallons. In Bowie, Weatherford and Cleburne, the “right-sized” system for a 4-person household would require only 3,750 sq. ft. of roofprint, paired with a 25,000-gallon cistern in Cleburne and Weatherford and a 27,500-gallon cistern in Bowie. All other locations would require 3,250-3,500 sq. ft. of roofprint and 20,000-25,000 gallons of cistern capacity. It is expected that a one-story house plan with a 2-car garage plus a “typical” area of covered patios/porches could provide a roofprint of 3,000-3,500 sq. ft., so these modeling results indicate many houses in/around the Metroplex would not require any “extra” roofprint to be added on.

As reviewed in the last post, a usage rate of 45 gallons/person/day should be readily attainable by most people, given a house fitted with the current stock of water fixtures, but a lower rate could be routinely attained by people even moderately attentive to conserving water. If a usage rate of 40 gallons/person/day were routinely attained around the Metroplex, the “right-sized” systems that would have provided 97-100% of total interior supply for a 4-person household through the recent drought period would require 3,000-3,500 sq. ft. of roofprint and 17,500-20,000 gallons of cistern capacity for a 4-person household.

Just as in Central Texas, with the baby boomers reaching retirement age and demographics tending toward more one and two-person households in all age groups, a significant part of the market might be made up of houses that could be “right-sized” for a 2-person occupancy. Modeling this occupancy around the Metroplex, at a water usage rate of 45 gallons/person/day the “right-sized” system that would have covered 97-99% of total interior demand would have a roofprint of 1,750-2,000 sq. ft. of roofprint and a cistern capacity of 10,000-15,000 gallons. At a water usage rate of 40 gallons/person/day, a “right-sized” system covering 97-100% of interior demand would require a roofprint of 1,750 sq. ft. and a cistern capacity of only 10,000 gallons, except for Bowie where a 12,500-gallon cistern would have been required. Since it is expected that a one-story house plan plus garage or carport and modest area of covered patios/porches would provide about 2,000 sq. ft. of roofprint, this market could use building-scale RWH without requiring any “extra” roofprint, and would incur relatively modest cistern costs.

So the water supply potential of building-scale RWH around the Metroplex is pretty clear. Yet there is not a mention of this strategy in the planning documents of state planning Region C, the area around the Metroplex. Actually there is no respect shown for this strategy in any of the regional plans, and the state water plan explicitly dismisses it, stating, “While it is often a component of municipal water conservation programs, rainwater harvesting was not recommended as a water management strategy to meet needs since … the volume of water may not be available during drought conditions.” Which is to say that because a “right-sized” system may need 1-3% of the total supply from alternative sources during severe drought periods, this strategy is deemed not to exist at all!

This is likely due to the water planners being guided by a mental model that does not comprehend building-scale RWH as a consciously chosen broadscale strategy, as perhaps the water supply strategy in whole developments.  This was the subject of an investigation, funded by the Texas Water Development Board, that I ran a couple years ago, in which it was brought out that this strategy confers a number of advantages relative to conventional – or watershed-scale RWH – water supply systems. One of the issues considered was provision of backup supply, but only on the basis of the “mechanics” of delivering it. Not fettered by the mainstream’s mental model, it had not occurred to me to question the whole strategy because some small amount of backup supply would no doubt be needed – indeed, the whole idea of “right-sizing” was to cover water demands in all but the worst drought periods and plan on providing a backup supply, presuming that the relieved capacity offered by building-scale RWH would make such a supply available from the sources so relieved.

Still, this does beg the question of from exactly where that backup supply would be derived. As noted in the last post, the building-scale RWH strategy should be considered in the context of “conjunctive management”. Building-scale RWH would divert the vast majority of the demand off of the conventional sources, so decreasing the routine drawdown of those supplies, thus leaving in them the capacity to provide the small amount of backup supply. Of course, if it is presumed that any development on building-scale RWH is in addition to rather than in place of development drawing from those conventional supplies, and that this other development would be of such extent that it would tax the available supply sources during those drought periods, then there may indeed be a question of whether the capacity to provide backup supply for building-scale RWH systems would be available. It will require another whole study to examine how a conjunctive management concept could work in practice. Until the mainstream water planners can get around their mental model and recognize the inherent potential of building-scale RWH, however, it is unlikely that any such study would get funded.

Around the Metroplex, however, modeling shows that, unless the drought gets more severe than has been experienced since 2007, essentially 100% of interior demands could be provided by upsizing the roofprint and/or cistern volume only a modest amount above what is reported above. The worst case would be in Bowie, where a roofprint of 4,000 sq. ft. and a cistern capacity of 30,000 gallons would be required for a 4-person household using water at a rate of 45 gallons/person/day.

So we can provide interior water usage with building-scale RWH, but why should we, rather than continuing to expand and perpetuate the watershed-scale RWH strategy? Consideration of the problems and hazards of building Marvin Nichols Reservoir offers some insights into that.

Marvin Nichols Reservoir would be located in northeast Texas, about 115 miles east-northeast of the Metroplex. The Region C report offers this about that project:

“As a major reservoir project, Marvin Nichols Reservoir will have significant environmental impacts. The reservoir would inundate about 68,000 acres. The 1984 U.S. Fish and Wildlife Service Bottomland Hardwood Preservation Program classified some of the land that would be flooded as a Priority 1 bottomland hardwood site, which is “excellent quality bottomlands of high value to key waterfowl species.” … Permitting the project and developing appropriate mitigation for the unavoidable impacts will require years, and it is important that water suppliers start that process well in advance of the need for water from the project. Development of the Marvin Nichols Reservoir will require an interbasin transfer permit to bring the water from the Sulpher River Basin to the Trinity River Basin. The project will include a major water transmission system to bring the new supply to the Metroplex.”

Unstated is that many people in the area that would be impacted are highly opposed to this project, due in large part to those “unavoidable impacts.” This is a battle of economic interests – those in the Metroplex that purport a need for this water vs. those, such as the timber producers, that would be eliminated by the reservoir. Indeed, the official position of the planning process in planning Region D, where the reservoir would be located, is in opposition to the project, and it is not included in their plan. This contrasts with deriving “new” water supply from building-scale RWH, which would have positive economic impacts in Region C – benefiting businesses that would design, install and maintain the building-scale RWH systems – and no negative impacts in Region D.

As noted, utilizing in the Metroplex any of the water collected in this reservoir would require a huge investment in transmission facilities – pipelines and pump stations – and on-going operating costs to maintain them and for energy to run the pumps. Of course the water would need to be treated, also entailing considerable energy requirements. Since it takes water to make energy, this would cut into the water use efficiency from this source. And making that energy would also generate greenhouse gases, which would exacerbate the already problematic impacts of climate change on regional water resources. This contrasts with the building-scale RWH strategy, which would not require any transmission facilities and would require far less energy to treat and pressurize the water for use within the building.

As the Region C report states, it will take a long time to permit and build this reservoir and the transmission facilities, meaning delivery of the first drop of water is decades away. In contrast, the building-scale RWH strategy could begin delivering water supply immediately, and grow in lockstep with demand, one building at a time.

The passage from the Region C report refers only peripherally to the ecosystem services that flooding the land would eliminate or damage, noting only loss of habitat for “key waterfowl species”, without quantifying how critical to the well-being or survival of those species that loss may be. That of course would be sorted out in the process of preparing the environmental impact analysis that will be required as part of the permitting process, another expense that would be obviated by the building-scale RWH strategy. But those ecosystem services go well beyond their impact on birds. Eliminating the timberlands loses the oxygen production and carbon sequestration they provide, along with habitat for many other plants and animals. Forests are also important to maintaining water quality and to the storage and release of water for environmental flows, which would instead need to be provided “artificially”, with water from the reservoir of degraded quality, including thermal impacts. None of these “externalities” figure into the cost of water projected for this strategy, significantly “warping” the analysis.

There would also be significant losses from the watershed-scale rainwater harvesting system this reservoir would create. Huge evaporation losses from the reservoir would be incurred, and there would be significant losses in the transmission system. In contrast, the building-scale RWH strategy would suffer no such losses.

The Region C report also states, “… the unit cost [of the water supply the reservoir would provide] is less than that of most other major water management strategies.” While at the end of the day the overall direct cost of Marvin Nichols Reservoir and its required infrastructure might be less than the aggregate direct cost of the number of building-scale RWH systems that would provide equivalent supply – which it is noted has not been developed in the Region C report for comparison – much of the cost of the former would need to be expended well up front of delivering the first drop of water to the Metroplex, and all that investment would be at risk. The costs of the building-scale RWH strategy, on the other hand, would be incurred incrementally, one building supply system at a time, so the delivery of supply would pretty directly track the capital requirements. This works with the “time value of money” to defray the global long-term cost of the building-scale RWH strategy. So it is not at all clear that the global cost of the Marvin Nichols option, even neglecting the externalities which the Region C report ignores, would be less.

In summary, broadscale implementation of building-scale rainwater harvesting may provide sufficient supply so that the conventional sources would be sufficiently “relieved”, allowing growth to be sustained without requiring new reservoirs. And it may do so at a cost that would be competitive with the global costs of continuing to extend and perpetuate the watershed-scale rainwater harvesting strategy, which would require going far afield to obtain additional new supply. Yet this is, quite consciously, the road not taken by the water planners in Region C. Or, as noted, anywhere else in the state where building new reservoirs, raiding remote aquifers, and other conventional supply strategies are purported to be needed to support projected growth. Time to re-evaluate?


Rainwater Harvesting for Water Supply – By The Numbers

Posted July 3, 2014 by waterbloguer
Categories: Uncategorized

In “Zero Net Water” the case was made for centering water supply on building-scale rainwater harvesting (RWH). Here we look in more detail into the potential of that strategy to provide water supply in Central Texas, parts of which are forecast to have considerable population growth over the next few decades. Since it is in new development where the Zero Net Water concept would be best applied, this area is a prime target for that strategy.

As reviewed in “Zero Net Water”, a modeling process was used to determine the “right-size” of a rainwater harvesting system to supply interior usage in houses. Modeling was executed presuming a 4-person occupancy in “standard” subdivisions and a 2-person occupancy in subdivisions targeted at seniors. A “right-sized” system is one that has a roofprint and cistern volume relative to the expected water demand profile such that backup supply would only be required in the worst drought years, and even then would be rather limited. This is specified so that the demand for backup supply in these houses from our “normal” supply sources would be minimized, and in recognition that a trucked-in backup supply – expected to be the dominant mode of providing that supply for a number of reasons that are not belabored here – would be stressed if backup supply requirements were not so limited.

First we examine locations tributary to the Highland Lakes, which currently provide the water supply for Austin and much of the area around the lakes, including such fast-growing places as Bee Cave and Dripping Springs. The inherently greater efficiency of building-scale RWH vs. watershed-scale RWH noted in “Zero Net Water” is illustrated by modeling these locations in that tributary area:  Brownwood, Burnet, Fredericksburg, Llano, Menard, San Saba and Spicewood. Only in Brownwood and Menard, located further to the north and west in this area, does the modeling indicate that any backup supply would have been required after the extreme drought year of 2011, while the “right-sized” RWH systems would have provided all the interior water supply since then in all the other locations. This contrasts to how the lakes have “performed” as the watershed-scale “cistern” over that period, as they remain chronically low, not “recovering” after 2011 in the way the “right-sized” building-scale RWH systems would have.

The “right-sized” building-scale RWH systems would have provided 95-98% of the interior demands over the recent drought period at these locations. Using building-scale RWH for interior water supply would have relieved the lakes of having to provide that supply, thus they would have been drawn down more slowly if that had been a broadscale practice. So even though backup supplies to provide the 2-5% deficit may have been drawn out of the lakes – or withdrawn from streams flowing into them – the overall result would have been to significantly conserve region-wide water supply over the modeling period.

Now looking at Austin proper, and at Dripping Springs, as representative of the high-growth areas in this region, we see that a “right-sized” building-scale RWH system would have provided 96-98% of interior demands in the recent drought period through 2013. Indeed, even with 2014 having been very dry well into May, the models show that no backup supply would have been required to date in 2014 as well.

Based on a modeled demand rate of 45 gallons/person/day and an occupancy of 4 persons, “right-sized systems for single-family homes around Austin and Dripping Springs require 4,500 sq. ft. of roofprint and a 35,000-gallon cistern to have provided 97-98% of interior demand through the current drought period. These are fairly large, and would impose significant costs, so the impact of better demand control – water conservation – was also examined.

A demand rate of 45 gallons/person/day is reported by the American Water Works Association to be routinely expected for a residence equipped with state-of-the-art fixtures in which the users give “reasonably” conscientious attention to demand control – e.g., it presumes minimal leakage losses, “reasonable” showering time, etc. It is understood, however, that better demand control is readily attainable. My personal experience is a case in point. According to our winter water bills my wife and I have an average interior demand rate of 37 gallons/person/day for our two-person household. As we are served by the watershed-scale Austin Water RWH system, not a building-scale RWH system, we have no particular impetus to “highly” conserve, as would a rainwater harvester who could see the cistern volume dwindling when rain is scarce. The only “highly” efficient appliance in our house is a front-loading washing machine; all the rest are 1990s-era fixtures. One can conclude, therefore, that something in the range of 35-40 gallons/person/day is a demand rate that is readily attainable without any “crimping” of lifestyle.

Indeed, a lower demand rate is typically presumed by those who design and install building-scale RWH systems, with 35 gallons/person/day being routinely presumed. So the models were also run using a demand rate of 40 and 35 gallons/person/day. At 40, a “right-sized” system that would have attained that same 97-98% coverage of interior water demand requires 4,000 sq. ft. of roofprint and a 30,000-gallon cistern. At 35 gallons/person/day, 96-97% of interior demand would have been covered with a 3,500 sq. ft. roofprint and a 25,000-gallon cistern. All these results presume 4-person occupancy in the house, which is above what demographics indicates is the average household size in most single-family residential developments around Austin and in the Hill Country, so it is expected that this sizing criteria would adequately supply the demands in most new houses.

These findings indicate that attaining very good demand control can significantly decrease the scale of facilities needed to “right-size” the building-scale RHW system, which would significantly reduce their costs. A single-story house plus garage and a “normal” area of covered porches/patios might provide 3,500 sq. ft. of roofprint, so an RWH house “right-sized” for a demand rate of 35 gallons would not require “extra” roofprint to be fit into the plan, so would not entail a cost increase to provide the required roofprint. And with the cistern being the costliest component of a building-scale RWH system, reducing its size contributes significantly to rendering the overall system more cost efficient.

With the baby boomers coming to retirement age, and single people and “DINKS” (dual income, no kids) being significant demographics, many building-scale RWH systems may be sized to serve 2-person households, for which the “right-sized” systems would be much smaller. Modeling in Austin and Dripping Springs shows that, with a demand rate of 45 gallons/person/day, a roofprint of 2,500 sq. ft. and a cistern volume of 17,500 gallons would have covered 97-98% of interior demands through the recent drought period. At a demand rate of 40 gallons/person/day, this result would have been attained with a roofprint of only 2,000 sq. ft. along with that 17,500-gallon cistern. If demand rate averaged 35 gallons/person/day, then a roofprint of 2,000 sq. ft. along with a 12,500-gallon cistern would have covered 97-98% of total interior demand. A small single-story house plus garage or carport and a “reasonable” area of covered porch/patio would provide that 2,000 sq. ft. roofprint, thus requiring no “extra” roofprint to be paid for. So, with significantly smaller cisterns being required, this market could more cost efficiently employ a building-scale RWH water supply strategy.

A model was also run covering the drought of record period from the late 1940s to the mid-late 1950s. The worst portion of that drought was from 1950 to 1956. Model results show that for all the scenarios reported above, a “right-sized” building-scale RWH system would have covered 92-95% of the interior water demands through that period. Comparing the rainfall deficits relative to long-term averages, it is seen that the 1950-1956 period was somewhat more “intense” than the recent drought period; while 2011 was the worst year on record, overall the current drought has not (yet) approached the severity of the drought of record. Even under the drought of record condition, however, it is seen that a “right-sized” building-scale RWH system would have provided the vast majority of interior water demands.

Many commercial and institutional buildings would also have a roofprint to water demand ratio that would be favorable to building-scale RWH. For example, a system for a two-story office building in which water usage rate is 5 gallons/person/day (typical toilet and lavatory use by an office employee) might have provided ~99% of water demand through the recent drought period. Whole campuses of such buildings might be built without having to install any conventional water and wastewater infrastructure, using wastewater treated at the building scale, perhaps supplemented by condensate capture, to supply toilets and all irrigation of the grounds, so allowing a smaller cistern to be installed, or allowing a higher water usage rate – e.g., to also cover food service – while still providing essentially all the demand. Capturing roof runoff in the RWH system would also reduce the stormwater management problem in such a development, enhancing the benefit of this strategy.

We can see therefore that building-scale RWH has great potential for relieving stress on the watershed-scale RWH systems that compose our “normal” water supply strategies, and could blunt the need for such high-cost options as desalination, direct potable reuse, or long-distance transfers from remote water sources. So even though building-scale RWH is relatively expensive in capital costs, it may be cost efficient relative to other options, while also offering low long-term operating costs.

One of those costs is for energy to pump and treat water. Building-scale RWH is a strategy that would entail relatively low energy use. Since the water loop is “tight”, water would be pumped only very short distances with little elevation head to overcome. This would save even more water, since it takes water to produce electricity to drive pumps – the so-called “water-energy nexus”.

On the basis of water usage efficiency, then, the building-scale rainwater harvesting strategy is well-worth serious consideration as a major means of serving the increasing demands which would be imparted by the projected growth in Central Texas. The same can be demonstrated for other high-growth regions in Texas, such as the Dallas-Forth Worth area.

Yet the present State Water Plan utterly rejects building-scale RWH as having any merit as a water supply strategy. I am told the reason for this is because the mental model of our controlling institutions sees building-scale RWH as “unreliable” because the cisterns may run dry during severe drought and require those minor fractions of total supply to be added to them from other sources. The counter to this is to think of it as “conjunctive management” of the total water resource, with the RWH systems diverting demand from other sources, decreasing their routine drawdown so that they have the capacity to provide the backup supply.

This highlights that, as noted in “Zero Net Water” there are challenges to be addressed, but those challenges may be less problematic than those posed by desalination, direct potable reuse or long-distance transfer schemes. So water policy makers should be called upon to recognize this clear potential and to incorporate this strategy into their water planning going forward.

It is noted in closing that the analyses reported in this post addressed only interior water usage. As reviewed in “Zero Net Water”, that concept envisions exterior usage – irrigation – to be largely supplied by localized reclamation and reuse of the “waste” water produced in the buildings being supplied by building-scale rainwater harvesting. In itself that tight-looped “decentralized concept” of wastewater management is a more highly efficient strategy – in regard to both money and water – than the conventional long-looped “regional” system, as was generally reviewed in “It’s the infrastructure, stupid”. That aspect of the Zero Net Water concept will be further considered in a future post.