Posted tagged ‘water’

The Zero Net Water Concept

December 12, 2025

The very “One Water” way to manage water resources on developments in the Wimberley Valley

1. INTRODUCTION

Imagine a water management strategy that would accommodate growth and development without unsustainably pumping down aquifers or incurring the huge expense and societal disruption to build reservoirs or transport water from remote supplies to developing areas. Welcome to the concept of Zero Net Water. As the name implies, it is a management concept that would result in zero net demand on our conventional water supplies. Well, practically speaking, approaching zero, as will be reviewed below.

The Zero Net Water concept combines 3 components:

  1. Building-scale Rainwater Harvesting (RWH), the basis of water supply
  2. The Decentralized Concept of Wastewater Management, focusing on reusing the wastewater; the hard-won rainwater supply, after being used once in the building, is reused to provide irrigation water supply.
  3. The Low-Impact Development (LID) stormwater management strategy, employed to maintain the hydrologic integrity of the land, and so of the watershed, as it is developed.

An indication of the potential of building-scale rainwater harvesting as a water supply in the Wimberley Valley is offered by this simple calculation. In one recent year, total water supply that the Wimberley Water Supply Corporation produced was about 154 MILLION gallons. The lowest annual rainfall recorded in Austin during the 1950s drought of record was 11.55 inches. With that rainfall, over the 20 square mile Wimberley WSC service area, the amount of water falling on that area would be a little over 4 BILLION gallons, so annual water usage would be about 4% of the rain falling on the area. At the average annual rainfall in Wimberley of 32.87 inches, the amount of water falling on the service area would be over 11 BILLION gallons. With that rainfall, the usage would be only about 1.5% of the water falling on this area. Clearly, there is far more water falling than could be used, even in a year of very low rainfall.

Of course, only a very minor fraction of the total area could be covered with suitable collection surfaces. And no one is suggesting that all other water supplies be abandoned, to rely only on building-scale rainwater harvesting. This just shows that the ability is there to significantly augment those supplies to serve growth. In the next section, it will be reviewed just how much this practice could off-load the conventional supply sources at each building.

But as noted Zero Net Water is not just building-scale rainwater harvesting – it is an integrated strategy, that creates greater overall water use efficiency. Water supply is integrated with wastewater management by centering the latter on decentralized reclamation and reuse, mainly for irrigation supply. Water supply would also integrate with stormwater management by using Low-Impact Development, green infrastructure and volume-based hydrology methods, to hold water on the land, maintaining hydrologic integrity of the watershed as it is developed, and reducing the area of landscaping that might need to be irrigated.

This integrated management results in minimal disruption of flows through the watershed, even as water is harvested at the site scale to be used – and reused – there to support development, creating a sustainable water development model. This integrated model would better utilize the water supply that routinely falls upon the Wimberley Valley, so rendering the whole system more efficient. This all is of course the essence of the “One Water” idea.

2. RAINWATER HARVESTING FOR WATER SUPPLY

Using building-scale rainwater harvesting for water supply instead of conventional sources can actually increase the overall efficiency of the water system. This is accomplished by taking advantage of the inherent difference in capture efficiency between building-scale and watershed-scale rainwater harvesting. Of course, all our conventional water supplies are watershed-scale rainwater harvesting systems. Which are inherently inefficient.

A really big inefficiency, as can be seen in the illustration in Figure 1, is that only a minor fraction of rainfall over a watershed, something like 15%, makes it into streams, reservoirs or aquifers, the “cisterns” of the watershed-scale system. Most of it is lost to evapotranspiration, which of course supports plant life in the watershed.

Figure 1

What does get into reservoirs is subject to very high evaporation loss. To offer a quantitative feel for how big a loss this is, evaporative loss each year from the Highland Lakes is more water than Austin treats and uses annually.

Then the water from those cisterns must be distributed back to where it is to be used, losing more water. Industry standards say 15% transmission loss is “good practice”, and many distribution systems have much higher losses. The Aqua system serving Woodcreek, for example, has reported a loss rate up around 30%.

There are also what might be called “quality inefficiencies”. Water collected in reservoirs is somewhat impaired relative to the quality of rainfall, and needs considerable treatment. And even though it is generally considered potable, water in aquifers can be hard, or contain things like sulfur or iron, and also need some treatment. Treatment imparts costs and consumes energy.

There certainly is a high energy cost. Besides treatment, it takes a lot to lift water out of aquifers, to pump it long distances, and to pressurize the far-flung distribution system.

The building-scale rainwater harvesting system, illustrated in Figure 2, blunts all those inefficiencies. The inherent capture efficiency off a rooftop into a cistern approaches 100%. The actual efficiency will be less, of course, due to cistern overflows, wind conditions, gutter overflows during intense rainfalls, and so on, but still it will be very high.

Figure 2

Since the cisterns are covered vessels, there will be negligible evaporation loss. The building-scale distribution system is “short”, largely within the building, but in any case is built, and can realistically be maintained, “tight”, so transmission loss is also negligible.

Rainwater captured directly off a roof is very lightly impaired, so doesn’t need much treatment – cartridge filtration and UV disinfection is the typical system – so a lot of energy is being saved there. With a low lift out of the cistern and a very short run to where the water is used, the building-scale system uses far less energy to move water to the point of use. Given the water-energy nexus – it can take a lot of water to make energy – this also renders the overall system more efficient.

Growth dynamics make the building-scale system economically efficient. Since this system is installed one building at a time, it “grows” the water supply in direct proportion with demand. So besides more efficiently transforming rainfall into a supply usable by humans, creating a more sustainable water supply system, this strategy also creates a more economically sustainable system, because supply is built, and paid for, only in response to imminent demand, one building at a time.

So it can reasonably be asked, which of these models is more sane? Collect high quality water, very efficiently, where it falls and use it there? OR collect it at very low efficiency, with impaired quality, over the whole watershed, then lose a bunch – and use a lot of energy – running it through a long loop … back to where the water fell to begin with? The building-scale system is the clear winner on this question.

3. THE LOW-IMPACT DEVELOPMENT STRATEGY

Before going on, a myth about this strategy must be addressed. Some say harvesting off all the roofs would rob streamflow, that it wouldn’t more efficiently harvest water, it would just rob some from downstream users who depend on that flow. But if we examine that premise …

The Zero Net Water strategy would be used to serve new development, and when raw land is developed, as illustrated in Figure 3 quickflow runoff increases at the expense of decreased infiltration. When the land is transformed from a natural site to just 10-20% impervious cover, like in an exurban subdivision, the runoff would double, and with housing at more like urban intensity, typically imparting 35-50% impervious cover, it would triple. To prevent flooding, water quality, and channel erosion problems, measures must be taken to mitigate that shift from infiltration to runoff.

Development Impacts On Hydrology

Figure 3

Development rules in this area typically require that runoff be captured and treated, to protect water quality and minimize channel erosion. Unfortunately, much of mainstream practice centers on large end-of-pipe devices, like sand filters, which largely pass the water through, letting it flow away rather than holding it on the land.

This is where the Low-Impact Development stormwater runoff management strategy comes into play, to mitigate that problem. LID practices like the bioretention bed illustrated in Figure 4. Bioretention beds can retain and infiltrate a large portion of the total annual rainfall, shifting the balance back toward infiltration. Instead of a bare sand filter, this is site beautification, designed into the development plan, instead of just appended on, as those sandboxes are. It is also landscaping that doesn’t need routine irrigation, saving some water there.

Bioretention Bed

Figure 4

An even better strategy is to scale these bioretention beds down and spread them around. At this scale, as shown in Figure 5, these installations are usually called by their more colloquial name, “rain gardens”. With this strategy, the rainfall is captured and infiltrated in a manner that more closely mimics how that is done on the native site, on a highly distributed basis. Using many distributed rain gardens instead of one end-of-pipe pond somewhat restores the “hydrologic roughness” which is characteristic of raw land, so holding water on the land. This maximizes hydrologic integrity, achieving that aspect of Zero Net Water. And again, creates interesting landscape elements that don’t need routine irrigation.

Figure 5

Back to the myth, if building-scale rainwater harvesting is integrated into that capture process, harvesting runoff from the rooftop portion of new impervious cover, this just further mitigates the negative hydrologic changes that development causes. Because development causes such large increases in runoff, despite rainwater harvesting off rooftops, runoff from the developed site typically increases over what runs off the native site, even with the LID strategy being applied.

And then too, water collected in the cistern does not exit the watershed. No one is putting it in a truck and hauling it away. It is just being held up for awhile, then rejoins the hydrologic cycle in the vicinity where the rain fell to begin with. Most of the water collected for interior use eventually appears as wastewater. Under Zero Net Water, that’s used for irrigation, so the water that was harvested, after being used once in the building, is still maintaining plant cover in the watershed.

The bottom line is that some of that additional runoff created by development is captured and utilized. That is done instead of allowing the additional runoff to become quickflow, which if not mitigated in some other way creates water quality, channel erosion and flooding problems. Thus, with building-scale rainwater harvesting, overall yield of water supply usable by humans can be fundamentally improved, increasing the usable water yield without any significant impact on streamflow.

4. THE CAVEAT TO “ZERO”, CREATING PRACTICAL RAINWATER HARVESTING BUILDINGS

The building-scale cistern operates just like a reservoir, holding water for future use; indeed, it is a “distributed reservoir”. So like any reservoir, it will produce a “firm yield” of water supply, that will cover a certain amount of demand.

As will be reviewed shortly, considerations of cost efficiency lead to the concept of “right-sizing” the building-scale system – its roofprint and cistern capacity – so, that “firm yield” would cover demands most of the time. Instead of spending a lot more upsizing the system to cover the lastlittle bit of demand in a drought, a backup supply would be brought in to cover that. When, and only when, conditions do get bad enough that it would be needed.

Of course, that backup supply would come from the watershed-scale system, which would itself be most stressed just when that backup supply is needed. So it must be considered, what impact would that have on the watershed-scale system?

The total market for development is set by a whole lot of factors other than, “Is there a water supply available?” So it can be presumed that whatever development is served by building-scale rainwater harvesting would not be additional development, it would just displace some of the development that would otherwise have drawn from the watershed-scale system. Indeed, the point of Zero Net Water is to off-load the watershed-scale system, to avoid costly expansions of that system, like the long pipelines from far-away aquifers that have been proposed to be run into the Wimberley Valley. And also to avoid depleting aquifers and drying up springs, like might be imparted in the Wimberley Valley by development continuing to rely on local groundwater.

Therefore building-scale RWH systems would be “right-sized” to off-load the watershed-scale system most of the time, allowing it to retain that supply, so it could provide a small percentage of total building-scale system demand through a drought. And then, when the rains do come, the watershed-scale system would also recover faster, because the draw on it for backup supply of the building-scale systems would stop, again off-loading the watershed-scale system.

A historical rainfall model is used to determine what would be a “right-sized” system. Here a model using rainfall data for the Austin area covering the years 2007 to 2024 is used to illustrate that. This covers the 2008-2014 drought period, now recognized as the new drought of record for the Highland Lakes, and the more recent somewhat dry years, so this period presents a fairly severe scenario for the sustainability of building-scale rainwater harvesting.

In the model, the front page of which is shown in Figure 6 – this is the initial year model, 2007 in this case – enteries are made for a collection area – the roofprint – for the storage – the cistern volume – and for a water usage profile. The model shows how that system would have performed over the modeling period. Again, the goal is to have needed only a limited backup supply, only in the most severe drought years. Covering a rather severe drought period, the results of this model offer a reasonable expectation of water supply coverage by the building-scale system in the future.

Rainwater Harvesting Model

Figure 6

Model runs were conducted for two scenarios: (1) A “standard housing” subdivision, in which the presumed nominal occupancy is 4 persons; (2) A “seniors oriented” subdivision, in which the presumed nominal occupancy is 2 persons. For each scenario, models were run presuming per person usage rates of 45, 40, 35 and 30 gallons per day. 45 gallons/person/day is a usage rate that is expected to not be routinely exceeded by most people in a house with all current state-of-the-art water fixtures, as determined by surveys by the American Water Works Association. 35 gallons/person/day is the commonly assumed usage rate by people who do use rainwater harvesting for their water supply, expecting they would be quite conscious of water conservation. Experience has shown that this usage rate is readily attainable. As is indeed 30 gallons/person/day, based on a number of reports by people who keep track of their water usage.

The modeling outcomes are shown in Figure 7. Based on the expectations of local water suppliers, the coverage that needs to be attained to render the systems “right-sized” would be determined. For this discussion, it is asserted that a coverage of 97.5% or more of total water demand over the 18-year modeling period, leaving no more than 2.5% to be covered by backup supply, would be a “right-sized” system.

Figure 7

On that basis this table shows that for a “standard” subdivision, at a water usage rate of 45 gallons/person/day, a “right-sized” system would need to have a roofprint of 4,000 sq. ft. and a cistern volume of 30,000 gallons, as this would provide 97.6% coverage of total water demand through the modeling period. This compares with a 5,000 sq. ft. roofprint and a 42,500-gallon cistern that would have been required to have provided 100% coverage. Following the “right-sizing” strategy, the savings on each house would be 1,000 sq. ft. of roofprint and 12,500 gallons of cistern volume. This would be a rather large savings in costs to build the RWH systems, purchased by using backup supply to cover the last 2.4% of demand, needing backup supply only in the rather severe drought years of 2008-2009 and 2011.

If the water usage rate were 40 gallons/person/day, a full coverage system would require a roofprint of 4,750 sq. ft. and a cistern volume of 37,500 gallons. A “right-sized” system providing 98.3% coverage would require a roofprint of 3,750 sq. ft. and a cistern volume of 27,500 gallons, a savings of 1,000 sq. ft. of roofprint and 10,000 gallons of storage on each house.

At a water usage rate of 35 gallons/person/day, a full coverage system would require a roofprint of 4,000 sq. ft. and a cistern volume of 33,000 gallons. A “right-sized” system providing 97.5% coverage would require a roofprint of 3,000 sq. ft. and a cistern volume of 25,000 gallons, again a savings per house of 1,000 sq. ft. of roofprint, and a savings of 8,000 gallons of storage.

A water usage rate of 30 gallons/person/day would require a roofprint of 3,750 sq. ft. and a cistern volume of 27,500 gallons to provide 100% coverage. A “right-sized” system providing 98.3% coverage would require a roofprint of 2,750 sq. ft. and a cistern volume of 20,000 gallons, once again a savings of 1,000 sq. ft. of roofprint, and a savings of 7,500 gallons of storage.

Similar observations can be made for the seniors oriented development scenario. All this well illustrates the high value of practicing good water conservation, affording significant savings to build a sustainable RWH system.

One may look at those roofprints and conclude that the building-scale RWH strategy would only “reasonably” apply to rather high-end developments in which the houses would be “large”. But it is important to understand that the roofprint is not the living space in the house, it is the somewhat larger roof area. As can be seen in Figure 8, this would include in addition to the living area the roof overhangs, the garage (or carport), and covered porches and patios, or verandas. And it is the latter which could readily be expanded somewhat over what might be “normally” built, to provide relatively inexpensive additional roof area to create sustainable systems.

House Roof Area Illustration

Figure 8

Obviously, single-story houses would be favored, so as to create as much “routine” roofprint as practical. Single-story house plans of several builders active in the Hill Country were examined to see where the addition of verandas could be accommodated, such as is illustrated in Figure 9. It does appear that significant roofprint could readily be added, so that houses with “moderate” areas of living space could have enough roofprint added to obtain the “right-sized” roofprints. Of course, if the house were designed around this idea from the start, these designs could no doubt provide the needed roofprint more cost efficiently, and would yield more attractive designs. So it is suggested that architects and builders consider this, and create “Hill Country vernacular rainwater harvesting house designs”.

Veranda Strategy Illustration

Figure 9

In a seniors oriented development, where the nominal occupancy would be 2 persons, a single-story house with a fairly “normal” area of verandas, along with a garage/carport could readily provide the roofprint needed to create “right-sized” systems.

Another advantage to this “veranda strategy” is that the cistern could be integrated into the house design. With a large area of veranda, the cistern need not be very deep under the veranada floor to attain the storage needed. An illustration of building the storage in under the veranda floor is shown in Figure 10.

Built-In Cistern Under Verandas

Figure 10

Of course the cistern could also be built under part of the house too – that has indeed been done in some Hill Country houses – but with this concept, the cistern could be built around the foundation, not impinging on the building itself, so the builder could build rather “normally” within the house envelope. With this strategy, space would not be taken up on the lot for a free-standing cistern. That will be more important on smaller lots, of course.

With the roofprint areas needed to “right-size” RWH systems in the Wimberley Valley, it may be questioned if this strategy is indeed only applicable in developments with “large” lots. The example of an urban lot in a South Austin neighborhood offers a look at that. Illustrated in Figure 11, this is a 1/5-acre lot with a 1,580 sq. ft., 3-bedroom, 2-bath house, and a garage, fairly typical in that neighborhood.

Rainwater Harvesting on an Urban Lot

Figure 11

Figure 11 illustrates that a system quite sufficient to be “right-sized” for a 2-person household might be accommodated, but would come up somewhat short for a 4-person household. While adding cistern volume might be problematic, as illustrated in Figure 12 roofprint could be added around the house, an adaptation of the veranda strategy noted above, to obtain a roofprint that would be more sustainable for the larger population.

Added Roofprint Illustration

Figure 12

Of course, it is really pretty clumsy to try to retrofit a 1960s era house like this, but if the RWH strategy were to be designed into a new house from the start, it does appear that rainwater harvesting could be a reasonably attainable water supply system on urban-sized lots.

Before moving on, it should be noted that this strategy does not have to be implemented with just individual house RWH systems. As illustrated in Figure 13, to broaden the use of rainwater, RWH could be integrated into the conventional supply with collective, conjunctive use systems, maximizing contribution to water supply from whatever roofprint can be installed. While this scheme would require permitting as a Public Water Supply System, this concept should be entertained, to maximize rainwater harvesting in more intense single-family development, and to extend it to multi-family projects too. Significantly defraying demands on the conventional supply system, and thus blunting the need to expand it with costly, disruptive projects.

Collective Conjunctive Use Strategy

Figure 13

5. IRRIGATION WATER SUPPLY FROM WASTEWATER REUSE – CLOSING THE WATER LOOP

Under the Zero Net Water concept, wastewater reuse would be maximized to supply non-potable water demand. Particularly for irrigation in this climate. Indeed, wastewater reuse is highly valuable to rainwater harvesters, if they want to maintain an irrigated landscape. That’s because, as we’ve seen, a “right-sized” system just for interior use is already “large”. So to also supply irrigation directly out of the cistern, the system would have to be made even larger in order to maintain sustainability of the water supply system, and that would be costly, of course. Or the users would have to bring in a MUCH larger backup supply. But neither is needed, as there is already a flow of water sitting right there that could be used for irrigation instead of drawing that water directly from the cistern. A supply for which a hefty price has already been paid to gather. That’s the wastewater flowing from the house. Don’t lose it, reuse it!

An example of the savings potential from using the reclaimed wastewater for irrigation rather than providing that supply directly from the cistern is offered by looking at the 4-person house with a water usage rate of 45 gallons/person/day. The “right-sized” system for supplying just the interior usage requires a 4,000 sq. ft. roofprint with a 30,000-gallon cistern. Per the modeling, that system would have required backup supply in 3 out of the 18 years in the modeling period, totaling 28,000 gallons.

For this illustration, it is presumed the irrigated area to be supplied is 2,400 sq. ft., which would be the nominal size for an on-site wastewater system dispersal field for a 3-bedroom house. If that system were to provide the modeled irrigation demand directly out of the cistern, backup supply would have been needed in 12 of the 18 years, totaling 222,000 gallons! The rainwater system would have covered only 85.6% of total water demand – interior plus irrigation usage – over the modeling period. In order to maintain the sustainability intended to be imparted by the “right-sizing” strategy, the model shows that the roofprint would have had to be increased by 1,500 sq. ft., to 5,500 sq. ft., and the cistern volume increased by 20,000 gallons, to 50,000 gallons. Again, very pricey.

But if the wastewater were reused instead, the system that was “right-sized” to provide only interior use would have also covered most of the irrigation demand by reuse, again having needed backup supply only in the same 3 years, totaling 38,000 gallons. A 10,000 gallon increase in total backup supply from the “base case”, but a far lower demand on the watershed-scale system than if reuse were not practiced.

The reuse can be done house by house if the development features larger lots, with wastewater service provided by on-site wastewater systems, commonly called “septic” systems. Those on-site systems can employ the highly stable and reliable High Performance Biofiltration Concept system mated with a subsurface drip irrigation dispersal system. This concept is illustrated in Figure 14.

Figure 14

This strategy has been implemented in several jurisdictions in Texas, including Hays County, over the last 3 decades. Examples of drip irrigation fields on some of those lots are shown in Figure 15. On-site reuse, to render the RWH systems more sustainable while maintaining grounds beautification, is readily doable in the Wimberley Valley. Indeed, a larger scale version of this system was implemented at the Blue Hole Elementary School, the so-called “One Water” school.

On-Site Wastewater System Drip Irrigation Fields

Figure 15

If the nature of the development requires a collective wastewater system – typically needed if a development in the Wimberley Valley has lots smaller than an acre – the reuse can readily be accomplished, rather cost efficiently, with a “decentralized concept” wastewater system. That concept, detailed in “This is how we do it”, is illustrated in the sketch plan in Figure 16, showing a system serving a neighborhood in a rather typical Hill Country residential development. Of course this strategy makes great sense no matter where the water supply comes from. Every development can use this strategy instead of paying the high cost of centralizing wastewater to one far-away location, and then discharging it. Throwing away the water needed for irrigation!

Figure 16

As can be seen in Figure 16, this strategy is about water management, not about “disposal” of a “waste”. Reuse is designed into the very fabric of the development, as if it were a central function, rather than just appended on (maybe) to the system at the end of the pipe, as if it were just an afterthought. And by distributing the system like this, all the large-scale infrastructure outside the neighborhood – the large interceptors and lift stations – would be eliminated, imparting considerable savings.

This highlights that TCEQ must be engaged in rethinking wastewater management, to move to this sort of integrated water management model, instead of viewing wastewater management as being focused on “disposal” of a perceived nuisance. Which makes this wastewater live down to its name, truly wasting this water resource.

6. MINIMUM NET WATER – WORKING THE CONCEPT INTO DEVELOPMENT MORE GENERALLY

Of course if there are conventional water lines already close to a development, the developer would greatly prefer to hook up to them, even if, long-term, very costly measures – like the waterlines some have proposed to run into the Wimberley Valley – would have to be undertaken to keep water flowing to that development. The developer will invariably chose a “normal” water system if it’s available, so the builders can do their normal building designs. Those long-term costs to keep water flowing in those pipes typically accrue to others, so builders and developers wouldn’t see the price signals that might favor a Zero Net Water strategy.

Still a variant of Zero Net Water that might be called “Minimum Net Water” can deliver some of the benefits. If the development will get water service from a watershed-scale system, it can still be arranged to pretty much take irrigation off that potable supply, by rainwater harvesting and/or wastewater reuse. In this region, that would typically lop off about 40% of total annual demand. And of course it would be an even higher percentage of peak period demand, and it is peak demand that drives a lot of water system investment, so that is very valuable.

This could be done with a combination of the neighborhood-scale wastewater reclamation system, illustrated above, using that water to irrigate the public areas – front yards, parkways, parks and such – and rainwater harvesting to irrigate the private spaces, the back yards. Under this scheme, roof runoff would collect in “water quality” tanks, as shown in the illustration in Figure 17. That part of it conforms to what Austin sets forth in its rules as “rainwater harvesting” for water quality control, under which these tanks would have to drain within 48 to 72 hours. In this scheme, however, these tanks would be connected to rainwater harvesting cisterns, where that water could be held until whenever it is needed to irrigate the back yard. And only when those cisterns were full would any water even pond up in the water quality tanks, to eventually flow “away”. So a lot of the roof runoff would be harvested, to be used for irrigation, instead of running “away”.

Figure 17

By taking care of water quality management of rooftop runoff like this, the green infrastructure needed to treat runoff from the rest of the site could be downsized. Bioretention beds would need to be sized only for the area and the impervious cover level obtained by omitting all the rooftops. That saves more money, while still well maintaining the hydrologic integrity of the site.

The bottom line is water quality management of this rooftop runoff would be integrated with rainwater harvesting to create an irrigation water supply. Doing this would impart superior water quality management of that runoff, restoring the hydrologic integrity of this patch of ground now covered with the rooftop, and providing irrigation water for the back yard. A lot of benefit obtained by a pretty simple, straightforward system.

7. COMMERCIAL DEVELOPMENT, A PRIME OPPORTUNITY

To this point, the Zero Net Water strategies have mainly been focused on housing developments, but commercial and institutional buildings are also a major opportunity for Zero Net Water. The ratio of roofprint to water use in those buildings typically favors rainwater harvesting. And condensate capture could also provide a significant water supply in this climate.

Along with RWH providing the original water supply, project-scale wastewater reuse could be employed for irrigation, and also for toilet flush supply, and the LID/green infrastructure stormwater management scheme could harvest runoff from paved areas of the site too, and feed it into landscape elements – rain gardens – that, again, don’t need routine irrigation. An illustration of applying these strategies to a commercial building is shown in Figure 18.

Commercial Building Reuse and Site-Harvested Water Systems

Figure 18

Using these strategies, commercial and institutional buildings, or whole campuses of these buildings, could readily be water-independent – “off-grid”, not drawing any water from the conventional water system. That could save a lot of water, and a lot of money for conventional water and wastewater infrastructure that would not have to be built to these projects.

8. COSTS … AND VALUE, A CLOSING STATEMENT

And finally, about cost. Of course, that’s always going to be a major factor in whether any of these strategies are put to work in service of society, to deliver water systems that are more societally responsible and more environmentally benign. But it is suggested that the discussion be framed in terms of VALUE. Oscar Wilde famously said, this society knows the price of everything and the value of nothing. That just might be the case here.

Because, by conventional accounting, water supply obtained from building-scale rainwater harvesting will no doubt appear quite expensive per unit of yield, mainly because of the high relative cost of cisterns, the distributed reservoir. A viewpoint exacerbated by presuming the watershed-scale storage and distribution systems are sunk costs, rather than the avoided costs they would be – or at least expansions of them would be – if building-scale systems were used instead of the watershed-scale system for whole developments. So mainstream institutions tend to dismiss this strategy out of hand. Indeed, the State Water Plan assigns little value to building-scale rainwater harvesting as a water supply strategy, and water suppliers in this region appear not to consider it to be any part of their portfolios.

But there are many reasons to consider if the Zero Net Water strategy delivers value that should not be ignored. Here are a few of them, very briefly, as each of these could be a day-long discussion themselves:

  • The Zero Net Water strategy minimizes depletion of local groundwater and loss of springflow. That is very important in places like the Wimberley Valley, indeed in much of the Hill Country. As the aquifers to the east of the IH-35 corridor are dewatered by importing that water into Hays County, Zero Net Water could be valuable over much of that area as well.
  • That’s because Zero Net Water would blunt the “need” to draw down aquifers, or take land to build reservoirs, and all their attendant societal issues. An example is the brewing water war in East Texas, where it is proposed to take land to build reservoirs and export groundwater out of that area as well.
  • As has been noted, Zero Net Water will be sustainable over the long term, since development would largely live on the water that’s falling on it, not by depleting water that’s been stored in aquifers over centuries, or getting it from reservoirs that will silt up.
  • Zero Net Water is an economically efficient strategy, since facilities are built, and paid for, only to provide water supply that is imminently needed.
  • Zero Net Water also minimizes public risk. Since the system is designed into the site as it is developed, the cost of creating water supply is largely borne by those who benefit directly from that development, rather than by society at large through public debt.
  • It must be understood that growth projections for this region are not manifest destiny. While large water infrastructure projects just have this, if you build it, they will come, sort of “justification”, those growth projections depend on a continuation of trends the projections are based on. And that might be disrupted by any number of circumstances. So perhaps it should be asked, what if they built it and no one came? Or just, not enough to pay for it came? Perhaps the projected growth is indeed inevitable, and so that’s a remote risk, but still it’s one that could be avoided by shifting to Zero Net Water.

Zero Net Water is about society collectively saying, this is how we will secure our water supply. We will integrate all aspects of water management, we will address all water as a resource, we will tighten up the water loops and maximize efficiency, to create a sustainable water strategy – Zero Net Water.

The Rainwater Harvesting Argument

June 19, 2025

The Waterblogue has featured the idea that building-scale rainwater harvesting (RWH) can provide a significant contribution to water supplies in this region. For example here, here, and here. Raising the obvious question, what actual contribution could this strategy make?

Discussing the merit of and prospects for building-scale RWH as a water supply strategy in this region with a colleague, I was surprised when he confronted me with this proposition. According to state policy, in order to be considered a functional water supply strategy, a method must be capable of delivering a “firm yield” through a repeat of the “drought of record”, and under that definition, most building-scale RWH systems simply do not exist, do not deliver any recognized water supply!

Let that sink in for a moment. A well-planned, well designed building-scale RWH system around here is typically expected to be able to provide in excess of 95% of total demand over a period of years, which would include a period of drought. An example is the “right-sized” system summary shown below produced by modeling the period 2007-2023, inputting Austin rainfall records over that period. This time period includes 2008-2014, which is reported to be the new “drought of record” period for the Highland Lakes, which is the major watershed-scale RWH system that serves this region.

[click on image to enlarge]

As you see, we can readily choose system sizing relative to the expected level of water usage to create systems that can indeed deliver 95% or more of the total water supply over the modeling period. So to the question, if the system does not deliver the total water supply needed through a drought period, what would it take to assure this strategy does deliver a functional, secure and assured water supply?

I posed this matter in the TWDB-funded investigation “Rainwater Harvesting as a Development-Wide Water Supply Strategy” that I ran for the Meadows Center at Texas State University in 2011-2012. I set forth the idea of “right-sizing”, that rather than having to pay to upsize the RWH system to cover the last little bit of demand, it would be more cost efficient society wide to install a “right-sized” system that would provide the vast majority of total water demand, and to provide for a backup supply of the small amount of shortfall, that would be needed only though bad drought periods. As can be seen in the table above, for example, to provide 100% supply at 45 gallons/day to a 4-person household would require 5,000 sq. ft. of roofprint and a 42,500-gallon cistern. But that system could be downsized considerably, to 4,000 sq. ft. of roofprint with a 30,000-gallon cistern – saving a ton of money – and would still have provided 97.5% of total demand through the modeling period. So the question becomes, how to assure that 2.5% shortfall could be provided by other means.

The presumption behind this concept is that there is not an unlimited market for development. The development that would be provided water supply by building-scale RWH systems would displace development that would have otherwise drawn its water supply from the watershed-scale RWH system, rather than be development in addition to that. So the supply being provided by building-scale RWH would be supply that would be left in the watershed-scale system storage pool most of the time, so presumably not drawing it down as severely as it would have been if all those building-scale systems had instead been routinely supplied by the watershed-scale system. Thus the watershed-scale system would have the “slack” to provide the relatively small amount of backup supply to the building-scale systems through the drought periods.

In the example above, “right-sizing” at 4,000 sq. ft. and 30,000 gallons, the table shows a total backup supply of 28,000 gallons would have been needed through the drought period 2008-2014, or just 4,000 gallons per year on average, out of a total modeled demand of 65,700 gallons/year. That system would have been 94% supplied by the building-scale RWH system through that 7-year drought period, and as noted above 97.5% supplied through the total 17-year modeling period.

The question, of course, is if indeed the watershed-scale RWH systems – such as the Highland Lakes in this area – would have the capacity to provide that backup supply demand through a drought period, as well as continuing to serve all the development that routinely draws from it. My colleague, while acknowledging the logic of my argument, asserted that the “growth model” presumes that the watershed-scale systems serving any given area would indeed become completely encumbered by development they serve directly – based I’m guessing on the very circumstance that overall growth around here is projected to exceed the capacity of existing water supplies to service it – so that it’s presumed there would be NO capacity available in that system through a drought of record period.

As best I can translate, it is asserted that the “right-sizing” strategy is illegitimate, because there would be no sources available for backup supply through a drought. So rendering that evaluation noted above, that if the building-scale system would not carry 100% of the projected supply needs, for the purpose of planning water supply strategy, it is presumed that the system provides NO water supply, is of NO value to the regional water economy.

I find that viewpoint to be, well, strange, contrary to common sense. Does it not seem that if a building-scale system provides in excess of 95% of the total supply over a period of years, that would be supply that the watershed-scale system is relieved of having to provide, and so this is effective water resource conservation, that does have value to the regional water economy? It seems rather didactic to simply “erase” the whole building-scale RWH water supply strategy because it would need a minor portion of total supply to be provided out the watershed-scale system, which the building-scale systems would be totally relieving much of the time. Indeed, one wonders where else in public policy is a 95+% “success” rate deemed “unreliable”? Yet that is what my colleague contends state planning principles presume “must” be so when considering whether to base water supply strategy on any use of building-scale RWH over any given area. That only the capacity of systems sized to deliver 100% of the projected supply can be deemed to exist.

It is little wonder then that we do not see the building-scale RWH strategy being set forth in any of the regional water plans, and thus not having been meaningfully incorporated into the State Water Plan. Here is the sum total of what the 2022 Texas State Water Plan says about building-scale RWH as a water supply strategy:

Rainwater harvesting involves capturing, divert­ing, and storing rainwater for landscape irrigation, drinking and domestic use, aquifer recharge, and stormwater abatement. Rainwater harvesting can reduce municipal outdoor irrigation demand on potable systems. Building-scale level of rainwa­ter harvesting, as was generally considered by planning groups and which meets planning rules, requires active management by each system owner to economically develop it to a scale that is large and productive enough to ensure a mean­ingful supply sustainable through a drought of record. About 5,000 acre-feet per year of supply from rainwater harvesting strategies is recom­mended in 2070 to address needs for select water users that have multiple additional recommended strategies.”

To put that projection of supply to be provided by building-scale RWH in perspective, if we presume a typical system does provide supply at 45 gallons/person/day for 4 persons, or 180 gallons/day total, each such system would supply 65,700 gallons/year, or about 0.2 acre-feet/year. So a contribution of 5,000 acre-feet/year would require 5000/0.2 = 25,000 RWH systems of this size, or the functional equivalent, to be put in place. How much growth of this strategy does this project?

While there is no authoritative data base that would provide the number of existing RWH systems, a rough guess that one expert on the subject offered is that there is likely in excess of a quarter million RWH systems – 10 times the number calculated above – in just 7 states, with Texas being the site of a goodly portion of those. Indicating that it does not appear the 5,000 acre-feet by 2070 projection even comes close to representing what is already on the ground, routinely producing water supply today.

But, as reviewed above, those who set water planning policy in Texas are loathe to accord to this strategy any actual contribution to supply, because of that “firm yield” requirement. So we need to consider if that is indeed sound reasoning, if that is a sufficient reason to exclude all contributions by building-scale RWH systems all of the time, or if we should rethink that.

Might, for example, society be better served by planning for building-scale RWH systems within a “conjunctive use” strategy, under which whatever the backup supply source is would have that capacity “reserved” in some manner? Just as this concept is applied to co-managing surface water and groundwater, so that one source might “fill in the gaps” of the other source’s capacity. To do this of course would require conscious consideration of and planning for building-scale RWH as a contribution to area-wide water supply. Which is absent at present, and so this matter remains “fallow”.

There are hundreds if not thousands of houses, businesses too, around here where folks are making building-scale RWH work as a water supply strategy, successfully arranging for whatever backup supply their systems need on an ad hoc basis. All of the water hauling companies that provide that backup supply report they are confident their business model will remain viable, so those supplies can be maintained into the future. So it would seem that building-scale RWH could indeed be a broad scale water supply strategy, with some intentional planning for assuring backup supplies are provided at need.

The situation can be summed up, that building-scale RWH is not meaningfully included in water resources planning in Texas, upon the “reasoning” that this method would not provide a “firm yield” through a repeat of the “drought of record”. This ignores any prospect for co-managing this strategy with the watershed-scale RWH systems to assure that whatever gaps in firm yield would result would be covered out of the watershed-scale systems. Does this not seem to show a lack of vision among the mainstreamers who control those planning processes?

In pursuit of society’s best interests, it is suggested that this whole viewpoint be revisited. This is another example of how we need to take a peek down the road not taken … so far. As that could make all the difference.

Can we pretty much take irrigation off the potable supply?

June 8, 2025

It is commonly reported that something like 40% of annual municipal water demand is typically used for landscape irrigation. This usage is also a major driver of demand peaking through the summer in our Central Texas climate. In Georgetown, for example, a recent Inside Climate News article (you can read that here) on the city’s efforts to obtain water supply from the Simsboro Aquifer to its east, the city manager is reported to have said “most” of that water will serve new residential developments and will be used “primarily” to irrigate lawns and other neighborhood landscaping. So it does seem that taking this irrigation off the potable water supply would be of huge benefit. Especially to cities, such as Georgetown, that are spending significant sums to import water from remote supplies. And in particular, shaving peak demands would be particularly beneficial, as peaking drives the costs of much of our water supply infrastructure. Like those hugely costly pipelines from the Simsboro.

Thus, it is reasonable to investigate, how might we accomplish taking irrigation off the potable water supply system?

Consideration of what this might entail reveals three main strategies that could be pursued:

  1. Move to a more regionally appropriate landscaping ethic, minimizing turf and other landscaping that would need routine irrigation, prioritizing instead native plant landscapes, that would need far less irrigation – and would enhance the “sense of place” in the built environment.
  2. Employ building-scale rainwater harvesting (RWH) to create the irrigation water supply, utilizing a resource that otherwise may become a stormwater management problem.
  3. Create wastewater reuse systems to provide the irrigation water supply.

Along with all this, rendering landscape irrigation systems more efficient to reduce the demand to begin with, such as was reviewed on the Waterblogue here, must be an on-going effort, just as a matter of course.

On landscaping of a residential lot, simply because I am intimately familiar with it, I will use the landscaping at our house to illustrate the potential for measures 1 and 2. Here is the lot plan, showing the current configuration.

[click on image to enlarge]

A street view of the current state of the front yard landscape is shown below.

And here is a closer view of the front patio area, the house entry visual.

And a bit of whimsy – SHARK!

When I moved in back in 1997, this front yard was rather typical for the neighborhood, two non-native trees and a swath of St. Augustine turf. Beginning in 2002, I started to transform it to what you see above, taking out a chunk of turf at a time, taking down the non-native trees and subbing in mountain laurels, a lacy oak and a cedar elm. This native landscaping is rarely watered. After the plants are well established, only during drought stress periods and sparingly then, so this landscaping incurs rather minimal irrigation water demand. The exception is the blackberries that I decided to try growing a couple years ago, that are in front of the raised planter with the accordion trellis on top of it. These plants do require routine irrigation to be viable, but the total amount of water they require is rather minimal.

In the back yard, we are doing some food gardening, in the raised beds, the in-ground potato bed, and blueberries in containers …

… which all does have to be watered frequently. But here again the areas are small, so the total water demand for the food crop irrigation is low, and generally all supplied from the rainwater tanks. We also have some citrus trees in containers – lemons and limes …

… and more blackberries in the back yard.

The sandbox, shown below, which my grandson has outgrown, is planned to become a milkweed bed to feed monarch butterflies.

As you can see in the picture above, the trellises along the yard walls, and in the picture below, around the veranda cover …

… are native plants, which are irrigated only during extended drought stress, when a bucket of water will be spread on them every so often. There are a number of hanging baskets and potted ornamentals that do require routine watering, but again the total amount of water required is very low.

The remainder of the back yard is covered with a small rain garden – you see that on the left edge of the picture above, it was featured previously on the Waterblogue here – with mulched beds containing native plants, and with non-irrigated “turf” areas. In the summer those “turf” areas become mainly populated with horseherb as the grasses fade – you can see that ground cover at the bottom of the picture just above. During long drought periods the horseherb becomes largely “dormant” – but always surges back green when the rains come.

You can see then that it is quite possible to move to a more regionally appropriate landscape style here in Central Texas, and not give up on having an attractive, interesting landscape. A landscape requiring minimal irrigation does not have to be, as an Austin city counselor once dimly put it, all rocks and cactus. While the turf aesthetic does still dominate the neighborhood, I have witnessed a number of similar transitions of front yards in the 20+ years since I started transforming mine. And the appearance of more and more rainwater tanks.

Leading us to strategy 2, using roof-harvested rainwater instead of potable water for irrigation supply. As shown on the lot plan and in the pictures below, we have a rainwater tank at each corner of the house, to capture all the roof areas.

The tank capacities, shown on the lot plan, total to 2,875 gallons of storage capacity. As you can see in the tank pictures, the shorter tanks are set on pedestals so that the tops of the tanks are all at about the same elevation. This was done because the tanks are tied together by 1-1/2” pipes running underground between them, so they will all operate as one hydraulic unit, all overflowing at the same elevation. That was done because each tank intercepts different roof areas, and so if not tied together, one tank could be overflowing before others are full, so this maximizes the collection efficiency off the entire roofprint.

So far, this RWH system has been efficient enough that over the last few years, since we started growing food as well as maintaining the visual landscape, only in the depths of the drought in the summer of 2024 did we have to use any significant amount of potable water for irrigation. You can see then that even such a limited RWH system can largely maintain a regionally appropriate native landscape, such as we have, as well as some food gardening.

Turning now to strategy 3, wastewater reuse. Here in the middle of the city on such a small lot as ours, reuse would generally not be feasible. Unless of course Austin Water were to install the “purple pipe” system redistributing reclaimed water from its centralized treatment plants into neighborhoods like ours, a very unlikely scenario, since that would be inordinately expensive. On any lot out in the countryside, however, that has a “septic” system – or OSSF (On-Site Sewage Facility) in regulatory system speak – folks could install a system that produces and reuses reclaimed water, as is reviewed on the Waterblogue here, and on my company website here. Really, just about every OSSF could be a reuse system, defraying whatever amount of irrigation may be needed on each lot. Cumulatively, over this region, that could conserve a lot of water.

In new development, however, a collective wastewater reuse system to defray irrigation demands can be quite feasible, even saving money relative to the cost of installing and running a conventional wastewater system to serve these developments. As was reviewed in “This is how we do it”, the dispersal fields of a decentralized concept wastewater system could be arrayed to irrigate front yards, parkways, parks, the margins of walking trails, etc. The illustration from that blog is shown below. By this means, a large portion of irrigation demands in new development – of any sort, not just residential projects – could be cost efficiently carried by wastewater reuse.

[click on image to enlarge]

Before proceeding, it is noted that by integrating rainwater harvesting with the water quality management strategies that development rules in this region typically require, a further contribution to irrigation water supply can be realized in the sort of development illustrated above. This strategy was reviewed on the Waterblogue in “… and Stormwater Too”. The illustration of the strategy in that piece is shown below. The rooftop runoff, and perhaps the water falling on the driveway too, can be stored and used for irrigation water. Depending on the volume of storage provided, this could largely obviate using potable water for back yard irrigation, on top of the savings from wastewater reuse to irrigate the front yards. Between them, pretty much taking irrigation off the potable water supply.

Now let’s look beyond the residential lot level, to multi-family and commercial developments. Below is the street-facing landscape of an apartment village, a large area of turf.

And right across the street from that, we see the same thing, very large areas covered by turf, that appear to be well irrigated, given that they stay as green as they appear in these pictures through the Austin summer.

Note that these turf areas serve no purpose that would dictate that turf must be the landscape. No, this is simply an aesthetic, someone’s evaluation of what such areas “should” look like. Might such an evaluation be called to question, on sustainability grounds? Why could not such areas be instead native wildflower meadows? Looking something like this?

This sort of landscaping would present the project’s street view just as well, in any functional sense. Indeed, it would impart a “sense of place” to the site. And it would require far less irrigation to maintain it in “presentable” condition. It would also require much less routine maintenance, perhaps “trimming” a couple times a year, and no applications of fertilizers, herbicides or pesticides. So the labor and materials inputs to maintain such areas would cost the project owners less, beyond the avoided cost of irrigation water.

Now in such projects as those apartment communities shown in the above photos, it may indeed be reasonable to implement a point-of-use wastewater reclamation system – again in the same manner as was illustrated in “This is how we do it” – to produce whatever irrigation water may be needed to maintain such landscapes. As well as to irrigate the turf that “normally” covers the grounds of such projects, like we see in the picture below.

Being within the area already served by the conventional sewer system, such a project-scale wastewater system could provide the reclaimed water for irrigation on demand, and whatever treated water flow is not required for irrigation supply could simply overflow into the sewer, reducing the organic load on the conventional wastewater system.

It’s an open question of course what the payback on avoided irrigation water costs may be – given that water is currently priced well below the costs of replacement water supplies, such as importing water from remote aquifers – or what the operations and maintenance costs of such project-scale systems may be, understanding they are essentially “redundant” to the conventional system. These are matters that should be investigated, comparing these costs to the long-run marginal costs of other options for obtaining new water supplies, given the sustainability benefits of taking irrigation water supply off the potable water system. Noting that, while such actions might appear to be a stretch as retrofits of existing development, it may be quite cost efficient to design them into the very fabric of new development.

It is also noted in passing that covering such areas with wildflower meadows instead of turf would impart stormwater management benefits. This is because the wildflower meadow would have a somewhat lower propensity to produce runoff than would the same area covered by turf. “Curve Number” (CN) is the measure of how “absorptive” a soil-plant complex would be under the Soil Conservation Service method for evaluating stormwater drainage issues; the higher the CN, the more runoff would issue from the area, given the same rainfall. For a wildflower meadow on the areas shown in the pictures of the apartment communities above, CN would be about 65, while if covered by conventional turf it would be about 79. The wildflower meadow would start issuing runoff at a rainfall depth of 1.08”, while the turf area would start issuing runoff at a rainfall depth of only 0.53”, so on an annual basis the wildflower meadow would issue somewhat less runoff to be managed in stormwater facilities.

And these wildflower meadows shouldn’t need any stormwater quality management at all; they themselves might even serve as the stormwater quality management device for runoff from development upslope. Peak flow rates to be managed would also be somewhat reduced. Applied at scale in a watershed, this could significantly reduce the costs of installing and maintaining stormwater ponds, rain gardens, and such. Furthering the benefit of adopting the more regionally appropriate landscaping ethic.

Still, however, there will be turf to irrigate. In discussing this matter with the city councilman representing my district, he asserted that folks in an affluent area of town would not willingly part with their turf-covered yards. So let’s look at that sort of situation.

Looking at the area of town the councilman referred to on Google Earth, I see the lot shown in the picture below. Not at all atypical of that neighborhood, this one just happens to be fairly free of tree cover, making it easier to measure the area of turf and of roofprints available for rainwater harvesting.

Roughly measuring off Google Earth, I calculate there is about 3,500 sq. ft. of roofprint covering the house and the garage, and I calculate there is about 5,000 sq. ft. of turf landscape, and minimal other landscaping on this lot. So let’s look at how much water would be required to keep that turf looking lush, and the prospects for supplying it with other than potable water.

Inserting those areas into the rainwater harvesting model, programmed with Austin historic rainfalls over the period 2007-2023 and an irrigation water demand profile suitable for turf around here, and presuming a 10,000-gallon cistern were installed, RWH would have covered 62% of the total irrigation water demand over this period. In the severe drought year of 2011, only 24% coverage would have been provided by harvested rainwater. 100% coverage of irrigation demand would have been provided in only one of the 17 years covered by this model.

If a 20,000-gallon cistern had been installed, the overall coverage would have risen to 81%, with the 2011 coverage rising to 37%. 100% coverage would have been provided in 9 of the 17 years covered by the model. Increasing cistern size to 30,000 gallons, overall coverage would have risen to 90%, and the 2011 coverage to 51%. 100% coverage would have been attained in 11 of the 17 years.

It would of course need to be investigated where a cistern might be accommodated on the lot and what such a rainwater harvesting system would cost. Beside the cistern, the costs of proper guttering, collection lines, and a treatment unit – consisting of only a cartridge unit for this irrigation usage – and pressurization facilities, and system on-going operations and maintenance, would have to be calculated, to determine the amortized cost of this rainwater supply. Once again noting that payback may not appear favorable relative to the defrayed cost of the potable water supply for this irrigation, since the actual replacement cost of an equivalent amount of water supply from other new sources would no doubt be much higher than the current water rates charged by Austin Water. But it may compare favorably to the long-run marginal costs of other new supply sources.

Turning now to on-lot wastewater reuse for irrigation supply, as was noted above, installing an OSSF to produce reclaimed water would greatly defray draw on the potable water supply. As for what such an on-lot treatment unit might look like, we see such an installation on a lot in an urban fringe neighborhood in the picture below.

As you see, not really very “intrusive”, with only the top of the filter bed and the tank hatch lids visible on the surface. If this were deemed too visually intrusive, these facilities could be tucked into a corner of the lot and screened with plants, such as the Texas sage plants in the above picture.

How much irrigation water this system would produce would of course be set by the amount of wastewater produced in the house, which would be largely driven by the house population. Presuming a 4-person occupancy, the average daily flow would be likely be in the range of 200 gallons/day. With a 5-person occupancy, it might be about 250 gallons/day. Over the 5,000 sq. ft. of turf area on the lot we are looking at here, this would provide an irrigation rate of about 0.45-0.56 inches/week. That would be more than sufficient over 7 months of the year, but deficient in the other 5 months, which is the peak irrigation season, May thru September. In those months this reclaimed water flow would provide between about 1/2 to 2/3 of the total modeled irrigation demand. RWH could defray the rest.

Again, it would have to be investigated what such an OSSF would cost, including the subsurface irrigation system that would disperse the reclaimed water underground, to minimize contact hazard potential in such an “unsupervised” environment. This would set the cost of this new water supply, that could then be compared with the long-run replacement cost of water supplies by other means.

While it may appear questionable if providing new water supply on-lot with rainwater harvesting at the scale reviewed above or by installing an OSSF would “pencil out”, it is noted than no such options have so far been considered and evaluated by Austin Water as it conducts planning under its Water Forward program, touted as a 100-year water plan for Austin. Given that long time frame, it is called to question if failing to consider such “outside the box” options as these is missing a march.

Noting again that while retrofits such as reviewed here may not be deemed fiscally viable, the situation in new development, where these facilities may be designed in from the start, may be very favorable. Cities in situations such as Georgetown finds itself – and San Marcos and Liberty Hill and Dripping Springs, anywhere that growth is straining current water supplies – could go a long way toward blunting its problems by taking the 3 measures we’ve reviewed to pretty much take irrigation off the potable water supply:

  1. Move to a more regionally appropriate landscaping ethic.
  2. Employ building-scale rainwater harvesting to create the irrigation water supply.
  3. Create wastewater reuse systems to provide the irrigation water supply.

These opportunities are there for the taking in many, many situations all over this region. Again, particularly in new development – understanding it is growth that is largely straining current water supplies – where these strategies could be essentially designed into the very fabric of the development, providing opportunities to actually save money while saving water. As has been argued in several other Waterblogue posts – see for example here, here and here.

Despite this, such strategies have not appeared in any of the regional water plans, nor as noted in the Water Forward process in Austin. It is suggested that water planners give these strategies a bit more attention than they have received so far. As was noted on the Waterblogue in this post, society would be well served by taking a peek down the road not taken … so far, as that could make all the difference. Are the water planners around here up to that task?

Astounding in its lack of ambition

December 5, 2024

“Truly, Councilman, can you stand behind a plan that proposes to focus on an infrastructure model that, by Austin Water’s own reckoning, will only attain 12% reuse in 55 years? Do you not find that astounding in its lack of ambition?”

That is the message I wrote to the Austin City Council member representing my district, following up my argument to Austin Water about Water Forward, the city’s plan for how water is to be managed around here over the next 100 years. Sent under the subject line, “Austin Water Backward” – stark, and yeah, a bit snarky. But read the argument for yourself, and make your own determination if clinging to the past’s business as usual is likely to be the most fiscally sound plan, the plan that would best serve the local and regional water economy over the next century. Or, since we have management concepts available that could approach 100% reuse at the project scale, if it’s questionable that backward looking plan is Austin’s pathway to sustainable water, if it lacks sufficient ambition to get us there.

*************************************************************************************

A look at the table entitled “Summary of Water Forward 2024 Strategies, 2030-2080” on page 31 of the 2024 Water Forward plan update shows that Austin’s planning is rooted in the presumption that the prevailing, essentially 19th century water infrastructure model will simply be extended and perpetuated, with little focus on anything else. I argue that this sort of backward looking strategy will not be a sufficiently ambitious pathway to sustainable water, and that Austin Water’s apparent conclusion it is the cost efficient manner of proceeding is likely rooted in flawed analysis.

The situation is exemplified by the “Non-Potable Water Reuse Strategies” part of the table, recapitulated below. This indeed indicates the expectation that wastewater system development going forward will continue to focus on piping flows from hither and yon to centralized treatment plants, and that non-potable reuse will continue to be focused mainly on the “purple pipe” system redistributing reclaimed water from those plants to points of use. The latter is a strategy that has been, throughout its 30 year operating history, kept afloat by infusions of money from other sources, and is not expected to ever pay for itself. Yet, the table shows, Austin Water seems intent on doing it mainly that way.

Non-Potable Water Reuse Strategies (Acre-Feet/Year)

Water Forward Strategies203020402050206020702080Annual Cost
$/ac-ft/year
Centralized Reclaimed1,1008,20012,90017,60022,30026,900$2,243
Decentralized Reclaimed02005008001,1001,300$5,158
Onsite Reuse1,1004,0005,7007,3009,00010,600$8,957
Total Reuse Attained2,20012,40019,10025,70032,40038,800—-

On “decentralized reclaimed”, I’m still wondering how that whole idea is understood at Austin Water, as outreach efforts to discuss this matter have been … well, I’d say rebuffed, but to be rebuffed, there would have to be evidence that it was looked at to begin with … Sorry, snarky again. But just highlighting that all throughout the decade that Water Forward has been in process, this whole idea – that can approach 100% reuse on each project – has been marginalized. The table indicates so little consideration of this strategy that it’s expected NO “decentralized reclaimed” would come on line by 2030. And it shows that “decentralized reclaimed” is projected to be a paltry 5% of “centralized reclaimed” reuse in 2080. This does indicate there will be little focus on if and how we might transform the infrastructure model over the next 55 years!

That 55-year timeline we’re talking about here bears some consideration. In 1999 I wrote a piece, aimed at fostering a more focused discussion of the then-pending 50-year water deal with LCRA, entitled “Is a Billion Dollar Crapshoot Our Best Public Policy?” The example of “silicon” plants – a totally unknown technology in 1949 – having become a significant part of the industrial sector here, drawing a sizable population to Austin, was offered to illustrate that it’s pretty impossible to know with any certainty what Austin will be like 50 years into the future. Indeed, it’s a crapshoot. The article posed the many ways the water future may be different from the past, how the water infrastructure model may evolve considerably over the ensuing 50 years, all being echoed by many “futurists” even back then, a quarter century ago. Most of which could be encapsulated under the now popular banner of “One Water”, the idea that all water flows have resource value that should be maximized, rather than those flows being “disposed of”. A perusal of that review would immediately illuminate that Austin Water has not really pursued much of that over the last 25 years. And again, the apparent presumption of the 2024 Water Forward plan is that, in regard to wastewater management, little will be done in that vein over the next 55 years.

Which we can see by examining the projections in the Water Forward plan. Austin Water seems to not want to share the magnitude of current wastewater flow. The best estimate I’ve been able to derive is from the latest rate study, from which it seems implied that current wastewater flow is about 87,000 acre-feet over the year. If this is taken as the flow in 2030, noting it will undoubtedly increase by then, reuse through the “purple pipe” system in 2030 would be only 1100/87000 = 1.3% of total wastewater flow. This indeed shows the limited reach of this mode of reuse, that after 30 years of developing the “purple pipe” system, this is the paltry level of reuse that’s being attained.

Water usage projections in the Water Forward plan can be parsed to guess that wastewater flow in 2080 would be around 2.5 times the current flow, or 2.5 x 87000 = 217,500 acre-feet over the year. Presuming this basis, the table shows expected “centralized reclaimed” reuse through an expanded “purple pipe” system would be 26900/217500 = 12.4% of total wastewater flow. A robust growth, at almost a 10-fold increase in share of total flow, but still far short of making the “One Water” ideal, utilization of the water resource rather than “disposal” of a perceived nuisance, the focus of this societal function. Certainly, since the collection system typically represents 70-80% of total wastewater system cost, the pipe-it-away infrastructure model reflects that fiscal resources would be far more focused on “disposal” than on reuse.

Meanwhile, it’s projected that “decentralized reclaimed” would have achieved a penetration of only 1300/217500 = 0.6% by 2080. Reflecting, over 55 years, an almost total lack of focus on if, where and how to transform the infrastructure model, apparently rejecting the “One Water” decentralized concept strategy, which can approach 100% reuse on each project.

Looking at costs for these options, Austin Water projects the cost of implementing “decentralized reclaimed” would be 2.3 times that of “centralized reclaimed” reuse, $2,243 vs. $5,158 per acre-foot/year. One might guess this is their basis for continuing to focus system development on the prevailing conventional pipe-it-away infrastructure model. But based on everything I have observed in this field, this does not make any sense. The decentralized concept strategy, in the setting of urban fringe and hinterlands development, always shows to be more cost efficient, on a global cost efficiency rating. As set forth, for example, in This is how we do it and, more explicitly, in Let’s Compare. Which is likely the key to why the Austin Water projections show the opposite, rather extremely so; global cost accounting is not their standard.

One guesses that the costs of “centralized reclaimed” are limited to the “purple pipe” system, not accounting for the cost of “producing” that reclaimed water flow to begin with – the costs of wastewater collection and treatment – and probably not accounting for costs to implement end uses either. While for “decentralized reclaimed”, it is guessed that the total system costs were presumed – the costs to collect, treat, redistribute and utilize this water resource. Rather distorted, not an apples-to-apples comparison. But what Austin Water seems to be basing its desire not to examine the infrastructure model upon, rather wanting to simply forge ahead presuming the future will be just like the past.

Now of course Austin Water can legitimately claim that the costs of collection and treatment to “produce” that reclaimed water supply are sunk costs, as those facilities are already in place, so should not be accounted as a cost factor for “centralized reclaimed”. True enough for the portion of the system that is currently in place. Remember though we are looking at policy and strategy over a 55 year timeline here, over which it is projected that wastewater flow would increase 2.5-fold. This will require that lines be extended and/or upgraded, new lift stations would be needed and/or current lift stations upgraded, and new treatment capacity would have to be added. None of these are sunk costs.

Austin Water might also argue that these new costs would be incurred in any case to impart wastewater management regardless of whether the effluent would be reused. But of course that argument only holds water if wastewater system capacity would be expanded only via extension of the prevailing conventional pipe-it-away infrastructure model. If areas of new development – on the urban fringe or in the hinterlands where lines would need to be extended, or for infill development that would require upgrading of existing facilities all along the way from the development to the centralized treatment plant – were to be managed instead by pursuing the “One Water” decentralized concept strategy, those costs may be obviated, saving as noted in First ‘Logue in the Water untold costs by never having to build another trunk main. Or to extend/upgrade the loss leader “purple pipe” system.

This leads us to the “onsite reuse” category, for which about a 10-fold growth is projected, from 1,100 acre-feet/year in 2030 to 10,600 acre-feet/year in 2080. Impressive on the surface, but is it really? When EVERY commercial and institutional building is a candidate for this form of water conservation, what level of dedication to reuse does a 10600/217500 = 4.9% penetration over 55 years represent? One wonders what Austin Water thinks would be so limiting a proliferation of this practice?

For one thing, maybe it’s a poor appreciation of “reuse”? Austin’s highly touted Onsite Reuse Ordinance does not actually very much address reuse. It’s focused on site-derived flows, mainly harvested rainwater and condensate, the usage of which would be “original” use, not REuse. That is a term of art generally applied to reclaimed wastewater, which this ordinance gives very short shrift. It does not even ALLOW “onsite” reuse of reclaimed wastewater except by variance. So it appears that Austin Water is really not that into this “One Water” idea of reuse at the building scale.

Which is a shame, since in its most “advanced” form, this strategy would render commercial and institutional buildings, or whole campuses of such buildings, water independent, running on Zero Net Water, as laid out over a decade ago. The legitimacy of which is now well recognized, being marketed, for example, by Texas Water Trade under the banner of “Net Zero Water”. But Austin Water seems to not expect such developments to ever “unhook” and become their own water supply centers, rather they might only just defray some non-potable usage, while the prevailing conventional pipe-it-away infrastructure model will continue to be the major mode of management across such developments. Failing to approach the essentially 100% reuse that is readily available here.

As to cost of “onsite reuse”, at $8,957, 4 times that of “centralized reclaimed”, doesn’t seem all that cost efficient. But … It is not likely that relieved capacity on either the potable water supply system or the centralized wastewater system are accounted for. If a building were to go Zero Net Water, there would be no need to extend – or in the case of “nodal densification” that was all the rage in the Imagine Austin program some years ago, upgrade – waterlines to and wastewater lines from these developments. Little doubt none of this was recognized in the cost accounting of these “onsite reuse” systems. Here again, the cost of a system to collect, treat, redistribute and utilize the site-derived water flows is likely being compared to the cost for ONLY the redistribution system for the “centralized reclaimed” option. Not the apples-to-apples comparison that, in a rational process, would be the standard.

Also, the asserted cost may be inflated over what it needs to be, because it has so far appeared that Austin Water, and Texas Water Trade as well, presume that the treatment units for building-scale reuse would be activated sludge units, imparting high operating costs. As discussed in Appropriate Technology, using technologies more appropriate to the building scale would render these systems more cost efficient, so would further reduce – eliminate? – the asserted cost disadvantage of building-scale reuse.

All of this reflects that there seems to be no “Visionary-in-Chief” at Austin Water, and so the process is being controlled by “mainstreamer” viewpoints, failing to imagine that the future might be anything but an extension of the past. So what we have here could indeed be quite fairly seen not as a Water Forward plan, rather a Water Backward plan. Certainly, in regard to the matters discussed here, a backward looking plan. One which just presumes that the future will be very like the past, just presumes that the prevailing, essentially 19th century water infrastructure model will be extended and perpetuated, world without end.

As was set forth in One More Generation, we need to get past this backward looking mental model, essentially just rearranging the deck chairs as the ship goes down. On behalf of everyone’s children and grandchildren, we should all ask Austin Water to consider if this is the manner in which it really should be proceeding. Should we really be good with a plan this astounding in its lack of ambition?

If you’re going to call it a vision …

January 29, 2024

A development called Mirasol Springs is being proposed in Central Texas, along the Pedernales River on the Travis County–Hays County border. The development scheme is shown in the schematic below. It includes a “resort” hotel (the Inn), “branded residential” homes, “resort” cottages, “conservation” lot houses, a research facility, and a farm. This area is a somewhat “pristine” landscape, in particular including much of the – so far – “undisturbed” Roy Creek watershed, renowned by naturalists as a great example of a “native” Hill Country landscape. Thus it is considered an imperative to develop in this area with great sensitivity to this landscape, in particular in regard to water resources management, to blunt the draw on this area’s limited water supply resources and to minimize water quality degradation. The developer’s scheme proposing to accomplish that is set forth on the project’s website here, offering his team’s vision of how to best manage water resources – water supply, “waste” water management, and stormwater management. The “header” of this page reads, “Mirasol Springs will set a new standard for environmentally focused Hill Country development.” Raising the question, would it really?

As reported in “One Water” = the “Decentralized Concept”, there is a broadly supported, but so far largely unrealized on the ground, idea that engineering practice in this area needs to move toward “One Water” practices, and that a better understanding of the way we do this needs to be fleshed out. It was argued in that post that the “One Water” ideal would be most effectively and beneficially delivered by designing efficient water management into the very fabric of development, as if it were a central point, rather than to first arrange for water to “go away” and then to attempt to append on that efficient management at the “end of the pipe”, as if it were an afterthought. With the inevitable conclusion from this being that imparting “One Water” practice will rely in large part on employing distributed management schemes, such as the decentralized concept described in the piece linked above. Let’s take a look at how all that might play out in a setting like Mirasol Springs.

Water Supply

In the vision the developer sets forth on the project website, listed under “Water Use” are four components:  surface water, reclaimed water, rainwater harvesting, and groundwater.

Under “surface water”, the website states, “Surface water purchased from the LCRA [Lower Colorado River Authority] will be the base water supply for Mirasol Spring’s [sic] potable water and will meet 100% of our demand.” It is first brought to question, just how is this so very conventional idea – extracting a water supply from the watershed-scale rainwater harvesting system that supplies the vast majority of water supply in the Central Texas region – a “new standard”?

In this case, as can be seen in the graphic below, the proximate source of the potable water supply would be the Pedernales River, which runs along the border of the project site. The water withdrawn from the river would be pumped into a water supply reservoir to be built on the site. Water would be withdrawn from that reservoir and run through a water treatment plant. The treated water would be distributed in a conventional distribution system, routing water to all of the buildings on the development, requiring distribution lines to be extended to all the various developed areas on this site.

This conventional water supply system would entail a great deal of site disruption. This includes installing the intake structure in the river and a pump system and delivery pipe running up the bluff on the Mirasol Springs side of the river, and installing the reservoir, which would entail excavating the pond and distributing the excavated material on the project site. The distribution lines would cause disruption over and between the developed areas, in particular to get to all the “conservation” lots in the more “pristine” parts of the site, in the Roy Creek watershed.

Raising the obvious question, what could the developer do instead to create a water supply system for the project? Skipping to the “rainwater harvesting” component, the website states, “Rainwater collection from rooftops will be a requirement for larger structures constructed across the property, a practice that is already in use on the ranch. Deed restrictions for home sites will include water capture for irrigation purposes and guidelines for non-water intensive vegetative covers, water conservation-oriented landscapes and xeriscapes. Landscape irrigation on home sites will be restricted to rainwater collection only; no potable water will be allowed for landscape use.” While this laudably proposes to make rainwater collected on site from rooftops a primary supply for irrigation needs, it neglects considering the most “One Water thing” one could conceive here, maximizing the resource value of the water falling upon this site. So perhaps obviating all the expense and disruption of creating and operating the surface water supply system.

Consider the benefits of a water supply derived from distributed building-scale rainwater harvesting (RWH) vs. the surface water system. First and foremost is the efficient use of the area’s strained water resources. The very reason why the developer would pursue groundwater as a backup supply, reviewed below, is that they conceive the possibility that the Pedernales River would run dry, or dry enough to have their water supply curtailed. So why not consider the prospect of not depleting that surface water resource at all?

Second, as noted, with the facilities arrayed at the building scale, site disruption to install the storage pond and the water distribution system would be avoided. As would the inevitable leakage losses that plague such water distribution systems, so largely avoiding that often rather sizable source of water use inefficiency.

Third, the energy requirements to run the building-scale RWH systems would be considerably lower than would be required to run the surface water supply system. In the former, considerable energy would be required to lift the water from the river to the on-project water supply pond, and from the pond to the water treatment plant, and also to run the more energy-intensive surface water treatment unit, and then to pressurize and move water through the distribution system. In the building-scale systems, any lift from a cistern would be low and the water would only have to be run a very short distance. The treatment unit required to render the roof-harvested rainwater to potable quality would require far less energy than the conventional surface water plant. Not only would this be a fiscal plus for the MUD that will pay the energy bills, since it takes water to make energy – the so-called water-energy nexus – all this energy conservation would enhance the overall efficiency of the region-wide water system.

Fourth, under the surface water supply scheme, a considerable evaporative loss from the on-project water storage pond would be incurred, at its maximum just when drought would typically be at its worst. Evaporative losses from the covered building-scale cisterns would be minimal, a not-insignificant efficiency advantage for the building-scale RWH strategy.

Fifth and finally, pretty much the entire surface water system would have to be planned, designed, permitted and installed before the first building on the project could be provided a water supply. This is a hefty amount of up-front cost that must be incurred before any revenue-generating facilities may come on line, imposing a considerable “time value of money” detriment. The building-scale RWH facilities, on the other hand, don’t need to come on line until the building(s) each unit serves would be built, so the lag between incurring those costs and being able to derive revenue from each building could be much shorter. Also, it is expected that each of the building-scale systems – excepting for the Inn – would fall below the threshold to be classified as a Public Water Supply System, so the long and expensive process of permitting these systems through TCEQ could be avoided, a further “time value of money” benefit.

To determine the degree to which building-scale RWH could create a sufficient supply to meet the potable water demands in the buildings, a model would be used, into which the roofprint (water collection) area, the cistern (water storage) volume, and the expected water usage profile would be input. The model would be run over a number of years of historic monthly rainfalls to see how much, and how often (if at all), backup supply would have been needed in each year through the cycles of drought and plenty, and how much water supply would have been lost via cistern overflows during large storms and through extended rainy periods.

Based on the outcomes, “appropriate” building design, to increase roofprints – for example the “veranda strategy”, adding on covered patios and porches to add relatively inexpensive additional collection area – and “proper” cistern sizes, as well as the target conservation behavior, could be chosen to make the system as robust as desired. Past modeling of and experience with building-scale RWH in this region indicates that this strategy could provide a quite sufficient supply for much of the interior (potable) water uses at Mirasol Springs.

Through this means, the potential for building-scale RWH could have been evaluated, and the costs of using it could have been compared to the costs of the conventional surface water supply system. And the benefits of avoiding the site-wide disruption entailed in the conventional strategy could also have been evaluated. None of this appears to have been considered by the developer, rather it seems to have been simply presumed that the surface water supply, the watershed-scale RWH system, was “needed”, that building-scale RWH could be no more than an adjunct supply to defray irrigation usage. Opportunity to set an actual “new standard” foregone.

Now consider the “groundwater” component. The website says: “Groundwater will only be used if surface water is unavailable or curtailed. The goal of the project is to significantly limit the use of groundwater through conservation, including the use of reclaimed water and harvested rainwater, noted above, to meet non-potable water demands. When surface water is not available, Mirasol Springs will utilize groundwater to service the demand for domestic use. No groundwater will be used for landscape irrigation. Good stewardship of groundwater resources will be supported through additional planning and holistic water management measures. There will be no individual water wells. Water availability studies have demonstrated that adequate groundwater is available from the underlying aquifer when the project is required to use groundwater.” Quite a number of claims and caveats there to be considered.

While there is no indication what the parameters of the deal to purchase water from the LCRA may be, it is expected that any curtailment or unavailability would be predicated on the flow in the Pedernales River, which would rise and fall with cycles of drought and wetter times. If a drought were of such severity that river flow would drop so low that LCRA would curtail or ban further withdrawal of the surface water, it would be exactly such a time period that the region’s aquifers would also be under maximum stress.

There is no analysis, however, of “[w]hen surface water is not available”, and so when/if groundwater might be “needed” is entirely opaque. There is no indication, no standard for what would constitute “[g]ood stewardship of groundwater resources”, no idea offered for how those resources “will be supported through additional planning and holistic water management measures.” It all seems to be a “just trust us” proposition, hardly any sort of “new standard for environmentally focused Hill Country development.”

Thus, by plan, groundwater would be prevailed upon to carry the entire potable water supply just when that source too would be most stressed, and so when groundwater withdrawals would be most problematic. But again there is absolutely no analysis of when/if groundwater might be “needed”. So it may be called to question if indeed “adequate groundwater [would be] available from the underlying aquifer when the project is required to use groundwater.” There is no indication that a drought-stressed local aquifer could provide the full potable water demand over any given period, for this or any other developments in this area. Indeed, it is the questionable future condition of the local aquifer that urged the developer to look to a surface water supply to begin with.

Then too it can be called to question if the treatment requirements for a groundwater supply would be the same, using the same sort of treatment train, as for the surface water drawn from the river. Water quality of groundwater varies considerably across the Hill Country, and “over-drawing” aquifers can cause the quality of water from some wells to degrade. So this is another aspect of the overall scheme that appears to be a bit open-ended.

All this would be imparted by choosing to ignore the readily available “One Water” strategy, an actual “new standard” strategy, of maximizing supply from water falling onto this site.

“Waste” Water Management

While those water supply matters basically rest on analyses that the developer chose not to pursue, and almost certainly sells short the “One Water” supply strategy, in the “waste” water arena, there is a much more clear-cut choice. For the “reclaimed water” component, the website states:  “Mirasol Springs will reclaim wastewater from the Inn, the Farm, the University of Texas Hill Country Field Station, and all the home sites in a centralized collection facility that is aesthetically integrated into the landscape. There will be no septic systems. The facility will be equipped with the best technology available for nutrient removal and will reclaim 100% of the effluent for irrigation uses. This wastewater will be treated and used to offset irrigation needs for the property and other non-potable uses. No potable water will be used for landscape irrigation. Also known as beneficial reuse, this process completes the effort to maximize the lifecycle of water usage onsite. There will be no discharge into any creek or river. All wastewater will be collected and returned to a treatment plant.” This word salad begs for examination.

As stated, and as seen on the water systems graphic above, the developer is proposing that a conventional centralized system be installed, collecting all the “waste” water to be treated at one centralized facility. Including from the large “conservation” lots, entailing a rather long run of sewer line, in the Roy Creek valley, the most environmentally sensitive portion of this site, to collect a relatively small portion of the total amount of “waste” water that would be generated on the overall project. For those lots, to avoid the disruption and pollution — and the cost — those lines would impart, the developer should consider on-lot systems, to treat and reuse the water on each lot to serve irrigation demands there. Which, I expect, requires a dose of perspective.

It is noted that the website explicitly states, “There will be no septic systems.” Like that is a good thing. One can read between the lines here that “septic system” is deemed, at best, a secondary good, and likely is presumed to be a source of pollution, that the developer sees as being eliminated by centralizing the “waste” water from the various lots. Ignoring of course that components of the centralized system would themselves be pollution vulnerabilities. Conventional collection lines leak – longer runs of lines impart more leakage, and this becomes worse as the lines age – and manholes in those lines overflow. Lift stations inevitably needed in the terrain on this site will inevitably fail and overflow at intervals. And all this is in addition to the widespread disruption of the landscape that would be entailed in installing the centralized collection system.

This could all be obviated by choosing to pursue a decentralized concept “waste” water management strategy, treating and reusing the “waste” water as close to where it is generated as practical. Again, for the dispersed “conservation” lots, this would be a no-brainer strategy, presuming the use of the sort of “septic system” that is equal to the task at hand. A system providing high quality pretreatment – including removal of a large majority of the nitrogen from the “waste” water prior to dispersal – consistently and reliably while imparting rather minimal O&M liabilities. Then dispersing the reclaimed water in a subsurface drip irrigation field, arrayed as much as possible to serve grounds beautification, the landscaping that would be irrigated in any case, whether the reclaimed water was there or not, so practically maximizing beneficial reuse of the “waste” water resource in the on-lot environment.

The High Performance Biofiltration Concept treatment unit – set forth for distributed treatment duty in “This is how we do it”, and more fully described here – fits the bill here, being by its very nature stable, benign and robust. This is the very treatment technology used, for example, at the highly touted Wimberley “One Water” school, exactly because of that.

Unfortunately this treatment concept is not very broadly known, as the “septic system” market in Texas is so dominated by the “aerobic treatment unit” (ATU), which is a small, home-sized, bastardized version of the activated sludge treatment process, a process that is by its very nature inherently unstable, and even more unstable in these bastardized incarnations of the process. And, as reviewed in “Averting a Crisis”, the “septic system” regulatory process in Texas is legend for neglecting on-going O&M, so making it even more critical that “fail-safe” systems like the High Performance Biofiltration Concept be used, especially in a setting like Mirasol Springs.

Noting, however, that assuring “proper” O&M “shouldn’t” be an issue on Mirasol Springs, as the entire “waste” water system, no matter how deployed, would be professionally operated and maintained by the MUD the developer proposes to establish to run the water utilities on this project. All the more reason to use the inherently stable and robust, the more “fail-safe” High Performance Biofiltration Concept treatment unit instead of ATUs, to reduce the load on that O&M system.

Indeed, even the centralized treatment plant the developer proposes would be episodically loaded, with flows rising and falling through the diurnal cycle. So using an inherently unstable activated sludge system for that plant would be a vulnerability, urging the use of the “fail-safe” option there as well.

But again, the major vulnerabilities would be avoided by not centralizing all flows, rather by distributing the system to each building or set of buildings, as would be most cost efficient in each circumstance. Note in particular how this disperses risk. Any problem with the centralized treatment plant would impact on the entire flow, while a problem with any of the distributed treatment units would impact on only a minor fraction of the total flow. And again, there we would be using the low risk “fail-safe” treatment technology.

But the developer foregoes this opportunity, in fealty to the conventional understanding that it is best to centralize all flows to one treatment unit, despite all the pollution potential, and disruption, inherent in gathering flows to that central point. And despite the cost of running the collection lines out to each developed area, and – if the reclaimed water is to be reused for landscape irrigation as the “vision” asserts – the redistribution lines to send water from the central treatment plant to the areas to be irrigated. Again, all that would be avoided under the decentralized concept strategy.

Then there is the matter of the “time value of money”. The centralized system is an “all or none” proposition. The treatment plant would be initially built with the capacity to treat flows from all the buildings on the project, while portions of the development would come on line in phases, so that some of the treatment plant capacity would lie idle in the interim until the project was built out. It is also likely that the collection and redistribution lines would all have to be installed to get the water from all areas to the treatment plant and back to irrigation sites. Here too, investments would sit in the ground, not fully utilized, until the project built out.

A distributed system would obviate all that unrealized value. First by not having to install the collection and redistribution lines at all. And then, by using the improved type of “septic system” noted above, installed for each development area on a “just in time” basis, to serve only imminent uses. For development other than the “conservation” lots, likely a collective system serving more than one building at each treatment center, but still overall a distributed system, not requiring any investment in the larger-scale collection and redistribution lines. Further realizing the “time value of money” by building only those systems needed to serve imminent development, rather than having to plan, design, permit and install the entire centralized system before service could be provided to the first building.

As for treatment quality, the developer appears to presume a need for “the best technology available for nutrient removal”, even though all the reclaimed water would be dispersed in subsurface drip irrigation fields, providing all the treatment and “buffering” that the soil-plant complex offers. The High Performance Biofiltration Concept treatment unit can consistently and reliably produce an effluent with low – typically about 10 mg/L – BOD and TSS, the two basic measures of how well treated the water is. 20-30 mg/L is deemed “secondary” treatment, which is the minimum required to disperse the reclaimed water in subsurface drip irrigation fields. This is mainly to assure that drip irrigation emitters would not clog, as the level of “dirtiness” of the water as measured by BOD and TSS, as long as it is in the “secondary” range, is otherwise irrelevant in a soil dispersal system.

The High Performance Biofiltration Concept unit can also routinely remove a large majority of the nitrogen from the “waste” water, typically producing an effluent concentration of about 15 mg/L. Less than 20 mg/L is deemed to be a “safe” level that would, along with plant uptake and in-soil denitrification we have in this climate, result in a vanishingly small amount of wastewater-derived nitrogen flowing into environmental waters when dispersed in a TCEQ-compliant subsurface drip irrigation field.

Phosphorus – the pollutant of greatest concern in discharges to streams – would be irrelevant here, since at the concentrations found in domestic wastewaters, phosphorus would be fully “sorbed” in any soil mantle that would provide a decent growing medium.

Bottom line, the High Performance Biofiltration Concept treatment system would deliver an effluent that would be highly protective of the environment, even in this sensitive area, assuming of course that the subsurface drip irrigation systems were well designed, well implemented and well operated. Which of course would be the same condition that the conventional system the developer proposes would have to meet.

The conclusion is that the decentralized concept strategy described here, utilizing distributed “fail-safe” treatment units and dispersing the reclaimed water into subsurface drip irrigation fields, would produce a “waste” water system for this project that would be more fiscally reasonable, more societally responsible, and more environmentally benign than would be offered by the conventional centralized system – with reuse appended on – that the developer proposes.

Stormwater Management

The website is light on how stormwater would be managed on this project. It states under the heading “Watershed Protection and Storm-water Runoff”, “The ultimate goal is to maintain the hydrology of the environment in its current state. This will be accomplished through short-term construction site management strategies that include silt fencing, soil berms and wattles to prevent erosion and silting of nearby streams.” Which seems to sequester the efforts to mitigating water quality degradation due to construction activities. Necessary of course as the development is being built, but the major task is to indeed “maintain the hydrology of the environment in its current state.”

In that quest, the website states, “There will only be a few homesites in the Roy Creek watershed, all with a 1,000-foot land buffer between the home [and] Roy Creek [sic]. The engineers recommend allowing for the native vegetation and soil to act as a natural ‘filter,’ as it has done for thousands of years, rather than trying to capture it and then release it from a pond or other structural water quality controls that would unnecessarily disrupt the natural character of the land.” The actual solution here is restricted to the “conservation” lots only, leaving it open what is to be done elsewhere, but implying the only option is the conventional view of stormwater management, that the site “should” be efficiently drained into an “end-of-pipe” facility – “a pond”. It seems to deny the “One Water” strategy of collecting and infiltrating the runoff on a more distributed basis, the Low-Impact Development (LID) strategy utilizing permeable pavement, Green Stormwater Infrastructure (GSI), etc.

The obvious measure of maintaining “the hydrology of this environment in its current state” is to render the rainfall-runoff response of the developed site as close as practical to that of the native site. This dictates that, up to the rainfall depth where runoff would begin on the native site, after all the “initial abstraction” were “filled”, all runoff should be intercepted and caused to infiltrate. While some of that infiltration might be imparted by flow over the “natural filter” in downslope areas, more generally some of that infiltration would have to be “forced”, with permeable pavement or by running it through GSI, such as distributed rain gardens. It seems the developer has not considered this basic “One Water” concept, choosing to rely on a more conventional end-of-pipe management scheme. Perhaps entailing the installation of grey infrastructure to convey flows from developed sites to ponds and such, as seems to be implied in the “Mirasol Water Systems” schematic above. It is called to question how well this could maintain the rainfall-runoff response very similar that of the native site.

The website further asserts, “Considerations will also include restrictions on impervious cover to prevent run-off and divert water into the aquifer.” Disregarding the non-sequitur, it should be clear that the LID/GSI strategy, infiltrating runoff from impervious surfaces on a highly distributed basis, is the manner in which one could reasonably “prevent run-off and divert water into the aquifer”, particularly on the more intensive portions of the development, like the Inn and resort cottages, perhaps the “branded residential” homesites too. We’d just have to deal with pavement, since rainwater harvesting would basically take rooftops “out of play”. The water that would have infiltrated over the area covered by rooftops would be captured, stored and later infiltrated, either through irrigation directly or once used in the buildings and becoming “waste” water, then irrigated. That concept was explained in this post.

Now as noted the 1,000-foot “land buffer” would indeed be quite effective in mitigating pollution and the increases in runoff imparted by development on the “conservation” lots, but of course there would have to be constructions to cause any concentrated flows to disperse into overland flow, so the scheme would not be quite so “automatic” as the website appears to present it. GSI, such as full infiltration rain gardens, should be installed there as well, to intercept flows off of impervious covers, to directly infiltrate some of the flow and to spread flows over that “natural filter”. This would be particularly so for any rainwater cistern overflows, which would be “concentrated” flows out of a pipe.

The website is totally silent on this LID/GSI approach — the “One Water” approach — implying that the only option the developer can conceive to letting stormwater runoff flow away downslope would be to first route it to “a pond or other structural water quality controls that would unnecessarily disrupt the natural character of the land.” Noting of course that it is the disruption of the “natural character” of the land caused by development that any such constructions would be installed to mitigate. Again this seems to reflect that conventional bias for gathering runoff into end-of-pipe “ponds”, rather than running it through highly distributed constructions like full infiltration rain gardens, with only “large” storm runoff overflowing on down the slope. Somewhat better mimicking the hydrology of the native site.

Summary

So it is that the water “vision” of the developer can readily be called to question. To sum it up, if the developer of Mirasol Springs is going to style its water management scheme as a “vision”, then perhaps it should impart some. It should follow the best “One Water” practices available in this setting, the water supply, “waste” water management, and storm water management practices reviewed above. Presenting the conventional scheme the developer proposes as a “new standard” can be quite fairly seen as simply greenwashing that very conventional scheme. If this project is to deliver on its promise of preserving and protecting this “pristine” landscape, a more holistic, more “One Water” strategy will be required.

Or so is my view of this matter. What’s your view?