Archive for the ‘Uncategorized’ category

The Zero Net Water Concept

December 12, 2025

The very “One Water” way to manage water resources on developments in the Wimberley Valley

1. INTRODUCTION

Imagine a water management strategy that would accommodate growth and development without unsustainably pumping down aquifers or incurring the huge expense and societal disruption to build reservoirs or transport water from remote supplies to developing areas. Welcome to the concept of Zero Net Water. As the name implies, it is a management concept that would result in zero net demand on our conventional water supplies. Well, practically speaking, approaching zero, as will be reviewed below.

The Zero Net Water concept combines 3 components:

  1. Building-scale Rainwater Harvesting (RWH), the basis of water supply
  2. The Decentralized Concept of Wastewater Management, focusing on reusing the wastewater; the hard-won rainwater supply, after being used once in the building, is reused to provide irrigation water supply.
  3. The Low-Impact Development (LID) stormwater management strategy, employed to maintain the hydrologic integrity of the land, and so of the watershed, as it is developed.

An indication of the potential of building-scale rainwater harvesting as a water supply in the Wimberley Valley is offered by this simple calculation. In one recent year, total water supply that the Wimberley Water Supply Corporation produced was about 154 MILLION gallons. The lowest annual rainfall recorded in Austin during the 1950s drought of record was 11.55 inches. With that rainfall, over the 20 square mile Wimberley WSC service area, the amount of water falling on that area would be a little over 4 BILLION gallons, so annual water usage would be about 4% of the rain falling on the area. At the average annual rainfall in Wimberley of 32.87 inches, the amount of water falling on the service area would be over 11 BILLION gallons. With that rainfall, the usage would be only about 1.5% of the water falling on this area. Clearly, there is far more water falling than could be used, even in a year of very low rainfall.

Of course, only a very minor fraction of the total area could be covered with suitable collection surfaces. And no one is suggesting that all other water supplies be abandoned, to rely only on building-scale rainwater harvesting. This just shows that the ability is there to significantly augment those supplies to serve growth. In the next section, it will be reviewed just how much this practice could off-load the conventional supply sources at each building.

But as noted Zero Net Water is not just building-scale rainwater harvesting – it is an integrated strategy, that creates greater overall water use efficiency. Water supply is integrated with wastewater management by centering the latter on decentralized reclamation and reuse, mainly for irrigation supply. Water supply would also integrate with stormwater management by using Low-Impact Development, green infrastructure and volume-based hydrology methods, to hold water on the land, maintaining hydrologic integrity of the watershed as it is developed, and reducing the area of landscaping that might need to be irrigated.

This integrated management results in minimal disruption of flows through the watershed, even as water is harvested at the site scale to be used – and reused – there to support development, creating a sustainable water development model. This integrated model would better utilize the water supply that routinely falls upon the Wimberley Valley, so rendering the whole system more efficient. This all is of course the essence of the “One Water” idea.

2. RAINWATER HARVESTING FOR WATER SUPPLY

Using building-scale rainwater harvesting for water supply instead of conventional sources can actually increase the overall efficiency of the water system. This is accomplished by taking advantage of the inherent difference in capture efficiency between building-scale and watershed-scale rainwater harvesting. Of course, all our conventional water supplies are watershed-scale rainwater harvesting systems. Which are inherently inefficient.

A really big inefficiency, as can be seen in the illustration in Figure 1, is that only a minor fraction of rainfall over a watershed, something like 15%, makes it into streams, reservoirs or aquifers, the “cisterns” of the watershed-scale system. Most of it is lost to evapotranspiration, which of course supports plant life in the watershed.

Figure 1

What does get into reservoirs is subject to very high evaporation loss. To offer a quantitative feel for how big a loss this is, evaporative loss each year from the Highland Lakes is more water than Austin treats and uses annually.

Then the water from those cisterns must be distributed back to where it is to be used, losing more water. Industry standards say 15% transmission loss is “good practice”, and many distribution systems have much higher losses. The Aqua system serving Woodcreek, for example, has reported a loss rate up around 30%.

There are also what might be called “quality inefficiencies”. Water collected in reservoirs is somewhat impaired relative to the quality of rainfall, and needs considerable treatment. And even though it is generally considered potable, water in aquifers can be hard, or contain things like sulfur or iron, and also need some treatment. Treatment imparts costs and consumes energy.

There certainly is a high energy cost. Besides treatment, it takes a lot to lift water out of aquifers, to pump it long distances, and to pressurize the far-flung distribution system.

The building-scale rainwater harvesting system, illustrated in Figure 2, blunts all those inefficiencies. The inherent capture efficiency off a rooftop into a cistern approaches 100%. The actual efficiency will be less, of course, due to cistern overflows, wind conditions, gutter overflows during intense rainfalls, and so on, but still it will be very high.

Figure 2

Since the cisterns are covered vessels, there will be negligible evaporation loss. The building-scale distribution system is “short”, largely within the building, but in any case is built, and can realistically be maintained, “tight”, so transmission loss is also negligible.

Rainwater captured directly off a roof is very lightly impaired, so doesn’t need much treatment – cartridge filtration and UV disinfection is the typical system – so a lot of energy is being saved there. With a low lift out of the cistern and a very short run to where the water is used, the building-scale system uses far less energy to move water to the point of use. Given the water-energy nexus – it can take a lot of water to make energy – this also renders the overall system more efficient.

Growth dynamics make the building-scale system economically efficient. Since this system is installed one building at a time, it “grows” the water supply in direct proportion with demand. So besides more efficiently transforming rainfall into a supply usable by humans, creating a more sustainable water supply system, this strategy also creates a more economically sustainable system, because supply is built, and paid for, only in response to imminent demand, one building at a time.

So it can reasonably be asked, which of these models is more sane? Collect high quality water, very efficiently, where it falls and use it there? OR collect it at very low efficiency, with impaired quality, over the whole watershed, then lose a bunch – and use a lot of energy – running it through a long loop … back to where the water fell to begin with? The building-scale system is the clear winner on this question.

3. THE LOW-IMPACT DEVELOPMENT STRATEGY

Before going on, a myth about this strategy must be addressed. Some say harvesting off all the roofs would rob streamflow, that it wouldn’t more efficiently harvest water, it would just rob some from downstream users who depend on that flow. But if we examine that premise …

The Zero Net Water strategy would be used to serve new development, and when raw land is developed, as illustrated in Figure 3 quickflow runoff increases at the expense of decreased infiltration. When the land is transformed from a natural site to just 10-20% impervious cover, like in an exurban subdivision, the runoff would double, and with housing at more like urban intensity, typically imparting 35-50% impervious cover, it would triple. To prevent flooding, water quality, and channel erosion problems, measures must be taken to mitigate that shift from infiltration to runoff.

Development Impacts On Hydrology

Figure 3

Development rules in this area typically require that runoff be captured and treated, to protect water quality and minimize channel erosion. Unfortunately, much of mainstream practice centers on large end-of-pipe devices, like sand filters, which largely pass the water through, letting it flow away rather than holding it on the land.

This is where the Low-Impact Development stormwater runoff management strategy comes into play, to mitigate that problem. LID practices like the bioretention bed illustrated in Figure 4. Bioretention beds can retain and infiltrate a large portion of the total annual rainfall, shifting the balance back toward infiltration. Instead of a bare sand filter, this is site beautification, designed into the development plan, instead of just appended on, as those sandboxes are. It is also landscaping that doesn’t need routine irrigation, saving some water there.

Bioretention Bed

Figure 4

An even better strategy is to scale these bioretention beds down and spread them around. At this scale, as shown in Figure 5, these installations are usually called by their more colloquial name, “rain gardens”. With this strategy, the rainfall is captured and infiltrated in a manner that more closely mimics how that is done on the native site, on a highly distributed basis. Using many distributed rain gardens instead of one end-of-pipe pond somewhat restores the “hydrologic roughness” which is characteristic of raw land, so holding water on the land. This maximizes hydrologic integrity, achieving that aspect of Zero Net Water. And again, creates interesting landscape elements that don’t need routine irrigation.

Figure 5

Back to the myth, if building-scale rainwater harvesting is integrated into that capture process, harvesting runoff from the rooftop portion of new impervious cover, this just further mitigates the negative hydrologic changes that development causes. Because development causes such large increases in runoff, despite rainwater harvesting off rooftops, runoff from the developed site typically increases over what runs off the native site, even with the LID strategy being applied.

And then too, water collected in the cistern does not exit the watershed. No one is putting it in a truck and hauling it away. It is just being held up for awhile, then rejoins the hydrologic cycle in the vicinity where the rain fell to begin with. Most of the water collected for interior use eventually appears as wastewater. Under Zero Net Water, that’s used for irrigation, so the water that was harvested, after being used once in the building, is still maintaining plant cover in the watershed.

The bottom line is that some of that additional runoff created by development is captured and utilized. That is done instead of allowing the additional runoff to become quickflow, which if not mitigated in some other way creates water quality, channel erosion and flooding problems. Thus, with building-scale rainwater harvesting, overall yield of water supply usable by humans can be fundamentally improved, increasing the usable water yield without any significant impact on streamflow.

4. THE CAVEAT TO “ZERO”, CREATING PRACTICAL RAINWATER HARVESTING BUILDINGS

The building-scale cistern operates just like a reservoir, holding water for future use; indeed, it is a “distributed reservoir”. So like any reservoir, it will produce a “firm yield” of water supply, that will cover a certain amount of demand.

As will be reviewed shortly, considerations of cost efficiency lead to the concept of “right-sizing” the building-scale system – its roofprint and cistern capacity – so, that “firm yield” would cover demands most of the time. Instead of spending a lot more upsizing the system to cover the lastlittle bit of demand in a drought, a backup supply would be brought in to cover that. When, and only when, conditions do get bad enough that it would be needed.

Of course, that backup supply would come from the watershed-scale system, which would itself be most stressed just when that backup supply is needed. So it must be considered, what impact would that have on the watershed-scale system?

The total market for development is set by a whole lot of factors other than, “Is there a water supply available?” So it can be presumed that whatever development is served by building-scale rainwater harvesting would not be additional development, it would just displace some of the development that would otherwise have drawn from the watershed-scale system. Indeed, the point of Zero Net Water is to off-load the watershed-scale system, to avoid costly expansions of that system, like the long pipelines from far-away aquifers that have been proposed to be run into the Wimberley Valley. And also to avoid depleting aquifers and drying up springs, like might be imparted in the Wimberley Valley by development continuing to rely on local groundwater.

Therefore building-scale RWH systems would be “right-sized” to off-load the watershed-scale system most of the time, allowing it to retain that supply, so it could provide a small percentage of total building-scale system demand through a drought. And then, when the rains do come, the watershed-scale system would also recover faster, because the draw on it for backup supply of the building-scale systems would stop, again off-loading the watershed-scale system.

A historical rainfall model is used to determine what would be a “right-sized” system. Here a model using rainfall data for the Austin area covering the years 2007 to 2024 is used to illustrate that. This covers the 2008-2014 drought period, now recognized as the new drought of record for the Highland Lakes, and the more recent somewhat dry years, so this period presents a fairly severe scenario for the sustainability of building-scale rainwater harvesting.

In the model, the front page of which is shown in Figure 6 – this is the initial year model, 2007 in this case – enteries are made for a collection area – the roofprint – for the storage – the cistern volume – and for a water usage profile. The model shows how that system would have performed over the modeling period. Again, the goal is to have needed only a limited backup supply, only in the most severe drought years. Covering a rather severe drought period, the results of this model offer a reasonable expectation of water supply coverage by the building-scale system in the future.

Rainwater Harvesting Model

Figure 6

Model runs were conducted for two scenarios: (1) A “standard housing” subdivision, in which the presumed nominal occupancy is 4 persons; (2) A “seniors oriented” subdivision, in which the presumed nominal occupancy is 2 persons. For each scenario, models were run presuming per person usage rates of 45, 40, 35 and 30 gallons per day. 45 gallons/person/day is a usage rate that is expected to not be routinely exceeded by most people in a house with all current state-of-the-art water fixtures, as determined by surveys by the American Water Works Association. 35 gallons/person/day is the commonly assumed usage rate by people who do use rainwater harvesting for their water supply, expecting they would be quite conscious of water conservation. Experience has shown that this usage rate is readily attainable. As is indeed 30 gallons/person/day, based on a number of reports by people who keep track of their water usage.

The modeling outcomes are shown in Figure 7. Based on the expectations of local water suppliers, the coverage that needs to be attained to render the systems “right-sized” would be determined. For this discussion, it is asserted that a coverage of 97.5% or more of total water demand over the 18-year modeling period, leaving no more than 2.5% to be covered by backup supply, would be a “right-sized” system.

Figure 7

On that basis this table shows that for a “standard” subdivision, at a water usage rate of 45 gallons/person/day, a “right-sized” system would need to have a roofprint of 4,000 sq. ft. and a cistern volume of 30,000 gallons, as this would provide 97.6% coverage of total water demand through the modeling period. This compares with a 5,000 sq. ft. roofprint and a 42,500-gallon cistern that would have been required to have provided 100% coverage. Following the “right-sizing” strategy, the savings on each house would be 1,000 sq. ft. of roofprint and 12,500 gallons of cistern volume. This would be a rather large savings in costs to build the RWH systems, purchased by using backup supply to cover the last 2.4% of demand, needing backup supply only in the rather severe drought years of 2008-2009 and 2011.

If the water usage rate were 40 gallons/person/day, a full coverage system would require a roofprint of 4,750 sq. ft. and a cistern volume of 37,500 gallons. A “right-sized” system providing 98.3% coverage would require a roofprint of 3,750 sq. ft. and a cistern volume of 27,500 gallons, a savings of 1,000 sq. ft. of roofprint and 10,000 gallons of storage on each house.

At a water usage rate of 35 gallons/person/day, a full coverage system would require a roofprint of 4,000 sq. ft. and a cistern volume of 33,000 gallons. A “right-sized” system providing 97.5% coverage would require a roofprint of 3,000 sq. ft. and a cistern volume of 25,000 gallons, again a savings per house of 1,000 sq. ft. of roofprint, and a savings of 8,000 gallons of storage.

A water usage rate of 30 gallons/person/day would require a roofprint of 3,750 sq. ft. and a cistern volume of 27,500 gallons to provide 100% coverage. A “right-sized” system providing 98.3% coverage would require a roofprint of 2,750 sq. ft. and a cistern volume of 20,000 gallons, once again a savings of 1,000 sq. ft. of roofprint, and a savings of 7,500 gallons of storage.

Similar observations can be made for the seniors oriented development scenario. All this well illustrates the high value of practicing good water conservation, affording significant savings to build a sustainable RWH system.

One may look at those roofprints and conclude that the building-scale RWH strategy would only “reasonably” apply to rather high-end developments in which the houses would be “large”. But it is important to understand that the roofprint is not the living space in the house, it is the somewhat larger roof area. As can be seen in Figure 8, this would include in addition to the living area the roof overhangs, the garage (or carport), and covered porches and patios, or verandas. And it is the latter which could readily be expanded somewhat over what might be “normally” built, to provide relatively inexpensive additional roof area to create sustainable systems.

House Roof Area Illustration

Figure 8

Obviously, single-story houses would be favored, so as to create as much “routine” roofprint as practical. Single-story house plans of several builders active in the Hill Country were examined to see where the addition of verandas could be accommodated, such as is illustrated in Figure 9. It does appear that significant roofprint could readily be added, so that houses with “moderate” areas of living space could have enough roofprint added to obtain the “right-sized” roofprints. Of course, if the house were designed around this idea from the start, these designs could no doubt provide the needed roofprint more cost efficiently, and would yield more attractive designs. So it is suggested that architects and builders consider this, and create “Hill Country vernacular rainwater harvesting house designs”.

Veranda Strategy Illustration

Figure 9

In a seniors oriented development, where the nominal occupancy would be 2 persons, a single-story house with a fairly “normal” area of verandas, along with a garage/carport could readily provide the roofprint needed to create “right-sized” systems.

Another advantage to this “veranda strategy” is that the cistern could be integrated into the house design. With a large area of veranda, the cistern need not be very deep under the veranada floor to attain the storage needed. An illustration of building the storage in under the veranda floor is shown in Figure 10.

Built-In Cistern Under Verandas

Figure 10

Of course the cistern could also be built under part of the house too – that has indeed been done in some Hill Country houses – but with this concept, the cistern could be built around the foundation, not impinging on the building itself, so the builder could build rather “normally” within the house envelope. With this strategy, space would not be taken up on the lot for a free-standing cistern. That will be more important on smaller lots, of course.

With the roofprint areas needed to “right-size” RWH systems in the Wimberley Valley, it may be questioned if this strategy is indeed only applicable in developments with “large” lots. The example of an urban lot in a South Austin neighborhood offers a look at that. Illustrated in Figure 11, this is a 1/5-acre lot with a 1,580 sq. ft., 3-bedroom, 2-bath house, and a garage, fairly typical in that neighborhood.

Rainwater Harvesting on an Urban Lot

Figure 11

Figure 11 illustrates that a system quite sufficient to be “right-sized” for a 2-person household might be accommodated, but would come up somewhat short for a 4-person household. While adding cistern volume might be problematic, as illustrated in Figure 12 roofprint could be added around the house, an adaptation of the veranda strategy noted above, to obtain a roofprint that would be more sustainable for the larger population.

Added Roofprint Illustration

Figure 12

Of course, it is really pretty clumsy to try to retrofit a 1960s era house like this, but if the RWH strategy were to be designed into a new house from the start, it does appear that rainwater harvesting could be a reasonably attainable water supply system on urban-sized lots.

Before moving on, it should be noted that this strategy does not have to be implemented with just individual house RWH systems. As illustrated in Figure 13, to broaden the use of rainwater, RWH could be integrated into the conventional supply with collective, conjunctive use systems, maximizing contribution to water supply from whatever roofprint can be installed. While this scheme would require permitting as a Public Water Supply System, this concept should be entertained, to maximize rainwater harvesting in more intense single-family development, and to extend it to multi-family projects too. Significantly defraying demands on the conventional supply system, and thus blunting the need to expand it with costly, disruptive projects.

Collective Conjunctive Use Strategy

Figure 13

5. IRRIGATION WATER SUPPLY FROM WASTEWATER REUSE – CLOSING THE WATER LOOP

Under the Zero Net Water concept, wastewater reuse would be maximized to supply non-potable water demand. Particularly for irrigation in this climate. Indeed, wastewater reuse is highly valuable to rainwater harvesters, if they want to maintain an irrigated landscape. That’s because, as we’ve seen, a “right-sized” system just for interior use is already “large”. So to also supply irrigation directly out of the cistern, the system would have to be made even larger in order to maintain sustainability of the water supply system, and that would be costly, of course. Or the users would have to bring in a MUCH larger backup supply. But neither is needed, as there is already a flow of water sitting right there that could be used for irrigation instead of drawing that water directly from the cistern. A supply for which a hefty price has already been paid to gather. That’s the wastewater flowing from the house. Don’t lose it, reuse it!

An example of the savings potential from using the reclaimed wastewater for irrigation rather than providing that supply directly from the cistern is offered by looking at the 4-person house with a water usage rate of 45 gallons/person/day. The “right-sized” system for supplying just the interior usage requires a 4,000 sq. ft. roofprint with a 30,000-gallon cistern. Per the modeling, that system would have required backup supply in 3 out of the 18 years in the modeling period, totaling 28,000 gallons.

For this illustration, it is presumed the irrigated area to be supplied is 2,400 sq. ft., which would be the nominal size for an on-site wastewater system dispersal field for a 3-bedroom house. If that system were to provide the modeled irrigation demand directly out of the cistern, backup supply would have been needed in 12 of the 18 years, totaling 222,000 gallons! The rainwater system would have covered only 85.6% of total water demand – interior plus irrigation usage – over the modeling period. In order to maintain the sustainability intended to be imparted by the “right-sizing” strategy, the model shows that the roofprint would have had to be increased by 1,500 sq. ft., to 5,500 sq. ft., and the cistern volume increased by 20,000 gallons, to 50,000 gallons. Again, very pricey.

But if the wastewater were reused instead, the system that was “right-sized” to provide only interior use would have also covered most of the irrigation demand by reuse, again having needed backup supply only in the same 3 years, totaling 38,000 gallons. A 10,000 gallon increase in total backup supply from the “base case”, but a far lower demand on the watershed-scale system than if reuse were not practiced.

The reuse can be done house by house if the development features larger lots, with wastewater service provided by on-site wastewater systems, commonly called “septic” systems. Those on-site systems can employ the highly stable and reliable High Performance Biofiltration Concept system mated with a subsurface drip irrigation dispersal system. This concept is illustrated in Figure 14.

Figure 14

This strategy has been implemented in several jurisdictions in Texas, including Hays County, over the last 3 decades. Examples of drip irrigation fields on some of those lots are shown in Figure 15. On-site reuse, to render the RWH systems more sustainable while maintaining grounds beautification, is readily doable in the Wimberley Valley. Indeed, a larger scale version of this system was implemented at the Blue Hole Elementary School, the so-called “One Water” school.

On-Site Wastewater System Drip Irrigation Fields

Figure 15

If the nature of the development requires a collective wastewater system – typically needed if a development in the Wimberley Valley has lots smaller than an acre – the reuse can readily be accomplished, rather cost efficiently, with a “decentralized concept” wastewater system. That concept, detailed in “This is how we do it”, is illustrated in the sketch plan in Figure 16, showing a system serving a neighborhood in a rather typical Hill Country residential development. Of course this strategy makes great sense no matter where the water supply comes from. Every development can use this strategy instead of paying the high cost of centralizing wastewater to one far-away location, and then discharging it. Throwing away the water needed for irrigation!

Figure 16

As can be seen in Figure 16, this strategy is about water management, not about “disposal” of a “waste”. Reuse is designed into the very fabric of the development, as if it were a central function, rather than just appended on (maybe) to the system at the end of the pipe, as if it were just an afterthought. And by distributing the system like this, all the large-scale infrastructure outside the neighborhood – the large interceptors and lift stations – would be eliminated, imparting considerable savings.

This highlights that TCEQ must be engaged in rethinking wastewater management, to move to this sort of integrated water management model, instead of viewing wastewater management as being focused on “disposal” of a perceived nuisance. Which makes this wastewater live down to its name, truly wasting this water resource.

6. MINIMUM NET WATER – WORKING THE CONCEPT INTO DEVELOPMENT MORE GENERALLY

Of course if there are conventional water lines already close to a development, the developer would greatly prefer to hook up to them, even if, long-term, very costly measures – like the waterlines some have proposed to run into the Wimberley Valley – would have to be undertaken to keep water flowing to that development. The developer will invariably chose a “normal” water system if it’s available, so the builders can do their normal building designs. Those long-term costs to keep water flowing in those pipes typically accrue to others, so builders and developers wouldn’t see the price signals that might favor a Zero Net Water strategy.

Still a variant of Zero Net Water that might be called “Minimum Net Water” can deliver some of the benefits. If the development will get water service from a watershed-scale system, it can still be arranged to pretty much take irrigation off that potable supply, by rainwater harvesting and/or wastewater reuse. In this region, that would typically lop off about 40% of total annual demand. And of course it would be an even higher percentage of peak period demand, and it is peak demand that drives a lot of water system investment, so that is very valuable.

This could be done with a combination of the neighborhood-scale wastewater reclamation system, illustrated above, using that water to irrigate the public areas – front yards, parkways, parks and such – and rainwater harvesting to irrigate the private spaces, the back yards. Under this scheme, roof runoff would collect in “water quality” tanks, as shown in the illustration in Figure 17. That part of it conforms to what Austin sets forth in its rules as “rainwater harvesting” for water quality control, under which these tanks would have to drain within 48 to 72 hours. In this scheme, however, these tanks would be connected to rainwater harvesting cisterns, where that water could be held until whenever it is needed to irrigate the back yard. And only when those cisterns were full would any water even pond up in the water quality tanks, to eventually flow “away”. So a lot of the roof runoff would be harvested, to be used for irrigation, instead of running “away”.

Figure 17

By taking care of water quality management of rooftop runoff like this, the green infrastructure needed to treat runoff from the rest of the site could be downsized. Bioretention beds would need to be sized only for the area and the impervious cover level obtained by omitting all the rooftops. That saves more money, while still well maintaining the hydrologic integrity of the site.

The bottom line is water quality management of this rooftop runoff would be integrated with rainwater harvesting to create an irrigation water supply. Doing this would impart superior water quality management of that runoff, restoring the hydrologic integrity of this patch of ground now covered with the rooftop, and providing irrigation water for the back yard. A lot of benefit obtained by a pretty simple, straightforward system.

7. COMMERCIAL DEVELOPMENT, A PRIME OPPORTUNITY

To this point, the Zero Net Water strategies have mainly been focused on housing developments, but commercial and institutional buildings are also a major opportunity for Zero Net Water. The ratio of roofprint to water use in those buildings typically favors rainwater harvesting. And condensate capture could also provide a significant water supply in this climate.

Along with RWH providing the original water supply, project-scale wastewater reuse could be employed for irrigation, and also for toilet flush supply, and the LID/green infrastructure stormwater management scheme could harvest runoff from paved areas of the site too, and feed it into landscape elements – rain gardens – that, again, don’t need routine irrigation. An illustration of applying these strategies to a commercial building is shown in Figure 18.

Commercial Building Reuse and Site-Harvested Water Systems

Figure 18

Using these strategies, commercial and institutional buildings, or whole campuses of these buildings, could readily be water-independent – “off-grid”, not drawing any water from the conventional water system. That could save a lot of water, and a lot of money for conventional water and wastewater infrastructure that would not have to be built to these projects.

8. COSTS … AND VALUE, A CLOSING STATEMENT

And finally, about cost. Of course, that’s always going to be a major factor in whether any of these strategies are put to work in service of society, to deliver water systems that are more societally responsible and more environmentally benign. But it is suggested that the discussion be framed in terms of VALUE. Oscar Wilde famously said, this society knows the price of everything and the value of nothing. That just might be the case here.

Because, by conventional accounting, water supply obtained from building-scale rainwater harvesting will no doubt appear quite expensive per unit of yield, mainly because of the high relative cost of cisterns, the distributed reservoir. A viewpoint exacerbated by presuming the watershed-scale storage and distribution systems are sunk costs, rather than the avoided costs they would be – or at least expansions of them would be – if building-scale systems were used instead of the watershed-scale system for whole developments. So mainstream institutions tend to dismiss this strategy out of hand. Indeed, the State Water Plan assigns little value to building-scale rainwater harvesting as a water supply strategy, and water suppliers in this region appear not to consider it to be any part of their portfolios.

But there are many reasons to consider if the Zero Net Water strategy delivers value that should not be ignored. Here are a few of them, very briefly, as each of these could be a day-long discussion themselves:

  • The Zero Net Water strategy minimizes depletion of local groundwater and loss of springflow. That is very important in places like the Wimberley Valley, indeed in much of the Hill Country. As the aquifers to the east of the IH-35 corridor are dewatered by importing that water into Hays County, Zero Net Water could be valuable over much of that area as well.
  • That’s because Zero Net Water would blunt the “need” to draw down aquifers, or take land to build reservoirs, and all their attendant societal issues. An example is the brewing water war in East Texas, where it is proposed to take land to build reservoirs and export groundwater out of that area as well.
  • As has been noted, Zero Net Water will be sustainable over the long term, since development would largely live on the water that’s falling on it, not by depleting water that’s been stored in aquifers over centuries, or getting it from reservoirs that will silt up.
  • Zero Net Water is an economically efficient strategy, since facilities are built, and paid for, only to provide water supply that is imminently needed.
  • Zero Net Water also minimizes public risk. Since the system is designed into the site as it is developed, the cost of creating water supply is largely borne by those who benefit directly from that development, rather than by society at large through public debt.
  • It must be understood that growth projections for this region are not manifest destiny. While large water infrastructure projects just have this, if you build it, they will come, sort of “justification”, those growth projections depend on a continuation of trends the projections are based on. And that might be disrupted by any number of circumstances. So perhaps it should be asked, what if they built it and no one came? Or just, not enough to pay for it came? Perhaps the projected growth is indeed inevitable, and so that’s a remote risk, but still it’s one that could be avoided by shifting to Zero Net Water.

Zero Net Water is about society collectively saying, this is how we will secure our water supply. We will integrate all aspects of water management, we will address all water as a resource, we will tighten up the water loops and maximize efficiency, to create a sustainable water strategy – Zero Net Water.

The Rainwater Harvesting Argument

June 19, 2025

The Waterblogue has featured the idea that building-scale rainwater harvesting (RWH) can provide a significant contribution to water supplies in this region. For example here, here, and here. Raising the obvious question, what actual contribution could this strategy make?

Discussing the merit of and prospects for building-scale RWH as a water supply strategy in this region with a colleague, I was surprised when he confronted me with this proposition. According to state policy, in order to be considered a functional water supply strategy, a method must be capable of delivering a “firm yield” through a repeat of the “drought of record”, and under that definition, most building-scale RWH systems simply do not exist, do not deliver any recognized water supply!

Let that sink in for a moment. A well-planned, well designed building-scale RWH system around here is typically expected to be able to provide in excess of 95% of total demand over a period of years, which would include a period of drought. An example is the “right-sized” system summary shown below produced by modeling the period 2007-2023, inputting Austin rainfall records over that period. This time period includes 2008-2014, which is reported to be the new “drought of record” period for the Highland Lakes, which is the major watershed-scale RWH system that serves this region.

[click on image to enlarge]

As you see, we can readily choose system sizing relative to the expected level of water usage to create systems that can indeed deliver 95% or more of the total water supply over the modeling period. So to the question, if the system does not deliver the total water supply needed through a drought period, what would it take to assure this strategy does deliver a functional, secure and assured water supply?

I posed this matter in the TWDB-funded investigation “Rainwater Harvesting as a Development-Wide Water Supply Strategy” that I ran for the Meadows Center at Texas State University in 2011-2012. I set forth the idea of “right-sizing”, that rather than having to pay to upsize the RWH system to cover the last little bit of demand, it would be more cost efficient society wide to install a “right-sized” system that would provide the vast majority of total water demand, and to provide for a backup supply of the small amount of shortfall, that would be needed only though bad drought periods. As can be seen in the table above, for example, to provide 100% supply at 45 gallons/day to a 4-person household would require 5,000 sq. ft. of roofprint and a 42,500-gallon cistern. But that system could be downsized considerably, to 4,000 sq. ft. of roofprint with a 30,000-gallon cistern – saving a ton of money – and would still have provided 97.5% of total demand through the modeling period. So the question becomes, how to assure that 2.5% shortfall could be provided by other means.

The presumption behind this concept is that there is not an unlimited market for development. The development that would be provided water supply by building-scale RWH systems would displace development that would have otherwise drawn its water supply from the watershed-scale RWH system, rather than be development in addition to that. So the supply being provided by building-scale RWH would be supply that would be left in the watershed-scale system storage pool most of the time, so presumably not drawing it down as severely as it would have been if all those building-scale systems had instead been routinely supplied by the watershed-scale system. Thus the watershed-scale system would have the “slack” to provide the relatively small amount of backup supply to the building-scale systems through the drought periods.

In the example above, “right-sizing” at 4,000 sq. ft. and 30,000 gallons, the table shows a total backup supply of 28,000 gallons would have been needed through the drought period 2008-2014, or just 4,000 gallons per year on average, out of a total modeled demand of 65,700 gallons/year. That system would have been 94% supplied by the building-scale RWH system through that 7-year drought period, and as noted above 97.5% supplied through the total 17-year modeling period.

The question, of course, is if indeed the watershed-scale RWH systems – such as the Highland Lakes in this area – would have the capacity to provide that backup supply demand through a drought period, as well as continuing to serve all the development that routinely draws from it. My colleague, while acknowledging the logic of my argument, asserted that the “growth model” presumes that the watershed-scale systems serving any given area would indeed become completely encumbered by development they serve directly – based I’m guessing on the very circumstance that overall growth around here is projected to exceed the capacity of existing water supplies to service it – so that it’s presumed there would be NO capacity available in that system through a drought of record period.

As best I can translate, it is asserted that the “right-sizing” strategy is illegitimate, because there would be no sources available for backup supply through a drought. So rendering that evaluation noted above, that if the building-scale system would not carry 100% of the projected supply needs, for the purpose of planning water supply strategy, it is presumed that the system provides NO water supply, is of NO value to the regional water economy.

I find that viewpoint to be, well, strange, contrary to common sense. Does it not seem that if a building-scale system provides in excess of 95% of the total supply over a period of years, that would be supply that the watershed-scale system is relieved of having to provide, and so this is effective water resource conservation, that does have value to the regional water economy? It seems rather didactic to simply “erase” the whole building-scale RWH water supply strategy because it would need a minor portion of total supply to be provided out the watershed-scale system, which the building-scale systems would be totally relieving much of the time. Indeed, one wonders where else in public policy is a 95+% “success” rate deemed “unreliable”? Yet that is what my colleague contends state planning principles presume “must” be so when considering whether to base water supply strategy on any use of building-scale RWH over any given area. That only the capacity of systems sized to deliver 100% of the projected supply can be deemed to exist.

It is little wonder then that we do not see the building-scale RWH strategy being set forth in any of the regional water plans, and thus not having been meaningfully incorporated into the State Water Plan. Here is the sum total of what the 2022 Texas State Water Plan says about building-scale RWH as a water supply strategy:

Rainwater harvesting involves capturing, divert­ing, and storing rainwater for landscape irrigation, drinking and domestic use, aquifer recharge, and stormwater abatement. Rainwater harvesting can reduce municipal outdoor irrigation demand on potable systems. Building-scale level of rainwa­ter harvesting, as was generally considered by planning groups and which meets planning rules, requires active management by each system owner to economically develop it to a scale that is large and productive enough to ensure a mean­ingful supply sustainable through a drought of record. About 5,000 acre-feet per year of supply from rainwater harvesting strategies is recom­mended in 2070 to address needs for select water users that have multiple additional recommended strategies.”

To put that projection of supply to be provided by building-scale RWH in perspective, if we presume a typical system does provide supply at 45 gallons/person/day for 4 persons, or 180 gallons/day total, each such system would supply 65,700 gallons/year, or about 0.2 acre-feet/year. So a contribution of 5,000 acre-feet/year would require 5000/0.2 = 25,000 RWH systems of this size, or the functional equivalent, to be put in place. How much growth of this strategy does this project?

While there is no authoritative data base that would provide the number of existing RWH systems, a rough guess that one expert on the subject offered is that there is likely in excess of a quarter million RWH systems – 10 times the number calculated above – in just 7 states, with Texas being the site of a goodly portion of those. Indicating that it does not appear the 5,000 acre-feet by 2070 projection even comes close to representing what is already on the ground, routinely producing water supply today.

But, as reviewed above, those who set water planning policy in Texas are loathe to accord to this strategy any actual contribution to supply, because of that “firm yield” requirement. So we need to consider if that is indeed sound reasoning, if that is a sufficient reason to exclude all contributions by building-scale RWH systems all of the time, or if we should rethink that.

Might, for example, society be better served by planning for building-scale RWH systems within a “conjunctive use” strategy, under which whatever the backup supply source is would have that capacity “reserved” in some manner? Just as this concept is applied to co-managing surface water and groundwater, so that one source might “fill in the gaps” of the other source’s capacity. To do this of course would require conscious consideration of and planning for building-scale RWH as a contribution to area-wide water supply. Which is absent at present, and so this matter remains “fallow”.

There are hundreds if not thousands of houses, businesses too, around here where folks are making building-scale RWH work as a water supply strategy, successfully arranging for whatever backup supply their systems need on an ad hoc basis. All of the water hauling companies that provide that backup supply report they are confident their business model will remain viable, so those supplies can be maintained into the future. So it would seem that building-scale RWH could indeed be a broad scale water supply strategy, with some intentional planning for assuring backup supplies are provided at need.

The situation can be summed up, that building-scale RWH is not meaningfully included in water resources planning in Texas, upon the “reasoning” that this method would not provide a “firm yield” through a repeat of the “drought of record”. This ignores any prospect for co-managing this strategy with the watershed-scale RWH systems to assure that whatever gaps in firm yield would result would be covered out of the watershed-scale systems. Does this not seem to show a lack of vision among the mainstreamers who control those planning processes?

In pursuit of society’s best interests, it is suggested that this whole viewpoint be revisited. This is another example of how we need to take a peek down the road not taken … so far. As that could make all the difference.

Can we pretty much take irrigation off the potable supply?

June 8, 2025

It is commonly reported that something like 40% of annual municipal water demand is typically used for landscape irrigation. This usage is also a major driver of demand peaking through the summer in our Central Texas climate. In Georgetown, for example, a recent Inside Climate News article (you can read that here) on the city’s efforts to obtain water supply from the Simsboro Aquifer to its east, the city manager is reported to have said “most” of that water will serve new residential developments and will be used “primarily” to irrigate lawns and other neighborhood landscaping. So it does seem that taking this irrigation off the potable water supply would be of huge benefit. Especially to cities, such as Georgetown, that are spending significant sums to import water from remote supplies. And in particular, shaving peak demands would be particularly beneficial, as peaking drives the costs of much of our water supply infrastructure. Like those hugely costly pipelines from the Simsboro.

Thus, it is reasonable to investigate, how might we accomplish taking irrigation off the potable water supply system?

Consideration of what this might entail reveals three main strategies that could be pursued:

  1. Move to a more regionally appropriate landscaping ethic, minimizing turf and other landscaping that would need routine irrigation, prioritizing instead native plant landscapes, that would need far less irrigation – and would enhance the “sense of place” in the built environment.
  2. Employ building-scale rainwater harvesting (RWH) to create the irrigation water supply, utilizing a resource that otherwise may become a stormwater management problem.
  3. Create wastewater reuse systems to provide the irrigation water supply.

Along with all this, rendering landscape irrigation systems more efficient to reduce the demand to begin with, such as was reviewed on the Waterblogue here, must be an on-going effort, just as a matter of course.

On landscaping of a residential lot, simply because I am intimately familiar with it, I will use the landscaping at our house to illustrate the potential for measures 1 and 2. Here is the lot plan, showing the current configuration.

[click on image to enlarge]

A street view of the current state of the front yard landscape is shown below.

And here is a closer view of the front patio area, the house entry visual.

And a bit of whimsy – SHARK!

When I moved in back in 1997, this front yard was rather typical for the neighborhood, two non-native trees and a swath of St. Augustine turf. Beginning in 2002, I started to transform it to what you see above, taking out a chunk of turf at a time, taking down the non-native trees and subbing in mountain laurels, a lacy oak and a cedar elm. This native landscaping is rarely watered. After the plants are well established, only during drought stress periods and sparingly then, so this landscaping incurs rather minimal irrigation water demand. The exception is the blackberries that I decided to try growing a couple years ago, that are in front of the raised planter with the accordion trellis on top of it. These plants do require routine irrigation to be viable, but the total amount of water they require is rather minimal.

In the back yard, we are doing some food gardening, in the raised beds, the in-ground potato bed, and blueberries in containers …

… which all does have to be watered frequently. But here again the areas are small, so the total water demand for the food crop irrigation is low, and generally all supplied from the rainwater tanks. We also have some citrus trees in containers – lemons and limes …

… and more blackberries in the back yard.

The sandbox, shown below, which my grandson has outgrown, is planned to become a milkweed bed to feed monarch butterflies.

As you can see in the picture above, the trellises along the yard walls, and in the picture below, around the veranda cover …

… are native plants, which are irrigated only during extended drought stress, when a bucket of water will be spread on them every so often. There are a number of hanging baskets and potted ornamentals that do require routine watering, but again the total amount of water required is very low.

The remainder of the back yard is covered with a small rain garden – you see that on the left edge of the picture above, it was featured previously on the Waterblogue here – with mulched beds containing native plants, and with non-irrigated “turf” areas. In the summer those “turf” areas become mainly populated with horseherb as the grasses fade – you can see that ground cover at the bottom of the picture just above. During long drought periods the horseherb becomes largely “dormant” – but always surges back green when the rains come.

You can see then that it is quite possible to move to a more regionally appropriate landscape style here in Central Texas, and not give up on having an attractive, interesting landscape. A landscape requiring minimal irrigation does not have to be, as an Austin city counselor once dimly put it, all rocks and cactus. While the turf aesthetic does still dominate the neighborhood, I have witnessed a number of similar transitions of front yards in the 20+ years since I started transforming mine. And the appearance of more and more rainwater tanks.

Leading us to strategy 2, using roof-harvested rainwater instead of potable water for irrigation supply. As shown on the lot plan and in the pictures below, we have a rainwater tank at each corner of the house, to capture all the roof areas.

The tank capacities, shown on the lot plan, total to 2,875 gallons of storage capacity. As you can see in the tank pictures, the shorter tanks are set on pedestals so that the tops of the tanks are all at about the same elevation. This was done because the tanks are tied together by 1-1/2” pipes running underground between them, so they will all operate as one hydraulic unit, all overflowing at the same elevation. That was done because each tank intercepts different roof areas, and so if not tied together, one tank could be overflowing before others are full, so this maximizes the collection efficiency off the entire roofprint.

So far, this RWH system has been efficient enough that over the last few years, since we started growing food as well as maintaining the visual landscape, only in the depths of the drought in the summer of 2024 did we have to use any significant amount of potable water for irrigation. You can see then that even such a limited RWH system can largely maintain a regionally appropriate native landscape, such as we have, as well as some food gardening.

Turning now to strategy 3, wastewater reuse. Here in the middle of the city on such a small lot as ours, reuse would generally not be feasible. Unless of course Austin Water were to install the “purple pipe” system redistributing reclaimed water from its centralized treatment plants into neighborhoods like ours, a very unlikely scenario, since that would be inordinately expensive. On any lot out in the countryside, however, that has a “septic” system – or OSSF (On-Site Sewage Facility) in regulatory system speak – folks could install a system that produces and reuses reclaimed water, as is reviewed on the Waterblogue here, and on my company website here. Really, just about every OSSF could be a reuse system, defraying whatever amount of irrigation may be needed on each lot. Cumulatively, over this region, that could conserve a lot of water.

In new development, however, a collective wastewater reuse system to defray irrigation demands can be quite feasible, even saving money relative to the cost of installing and running a conventional wastewater system to serve these developments. As was reviewed in “This is how we do it”, the dispersal fields of a decentralized concept wastewater system could be arrayed to irrigate front yards, parkways, parks, the margins of walking trails, etc. The illustration from that blog is shown below. By this means, a large portion of irrigation demands in new development – of any sort, not just residential projects – could be cost efficiently carried by wastewater reuse.

[click on image to enlarge]

Before proceeding, it is noted that by integrating rainwater harvesting with the water quality management strategies that development rules in this region typically require, a further contribution to irrigation water supply can be realized in the sort of development illustrated above. This strategy was reviewed on the Waterblogue in “… and Stormwater Too”. The illustration of the strategy in that piece is shown below. The rooftop runoff, and perhaps the water falling on the driveway too, can be stored and used for irrigation water. Depending on the volume of storage provided, this could largely obviate using potable water for back yard irrigation, on top of the savings from wastewater reuse to irrigate the front yards. Between them, pretty much taking irrigation off the potable water supply.

Now let’s look beyond the residential lot level, to multi-family and commercial developments. Below is the street-facing landscape of an apartment village, a large area of turf.

And right across the street from that, we see the same thing, very large areas covered by turf, that appear to be well irrigated, given that they stay as green as they appear in these pictures through the Austin summer.

Note that these turf areas serve no purpose that would dictate that turf must be the landscape. No, this is simply an aesthetic, someone’s evaluation of what such areas “should” look like. Might such an evaluation be called to question, on sustainability grounds? Why could not such areas be instead native wildflower meadows? Looking something like this?

This sort of landscaping would present the project’s street view just as well, in any functional sense. Indeed, it would impart a “sense of place” to the site. And it would require far less irrigation to maintain it in “presentable” condition. It would also require much less routine maintenance, perhaps “trimming” a couple times a year, and no applications of fertilizers, herbicides or pesticides. So the labor and materials inputs to maintain such areas would cost the project owners less, beyond the avoided cost of irrigation water.

Now in such projects as those apartment communities shown in the above photos, it may indeed be reasonable to implement a point-of-use wastewater reclamation system – again in the same manner as was illustrated in “This is how we do it” – to produce whatever irrigation water may be needed to maintain such landscapes. As well as to irrigate the turf that “normally” covers the grounds of such projects, like we see in the picture below.

Being within the area already served by the conventional sewer system, such a project-scale wastewater system could provide the reclaimed water for irrigation on demand, and whatever treated water flow is not required for irrigation supply could simply overflow into the sewer, reducing the organic load on the conventional wastewater system.

It’s an open question of course what the payback on avoided irrigation water costs may be – given that water is currently priced well below the costs of replacement water supplies, such as importing water from remote aquifers – or what the operations and maintenance costs of such project-scale systems may be, understanding they are essentially “redundant” to the conventional system. These are matters that should be investigated, comparing these costs to the long-run marginal costs of other options for obtaining new water supplies, given the sustainability benefits of taking irrigation water supply off the potable water system. Noting that, while such actions might appear to be a stretch as retrofits of existing development, it may be quite cost efficient to design them into the very fabric of new development.

It is also noted in passing that covering such areas with wildflower meadows instead of turf would impart stormwater management benefits. This is because the wildflower meadow would have a somewhat lower propensity to produce runoff than would the same area covered by turf. “Curve Number” (CN) is the measure of how “absorptive” a soil-plant complex would be under the Soil Conservation Service method for evaluating stormwater drainage issues; the higher the CN, the more runoff would issue from the area, given the same rainfall. For a wildflower meadow on the areas shown in the pictures of the apartment communities above, CN would be about 65, while if covered by conventional turf it would be about 79. The wildflower meadow would start issuing runoff at a rainfall depth of 1.08”, while the turf area would start issuing runoff at a rainfall depth of only 0.53”, so on an annual basis the wildflower meadow would issue somewhat less runoff to be managed in stormwater facilities.

And these wildflower meadows shouldn’t need any stormwater quality management at all; they themselves might even serve as the stormwater quality management device for runoff from development upslope. Peak flow rates to be managed would also be somewhat reduced. Applied at scale in a watershed, this could significantly reduce the costs of installing and maintaining stormwater ponds, rain gardens, and such. Furthering the benefit of adopting the more regionally appropriate landscaping ethic.

Still, however, there will be turf to irrigate. In discussing this matter with the city councilman representing my district, he asserted that folks in an affluent area of town would not willingly part with their turf-covered yards. So let’s look at that sort of situation.

Looking at the area of town the councilman referred to on Google Earth, I see the lot shown in the picture below. Not at all atypical of that neighborhood, this one just happens to be fairly free of tree cover, making it easier to measure the area of turf and of roofprints available for rainwater harvesting.

Roughly measuring off Google Earth, I calculate there is about 3,500 sq. ft. of roofprint covering the house and the garage, and I calculate there is about 5,000 sq. ft. of turf landscape, and minimal other landscaping on this lot. So let’s look at how much water would be required to keep that turf looking lush, and the prospects for supplying it with other than potable water.

Inserting those areas into the rainwater harvesting model, programmed with Austin historic rainfalls over the period 2007-2023 and an irrigation water demand profile suitable for turf around here, and presuming a 10,000-gallon cistern were installed, RWH would have covered 62% of the total irrigation water demand over this period. In the severe drought year of 2011, only 24% coverage would have been provided by harvested rainwater. 100% coverage of irrigation demand would have been provided in only one of the 17 years covered by this model.

If a 20,000-gallon cistern had been installed, the overall coverage would have risen to 81%, with the 2011 coverage rising to 37%. 100% coverage would have been provided in 9 of the 17 years covered by the model. Increasing cistern size to 30,000 gallons, overall coverage would have risen to 90%, and the 2011 coverage to 51%. 100% coverage would have been attained in 11 of the 17 years.

It would of course need to be investigated where a cistern might be accommodated on the lot and what such a rainwater harvesting system would cost. Beside the cistern, the costs of proper guttering, collection lines, and a treatment unit – consisting of only a cartridge unit for this irrigation usage – and pressurization facilities, and system on-going operations and maintenance, would have to be calculated, to determine the amortized cost of this rainwater supply. Once again noting that payback may not appear favorable relative to the defrayed cost of the potable water supply for this irrigation, since the actual replacement cost of an equivalent amount of water supply from other new sources would no doubt be much higher than the current water rates charged by Austin Water. But it may compare favorably to the long-run marginal costs of other new supply sources.

Turning now to on-lot wastewater reuse for irrigation supply, as was noted above, installing an OSSF to produce reclaimed water would greatly defray draw on the potable water supply. As for what such an on-lot treatment unit might look like, we see such an installation on a lot in an urban fringe neighborhood in the picture below.

As you see, not really very “intrusive”, with only the top of the filter bed and the tank hatch lids visible on the surface. If this were deemed too visually intrusive, these facilities could be tucked into a corner of the lot and screened with plants, such as the Texas sage plants in the above picture.

How much irrigation water this system would produce would of course be set by the amount of wastewater produced in the house, which would be largely driven by the house population. Presuming a 4-person occupancy, the average daily flow would be likely be in the range of 200 gallons/day. With a 5-person occupancy, it might be about 250 gallons/day. Over the 5,000 sq. ft. of turf area on the lot we are looking at here, this would provide an irrigation rate of about 0.45-0.56 inches/week. That would be more than sufficient over 7 months of the year, but deficient in the other 5 months, which is the peak irrigation season, May thru September. In those months this reclaimed water flow would provide between about 1/2 to 2/3 of the total modeled irrigation demand. RWH could defray the rest.

Again, it would have to be investigated what such an OSSF would cost, including the subsurface irrigation system that would disperse the reclaimed water underground, to minimize contact hazard potential in such an “unsupervised” environment. This would set the cost of this new water supply, that could then be compared with the long-run replacement cost of water supplies by other means.

While it may appear questionable if providing new water supply on-lot with rainwater harvesting at the scale reviewed above or by installing an OSSF would “pencil out”, it is noted than no such options have so far been considered and evaluated by Austin Water as it conducts planning under its Water Forward program, touted as a 100-year water plan for Austin. Given that long time frame, it is called to question if failing to consider such “outside the box” options as these is missing a march.

Noting again that while retrofits such as reviewed here may not be deemed fiscally viable, the situation in new development, where these facilities may be designed in from the start, may be very favorable. Cities in situations such as Georgetown finds itself – and San Marcos and Liberty Hill and Dripping Springs, anywhere that growth is straining current water supplies – could go a long way toward blunting its problems by taking the 3 measures we’ve reviewed to pretty much take irrigation off the potable water supply:

  1. Move to a more regionally appropriate landscaping ethic.
  2. Employ building-scale rainwater harvesting to create the irrigation water supply.
  3. Create wastewater reuse systems to provide the irrigation water supply.

These opportunities are there for the taking in many, many situations all over this region. Again, particularly in new development – understanding it is growth that is largely straining current water supplies – where these strategies could be essentially designed into the very fabric of the development, providing opportunities to actually save money while saving water. As has been argued in several other Waterblogue posts – see for example here, here and here.

Despite this, such strategies have not appeared in any of the regional water plans, nor as noted in the Water Forward process in Austin. It is suggested that water planners give these strategies a bit more attention than they have received so far. As was noted on the Waterblogue in this post, society would be well served by taking a peek down the road not taken … so far, as that could make all the difference. Are the water planners around here up to that task?

Astounding in its lack of ambition

December 5, 2024

“Truly, Councilman, can you stand behind a plan that proposes to focus on an infrastructure model that, by Austin Water’s own reckoning, will only attain 12% reuse in 55 years? Do you not find that astounding in its lack of ambition?”

That is the message I wrote to the Austin City Council member representing my district, following up my argument to Austin Water about Water Forward, the city’s plan for how water is to be managed around here over the next 100 years. Sent under the subject line, “Austin Water Backward” – stark, and yeah, a bit snarky. But read the argument for yourself, and make your own determination if clinging to the past’s business as usual is likely to be the most fiscally sound plan, the plan that would best serve the local and regional water economy over the next century. Or, since we have management concepts available that could approach 100% reuse at the project scale, if it’s questionable that backward looking plan is Austin’s pathway to sustainable water, if it lacks sufficient ambition to get us there.

*************************************************************************************

A look at the table entitled “Summary of Water Forward 2024 Strategies, 2030-2080” on page 31 of the 2024 Water Forward plan update shows that Austin’s planning is rooted in the presumption that the prevailing, essentially 19th century water infrastructure model will simply be extended and perpetuated, with little focus on anything else. I argue that this sort of backward looking strategy will not be a sufficiently ambitious pathway to sustainable water, and that Austin Water’s apparent conclusion it is the cost efficient manner of proceeding is likely rooted in flawed analysis.

The situation is exemplified by the “Non-Potable Water Reuse Strategies” part of the table, recapitulated below. This indeed indicates the expectation that wastewater system development going forward will continue to focus on piping flows from hither and yon to centralized treatment plants, and that non-potable reuse will continue to be focused mainly on the “purple pipe” system redistributing reclaimed water from those plants to points of use. The latter is a strategy that has been, throughout its 30 year operating history, kept afloat by infusions of money from other sources, and is not expected to ever pay for itself. Yet, the table shows, Austin Water seems intent on doing it mainly that way.

Non-Potable Water Reuse Strategies (Acre-Feet/Year)

Water Forward Strategies203020402050206020702080Annual Cost
$/ac-ft/year
Centralized Reclaimed1,1008,20012,90017,60022,30026,900$2,243
Decentralized Reclaimed02005008001,1001,300$5,158
Onsite Reuse1,1004,0005,7007,3009,00010,600$8,957
Total Reuse Attained2,20012,40019,10025,70032,40038,800—-

On “decentralized reclaimed”, I’m still wondering how that whole idea is understood at Austin Water, as outreach efforts to discuss this matter have been … well, I’d say rebuffed, but to be rebuffed, there would have to be evidence that it was looked at to begin with … Sorry, snarky again. But just highlighting that all throughout the decade that Water Forward has been in process, this whole idea – that can approach 100% reuse on each project – has been marginalized. The table indicates so little consideration of this strategy that it’s expected NO “decentralized reclaimed” would come on line by 2030. And it shows that “decentralized reclaimed” is projected to be a paltry 5% of “centralized reclaimed” reuse in 2080. This does indicate there will be little focus on if and how we might transform the infrastructure model over the next 55 years!

That 55-year timeline we’re talking about here bears some consideration. In 1999 I wrote a piece, aimed at fostering a more focused discussion of the then-pending 50-year water deal with LCRA, entitled “Is a Billion Dollar Crapshoot Our Best Public Policy?” The example of “silicon” plants – a totally unknown technology in 1949 – having become a significant part of the industrial sector here, drawing a sizable population to Austin, was offered to illustrate that it’s pretty impossible to know with any certainty what Austin will be like 50 years into the future. Indeed, it’s a crapshoot. The article posed the many ways the water future may be different from the past, how the water infrastructure model may evolve considerably over the ensuing 50 years, all being echoed by many “futurists” even back then, a quarter century ago. Most of which could be encapsulated under the now popular banner of “One Water”, the idea that all water flows have resource value that should be maximized, rather than those flows being “disposed of”. A perusal of that review would immediately illuminate that Austin Water has not really pursued much of that over the last 25 years. And again, the apparent presumption of the 2024 Water Forward plan is that, in regard to wastewater management, little will be done in that vein over the next 55 years.

Which we can see by examining the projections in the Water Forward plan. Austin Water seems to not want to share the magnitude of current wastewater flow. The best estimate I’ve been able to derive is from the latest rate study, from which it seems implied that current wastewater flow is about 87,000 acre-feet over the year. If this is taken as the flow in 2030, noting it will undoubtedly increase by then, reuse through the “purple pipe” system in 2030 would be only 1100/87000 = 1.3% of total wastewater flow. This indeed shows the limited reach of this mode of reuse, that after 30 years of developing the “purple pipe” system, this is the paltry level of reuse that’s being attained.

Water usage projections in the Water Forward plan can be parsed to guess that wastewater flow in 2080 would be around 2.5 times the current flow, or 2.5 x 87000 = 217,500 acre-feet over the year. Presuming this basis, the table shows expected “centralized reclaimed” reuse through an expanded “purple pipe” system would be 26900/217500 = 12.4% of total wastewater flow. A robust growth, at almost a 10-fold increase in share of total flow, but still far short of making the “One Water” ideal, utilization of the water resource rather than “disposal” of a perceived nuisance, the focus of this societal function. Certainly, since the collection system typically represents 70-80% of total wastewater system cost, the pipe-it-away infrastructure model reflects that fiscal resources would be far more focused on “disposal” than on reuse.

Meanwhile, it’s projected that “decentralized reclaimed” would have achieved a penetration of only 1300/217500 = 0.6% by 2080. Reflecting, over 55 years, an almost total lack of focus on if, where and how to transform the infrastructure model, apparently rejecting the “One Water” decentralized concept strategy, which can approach 100% reuse on each project.

Looking at costs for these options, Austin Water projects the cost of implementing “decentralized reclaimed” would be 2.3 times that of “centralized reclaimed” reuse, $2,243 vs. $5,158 per acre-foot/year. One might guess this is their basis for continuing to focus system development on the prevailing conventional pipe-it-away infrastructure model. But based on everything I have observed in this field, this does not make any sense. The decentralized concept strategy, in the setting of urban fringe and hinterlands development, always shows to be more cost efficient, on a global cost efficiency rating. As set forth, for example, in This is how we do it and, more explicitly, in Let’s Compare. Which is likely the key to why the Austin Water projections show the opposite, rather extremely so; global cost accounting is not their standard.

One guesses that the costs of “centralized reclaimed” are limited to the “purple pipe” system, not accounting for the cost of “producing” that reclaimed water flow to begin with – the costs of wastewater collection and treatment – and probably not accounting for costs to implement end uses either. While for “decentralized reclaimed”, it is guessed that the total system costs were presumed – the costs to collect, treat, redistribute and utilize this water resource. Rather distorted, not an apples-to-apples comparison. But what Austin Water seems to be basing its desire not to examine the infrastructure model upon, rather wanting to simply forge ahead presuming the future will be just like the past.

Now of course Austin Water can legitimately claim that the costs of collection and treatment to “produce” that reclaimed water supply are sunk costs, as those facilities are already in place, so should not be accounted as a cost factor for “centralized reclaimed”. True enough for the portion of the system that is currently in place. Remember though we are looking at policy and strategy over a 55 year timeline here, over which it is projected that wastewater flow would increase 2.5-fold. This will require that lines be extended and/or upgraded, new lift stations would be needed and/or current lift stations upgraded, and new treatment capacity would have to be added. None of these are sunk costs.

Austin Water might also argue that these new costs would be incurred in any case to impart wastewater management regardless of whether the effluent would be reused. But of course that argument only holds water if wastewater system capacity would be expanded only via extension of the prevailing conventional pipe-it-away infrastructure model. If areas of new development – on the urban fringe or in the hinterlands where lines would need to be extended, or for infill development that would require upgrading of existing facilities all along the way from the development to the centralized treatment plant – were to be managed instead by pursuing the “One Water” decentralized concept strategy, those costs may be obviated, saving as noted in First ‘Logue in the Water untold costs by never having to build another trunk main. Or to extend/upgrade the loss leader “purple pipe” system.

This leads us to the “onsite reuse” category, for which about a 10-fold growth is projected, from 1,100 acre-feet/year in 2030 to 10,600 acre-feet/year in 2080. Impressive on the surface, but is it really? When EVERY commercial and institutional building is a candidate for this form of water conservation, what level of dedication to reuse does a 10600/217500 = 4.9% penetration over 55 years represent? One wonders what Austin Water thinks would be so limiting a proliferation of this practice?

For one thing, maybe it’s a poor appreciation of “reuse”? Austin’s highly touted Onsite Reuse Ordinance does not actually very much address reuse. It’s focused on site-derived flows, mainly harvested rainwater and condensate, the usage of which would be “original” use, not REuse. That is a term of art generally applied to reclaimed wastewater, which this ordinance gives very short shrift. It does not even ALLOW “onsite” reuse of reclaimed wastewater except by variance. So it appears that Austin Water is really not that into this “One Water” idea of reuse at the building scale.

Which is a shame, since in its most “advanced” form, this strategy would render commercial and institutional buildings, or whole campuses of such buildings, water independent, running on Zero Net Water, as laid out over a decade ago. The legitimacy of which is now well recognized, being marketed, for example, by Texas Water Trade under the banner of “Net Zero Water”. But Austin Water seems to not expect such developments to ever “unhook” and become their own water supply centers, rather they might only just defray some non-potable usage, while the prevailing conventional pipe-it-away infrastructure model will continue to be the major mode of management across such developments. Failing to approach the essentially 100% reuse that is readily available here.

As to cost of “onsite reuse”, at $8,957, 4 times that of “centralized reclaimed”, doesn’t seem all that cost efficient. But … It is not likely that relieved capacity on either the potable water supply system or the centralized wastewater system are accounted for. If a building were to go Zero Net Water, there would be no need to extend – or in the case of “nodal densification” that was all the rage in the Imagine Austin program some years ago, upgrade – waterlines to and wastewater lines from these developments. Little doubt none of this was recognized in the cost accounting of these “onsite reuse” systems. Here again, the cost of a system to collect, treat, redistribute and utilize the site-derived water flows is likely being compared to the cost for ONLY the redistribution system for the “centralized reclaimed” option. Not the apples-to-apples comparison that, in a rational process, would be the standard.

Also, the asserted cost may be inflated over what it needs to be, because it has so far appeared that Austin Water, and Texas Water Trade as well, presume that the treatment units for building-scale reuse would be activated sludge units, imparting high operating costs. As discussed in Appropriate Technology, using technologies more appropriate to the building scale would render these systems more cost efficient, so would further reduce – eliminate? – the asserted cost disadvantage of building-scale reuse.

All of this reflects that there seems to be no “Visionary-in-Chief” at Austin Water, and so the process is being controlled by “mainstreamer” viewpoints, failing to imagine that the future might be anything but an extension of the past. So what we have here could indeed be quite fairly seen not as a Water Forward plan, rather a Water Backward plan. Certainly, in regard to the matters discussed here, a backward looking plan. One which just presumes that the future will be very like the past, just presumes that the prevailing, essentially 19th century water infrastructure model will be extended and perpetuated, world without end.

As was set forth in One More Generation, we need to get past this backward looking mental model, essentially just rearranging the deck chairs as the ship goes down. On behalf of everyone’s children and grandchildren, we should all ask Austin Water to consider if this is the manner in which it really should be proceeding. Should we really be good with a plan this astounding in its lack of ambition?

A peek down the road not taken … so far

May 27, 2024

The following was originally written for and submitted to the Texas Water Development Board as feedback on how to use the $1 billion Texas Water Fund to “create” some “new” water supply.

Two roads diverged in a wood, and I—

I took the one less traveled by,

And that has made all the difference.

  • Robert Frost

In considering the rollout of the Texas Water Fund, it was noticed that the heading of one of the surveys sent out by the Texas Water Development Board seems to indicate there is a bias cooked in for certain sorts of water projects – more on that below. What is offered here is another look at water resources management, one that suggests it will be beneficial to our water economy if we apply and implement a broader vision of how we manage water, and that may make all the difference in just how effectively we can “create” any “new” water supplies. So let’s take a peek down that road not taken … so far.

Regarding that apparent bias, many years ago now a blog post entitled “They’d rather have a root canal …” was published, which noted that the mainstream of the water resources management field was all over actions that could be implemented at the beginning of the pipe – e.g., desalinization – or at the end of the pipe – e.g., direct potable and “purple pipe” reuse – but it seemed they would rather have a root canal than delve into how anything in between the beginning and end of the pipe might be rethought, re-planned, re-engineered, to produce systemically more efficient systems. This missive delves into that and elucidates such opportunities.

It starts with broad systemic thinking, of the sort set forth on the Waterblogue in One More Generation. It is reviewed there how it seems society “thinks” that a “reasonable” strategy is to raid remote aquifers and transport that water, in “California style” long-distance water transfer schemes, to areas which are growing, so presumed to “need” that water supply. It appears that little consideration is being given to the long-term sustainability of the aquifers being raided. As a prominent water professional once observed, our groundwater is being “managed to depletion”. By depleting that water, we would grow a population over the next 50 years or so that (we hope anyway) would not then dry up and blow away, will still be here, still needing water supply for the next 50 years, and so on after that. So it seems this would be a plan to run society into a box canyon, to be just rearranging the deck chairs as the ship goes down, leaving future society high and dry. So it is suggested that we consider how we can “create” some “new”, and renewable, water supply as part and parcel of the development that will serve that growth.

Look! Up in the Sky!

Let’s begin that broad systemic thinking with consideration of building-scale rainwater harvesting (RWH) vs. watershed-scale RWH as the basis of water supply. Of course, since almost all of our conventional water supplies ultimately derive from precipitation, they are all watershed-scale RWH systems, with the watershed being the collection surface, and streams, reservoirs and aquifers serving as the “cisterns”. So first, it should be understood there is nothing “different”, nothing “exotic” about the very idea of building-scale RWH as a water supply strategy. We simply have to consider it more systemically than we have, in the main – a road not taken … so far.

As is reviewed in a piece on the Waterblogue entitled Water for DFW – Building-scale rainwater harvesting vs. Marvin Nichols, the systemic differences in the two approaches to “growing” water supply favors, even highly so, the building-scale strategy. These differences include:

  • eliminating transmission losses;
  • eliminating the energy cost of long-distance transmission;
  • eliminating lake evaporation losses;
  • avoiding the environmental and economic disruption and destruction that building reservoirs entails;
  • allowing simpler and less costly water treatment for potable supply;
  • minimizing or eliminating climate change impacts (oh, wait, we can’t talk about that in Texas, sorry);
  • water savings due to it taking less energy to deliver the water supply – the so-called water-energy nexus;
  • lower impact on environmental flows;
  • matching up the timing of costs to implement supply more closely with imminent need for supply, the so-called “time value of money”, understanding that under the watershed-scale RWH strategy, a whole lot of investment must be put in the ground before water supply can be provided to the first house or business, while under the building-scale RWH strategy, the supply for each building is installed on a “just in time” basis; and
  • because of this, there would be a much lower investment put at risk.

It is noted in passing that some folks are touting transmission line loss control as the epicenter of how we can “create” some “new” water supply, by better husbanding the water we already have. While of course any and all conservation efforts must be a necessary component of this whole effort, it is observed that this simply highlights the flawed nature of the watershed-scale RWH water supply strategy. Even with the very best loss control deemed feasible, there would still be considerable transmission losses. Maybe “good enough” as long as water supply doesn’t get too tight, but maybe too big a chunk out of supply when it is tight. So it is asserted that a better way to blunt those losses would be an infrastructure model that would pretty much completely obviate line losses, the building-scale RWH strategy.

Yet, we have seen no meaningful consideration of building-scale RWH as a broad scale water supply strategy. To the extent it is considered as a strategy at all, that is sequestered to individual building owners, evaluated in a micro-economic climate that does not take into consideration any of these systemic advantages; the long-run economic analysis is pretty universally ignored. So it is asserted that this is an arena ripe for investigation, with the potential for yielding huge amounts of “new” supply over much of the state in a globally cost efficient and resource efficient manner. The Texas Water Fund should direct resources toward broad proliferation of building-scale RWH.

Waste Not, Want Not

The other major arena in which better systemic thinking is needed is wastewater reclamation and reuse. As noted, there is a mainstream bias for understanding reuse pretty much exclusively as something that would be appended on to the system at the end of the pipe, really as if it is just an afterthought rather than a central function. It is suggested that this whole paradigm be rethought, that we focus on designing reuse right into the very fabric of development, as if it indeed were a central function of this societal process we call wastewater management.

To do this would entail reconfiguring the wastewater system architecture, so that this water resource would be treated, and reused to the maximum extent feasible, as close to where this water is “generated” as practical. Of course the rub here is “as practical”. The mainstreamer “understanding”, even codified in TCEQ rule, is that wastewater is “best” managed on a “regional” basis, urging the routing of flows from as far and wide “as practical” to one point source.

That reconfiguration of wastewater system architecture was labeled the “decentralized concept” of wastewater management back in 1986, set forth as an alternative organizing paradigm for a wastewater system of any overall scale. All that was based on works prior to that, so really there is nothing “new” about the decentralized concept strategy; it too is just a road not taken … so far.

As has been reviewed in many works, the “regional” system is really just the product of 19th century tradition, when, with the stuff running in the streets, the whole focus was on piping it “away”. That tradition has simply been “extended”, with no thought on whether that would be the best manner of managing this societal function in the dynamic development market, and with the water supply challenges, we have in Texas today. It is high time that society engages in that rethinking and considers the systemic efficiency of distributed vs. “regional” systems.

A look at just how the decentralized concept could be implemented in a hinterlands development in the Hill Country is set forth on the Waterblogue in This is how we do it. A comparison of this strategy vs. piping such a development into a “regional” system is offered in Let’s Compare. As these works review, using the decentralized concept strategy in lieu of the conventional “regional” strategy can deliver fiscal, societal and environment advantages.

In the fiscal arena, we can realize several advantages:

  • With the treatment units being distributed to the “neighborhood” scale, all of the large-scale collection system infrastructure, trunk mains and lift stations, would be obviated. As the collection system is typically a large majority of total “regional” wastewater system cost, this would impart a huge cost savings.
  • For any local collection system that is needed, the effluent sewerage concept would be favored, rendering those lines more cost efficient, as reviewed on the Waterblogue.
  • Little to no infiltration/inflow would plague the effluent sewerage collection systems, decreasing maintenance costs and peaking loads on treatment units, allowing some components to be downsized.
  • Using effluent sewerage renders the sludge management function more cost efficient, and otherwise less burdensome.
  • With treatment units distributed, the reclaimed water redistribution system would also be smaller scale, thus far less costly than “purple pipe” systems redistributing water from “regional” plants, so rendering reuse more affordable, thus more fiscally feasible, delivering cost – and sustainability –  benefits by displacing demands on the potable water system.
  • The distributed system architecture would also typically obviate pumps (lift stations) in the collection system, so saving all that energy, saving money.
  • The type of treatment unit we need to employ in the distributed treatment units – the High Performance Biofiltration Concept (recirculating packed-bed filter) unit is highly favored, as reviewed in This is how we do it – would incur fairly minimal routine O&M cost.
  • The High Performance Biofiltration Concept treatment unit would require a lot less energy to operate than an activated sludge system, practically the knee-jerk choice by the mainstreamers, so saving a lot of money.

Societal implications would be significant. More fully reviewed in Let’s Compare, those implications are listed here:

  • As noted above, reuse of effluent would become more cost efficient.  The effluent would be made available throughout the service area, nearer to points of potential reuse, greatly decreasing the cost of the reclaimed water distribution system.  Non-potable demands such as landscape irrigation, toilet flush supply, and cooling tower makeup could be more expeditiously and cost efficiently served with reclaimed water, spurring on this action, so defraying demands on the our increasingly stressed water supplies.
  • Requiring far less energy for treatment and to move water around, this distributed treatment and reuse would also reduce greenhouse gas emissions (oh, sorry again to mention climate change, but this IS a societal advantage).
  • The management system would be able to accommodate any level of water conservation found to be economically attractive or ecologically necessary.  Only liquid effluent is transported, so reduced wastewater flows due to water conservation measures would not cause clogging problems in the collection system, as has occurred in conventional centralized systems.
  • The decentralized concept system is easier to plan and finance.  Each project is small compared to the typical “regional” system expansion.  The management needs of each area or new development are considered directly and could be implemented independently. Contrast this with planning large-scale facilities over an area-wide system, with much less definite growth prospects in any given area.
  • Capacity expansion – and therefore capital requirements – would track demand much more closely, minimizing the amount of money spent to construct facilities which would not be fully utilized for years to come, as occurs routinely in the course of expanding and upgrading conventional centralized systems. Again, the “time value of money”.
  • Much of the cost could be privatized or assigned directly to the activity generating new demands on a much fairer basis, so that existing ratepayers and/or taxpayers would not have to serve as the “bank” for this infrastructure, which is typically the case as upgrades and expansions of the conventional centralized system are bond financed by public agents.
  • This is all money “at risk”. If, for example, we were to experience another “crash” such as occurred in 2008, the pace of development might slow, even stop altogether for a time. But once the money is borrowed and the system built, the payments would be due whether development came on line to fund those payments or not. So whoever financed that infrastructure would be on the hook to make those payments. If these facilities were publicly financed, it would be all of the ratepayers, and/or taxpayers, who would be called upon to pony up. This could balloon their wastewater rates and/or tax bills. All that would be avoided under a decentralized concept strategy, which assigns that risk to the developer, who would be putting relatively small amounts at risk at a time.
  • With the hardware systems decentralized, there would be no compelling reason to impose a “one size fits all” management approach.  Different strategies could be employed in various parts of the service area, responding in the most fiscally efficient and environmentally responsible manner to each set of circumstances – e.g., on-site systems in low density development, distributed systems on the urban fringe and in the hinterlands, connection to a centralized system where that might be the cost efficient strategy – with all those systems under one unified management system.
  • The system can be designed and installed in a manner which is “growth-neutral”, whereas centralized systems often spur growth, even requiring it to be fiscally viable in many cases.
  • Distributed management reduces vulnerability to pollution.

This leads us to considering the environmental advantages of the decentralized concept strategy, which include:

  • Scale is a major driver of environmental vulnerability. Centralization causes large flows to be concentrated through one pipe or lift station or treatment plant.  This implies that any mishap would have large consequences.  In a decentralized concept system, the flows at any point remain small, implying less environmental damage from any mishap.
  • In a distributed system, any mishap that may occur would impact on only a minor portion of the overall system, as each distributed system is disconnected from all the others, so imparting far less overall vulnerability.
  • In any case, bypasses, leaks, overflows, etc., would be far less likely in a decentralized concept system. More “fail-safe” treatment technologies would be employed, and lift stations would be eliminated or greatly reduced in number.  Carrying only liquid effluent to multiple treatment centers, the effluent sewer collection system would consist of shorter runs of smaller pipes containing fewer openings, implying far less potential for infiltration and exfiltration and for overflows.
  • There would be less environmental disturbance from system construction.  The smaller effluent sewer collection system pipes would be installed at shallow depths and could be more flexibly routed.  There would be no large interceptor mains, which typically run in riparian areas, requiring these riparian environments to be disturbed to install the mains.
  • Environmental disturbance would be minimized over the long term as well because existing lines would not need to be torn up to upgrade system capacity.  System expansion would be afforded by adding a new treatment center for each newly developing area rather than by routing ever more flow to existing centers.
  • Treatment and reuse can be “tailored” to the waste stream.  Industrial waste need not be commingled with domestic wastes; rather these generators can be required to implement treatment methods specific to their wastewater characteristics and reuse opportunities.

While all these fiscal, societal and environmental advantages should be creating high incentive to consider organizing wastewater systems in accord with the decentralized concept, most important here is the ability to more cost efficiently route the reclaimed water to beneficial reuse, so spurring on that practice, thereby defraying demands on our increasingly stressed water supplies. In regard to the Texas Water Fund, that of course is the main reason to move management in that direction. Again by designing reuse into the very fabric of development, this can “create” a “new” water supply in a far more fiscally efficient manner than can generally be done under the conventional “regional” system paradigm. So the Texas Water Fund should direct resources toward proliferation of reuse-focused decentralized concept “waste” water systems.

Don’t Drain It, Retain It

Before proceeding, it is noted in passing that sustainable water can be enhanced by moving storm water management practice to distributed Low-Impact Development (LID) strategies, for example as outlined on the Waterblogue in … and Stormwater Too. Rather than draining “away” the increased runoff created by development, it should be retained and infiltrated, so as not to desertify the landscape as it is developed. Some of that increased runoff could be captured and directly used for water supply. As reviewed on the Waterblogue, rainwater harvesting for irrigation supply – and in settings where those uses come into play, for toilet flushwater supply and cooling tower makeup supply too – can readily be integrated with the storm water quality control function.

Really, this is a subset of the building-scale RWH strategy. Attention specifically to this is merited, however, as there is a considerable market for this practice, in both new development and in retrofit projects. So this is a road now being partially trod, but the journey lacks much institutional support; it is largely driven by individuals considering their immediate needs. Thus in regard to how society is proceeding to consider how we “create” any “new” water supplies, this too is a road not taken … so far. The Texas Water Fund should direct resources toward expanding these storm water management practices.

Examples of the Opportunities

To illustrate the currency of these manners of “creating” any “new” water supplies, let’s look at some examples.

The first example is a small-bore project, but one that if repeated over and over could add up to a whole lot of “new” water supply. It is a hinterlands project in the Hill Country, called Lunaroya, proposing 28 lots with areas between about 1.5 and 2.5 acres each, located on Barton Creek. The conventional plan for water supply would entail a public water supply well, storage tank and distribution system, all of which would have to be planned, designed and permitted “up front” as a Public Water Supply System (PWSS). It was reported that the water quality out of the well would be such that reverse osmosis (R.O.) treatment would be needed to render the water to potable quality, and this treatment would result in something like 70% of the water supplied from the well becoming R.O. “reject” water, a brine toxic to plants, and so there would have to be a “disposal” system for that water created as well.

It is suggested that all this waste could be obviated by using building-scale RWH for water supply to the houses in Lunaroya. In this area, these acreage lots will be rather expensive, so it is to be expected that houses built there would be “large”, with the roofprint area needed to make RWH systems highly sustainable being installed as a matter of course, between the house, the garage, and covered porches and patios. This leaves the cistern and the rainwater treatment unit – a simple cartridge system and UV disinfection unit – as “extra” costs for an RWH system at each house.

By going with building-scale RWH, not only is the high level of water waste out of the well obviated, but so are the costs of the well, the storage tank, the distribution system, the brine disposal system, and all the planning, design, and permitting costs of those components. This would save the developer a very large sum in up front costs that would need to be paid before water service could be provided to the first house in Lunaroya. This savings could translate to lower lot prices, so defraying the cost of the cistern. Since the RWH systems for each house would be planned, designed and implemented by the owner/builder of each house, all the developer would need to pay up front for water supply would be for the planning items specified by the county as part of the platting process in subdivisions proposing building-scale RWH as the water supply strategy, a very minor fraction of all those up front costs for the conventional water supply system.

As for marketability of houses that use building-scale RWH for water supply, there are hundreds of houses using this supply strategy in the general area where Lunaroya is located. So it is a well known commodity, which can readily be financed by lenders. With a bit of education, some modeling to show how sustainable this supply can be made, this strategy should readily be rendered “non-scary” to any potential buyers of lots in Lunaroya. It is also reputed that rainwater is a superior water supply, being naturally soft, so gentle on skin, clothes and fixtures, that people who use it love it. So the developer would have no reason to fear the lots could not be marketed.

For “waste” water management, it was proposed to use On-Site Sewage Facilities (OSSFs), more popularly called “septic systems”. In particular, it was asserted that a spray field be sited on each lot to “dispose” of this “waste” water, after pretreatment in a home-sized bastardized version of the activated sludge process, commonly called an ATU – “aerobic treatment unit”. So really it is proposed that 100% of the water to be drawn from the well would be wasted, about 70% directly as R.O. reject water and the other 30% or so being “disposed of” – thrown away – after being used once in the house.

It is suggested that the OSSFs be recast as reclamation and reuse systems. Each one would be composed of a High Performance Biofiltration Concept treatment unit, which would remove the majority of the nitrogen from the “waste” water – important in the Barton Creek watershed – feeding the effluent into a subsurface drip irrigation field, arrayed as much as practical to irrigate landscaping that would be irrigated in any case. This type of OSSF has been designed, permitted and installed in several jurisdictions in Texas for almost 30 years now, so is a well known, well understood type of “septic system”.

By this means, a large majority of the “waste” water generated in the house over the annual cycle would be beneficially reused for irrigation, so greatly defraying demands on the potable water supply to the house. For the owners of houses in Lunaroya, defraying irrigation demand would allow them to downsize their RWH systems – the roofprint area and cistern volume – while still maintaining high sustainability of that water supply and maintaining some irrigated landscaping.

The cost of implementing this form of OSSF would not be all that much more than what the ATU-spray system would cost. Especially since the drip irrigation field would in some part displace an irrigation system that would be installed in any case, if the homeowner intended to maintain any irrigated landscaping.

Thus it’s apparent that the alternative scheme, a set of “One Water” strategies, would save a whole lot of water. 37 acre-feet per year is being requested to be permitted by the governing groundwater district. This calculates out to be an average demand of 1,180 gallons/day per house. Note again, only about 30% of that flow, or about 350 gallons/day per house, would be a usable water supply; the rest would become R.O. reject water, an utter waste of this water, greatly downgraded in quality by the R.O. treatment, and so posing an expense for proper “disposal”. Going with building-scale RWH instead, society would save all that water, the developer would immediately save a large sum for all those obviated up front costs of implementing the PWSS, and the residents would enjoy the superior quality of the rainwater supply. This is a win-win-win, for the developer, for the homeowner, for the water environment. It is the sort of action that the Texas Water Fund should have some mechanism for fostering.

The next example is the Mirasol Springs project, presently in the planning and permitting stage, along the Pedernales River near where it is crossed by Hamilton Pool Road. The development comprises an Inn, restaurants, some large “conservation” lots, some “branded residential” houses, “resort cottages”, an event barn, a University of Texas field research station, and a farm. The situation is described on the Waterblogue in If you’re going to call it a vision …, reviewing how the developer is presuming to call a very conventional set of water resources management practices a “new standard for environmentally focused Hill Country development”. It is suggested that an actual “new standard” would be a “total One Water” scheme, essentially following the water supply, wastewater management, and storm water management concepts set forth above. The basics of this program would include:

  • All the houses on the “conservation” lots, arrayed on both sides of the environmentally sensitive Roy Creek Canyon, shall have water supply provided by building-scale RWH.
  • All the houses on the “conservation” lots shall have wastewater service provided by OSSFs, using nitrogen-reducing recirculating packed-bed filter (High Performance Biofiltration Concept) treatment units and dispersing the reclaimed water in subsurface drip irrigation fields, arrayed to the maximum extent attainable to irrigate the highest value landscaping on the lot, landscaping that would likely be irrigated in any case, to minimize need for harvested rainwater for irrigation.
  • Together, these strategies will obviate all water and wastewater lines running through/across Roy Creek Canyon, eliminating that source of water quality degradation and saving the developer the budget of a small African nation in upfront costs that would deliver a very limited amount of service.
  • All the developer would have to do to market those lots is plat them and provide road access. Paying for and installing the RWH systems and the OSSFs would fall to the owners/builders on each lot.
  • Savings on the obviated water and wastewater lines would be far more than the cost of a tanker truck, which the manager of the overall water supply system could use to provide any backup supply to the houses, if ever needed, to provide high assurance of backup water supply at need to these homeowners. In the meantime, with it being needed only rarely, that truck could be leased out or used for other needs, to defray its cost.
  • This same strategy may be applied to the “branded residential” houses, similarly eliminating up front costs for the developer, obviating water distribution line runs and wastewater collection and reuse redistribution lines, and their costs and environmental liabilities. Also reducing the capacity needed in the surface water and/or groundwater supply and wastewater management systems, so reducing those costs, which the developer will have to pay up front.
  • This same strategy may also be readily applied over the “resort cottages”, where all units are likely to be on a single lot under single ownership, so a collective RWH system and a collective OSSF would be readily doable, institutionally as well as technically. Again, saving on water distribution lines and wastewater collection and redistribution lines, and eliminating their environmental liabilities. And further reducing the capacity needed in the surface water and/or groundwater supply and in the wastewater management system. Noting that due to the nature of this service, this RWH system may have to be permitted as a PWSS, but that is readily doable, and that PWSS could very readily be operated by the entity established to run the overall water supply system.
  • Building-scale RWH for water supply and an OSSF for wastewater management could also service the research center. In particular as it appears this facility would be “off to the side” at some distance from other facilities, this would save on water distribution lines and wastewater collection and redistribution lines, and again reduce the capacity needed in the surface water and/or groundwater supply and the wastewater management system. All of these costs could likely be shunted off to the entity that develops and builds the research center facilities, so here too reducing the developer’s up front costs.
  • It appears practical to use building-scale RWH for water supply and an OSSF for wastewater management for the event barn. Again, savings on water and wastewater lines, savings in water supply capacity, and elimination of environmental liabilities that wastewater lines would entail.
  • This leaves “only” the Inn and the restaurants to be provided water supply from surface water and/or groundwater. Even for those facilities, however, rainwater supply off those roofs can defray water demands. Here again, even if the RWH system would need to be permitted as a PWSS.
  • Consider using slow sand filter for the surface water treatment process, a system very well suited to treating river water. Likely less expensive to build – could readily be phased as needed, installing multiple beds – and far less expensive to operate and maintain than the conventional water treatment plant that the developer is proposing.
  • The Inn and restaurants would have design flow rates above 5,000 gpd, so the wastewater system(s) to serve those facilities would have to be permitted at TCEQ under the “municipal” permitting process. But the form of such systems could be the same as the OSSFs, using the highly stable and robust nitrogen-reducing recirculating packed-bed filter (High Performance Biofiltration Concept) treatment unit – rather than the inherently unstable, high operating cost activated sludge unit, which would need cost increasing modifications to attain significant nitrogen reduction – and dispersing the reclaimed water as much as practical to irrigate grounds beautification. Understanding that the subsurface drip irrigation fields to irrigate that landscaping would be in effect the “TLAP” fields for this system. Installing drip fields in the “waste meadows” could be largely, if not completely, obviated. All of this will also decrease the developer’s up front costs.
  • The “farm” is a wild card, but it is quite likely that everything but crop irrigation (but maybe some of that too) could have water supply provided by building-scale RWH and all wastewater generated there could be managed in an OSSF. It may be that some of the drip irrigation field receiving reclaimed water from the Inn and restaurants could be on the farm, irrigating some of the crops.
  • There should also be discussion of the form of storm water management facilities. Note that a very robust program of rainwater harvesting will basically take rooftops “out of play” in regard to storm water quality management, and reduce the scale of water quantity management (detention) facilities. Maximizing use of permeable pavement would also reduce need for water quality management installations. All this too is quite likely to reduce the developer’s up front costs.

The developer has applied to use surface water from the Pedernales River and groundwater from the local aquifer to provide water supply for this project. The adoption of this “total One Water” set of strategies would obviate a large portion of those draws on these sources, each of which is understood to be rather limited. Indeed, it is the expectation that both surface water and groundwater supplies might be curtailed during future droughts that motivated the developer to apply for access to both supply sources.

Given the clear opportunity to render the water resources management infrastructure more fiscally efficient, including the “time value of money” benefit of being able to defer installation of infrastructure until the building it serves would be built – noting this project has a projected 10-year timeline to completion – and higher quality of rainwater supply, the “total One Water” set of strategies presents a win-win-win prospect for the developer, for society, and for the eventual users/residents of this project. That it appears this sort of approach is currently being ignored by the developer illustrates the need for the Texas Water Fund to include some means of promoting, maybe even incentivizing, the move toward sustainable water practices, so as to proliferate them throughout the development community.

Those two examples are out in the hinterlands, so not entailing how to apply these concepts within the context of a community-scale utility system. For that, we look first at the situation facing the City of Liberty Hill. The city has applied to TCEQ for a permit to increase the capacity of its wastewater system from 2 Mgd to 4 Mgd. This shows that a lot of the future flows would derive from new development, and that would be mainly located on the urban fringe and in the surrounding hinterlands. Which in turn suggests that, for all the reasons reviewed above, the city might consider moving to a decentralized concept strategy to serve that growth, rather than to expand the capacity of its centralized point source treatment plant and extend sewers ever further into the hinterlands.

This strategy may be doubly beneficial to Liberty Hill because the new permit carries with it a more stringent restriction on phosphorus concentrations in the effluent to be discharged into the South San Gabriel River. This was imposed because the current discharge has stimulated massive algal blooms in the river, contended to be a violation of the Clean Water Act. So, while the city would have to take measures to reduce effluent phosphorus concentration in the existing flow, they can save money by not having to increase the capacity of the phosphorus removal process(es) for the additional 2 Mgd of flow. Particularly since none of that flow currently exists, so the investment required for increasing the central point source treatment plant capacity would be money that would not be fully utilized for many years into the future. The “time value of money” thing again.

It was communicated to the Liberty Hill mayor and city council members how they could go about moving to a decentralized concept strategy to serve new development in and around the city. And perhaps to even “unhook” some existing development, using the “sewer mining” strategy. Then perhaps to also install a “purple pipe” system from the central point source treatment plant to minimize the amount of water that would need to be discharged into the river. By these means, they could reduce the considerable investment needed for phosphorus removal processes.

As argued above, it is to be expected that by going with the decentralized concept strategy to serve new development, the city could expand its wastewater system capacity at lower cost, and – again the major point here – would put a lot of that wastewater to work for society, beneficially reusing it to defray demands on the area’s water supply sources. Blunting any future “need” to import water from remote aquifers, and all its attendant inefficiencies, and apparent unsustainability, as was reviewed above.

What is not addressed in those communications is that there would also be opportunities to make building-scale RWH part and parcel of the water supplies for the new development. Together with the “new” water supply “created” by reusing rather than discharging the flows from all these sources, there is a win-win-win to be had here, for the city’s ratepayers/taxpayers, for the developers, and for society by moving practice toward sustainable water. Again highlighting how the Texas Water Fund needs to create some means of promoting, perhaps incentivizing, the move to sustainable water practice, at all scales.

The last example offered is the “Zero Net Water” concept that had been posed a decade ago, proposing to maximize building-scale RWH, condensate capture, and building-scale/campus scale wastewater reuse to provide water supplies, an exponent of the strategies reviewed above. One example of this has been set out by Texas Water Trade (TWT) just recently as part of its Strategic Plan, which they have labeled as “Net Zero Water”. As TWT styles it, this concept – which is well set forth in its Net Zero Water Toolkit – is largely envisioned to be mainly applied in “big box” projects, like large institutional or commercial buildings. But of course as reviewed above it could also be applied in many sorts of developments, at various scales.

The explicit opportunity to cost efficiently expand the Zero Net Water strategy to smaller projects, explicitly in regard to wastewater reuse, is reviewed on the Waterblogue in Appropriate Technology. This hinges on moving to technologies better suited to distributed deployment, rather than simply scaling down the conventional technologies used at large centralized treatment plants.

However it is done, going with the Zero Net Water strategy would “create” considerable “new” water supply, through building-scale RWH, condensate capture, and distributed wastewater reuse. While market forces are already moving this strategy forward – and as noted it could be cost efficiently extended into markets that appear to be not currently recognized – this is another arena in which the Texas Water Fund could promote and incentivize that move, relieving stress on the watershed-scale RWH systems, saving a whole lot of water for society.

Stimulating Action

While the arguments for moving water resources management practices toward sustainable water are solid, compelling even, it is clear that the mainstream “thinking” in this field does not, in the main, recognize those practices. Thus there is little if any such change in practice showing up in regional water plans, and so in the State Water Plan. Evidencing, as is suggested throughout this document, that measures to proliferate the vision, knowledge and skills required to put these sustainable water practices on the ground need to be stimulated and pursued.

It is suggested that, in terms of the programmatic/bureaucratic activities that create the regional water plans, there needs to be a more inviting forum, that would encourage the pursuit of the sorts of sustainable water practices as have been highlighted herein. Attendance at a regional planning group meeting has been an experience of sitting through mind-numbingly programmatic/bureaucratic discussions and machinations, with such ideas as basing broad scale water supply strategy on building-scale RWH or the decentralized concept of wastewater management never even coming up for discussion. Much less is there any consideration of incorporating any of that into the regional plan. Rather, everything is focused on extending and perpetuating practices that constitute business as usual in the water resources planning and engineering field. Indeed on such actions as desalinization at the beginning of the pipe or direct potable and “purple pipe” reuse at the end of the pipe, neglecting how we can rethink, re-plan and re-engineer anything in between. It seems they would rather have a root canal …

Because of this groupthink being “expected”, it remains quite difficult to gain any traction in getting those who would implement water resources management infrastructure – municipalities, utilities, developers, etc. – to even entertain a peek down the road not taken. So there is much work to be done.

We began this missive with the closing lines of a Robert Frost poem. Let’s end it with some more poetry, these song lyrics by Neal Peart of the band Rush, from “Vital Signs”:

Leave out the fiction
The fact is this friction
Will only be worn by persistence

Leave out conditions
Courageous convictions
Will drag the dream into existence

So who is going to apply the persistence, who will impart the courageous convictions, who is going to drag the dream into existence? Who will lead in taking a peek down the road not taken … so far? If Texas is to avoid a dark water future, perhaps becoming unable to accommodate more growth, or even to maintain beyond one more generation the currently occurring population growth, that is being bought by “managing to depletion” existing water supplies, it must consider how to move practice toward sustainable water. And that could make all the difference.

If you’re going to call it a vision …

January 29, 2024

A development called Mirasol Springs is being proposed in Central Texas, along the Pedernales River on the Travis County–Hays County border. The development scheme is shown in the schematic below. It includes a “resort” hotel (the Inn), “branded residential” homes, “resort” cottages, “conservation” lot houses, a research facility, and a farm. This area is a somewhat “pristine” landscape, in particular including much of the – so far – “undisturbed” Roy Creek watershed, renowned by naturalists as a great example of a “native” Hill Country landscape. Thus it is considered an imperative to develop in this area with great sensitivity to this landscape, in particular in regard to water resources management, to blunt the draw on this area’s limited water supply resources and to minimize water quality degradation. The developer’s scheme proposing to accomplish that is set forth on the project’s website here, offering his team’s vision of how to best manage water resources – water supply, “waste” water management, and stormwater management. The “header” of this page reads, “Mirasol Springs will set a new standard for environmentally focused Hill Country development.” Raising the question, would it really?

As reported in “One Water” = the “Decentralized Concept”, there is a broadly supported, but so far largely unrealized on the ground, idea that engineering practice in this area needs to move toward “One Water” practices, and that a better understanding of the way we do this needs to be fleshed out. It was argued in that post that the “One Water” ideal would be most effectively and beneficially delivered by designing efficient water management into the very fabric of development, as if it were a central point, rather than to first arrange for water to “go away” and then to attempt to append on that efficient management at the “end of the pipe”, as if it were an afterthought. With the inevitable conclusion from this being that imparting “One Water” practice will rely in large part on employing distributed management schemes, such as the decentralized concept described in the piece linked above. Let’s take a look at how all that might play out in a setting like Mirasol Springs.

Water Supply

In the vision the developer sets forth on the project website, listed under “Water Use” are four components:  surface water, reclaimed water, rainwater harvesting, and groundwater.

Under “surface water”, the website states, “Surface water purchased from the LCRA [Lower Colorado River Authority] will be the base water supply for Mirasol Spring’s [sic] potable water and will meet 100% of our demand.” It is first brought to question, just how is this so very conventional idea – extracting a water supply from the watershed-scale rainwater harvesting system that supplies the vast majority of water supply in the Central Texas region – a “new standard”?

In this case, as can be seen in the graphic below, the proximate source of the potable water supply would be the Pedernales River, which runs along the border of the project site. The water withdrawn from the river would be pumped into a water supply reservoir to be built on the site. Water would be withdrawn from that reservoir and run through a water treatment plant. The treated water would be distributed in a conventional distribution system, routing water to all of the buildings on the development, requiring distribution lines to be extended to all the various developed areas on this site.

This conventional water supply system would entail a great deal of site disruption. This includes installing the intake structure in the river and a pump system and delivery pipe running up the bluff on the Mirasol Springs side of the river, and installing the reservoir, which would entail excavating the pond and distributing the excavated material on the project site. The distribution lines would cause disruption over and between the developed areas, in particular to get to all the “conservation” lots in the more “pristine” parts of the site, in the Roy Creek watershed.

Raising the obvious question, what could the developer do instead to create a water supply system for the project? Skipping to the “rainwater harvesting” component, the website states, “Rainwater collection from rooftops will be a requirement for larger structures constructed across the property, a practice that is already in use on the ranch. Deed restrictions for home sites will include water capture for irrigation purposes and guidelines for non-water intensive vegetative covers, water conservation-oriented landscapes and xeriscapes. Landscape irrigation on home sites will be restricted to rainwater collection only; no potable water will be allowed for landscape use.” While this laudably proposes to make rainwater collected on site from rooftops a primary supply for irrigation needs, it neglects considering the most “One Water thing” one could conceive here, maximizing the resource value of the water falling upon this site. So perhaps obviating all the expense and disruption of creating and operating the surface water supply system.

Consider the benefits of a water supply derived from distributed building-scale rainwater harvesting (RWH) vs. the surface water system. First and foremost is the efficient use of the area’s strained water resources. The very reason why the developer would pursue groundwater as a backup supply, reviewed below, is that they conceive the possibility that the Pedernales River would run dry, or dry enough to have their water supply curtailed. So why not consider the prospect of not depleting that surface water resource at all?

Second, as noted, with the facilities arrayed at the building scale, site disruption to install the storage pond and the water distribution system would be avoided. As would the inevitable leakage losses that plague such water distribution systems, so largely avoiding that often rather sizable source of water use inefficiency.

Third, the energy requirements to run the building-scale RWH systems would be considerably lower than would be required to run the surface water supply system. In the former, considerable energy would be required to lift the water from the river to the on-project water supply pond, and from the pond to the water treatment plant, and also to run the more energy-intensive surface water treatment unit, and then to pressurize and move water through the distribution system. In the building-scale systems, any lift from a cistern would be low and the water would only have to be run a very short distance. The treatment unit required to render the roof-harvested rainwater to potable quality would require far less energy than the conventional surface water plant. Not only would this be a fiscal plus for the MUD that will pay the energy bills, since it takes water to make energy – the so-called water-energy nexus – all this energy conservation would enhance the overall efficiency of the region-wide water system.

Fourth, under the surface water supply scheme, a considerable evaporative loss from the on-project water storage pond would be incurred, at its maximum just when drought would typically be at its worst. Evaporative losses from the covered building-scale cisterns would be minimal, a not-insignificant efficiency advantage for the building-scale RWH strategy.

Fifth and finally, pretty much the entire surface water system would have to be planned, designed, permitted and installed before the first building on the project could be provided a water supply. This is a hefty amount of up-front cost that must be incurred before any revenue-generating facilities may come on line, imposing a considerable “time value of money” detriment. The building-scale RWH facilities, on the other hand, don’t need to come on line until the building(s) each unit serves would be built, so the lag between incurring those costs and being able to derive revenue from each building could be much shorter. Also, it is expected that each of the building-scale systems – excepting for the Inn – would fall below the threshold to be classified as a Public Water Supply System, so the long and expensive process of permitting these systems through TCEQ could be avoided, a further “time value of money” benefit.

To determine the degree to which building-scale RWH could create a sufficient supply to meet the potable water demands in the buildings, a model would be used, into which the roofprint (water collection) area, the cistern (water storage) volume, and the expected water usage profile would be input. The model would be run over a number of years of historic monthly rainfalls to see how much, and how often (if at all), backup supply would have been needed in each year through the cycles of drought and plenty, and how much water supply would have been lost via cistern overflows during large storms and through extended rainy periods.

Based on the outcomes, “appropriate” building design, to increase roofprints – for example the “veranda strategy”, adding on covered patios and porches to add relatively inexpensive additional collection area – and “proper” cistern sizes, as well as the target conservation behavior, could be chosen to make the system as robust as desired. Past modeling of and experience with building-scale RWH in this region indicates that this strategy could provide a quite sufficient supply for much of the interior (potable) water uses at Mirasol Springs.

Through this means, the potential for building-scale RWH could have been evaluated, and the costs of using it could have been compared to the costs of the conventional surface water supply system. And the benefits of avoiding the site-wide disruption entailed in the conventional strategy could also have been evaluated. None of this appears to have been considered by the developer, rather it seems to have been simply presumed that the surface water supply, the watershed-scale RWH system, was “needed”, that building-scale RWH could be no more than an adjunct supply to defray irrigation usage. Opportunity to set an actual “new standard” foregone.

Now consider the “groundwater” component. The website says: “Groundwater will only be used if surface water is unavailable or curtailed. The goal of the project is to significantly limit the use of groundwater through conservation, including the use of reclaimed water and harvested rainwater, noted above, to meet non-potable water demands. When surface water is not available, Mirasol Springs will utilize groundwater to service the demand for domestic use. No groundwater will be used for landscape irrigation. Good stewardship of groundwater resources will be supported through additional planning and holistic water management measures. There will be no individual water wells. Water availability studies have demonstrated that adequate groundwater is available from the underlying aquifer when the project is required to use groundwater.” Quite a number of claims and caveats there to be considered.

While there is no indication what the parameters of the deal to purchase water from the LCRA may be, it is expected that any curtailment or unavailability would be predicated on the flow in the Pedernales River, which would rise and fall with cycles of drought and wetter times. If a drought were of such severity that river flow would drop so low that LCRA would curtail or ban further withdrawal of the surface water, it would be exactly such a time period that the region’s aquifers would also be under maximum stress.

There is no analysis, however, of “[w]hen surface water is not available”, and so when/if groundwater might be “needed” is entirely opaque. There is no indication, no standard for what would constitute “[g]ood stewardship of groundwater resources”, no idea offered for how those resources “will be supported through additional planning and holistic water management measures.” It all seems to be a “just trust us” proposition, hardly any sort of “new standard for environmentally focused Hill Country development.”

Thus, by plan, groundwater would be prevailed upon to carry the entire potable water supply just when that source too would be most stressed, and so when groundwater withdrawals would be most problematic. But again there is absolutely no analysis of when/if groundwater might be “needed”. So it may be called to question if indeed “adequate groundwater [would be] available from the underlying aquifer when the project is required to use groundwater.” There is no indication that a drought-stressed local aquifer could provide the full potable water demand over any given period, for this or any other developments in this area. Indeed, it is the questionable future condition of the local aquifer that urged the developer to look to a surface water supply to begin with.

Then too it can be called to question if the treatment requirements for a groundwater supply would be the same, using the same sort of treatment train, as for the surface water drawn from the river. Water quality of groundwater varies considerably across the Hill Country, and “over-drawing” aquifers can cause the quality of water from some wells to degrade. So this is another aspect of the overall scheme that appears to be a bit open-ended.

All this would be imparted by choosing to ignore the readily available “One Water” strategy, an actual “new standard” strategy, of maximizing supply from water falling onto this site.

“Waste” Water Management

While those water supply matters basically rest on analyses that the developer chose not to pursue, and almost certainly sells short the “One Water” supply strategy, in the “waste” water arena, there is a much more clear-cut choice. For the “reclaimed water” component, the website states:  “Mirasol Springs will reclaim wastewater from the Inn, the Farm, the University of Texas Hill Country Field Station, and all the home sites in a centralized collection facility that is aesthetically integrated into the landscape. There will be no septic systems. The facility will be equipped with the best technology available for nutrient removal and will reclaim 100% of the effluent for irrigation uses. This wastewater will be treated and used to offset irrigation needs for the property and other non-potable uses. No potable water will be used for landscape irrigation. Also known as beneficial reuse, this process completes the effort to maximize the lifecycle of water usage onsite. There will be no discharge into any creek or river. All wastewater will be collected and returned to a treatment plant.” This word salad begs for examination.

As stated, and as seen on the water systems graphic above, the developer is proposing that a conventional centralized system be installed, collecting all the “waste” water to be treated at one centralized facility. Including from the large “conservation” lots, entailing a rather long run of sewer line, in the Roy Creek valley, the most environmentally sensitive portion of this site, to collect a relatively small portion of the total amount of “waste” water that would be generated on the overall project. For those lots, to avoid the disruption and pollution — and the cost — those lines would impart, the developer should consider on-lot systems, to treat and reuse the water on each lot to serve irrigation demands there. Which, I expect, requires a dose of perspective.

It is noted that the website explicitly states, “There will be no septic systems.” Like that is a good thing. One can read between the lines here that “septic system” is deemed, at best, a secondary good, and likely is presumed to be a source of pollution, that the developer sees as being eliminated by centralizing the “waste” water from the various lots. Ignoring of course that components of the centralized system would themselves be pollution vulnerabilities. Conventional collection lines leak – longer runs of lines impart more leakage, and this becomes worse as the lines age – and manholes in those lines overflow. Lift stations inevitably needed in the terrain on this site will inevitably fail and overflow at intervals. And all this is in addition to the widespread disruption of the landscape that would be entailed in installing the centralized collection system.

This could all be obviated by choosing to pursue a decentralized concept “waste” water management strategy, treating and reusing the “waste” water as close to where it is generated as practical. Again, for the dispersed “conservation” lots, this would be a no-brainer strategy, presuming the use of the sort of “septic system” that is equal to the task at hand. A system providing high quality pretreatment – including removal of a large majority of the nitrogen from the “waste” water prior to dispersal – consistently and reliably while imparting rather minimal O&M liabilities. Then dispersing the reclaimed water in a subsurface drip irrigation field, arrayed as much as possible to serve grounds beautification, the landscaping that would be irrigated in any case, whether the reclaimed water was there or not, so practically maximizing beneficial reuse of the “waste” water resource in the on-lot environment.

The High Performance Biofiltration Concept treatment unit – set forth for distributed treatment duty in “This is how we do it”, and more fully described here – fits the bill here, being by its very nature stable, benign and robust. This is the very treatment technology used, for example, at the highly touted Wimberley “One Water” school, exactly because of that.

Unfortunately this treatment concept is not very broadly known, as the “septic system” market in Texas is so dominated by the “aerobic treatment unit” (ATU), which is a small, home-sized, bastardized version of the activated sludge treatment process, a process that is by its very nature inherently unstable, and even more unstable in these bastardized incarnations of the process. And, as reviewed in “Averting a Crisis”, the “septic system” regulatory process in Texas is legend for neglecting on-going O&M, so making it even more critical that “fail-safe” systems like the High Performance Biofiltration Concept be used, especially in a setting like Mirasol Springs.

Noting, however, that assuring “proper” O&M “shouldn’t” be an issue on Mirasol Springs, as the entire “waste” water system, no matter how deployed, would be professionally operated and maintained by the MUD the developer proposes to establish to run the water utilities on this project. All the more reason to use the inherently stable and robust, the more “fail-safe” High Performance Biofiltration Concept treatment unit instead of ATUs, to reduce the load on that O&M system.

Indeed, even the centralized treatment plant the developer proposes would be episodically loaded, with flows rising and falling through the diurnal cycle. So using an inherently unstable activated sludge system for that plant would be a vulnerability, urging the use of the “fail-safe” option there as well.

But again, the major vulnerabilities would be avoided by not centralizing all flows, rather by distributing the system to each building or set of buildings, as would be most cost efficient in each circumstance. Note in particular how this disperses risk. Any problem with the centralized treatment plant would impact on the entire flow, while a problem with any of the distributed treatment units would impact on only a minor fraction of the total flow. And again, there we would be using the low risk “fail-safe” treatment technology.

But the developer foregoes this opportunity, in fealty to the conventional understanding that it is best to centralize all flows to one treatment unit, despite all the pollution potential, and disruption, inherent in gathering flows to that central point. And despite the cost of running the collection lines out to each developed area, and – if the reclaimed water is to be reused for landscape irrigation as the “vision” asserts – the redistribution lines to send water from the central treatment plant to the areas to be irrigated. Again, all that would be avoided under the decentralized concept strategy.

Then there is the matter of the “time value of money”. The centralized system is an “all or none” proposition. The treatment plant would be initially built with the capacity to treat flows from all the buildings on the project, while portions of the development would come on line in phases, so that some of the treatment plant capacity would lie idle in the interim until the project was built out. It is also likely that the collection and redistribution lines would all have to be installed to get the water from all areas to the treatment plant and back to irrigation sites. Here too, investments would sit in the ground, not fully utilized, until the project built out.

A distributed system would obviate all that unrealized value. First by not having to install the collection and redistribution lines at all. And then, by using the improved type of “septic system” noted above, installed for each development area on a “just in time” basis, to serve only imminent uses. For development other than the “conservation” lots, likely a collective system serving more than one building at each treatment center, but still overall a distributed system, not requiring any investment in the larger-scale collection and redistribution lines. Further realizing the “time value of money” by building only those systems needed to serve imminent development, rather than having to plan, design, permit and install the entire centralized system before service could be provided to the first building.

As for treatment quality, the developer appears to presume a need for “the best technology available for nutrient removal”, even though all the reclaimed water would be dispersed in subsurface drip irrigation fields, providing all the treatment and “buffering” that the soil-plant complex offers. The High Performance Biofiltration Concept treatment unit can consistently and reliably produce an effluent with low – typically about 10 mg/L – BOD and TSS, the two basic measures of how well treated the water is. 20-30 mg/L is deemed “secondary” treatment, which is the minimum required to disperse the reclaimed water in subsurface drip irrigation fields. This is mainly to assure that drip irrigation emitters would not clog, as the level of “dirtiness” of the water as measured by BOD and TSS, as long as it is in the “secondary” range, is otherwise irrelevant in a soil dispersal system.

The High Performance Biofiltration Concept unit can also routinely remove a large majority of the nitrogen from the “waste” water, typically producing an effluent concentration of about 15 mg/L. Less than 20 mg/L is deemed to be a “safe” level that would, along with plant uptake and in-soil denitrification we have in this climate, result in a vanishingly small amount of wastewater-derived nitrogen flowing into environmental waters when dispersed in a TCEQ-compliant subsurface drip irrigation field.

Phosphorus – the pollutant of greatest concern in discharges to streams – would be irrelevant here, since at the concentrations found in domestic wastewaters, phosphorus would be fully “sorbed” in any soil mantle that would provide a decent growing medium.

Bottom line, the High Performance Biofiltration Concept treatment system would deliver an effluent that would be highly protective of the environment, even in this sensitive area, assuming of course that the subsurface drip irrigation systems were well designed, well implemented and well operated. Which of course would be the same condition that the conventional system the developer proposes would have to meet.

The conclusion is that the decentralized concept strategy described here, utilizing distributed “fail-safe” treatment units and dispersing the reclaimed water into subsurface drip irrigation fields, would produce a “waste” water system for this project that would be more fiscally reasonable, more societally responsible, and more environmentally benign than would be offered by the conventional centralized system – with reuse appended on – that the developer proposes.

Stormwater Management

The website is light on how stormwater would be managed on this project. It states under the heading “Watershed Protection and Storm-water Runoff”, “The ultimate goal is to maintain the hydrology of the environment in its current state. This will be accomplished through short-term construction site management strategies that include silt fencing, soil berms and wattles to prevent erosion and silting of nearby streams.” Which seems to sequester the efforts to mitigating water quality degradation due to construction activities. Necessary of course as the development is being built, but the major task is to indeed “maintain the hydrology of the environment in its current state.”

In that quest, the website states, “There will only be a few homesites in the Roy Creek watershed, all with a 1,000-foot land buffer between the home [and] Roy Creek [sic]. The engineers recommend allowing for the native vegetation and soil to act as a natural ‘filter,’ as it has done for thousands of years, rather than trying to capture it and then release it from a pond or other structural water quality controls that would unnecessarily disrupt the natural character of the land.” The actual solution here is restricted to the “conservation” lots only, leaving it open what is to be done elsewhere, but implying the only option is the conventional view of stormwater management, that the site “should” be efficiently drained into an “end-of-pipe” facility – “a pond”. It seems to deny the “One Water” strategy of collecting and infiltrating the runoff on a more distributed basis, the Low-Impact Development (LID) strategy utilizing permeable pavement, Green Stormwater Infrastructure (GSI), etc.

The obvious measure of maintaining “the hydrology of this environment in its current state” is to render the rainfall-runoff response of the developed site as close as practical to that of the native site. This dictates that, up to the rainfall depth where runoff would begin on the native site, after all the “initial abstraction” were “filled”, all runoff should be intercepted and caused to infiltrate. While some of that infiltration might be imparted by flow over the “natural filter” in downslope areas, more generally some of that infiltration would have to be “forced”, with permeable pavement or by running it through GSI, such as distributed rain gardens. It seems the developer has not considered this basic “One Water” concept, choosing to rely on a more conventional end-of-pipe management scheme. Perhaps entailing the installation of grey infrastructure to convey flows from developed sites to ponds and such, as seems to be implied in the “Mirasol Water Systems” schematic above. It is called to question how well this could maintain the rainfall-runoff response very similar that of the native site.

The website further asserts, “Considerations will also include restrictions on impervious cover to prevent run-off and divert water into the aquifer.” Disregarding the non-sequitur, it should be clear that the LID/GSI strategy, infiltrating runoff from impervious surfaces on a highly distributed basis, is the manner in which one could reasonably “prevent run-off and divert water into the aquifer”, particularly on the more intensive portions of the development, like the Inn and resort cottages, perhaps the “branded residential” homesites too. We’d just have to deal with pavement, since rainwater harvesting would basically take rooftops “out of play”. The water that would have infiltrated over the area covered by rooftops would be captured, stored and later infiltrated, either through irrigation directly or once used in the buildings and becoming “waste” water, then irrigated. That concept was explained in this post.

Now as noted the 1,000-foot “land buffer” would indeed be quite effective in mitigating pollution and the increases in runoff imparted by development on the “conservation” lots, but of course there would have to be constructions to cause any concentrated flows to disperse into overland flow, so the scheme would not be quite so “automatic” as the website appears to present it. GSI, such as full infiltration rain gardens, should be installed there as well, to intercept flows off of impervious covers, to directly infiltrate some of the flow and to spread flows over that “natural filter”. This would be particularly so for any rainwater cistern overflows, which would be “concentrated” flows out of a pipe.

The website is totally silent on this LID/GSI approach — the “One Water” approach — implying that the only option the developer can conceive to letting stormwater runoff flow away downslope would be to first route it to “a pond or other structural water quality controls that would unnecessarily disrupt the natural character of the land.” Noting of course that it is the disruption of the “natural character” of the land caused by development that any such constructions would be installed to mitigate. Again this seems to reflect that conventional bias for gathering runoff into end-of-pipe “ponds”, rather than running it through highly distributed constructions like full infiltration rain gardens, with only “large” storm runoff overflowing on down the slope. Somewhat better mimicking the hydrology of the native site.

Summary

So it is that the water “vision” of the developer can readily be called to question. To sum it up, if the developer of Mirasol Springs is going to style its water management scheme as a “vision”, then perhaps it should impart some. It should follow the best “One Water” practices available in this setting, the water supply, “waste” water management, and storm water management practices reviewed above. Presenting the conventional scheme the developer proposes as a “new standard” can be quite fairly seen as simply greenwashing that very conventional scheme. If this project is to deliver on its promise of preserving and protecting this “pristine” landscape, a more holistic, more “One Water” strategy will be required.

Or so is my view of this matter. What’s your view?

Moving off top dead center …

December 6, 2023

Awhile back I attended a confab put on by the Save Barton Creek Association (SBCA) at which was discussed a couple of developments south of Austin, in the vicinity of the City of Hays and the City of Buda, labeled “Hays Commons” and “Persimmon”. The presenters, representing SBCA along with the Save Our Springs (SOS) Alliance and the Greater Edwards Aquifer Alliance (GEAA), asserted that the scale and nature of these developments were “incompatible” with the area and represented a threat to water quality, as they were on or adjacent to the environmentally sensitive Edwards Aquifer Recharge Zone. All of this was presented in rather general terms, as the presenters did not have any actual development plans to show, just generalities about the scale and nature of the proposed projects. And the only actual “prescription” the presenters seemed to have was “just don’t build there”.

But of course, this being a capitalist society, the folks who own that land have investments, and are no doubt responsible to investors who expect a return on their investment. So in an attempt to further the conversation and maybe get into how we might blunt or obviate the problems that the presenters asserted would be imparted by the developments they feared, I asked the obvious question, what would you have these folks do with this land instead? This seemed to have stumped them, but one of the presenters eventually said, “more of the same as what’s there, 1-acre lots on septic.” Apparently unaware of the impacts of that development concept on land fragmentation, and how unfriendly to water quality the types of “septic” systems the county will approve would be. See, for example, Averting a Crisis.

Leading me to understand, we need to have a conversation about what sort of development folks think “should” be the manner in which much of the Texas Hill Country develops. Because in the current market, develop much of it will. So I set my mind to looking at the land upon which the Hays Commons project was being planned.

Some years ago I had met with the owner of that property, Bill Walters, to talk about the manner in which he might install water and “waste” water service to develop that property. This was years before any thought of extending City of Austin water and wastewater utility lines might have been entertained that far out from the limits of their system at that time, and even back then the limited supply capacity from the aquifer was understood, so our conversation centered on what folks might term today “One Water” practices, that term having come into vogue in the interim. We talked about using building-scale rainwater harvesting (RWH) for water supply, and about distributed (decentralized concept) wastewater systems that would focus on producing a reclaimed water supply to serve non-potable demands, to reduce the draw on the “original” water supply source. Also a heavy dose of low-impact development (LID) stormwater management techniques, to greatly blunt the impacts of development on runoff water volume and quality.

I could see how these concepts might allow Walters to install water and wastewater services on Hays Commons in a manner that would save him time and money while we all saved water. You see, the working plan of Walters’ development partner Milestone is to get the City of Austin to extend water and sewer lines to the project. That would, of course, require a very significant investment up front, to install the waterlines, pump stations, and storage tanks, and to install the wastewater mains and lift stations. Note that in addition to those line extensions, the developer would still also have to install water distribution lines and sewer lines within the development. Which would all take quite a bit of time, during all of which the actual development – the thing that would produce revenue for the developer – would have to cool its heels. And because of those high up front costs and time delays, the developer would be highly motivated to go with a more intense development scheme. That too would entail costs and further time delays to the developer to deal with opposition to denser development from the likes of SBCA, SOS and GEAA, as we’ve already seen.

Enter a better plan, a “total One Water” scheme, to eliminate most of the up front costs and the delays that spending that money would entail, and to also save a whole lot of water. AND it would also greatly blunt water quality degradation. Here is that scheme, as I set it before Walters:

First, thank you for taking some of your time to talk with me about the project next to the City of Hays, about the prospects for considering a water infrastructure plan that would have rather reduced up-front costs and regulatory approvals timeline, and would be more sustainable. We need to be pursuing sustainable water strategies, of course, as that’s really a matter of long-term sustainability, including having the water supplies to continue to develop in this region. A current series of articles in the New York Times is highlighting how America is draining away a “legacy” groundwater supply, overdrawing aquifers all over the country. Which of course is true around here. Indeed, one wonders, are we really acting in a manner such that we’re planning for only One More Generation, and after that, we really don’t have a plan? So maybe this development should be “saved” from becoming just one more exponent of that “one more generation” attitude?

For water supply, you noted that you’ve drilled 3 wells on this property, each of which was found to yield enough flow to support at least some of the desired development. Given the drought-induced issues with declining aquifers around here, one must wonder just how sustainable those yields may be. So maybe consider greatly “extending” that supply with a “conjunctive management” scheme, entailing:

  • Create a PWSS [public water supply system] using one or more of the wells as its water supply source. Determine which buildings would have a level of service that would rise to the level needing the water supply to be a PWSS, and install a limited distribution system to those buildings only. That, along with the permitting of the PWSS, and of course one or more ground storage tanks, would be all of the water supply facilities that would need to be “papered”, financed and installed prior to being able to start selling those lots to builders, and to start building on them.
  • The rest of the development – be they commercial or residential properties – would employ building-scale RWH as the water supply strategy. This would entail no up-front costs, as the whole water supply system would be built as the building is built. And with building-scale RWH at a level of service that does not rise to being classified as a PWSS, there would be very little regulatory burden.
  • Of course, the building-scale RWH systems should be designed and installed so as to be as sustainable “as practical”. This would dictate that building roofprints may need to be larger than would “normally” be the case for the sorts of buildings being considered. And that of course would have impacts on building styles, so the developer, and builders, would have to be willing to consider all that.
  • As for how to accommodate the “large” roofprints, please consider what I call the “veranda strategy” for adding relatively less expensive roofprint, by adding “verandas” around the building. By this means, expanded roofprints that would render RWH systems sustainable through worse drought periods can be provided relatively cost efficiently. So we’d want to look at the sorts of spaces the developer wants to be able to market, and consider if buildings housing those spaces could be practically built, and marketed, to provide the needed roofprint.
  • Of course, there would be a cost for the cisterns required for each building-scale RWH system, but perhaps this cost could be blunted – if not “relieved” altogether? – by lower lot prices availed by there being very little up-front costs to create an overall water supply system.
  • The PWSS drawing from the well would also be the source of backup water supply for the building-scale RWH systems, if drought became too severe and the building-scale RWH systems’ cisterns became depleted. This would put control of backup supply availability within the development, so that all owners of RWH-served properties would be assured of having a backup supply whenever needed. The backup supply would be delivered from a ground storage tank via tanker truck, so that a backup water distribution system would not have to be installed.
  • Because a very large fraction of the total water supply needed in the development would be provided by the building-scale RWH systems, so “relieving” the aquifer of that routine demand, the aquifer level could be somewhat “preserved”, so that the water would indeed be there if needed as supplemental supply during prolonged drought periods.
  • Note that buildings to be served by building-scale RWH may be started, and marketed, without having to wait for the several month (minimum) permitting process for the PWSS, and for getting those facilities designed, bid and installed, so likely imparting a “time value of money” benefit.

I trust you can see the “charm” of such a scheme, in particular from the developer’s standpoint due to the low up-front cost of water supply infrastructure, and the minimization of regulatory lead time in order to begin selling lots and building on them. Again, this could be so without regard to the overall intensity of development proposed on this property … within limits, of course; there’d have to be room for the larger roofprints and such, but the overall development intensity would have to become rather “extreme” for that to functionally come into play, seems to me.

For the “waste” water system, the idea would be to creatively plat lots for “condo” development, with the total amount of development on any given lot imparting a design flow rate <5,000 gallons/day (gpd), so that the “waste” water systems could all be “septic” systems, or On-Site Sewage Facilities (OSSFs) in rules-speak, permitted at Hays County – nominally a 30-day permitting time – rather than TCEQ-permitted “municipal” systems, entailing a year or more to permit and a whole lot more paperwork, thus more cost for technical and legal services. This scheme would entail:

  • Determining the building types desired to be developed and how they may be arrayed on the property. Then gathering them into groups that would create a total design flow rate <5,000 gpd, and platting lots surrounding each such group of buildings. All of the buildings would be under “condo” ownership, with the ground being owned in common. These are the conditions required in order to use OSSFs for the “waste” water system, since TCEQ kicked “cluster systems” out of Chapter 285, the on-site wastewater rules, in 2003, and has never seen fit to formulate a “middle way”, short of the far more onerous “municipal” permitting process, in the 20 years since.
  • Presently, Hays County rules include a requirement that each lot covers an area large enough so that the total design flow rate would not exceed the equivalent of 300 gpd/acre. So for a 5,000 gpd system, the minimum lot size would have to be 5000/300 = 16.67 acres. 5,000 gpd would cover 27 2-bedroom houses or 20 3-bedroom houses, so even with this restriction, a net density of over one house per acre would be attained.
  • In working on another project that was proposing to similarly use OSSFs for the “waste” water service, it was floated to Hays County that the 300 gpd/acre requirement might be applied over the whole property, rather than imposed lot by lot. This would allow using non-lotted property – e.g., the floodplain on the property in question – as a part of the required area, and so allow denser development on the lots. Hays County had responded favorably to that proposal. (Unfortunately that project “died” due to the Dripping Springs development moratorium, so that concept was never tested and confirmed.)
  • Presuming that the 300 gpd/acre rule did not limit the total number of units that could be installed on this property “too severely”, or if that rule could be “excused”, then it’s pretty clear that this “waste” water service plan would entail no up-front costs to the overall project developer, other than what Hays County would require during the platting process to show OSSFs could provide the required level of service. This can very readily be done by posing a standard OSSF design using the High Performance Biofiltration Concept (recirculating packed-bed filter) treatment unit, which would disperse the high quality effluent this unit would consistently and reliably produce in subsurface drip irrigation fields. This is the most environmentally benign sort of system that could be practically employed in such a distributed management system.
  • Each lot owner – which may be the overall project developer, or the developer(s) who would build on each lot – would then plan, design and install the OSSF to serve the buildings on that lot. This process would entail fairly minimal lead time, after the lots had been platted and made available for building upon.
  • The drip fields would ideally be arrayed to irrigate the highest value landscaping on each lot, likely to be grounds beautification around each building, or high value common areas, like a park or “common”. Note it is not at all uncommon to import soil to create “improved” soils for landscaping around houses and other buildings in the Hill Country, so we’d be placing the drip fields in the best soils on the property. In any case, the soil depth would have to be shown to meet the rules over the whole field area, something that’s not very well “guaranteed” in land application systems under the municipal permitting process.
  • By these means, most (all?) of the grounds irrigation would be taken off the potable supply, so rather drastically conserving the overall level of water use on this property.
  • There would of course be a management system created. Formally, each OSSF would have to be covered by a maintenance contract with a TCEQ-qualified/licensed maintenance company. In practice it would be most rational to have one “master contract” that would maintain all the OSSFs on the whole development, making this essentially the “wastewater utility” covering this development. By assuring that this management system was organized and run to properly oversee the OSSFs, long-term good performance could be practically assured.
  • It may be that some buildings – like the commercial buildings that it is understood would be part of the desired development plan – could implement a flush-water recycling scheme, further saving water supply. In such buildings, only a small “residual” wastewater flow would be created by water used in lavatories, slop sinks, break room sinks, etc., and that flow could be readily dispersed over the landscaping around such buildings. Again optimally focused on maintaining the highest value landscaping, the grounds beautification. That sort of scheme was discussed in Appropriate Technology.
  • Finally, by not collecting all the “waste” water in a conventional centralized scheme, we’d avoid the environmental liabilities inherent in that configuration, due to line leaks, manhole overflows, lift station bypasses, and from the disruption inherent in installing conventional sewer lines. So this distributed scheme is inherently more environmentally benign simply by how it is arrayed.

I trust you can also appreciate the “charm” of this scheme for the project developer, as it minimizes up-front costs for the “waste” water infrastructure and time delays for dealing with regulatory processes.

Part and parcel of the overall development scheme would be a very robust LID/green infrastructure stormwater management scheme. This would be designed to retain at least as much water on the land as infiltrates, rather than runs off, on the “native” site, so as not to “desertify” the land by draining “away” the increased runoff imparted by covering the land with impervious surfaces and otherwise modifying the land surface. By this means, water quality impacts of stormwater runoff would also be practically minimized. That scheme could be rather readily designed into each lot as it is developed, so here also entailing minimal lead time to design and install. That strategy would minimize any need for end-of-pipe ponds and such, which may also pose an up-front cost and time delay that could be avoided.

By going with this sort of sustainable water infrastructure scheme, society would be saving a lot of water – that is my main interest, indeed getting society to create more sustainable water systems, to start planning beyond “one more generation” – while saving the developer time and money.

Unfortunately, when presented with this “total One Water” scheme, Walters and Milestone proved to be strangely uninterested in saving time and money, and as far as I know are still pursuing utility extensions from the City of Austin. But you can see the potential here for a development scheme that would indeed save time and money for the developer, save water for society, and would significantly blunt water quality degradation. It also avoids sprawl-inducing utility extensions, the cost of which would mitigate for more, and more intense, development, exactly the outcome we’d like to avoid on this land. This sort of win-win-win might provide a template for how the environmentally sensitive Hill Country is to be developed, again noting that in the current market in this region, a whole lot of it will be developed.

So I would urge SBCA, SOS, GEAA and their allies to consider all this, to move off top dead center, and urge developers to be a bit more creative and innovative than going with “1-acre lots on septic.”

Or so is my view on this matter. What is your view?

“One Water” = the “Decentralized Concept”

October 16, 2023

I am part of a group that is putting together a course, perhaps to be offered by Texas A&M, to teach the “One Water” concept, culminating with a “certificate” that is expected to have some meaning for practice in this arena. The major activity in the first gathering of that group was to set forth ideas on what the course content should cover. As a “conversation starter” I offered the group a rundown on what the “components” of a “One Water” scheme might include. A major theme of that document was that “One Water” schemes would be basically “distributed” concepts. Let’s explore that whole idea.

A working definition of “One Water” is offered by the Meadows Center for Water and the Environment at Texas State University as:

“One Water is an intentionally integrated approach to water that promotes the management of all water – drinking water, wastewater, stormwater, greywater – as a single resource.” [emphasis added]

This definition can be illustrated in practice by these examples:

  • Wastewater and water supply can be integrated by designing and developing the “waste” water system to focus effort and resources on producing a reclaimed water supply, preferably close to the point of reuse, which would be utilized to defray demands on the “original” water supply to the project.
  • Stormwater management and water supply can be integrated by capturing the additional runoff caused by development, in cisterns and/or landforms, rather than allowing it to “efficiently” drain “away”, with the captured water used as explicit water supply – e.g., building-scale rainwater harvesting, using water captured from roof runoff into cisterns to defray, or displace, demands on the “original” water supply – or to enhance the hydrologic integrity of the site by maintaining the rainfall-runoff characteristics of the “native” site on the developed site, holding water on the land instead of “desertifying” the land by draining “away” water that would have otherwise infiltrated to maintain deep soil moisture, to recharge aquifers, etc.

These examples highlight a fundamental imperative of “One Water” practice – to maximize the sustainability of water supplies, these integrated management strategies must focus on addressing all water as a resource to be husbanded, not as a “nuisance” to be gotten rid of, to be wasted as expeditiously as possible, which has been the focus of conventional stormwater and wastewater management practice. Of course, the water does not actually “go away” – the hydrologic cycle is a closed system on a global scale – but those practices are wasteful in that they expend resources to route the water out of, rather than to maximize its beneficial use within, the immediate environs. Each gallon that is so externalized (wasted) is another gallon that must be imported into the project, and (in this region particularly), all existing water supplies are becoming increasingly strained.

A basic principle, highlighted for example by Paul Brown of CDM – a voice from the very heart of the mainstream – in his preface to Cities of the Future, is that water is most sustainably managed by maximizing the beneficial use of these resources within the project, as much as practical, “tightening up” the loops of the hydrologic cycle, rather than externalizing those resources and then importing water from “traditional” supplies to make up for these externalized – wasted – resources. By following this “One Water” practice, the developer, the eventual users of the project, and the community-at-large will realize fiscal, societal, and environmental benefits.

A simple schematic comparing a non-integrated – or “silo’d” – system with an integrated management system is shown in the figures below, illustrating how husbanding of the water resource may be enhanced by this “tightening” of the water loops. As this illustrates, the same overall function might be obtained while imparting about one-half the draw on the “original” water supply to a project.

As this illustrates, the “One Water” methods would integrate into the land plan, rather than being appended on, divorced from the land plan, as is typical under the “silo’d” conventional management strategies. In short, water resources sustainability would be designed into the very fabric of each project. And thus, we see that “One Water” would typically be most effectively accomplished by decentralizing the water resources management infrastructure.

Before proceeding, it is important to acknowledge that context is important, and in each circumstance, the full range of available options should always be evaluated, including both “centralized” and “decentralized” schemes. So while it is presented here that distributed infrastructure seems typically more likely to produce the better “One Water” schemes, there will of course be instances in which a more conventional looking “centralized” infrastructure would prove to be the “better” option. For example, in an area with the conventional centralized “waste” water system architecture already in place, the “best” option for assuring that “waste” water realizes its resource value may be the installation of a “purple pipe” system to redistribute water from the centralized treatment plant to points of reuse.

Moving on … While as noted the intention should always be to integrate the various water management functions, in practice each of them – water supply, “waste” water management, and storm water management – are typically addressed individually, with the “integrations” generally falling out of the means and methods that are utilized. Referring to the examples listed above, water supply and “waste” water management could be integrated by utilizing the “waste” water resource to defray demands on the “original” water supply source for the project being served by that “waste” water system. Note that the water supply system itself would not be “perturbed” by this scheme, it’s just that part of the supply would now be shunted off to the “waste” water being reused to create that adjunct supply.

Generally, as also noted above, that integration, the defraying of water supply demand, would be maximized by designing that whole process into the very fabric of the development. Which brings us to the “decentralized concept” of “waste” water management.

Noting that the “One Water” concept generally entails “tightening” the water loops, integrating water management into the very fabric of development, it becomes rather obvious that “waste” water systems would be “distributed”, rather than “regionalized”, as is the mantra of much of conventional practice. The “decentralized concept of ‘waste’ water management” – set forth quite consistently by the author over the last 37 years – embodies this whole idea. This is an alternative organizing paradigm for a “waste” water system of any overall scale. Here is the basic idea, as set forth in a 1988 paper by the author:

“Stated simply, the decentralized concept holds that ‘waste’ water should be treated—and the effluent reused, if possible—as close to its source of generation as practical. In particular, it is suggested that the first stage of treatment—for which a septic tank is preferred, as outlined later—be placed at or very near to the ‘waste’ water source, regardless of how centralized the rest of the system is. The conventional “on-site” system might be viewed as the ultimate embodiment of this concept, and on-site systems might indeed be the appropriate technology for parts of the service area. In areas where soil and/or site conditions dictate that conventional on-site systems would not be environmentally sound, the septic tank effluent might be routed through further treatment processes before dispersal or reuse.

“However, the concept is very “elastic”.  In practice, it may be beneficial to aggregate several waste generators into one septic tank or to route septic tank effluent from several generators into a collective treatment and dispersal system.  The most appropriate level of aggregation at any stage of treatment would be determined by a number of considerations, such as topography, development density, type of land use, points of potential reuse, or locations where discharge is allowable.

“Judicious choice of technology at each stage of the collection and treatment system can help advance fiscal, societal and environmental goals.”

Note in particular three items in that description:

  1. The term “waste” is in quotes to highlight that “wastewater” is a complete misnomer for designating what, within a “One Water” concept, must be addressed as a water resource to the maximum extent practical in each circumstance.
  2. The over-arching aim of “wastewater” management is not “disposal” – as basically defines conventional practice – rather must be focused on producing a reclaimed water, to be used to defray demands on the “original” water supply to the project being served by the “waste” water system, to the maximum practical extent in the circumstances at hand.
  3. With the overall system distributed, practical operation of multiple distributed treatment units demands that “judicious choice” of technologies to be used to assemble the system.

This all leads to identifying the basic “tools” of the decentralized concept, as has been set forth by the author and others in any number of works in this field, those tools being:

  • Effluent sewerage is highly favored for any collection of “waste” water beyond the building site level. This is the concept of intercepting flows at, or very near to, the site of their generation in septic tanks – termed “interceptor tanks” within the effluent sewerage concept because they intercept the “big chunks”, leaving only liquid effluent with very low levels of settleable solids to be transported any further, so allowing the use of effluent sewers, instead of conventional “big pipe” sewers. This reduces the cost of the “remant” collection system that remains in the distributed system and minimizes the environmental impacts of the collection system, practically eliminating leaks, bypasses and overflows, not to mention minimizing the degree of disruption entailed in installing the smaller, shallower effluent sewer collection lines.
  • Treatment beyond the septic tank must be done with “fail-safe” technologies. The term “fail-safe” is in quotes because any sort of treatment unit must be properly operated and maintained so as to continuously and reliably produce the expected effluent quality, but certain technologies, by dint of their very nature, are more robust and “forgiving”, and so can stay “on track” with rather minimal O&M effort and attention. This is essential to creating a management system that would not become overtaxed by needing to police the multiple treatment centers that following the decentralized concept would create. What types of treatment units that fall into the “fail-safe” category, and how to design those units will likely be a matter of opinion in this field. It is the author’s long-held view that inherently stable technologies like the recirculating packed-bed filter and constructed wetlands be highly preferred in lieu of the activated sludge process that is practically the “knee-jerk” choice in conventional practice. This is because the activated sludge technology is inherently unstable, an effect that becomes more problematic in “smaller” treatment plants, really anything but the treatment plant being “based loaded”, as is extant only in large “regional” treatment plants.
  • Again, the fate of the effluent, the reclaimed water, is to serve water uses that would need to be met whether or not the effluent were available for that usage, and to properly apply the reclaimed water to serve those demands, and so defray demands on the “original” water supply source. It is to be expected that irrigation is likely the most “available” and readily served use in many cases. In particular in climates such as exist in this region, for which subsurface drip irrigation – itself a “fail-safe” technology of sorts – should be preferred, to maximize irrigation efficiency – also of course a basic “One Water” principle” – and to sequester the reclaimed water underground to minimize potential for human contact in the highly distributed reuse sites. Other uses may include toilet flushwater supply, cooling tower blowdown makeup, and perhaps even laundry water supply.

An example of how these tools could be employed to create a decentralized concept system strategy was reviewed in “This is how we do it”, showing the benefits, in particular to practically maximize the resource value of the “waste” water, of the decentralized concept scheme vs. a conventional development-scale system or a conventional centralized system.

Thus we see that in terms of “waste” water management, “One Water” basically equals the “decentralized concept” of “waste” water management. This strategy focuses investments on properly treating and effectively reusing the water resource, rather than on just moving the stuff around – which is all the far-flung collection system within the conventional centralized system architecture does – by eliminating most of that collection system. Employing creative system concepts and more “fail-safe” technologies, this scheme also makes the overall system cost effective to operate, and blunts environmental impacts inherent in “waste” water management.

Under a “One Water” strategy, water supply and storm water management would also be generally more distributed systems than they are under conventional practice.

The “traditional” view of storm water management centers on “efficient” drainage “away” from the development of increased runoff imparted by development, so as not to impart nuisance flooding of project grounds, and to assure that as this water flows “away”, it does not create downstream problems, either due to streambank erosion or overbank flooding. This “efficient drainage” viewpoint often results in whatever constructions used to attain those aims being “end-of-pipe”, essentially appended onto the project, rather than distributed facilities designed into the very fabric of the development.

An essential “One Water” strategy is to “retain, not drain” the runoff, at least up to the point that this water would have infiltrated rather than flowed “away” from the “native” site. The aim is to maintain as much as practical the “hydrologic integrity” of the site – that is, to maintain the rainfall-runoff response as similar to that of the “native” site as practical – and by treating multiple sites in a watershed in this manner, to maintain rather than degrade the hydrologic integrity of the watershed.

The means by which this would be accomplished would be implementing the “low-impact development” (LID) concept, imparting rainwater harvesting – either “formally” in cisterns or by capturing runoff in specialized landforms generally termed “green stormwater infrastructure” – on a highly distributed basis. Again, designed into rather than appended onto the development. When I first learned of LID, it immediately hit me that this is basically a “decentralized concept of storm water management”. So here too, “One Water” = the “Decentralized Concept”.

An example of this sort of distributed LID concept was illustrated in “… and Stormwater Too”, showing how rainwater harvesting from rooftops, permeable pavement, and rain gardens (bioretention beds) would create a highly efficient scheme to both hold more water on the land and to utilize the rooftop runoff to further defray landscape irrigation demands.

Another example is applying this concept to a big-box store parking lot, as is illustrated here. This shows how to indeed maintain, or better, the rainfall-runoff response of the “native” site, while providing water quality treatment just as a matter of course as the water flows through the site.

Regarding water supply, not only would building-scale rainwater harvesting (RWH) typically be a component of the storm water management scheme – so potentially creating a supply that could be used to defray water demands rather than just being drained “away” after being detained in the cistern – but some, even all, of the water supply strategy could be based on building-scale RWH. This all plays into what I have termed the Zero Net Water concept.

True to its name, Zero Net Water is a water management strategy that would result in zero demand on our conventional water supplies – rivers, reservoirs and aquifers. Under the Zero Net Water development concept, water supply is centered on building-scale rainwater harvesting, “waste” water management centers on project-scale reclamation and reuse, and stormwater management employs distributed green infrastructure to maintain the hydrologic integrity of the site. So basically it’s really another name for “One Water”. Together these result in minimal disruption of flows through a watershed even as water is harvested at the site scale and used – and reused – to support development there. In the general case, this concept might be approached – you might call it “minimum net water” – to defray but not eliminate demands on the conventional water supply sources.

Our conventional supply systems are watershed-scale rainwater harvesting systems, utilizing reservoirs, aquifers and streams as the “cisterns” to hold the water supplies awaiting various uses. Noting that an essential water use is maintenance of environmental integrity – e.g., maintaining aquifer levels so as not to reduce spring flows, keeping enough flow in streams to service various environmental functions, including delivery of water into bays and estuaries to maintain those ecologies. So we need to be saving as much water as we reasonably can, and minimizing disruption of flow through the watershed. The “minimum net water” approach would serve that end.

While it will not be belabored here, distributing the water supply system to the building scale creates an inherently more efficient system, as it eliminates transmission losses and evaporation losses from reservoirs, and it requires less energy, as the water needs to be moved only short distances, with small elevation heads. It would also be more economically efficient, as the water supply would be “grown” – thus paid for – in fairly direct proportion to water demand, one building at a time.

In closing, a more recent focus of “One Water” practice is the capture of air conditioner (AC) condensate, applying the water so captured to providing water supply, to defray demands on the “original” water supply source to the site. Clearly, condensate capture would also be a very highly distributed strategy, executed at the building or campus scale.

So it is that not only does “One Water” = the “Decentralized Concept” in the “waste” water management arena, “One Water” would optimally be a rather highly distributed strategy for water supply and storm water management as well. Moving from the conventional centralized “waste” water system architecture to the decentralized concept strategy has been a challenge for the mainstream of this field. Over the almost 4 decades I’ve been advocating for that paradigm shift, very little movement has been seen on that front. It is to be expected that moving off “end-of-pipe” management to a highly distributed LID strategy, and integrating building-scale RWH into the water supply strategy on a broad scale will be similarly challenging to the field, as these moves also require a paradigm shift.

Despite the “One Water” concept having been “a thing” in this field for many years now, despite years of “happy talk” around moving society toward a “One Water future” and such, we have seen precious little of all this hitting the ground. To the extent that, when a local environmental activist queried the U.S. Water Alliance – a major “cheerleader” for “One Water” – on examples where “total One Water” schemes, entailing all the water management functions, could be found, the response was … [crickets chirping]. One wonders, how much is that lack of movement in practice due to the paradigm shift to more decentralized schemes not having taken hold in the water resources management field? Indeed, we have a lot more work to do before it will become broadly recognized that “One Water” = the “Decentralized Concept”. But until that happens, it’s highly likely the movement toward “One Water” will continue to be very slow.

Or so is my view on this matter. What’s your view?

Be a beaver

May 13, 2023

I recently read the book Water Always Wins by Erica Gies. A major theme is that to best manage water resources as they flow through our environment, in particular to promote and enhance sustainable water, we need to be imparting a “Slow Water” regime. We need to use management methods that blunt the rushing off of runoff and instead install means and methods that slow the flow, causing it to spread so that more of the water soaks in instead of running off. By these means, Gies argues, we can retain, and restore, the hydrologic integrity of sites, and by so treating a multiplicity of sites, of the watershed, so it would deliver more water supply.

A chapter in the book is about beaver, the undisputed champion in the animal kingdom at modifying and improving the hydrology of watersheds. The instinctive dam-building activities of beaver indeed create a Slow Water regime. They dam the flow, causing water to spread across the floodplain, filling up the “sponge” that landform creates, so storing much more water in the watershed than would occur otherwise. This slowing and retention of the water decreases downstream flooding, as more rainfall is detained/retained rather than zooming downstream. A good bit of this retained water would over time augment downstream baseflow, so improving the riparian ecology. The dams also retain sediment and so improve downstream water quality. All of this also stores carbon, and so helps to blunt climate change.

It struck me while reading that chapter that being like a beaver and creating a Slow Water regime is essentially what we did in the Villa Court project. The 3505 Villa Court project lies between Garden Villa and South 5th Street, in the Bouldin Creek watershed in South Austin. It’s a 13-unit townhome project that was installed on 1.43 acres of formerly “vacant” land. The final layout of the project imparted 67% impervious cover of the land.

In 2010, I was approached by PSW, the developer of many townhome projects in Austin, to help them obtain the water quality permit they would need to execute this development. They related that a senior level person in the City of Austin Watershed Protection Department had told them, “You need a retention-irrigation system.” Won’t bore you with the details of that method, except to say it would have entailed encumbering the back yards of all the townhomes along the downslope edge of the property with tanks to store the runoff, and taken up just about every square inch of greenspace in the development plan to “irrigate” the water gathered into those tanks. I looked at that for, oh, about 2 or 3 nanoseconds, then said, “This is insane.”

I told them, we could provide the water quality treatment for this project with rain gardens, and we could do that without needing any variances. This surprised them, and the folks at Watershed Protection too, as no one had run this idea at them before that. But I showed them that the rules did indeed support the rain garden scheme. Besides meeting the formal water quality management requirements, the rain gardens would capture and insoak the “excess” runoff created by development and keep about as much water on the land as had been soaked up by the land in its “natural” state. This despite a rather high percentage of the land having impervious cover. So we proceeded to be like a beaver.

“Rain garden”, as a formal water quality device, is a term often used for a full infiltration bioretention bed. This device is a vegetated bed of a specialized soil mix that intercepts runoff from its drainage area. Under Austin rules, the volume of water to be captured is termed the “water quality volume”. This is the volume of runoff from the drainage area generated by the “water quality depth” of runoff from the area. The water quality depth required to be captured under Austin rules increases with the percentage of impervious cover over the drainage area tributary to each rain garden. The idea is that the more impervious cover, the more the balance would be shifted from infiltration to runoff of rainfall, so a larger volume of water would need to be captured in order blunt that shift.

The captured water volume would be stored in the bed until it is infiltrated into the soil under and surrounding the rain garden. This process both intercepts pollution entrained in the runoff – “treating” it as it flows down through the biofiltration bed root zone – and retains the water quality volume on site, rather than allowing it to run directly off, so helping to control and mitigate downstream flooding and channel erosion.

The 3505 Villa Court water quality management plan entailed several rain gardens, distributed around the site. Five of them intercepted runoff from rooftops, pavement and green spaces. Another seven captured runoff from a rooftop that could not be routed to one of the other five rain gardens, so that these areas of impervious cover would be “treated”. The project layout is shown in the figure below.

This shows how indeed we intercepted, spread and infiltrated runoff flows. As noted, this restores to some degree the rainfall-runoff response of the “natural” site, as it blunts the increase in runoff immediately leaving the site that installing impervious cover on the site would otherwise impart. This indeed mimics to an extent the impacts that beaver have on a stream, only this is applied in the uplands rather than in a streambed, intercepting and insoaking the increased runoff on its way to a stream. But the ultimate impact is largely the same, damming up the flows, holding – storing – water on the land. And reaping the benefits of that.

Some of this water would infiltrate and ultimately migrate to a stream, to impart baseflow, just as water seeping out of the floodplain “sponge” that beaver create would impart baseflow downstream. As noted, the rain gardens also intercept and store sediment entrained in runoff, just as beaver ponds do, and also intercept and “treat” pollutants that developing land causes to be entrained in runoff. Then too, there would be carbon sequestration in the rain garden beds. So in imparting the water quality management scheme at Villa Court, we were being like a beaver, mimicking the ecological wisdom they evolved to deliver to the landscape.

Now of course the retention-irrigation scheme, entailing interception of runoff and subsequently spraying it over landscape to largely infiltrate the water that would have otherwise run off, would have also helped to maintain the hydrologic integrity of the site. But that would have been acting like a human, using a failure-prone mechanical system, requiring assured power supply, to redistribute the water over the site. Noting that little actual irrigation benefit would be obtained by spraying the captured runoff over the green spaces, because the storage tanks have to be evacuated only a couple days after the rain fell, which had just “irrigated” those spaces. Any sediment removal would have to be accomplished by “actively” cleaning the storage tanks, into which the “raw” runoff would flow. Rather than incorporating the sediment into the plant-soil system on the site just as a matter of course, this would create a “waste” stream that would have to be “disposed of”. Then too, the shallow-rooted turf that would be “irrigated” would not sequester anywhere near the level of carbon that the deep, more biologically complex rain garden root zone would.

All this illustrates we would be well served to be like a beaver in the way we design sites for water quality management, in the manner we did at Villa Court. Damming up water flows to create a Slow Water regime at the site level, holding up and spreading out the flows, infiltrating on the site “excess” runoff imparted by development, rather than it rushing away, so retaining/restoring the hydrologic integrity of the site. And by a multiplicity of such sites in a watershed, we can retain/restore the hydrologic integrity of the watershed. It could fairly be stated that if every site in the Bouldin Creek watershed had been treated like Villa Court was, we would still have baseflow in Bouldin Creek. And that would enhance the ecology of the whole watershed.

Almost like re-introducing beaver to the watershed would. So when considering how to manage storm water as a site is being developed, be a beaver.

Comment

March 28, 2022

March 28, 2022

Office of the Chief Clerk, MC 105

Texas Commission on Environmental Quality

P.O. Box 13087

Austin, TX 78711-3087

COMMENTS TO TCEQ REGARDING “PRISTINE STREAMS PETITION”

A petition, styled as the “Pristine Streams Petition”, permit no. 2022-014-PET-NR, has been filed with the Texas Commission on Environmental Quality (TCEQ) requesting that TCEQ adopt a rule that would bar any discharges from TCEQ-permitted “waste” water treatment plants into 23 stream segments that monitoring has found to have very low “native” concentrations of phosphorus. The aim of this proposed rule is to keep these stream segments “pristine”, in regard to their phosphorus concentrations, and thus free of “excess” algae growth that it has been documented “waste” water effluent discharges into such streams will induce.

I agree that such a rule should be promulgated, in an attempt to accomplish its stated aim. This rule is truly the minimal effort that the State of Texas should consider in an effort to blunt degradation of water quality in the waters of the state. It is in fact rather astounding that it would take a petition to get TCEQ to consider such a rule, as protecting “pristine streams” from the sort of degradation due to “waste” water discharges that have been witnessed is really in the “no-brainer” category. But as is reviewed below, establishing such a rule really needs to be just the beginning point of a larger effort to improve the state of “waste” water management in the state of Texas, including – if not especially – in regard to the fate of this water resource that we mistakenly identify as “wastewater”.

You see, a very basic problem that plagues this societal function is that TCEQ insists upon viewing this function through the lens of “disposal of a nuisance”, when really what society needs – in this region in particular, with the looming water supply issues we face – is for us to address this function with a strong emphasis on “utilization of a resource”. As TCEQ appears to “understand” this function, it is required that a full-blown “disposal” system be in place before an applicant can even start to consider the resource value of the water and how best to reuse this water so as to defray demands on the “original” water supply.

The practical outcome of TCEQ’s insistence on the “disposal of a nuisance” focus is that it is difficult to design reuse into the very fabric of development, so as to practically maximize the reuse benefit of this water resource. Which is how we should be addressing this societal function, just as a matter of course.

Under the “disposal of a nuisance” construct, the permitting process is typically a two-part endeavor; first a permit for the “disposal” system, and then another permit to route this water resource to reuse. The “clunkiness” of this process is an open-ended “invitation” to applicants to just throw this water resource away, to just do the “disposal” system and not incur the extra work, and cost, of addressing the reuse part. That is why it has been witnessed that the majority of “land application” systems are in effect “land dumping” systems, with the effluent routed to “disposal fields” – where any irrigation benefit is just coincidental – as the manner in which to make this water “go away”. Indeed, in every reference to the Dripping Springs application to expand its dispersal fields, for example, the proposed fields are called “disposal” fields. Indeed, while Dripping Springs asserts that they intend to reuse their effluent rather than “dispose” of it, it appears that all their efforts to date indeed focus on “disposal”, with no actual reuse system infrastructure being apparent. This is a dynamic that has to be blunted, with the process reoriented to focus on the beneficial reuse of this water resource, if local society is to most effectively address its looming water supply crisis.

Then too it must be recognized that centralization of “waste” water issuing from miles around to one point source treatment plant is the whole predicate for even considering stream discharge as the fate of this water resource. Indeed it is a motivator to just discharge into a stream to “get rid of it”. Once the “waste” water has been gathered to that one point source, stream discharge will always be the “cost efficient” way to “manage” that flow (depending perhaps on whether TCEQ would assign effluent limits that would not degrade the receiving stream’s water quality, which it does not seem inclined to do), especially given the high cost that would already be incurred to gather that flow to the centralized point source. A large majority of the total cost of a conventional centralized “waste” water system is the cost of the collection system, investments that really do nothing but move the stuff around, not really contributing to resolution of the root problem of managing that water resource. So once a community has dedicated a considerable investment to making that water “go away”, they would be less than excited to incur a similar investment to redistribute this water resource from the centralized point source to points were it could be beneficially reused, often being back where the water came from in the first place!

So it is that we should be considering how we can decentralize the “waste” water system, to shorten the water loops so as to minimize the investment dedicated to just moving the stuff around, and better enable designing reuse of this water resource into the very fabric of development. But this runs afoul of TCEQ’s “regionalization” policy, stated in TAC Chapter 26.081. That this holds sway over the institutional infrastructure that addresses this societal function is attested to in Dripping Springs’ “moratorium” ordinance, which states that the city “understands” the conventional centralized system architecture to be the only form of “waste” water system infrastructure that is “blessed” by TCEQ. As if this is religion, not science and technology, so that no considerations of the form and function of that infrastructure that do not conform to the “Book of TCEQ”, Chapter 26, verse 081, is allowed to be countenanced. A moment’s reflection should be plenty to conclude that there is more than one way to skin this cat. But at present it appears that most of our institutional infrastructure will not countenance any such reflection. Again, as if this is religion.

The need for re-examining the infrastructure model is perhaps most critical in exactly the areas such as those where those 23 “pristine” streams lie. A major reason they remain “pristine” is because there has been little development in areas tributary to these streams. So these are places where we have a “blank slate”, where we are not beholden to sunk costs in the conventional centralized system architecture, so could readily entertain another infrastructure model. That was, for example, the situation in which Dripping Springs found itself, yet again they chose fealty to the “Book of TCEQ” over a rational consideration of the full range of options available to them.

One such option is a “decentralized concept” strategy. Cut to its most basic, the decentralized concept holds that “waste” water is most effectively and efficiently managed by treating, and reusing to the maximum practical extent, the “waste” water as close as practical to where it is generated. As noted, this approach would work with, rather than against, the whole idea of integrating reuse into the very fabric of development, as if management of this water as a resource from its very point of generation was a central point. Rather than considering that whole matter as something you might append on to redistribute the water gathered at the end of the pipe, as if the whole matter of this water being a resource was just an afterthought.

This “decentralized concept” idea has been out there for decades, having been the subject of many works considering the concept, much of it funded and conducted under the auspices of EPA. Indeed, a “finding” was issued by Congress in the 1990s that this general approach is a legitimate manner of addressing “waste” water management needs. It is generally understood among those who have chosen to examine this matter that the decentralized concept strategy has the potential to produce “waste” water systems that are more fiscally reasonable, more societally responsible, and more environmentally benign than those systems which implement the conventional centralized system architecture. But in practice any meaningful consideration of the road not taken has been blunted, so that good examples of such practice remain rather few and far between. In no small part due to such circumstances as “regionalization” being an “article of faith” in TCEQ.

However, in the context of “developing” areas, this matter has been presented to TCEQ, and to society’s various institutional actors – city administrations and utility operators, the engineering community, the development community, and the environmental community. An example of this was presented on the Waterblogue in 2014, entitled “This is how we do it” (https://waterblogue.com/2014/09/24/this-is-how-we-do-it/, considered to be part and parcel of this comment), showing in the context of one development in the Dripping Springs hinterlands how a reuse-focused decentralized concept strategy could work in that environment, in a manner that would not only integrate reuse into the very fabric of the development, thus maximizing ability to defray demands on the “original” water supply, but would also be more globally cost efficient than centralizing that development into a Dripping Springs “regional” system. The fiscal, societal and environmental advantages of this strategy were further reviewed on the Waterblogue in 2016, in the piece entitled “Let’s Compare” (https://waterblogue.com/2016/09/26/lets-compare/, considered to be part and parcel of this comment). This application of the concept was explicitly reviewed with TCEQ, and it was the conclusion of the folks with whom it was reviewed that this strategy could be permitted under current rules. So it could readily deliver those fiscal, societal and environmental advantages in many developments in the hinterlands, such as those that may be installed on or near the “pristine” streams, NOW.

So it is that there is ample reason to reconsider the “dogma” of “regionalization” and to consider the road not taken. In particular in areas where there is little sunk cost in the ground that must be respected going forward, such as the areas where, in the main, those 23 “pristine” streams are located. Reinventing the “waste” water system infrastructure model in those sorts of areas can deliver systems that are more fiscally reasonable, more societally responsible and more environmentally benign than would be attained by pressing down on the cookie cutter and spewing out the conventional centralized system architecture, without regard to the nature of the circumstances. TCEQ must examine this matter and consider how it can best serve the citizens of Texas in regard to how the “waste” water resource is to be managed, especially around those “pristine” streams.

So again, please do protect those 23 “pristine” streams from “waste” water discharges. This is simply a “no-brainer” thing to do. But also open up the whole process of planning for how growth and development will be managed in such areas, to take a long hard look down the road not taken. As that all the difference could make. [apologies to Robert Frost]

Respectfully submitted,

David Venhuizen, P.E.

Austin, Texas