Archive for the ‘Uncategorized’ category

Water for DFW – Building-scale rainwater harvesting vs. Marvin Nichols

August 7, 2014

In the last post we reviewed the potential of building-scale rainwater harvesting (RWH) as a water supply strategy in the high-growth area around Austin, in Central Texas. Here, we examine its potential in another high-growth area of Texas, the Dallas-Fort Worth area, commonly called the Metroplex. And then we will contrast that strategy with doubling down on the watershed-scale rainwater harvesting strategy, as may be represented by the proposed Marvin Nichols Reservoir.

To gain an appreciation for the potential of building-scale RWH in and around the Metroplex, modeling was executed for the following locations: Athens and Terrell to the east-southeast, Ferris closer in to the south, Cleburne to the southwest, Weatherford to the west, Bowie to the northwest, Sherman to the north-northeast, and Denton closer in to the north-northwest. Ringing the Metroplex, these locations offer an overview of conditions all around it.

As was the case for the modeling results of the Central Texas locations, it was seen that “right-sized” building-scale RWH systems around the Metroplex would have provided 97-99% of total interior supply through the recent drought period for houses modeled with a presumed average water usage rate of 45 gallon/person/day. But around the Metroplex, the “right-sized” systems would be somewhat smaller than would be required around Austin. Recall that the “right-sized” system there to serve a 4-person household would be a roofprint of 4,500 sq. ft. and a cistern volume of 35,000 gallons. In Bowie, Weatherford and Cleburne, the “right-sized” system for a 4-person household would require only 3,750 sq. ft. of roofprint, paired with a 25,000-gallon cistern in Cleburne and Weatherford and a 27,500-gallon cistern in Bowie. All other locations would require 3,250-3,500 sq. ft. of roofprint and 20,000-25,000 gallons of cistern capacity. It is expected that a one-story house plan with a 2-car garage plus a “typical” area of covered patios/porches could provide a roofprint of 3,000-3,500 sq. ft., so these modeling results indicate many houses in/around the Metroplex would not require any “extra” roofprint to be added on.

As reviewed in the last post, a usage rate of 45 gallons/person/day should be readily attainable by most people, given a house fitted with the current stock of water fixtures, but a lower rate could be routinely attained by people even moderately attentive to conserving water. If a usage rate of 40 gallons/person/day were routinely attained around the Metroplex, the “right-sized” systems that would have provided 97-100% of total interior supply for a 4-person household through the recent drought period would require 3,000-3,500 sq. ft. of roofprint and 17,500-20,000 gallons of cistern capacity for a 4-person household.

Just as in Central Texas, with the baby boomers reaching retirement age and demographics tending toward more one and two-person households in all age groups, a significant part of the market might be made up of houses that could be “right-sized” for a 2-person occupancy. Modeling this occupancy around the Metroplex, at a water usage rate of 45 gallons/person/day the “right-sized” system that would have covered 97-99% of total interior demand would have a roofprint of 1,750-2,000 sq. ft. of roofprint and a cistern capacity of 10,000-15,000 gallons. At a water usage rate of 40 gallons/person/day, a “right-sized” system covering 97-100% of interior demand would require a roofprint of 1,750 sq. ft. and a cistern capacity of only 10,000 gallons, except for Bowie where a 12,500-gallon cistern would have been required. Since it is expected that a one-story house plan plus garage or carport and modest area of covered patios/porches would provide about 2,000 sq. ft. of roofprint, this market could use building-scale RWH without requiring any “extra” roofprint, and would incur relatively modest cistern costs.

So the water supply potential of building-scale RWH around the Metroplex is pretty clear. Yet there is not a mention of this strategy in the planning documents of state planning Region C, the area around the Metroplex. Actually there is no respect shown for this strategy in any of the regional plans, and the state water plan explicitly dismisses it, stating, “While it is often a component of municipal water conservation programs, rainwater harvesting was not recommended as a water management strategy to meet needs since … the volume of water may not be available during drought conditions.” Which is to say that because a “right-sized” system may need 1-3% of the total supply from alternative sources during severe drought periods, this strategy is deemed not to exist at all!

This is likely due to the water planners being guided by a mental model that does not comprehend building-scale RWH as a consciously chosen broadscale strategy, as perhaps the water supply strategy in whole developments.  This was the subject of an investigation, funded by the Texas Water Development Board, that I ran a couple years ago, in which it was brought out that this strategy confers a number of advantages relative to conventional – or watershed-scale RWH – water supply systems. One of the issues considered was provision of backup supply, but only on the basis of the “mechanics” of delivering it. Not fettered by the mainstream’s mental model, it had not occurred to me to question the whole strategy because some small amount of backup supply would no doubt be needed – indeed, the whole idea of “right-sizing” was to cover water demands in all but the worst drought periods and plan on providing a backup supply, presuming that the relieved capacity offered by building-scale RWH would make such a supply available from the sources so relieved.

Still, this does beg the question of from exactly where that backup supply would be derived. As noted in the last post, the building-scale RWH strategy should be considered in the context of “conjunctive management”. Building-scale RWH would divert the vast majority of the demand off of the conventional sources, so decreasing the routine drawdown of those supplies, thus leaving in them the capacity to provide the small amount of backup supply. Of course, if it is presumed that any development on building-scale RWH is in addition to rather than in place of development drawing from those conventional supplies, and that this other development would be of such extent that it would tax the available supply sources during those drought periods, then there may indeed be a question of whether the capacity to provide backup supply for building-scale RWH systems would be available. It will require another whole study to examine how a conjunctive management concept could work in practice. Until the mainstream water planners can get around their mental model and recognize the inherent potential of building-scale RWH, however, it is unlikely that any such study would get funded.

Around the Metroplex, however, modeling shows that, unless the drought gets more severe than has been experienced since 2007, essentially 100% of interior demands could be provided by upsizing the roofprint and/or cistern volume only a modest amount above what is reported above. The worst case would be in Bowie, where a roofprint of 4,000 sq. ft. and a cistern capacity of 30,000 gallons would be required for a 4-person household using water at a rate of 45 gallons/person/day.

So we can provide interior water usage with building-scale RWH, but why should we, rather than continuing to expand and perpetuate the watershed-scale RWH strategy? Consideration of the problems and hazards of building Marvin Nichols Reservoir offers some insights into that.

Marvin Nichols Reservoir would be located in northeast Texas, about 115 miles east-northeast of the Metroplex. The Region C report offers this about that project:

“As a major reservoir project, Marvin Nichols Reservoir will have significant environmental impacts. The reservoir would inundate about 68,000 acres. The 1984 U.S. Fish and Wildlife Service Bottomland Hardwood Preservation Program classified some of the land that would be flooded as a Priority 1 bottomland hardwood site, which is “excellent quality bottomlands of high value to key waterfowl species.” … Permitting the project and developing appropriate mitigation for the unavoidable impacts will require years, and it is important that water suppliers start that process well in advance of the need for water from the project. Development of the Marvin Nichols Reservoir will require an interbasin transfer permit to bring the water from the Sulpher River Basin to the Trinity River Basin. The project will include a major water transmission system to bring the new supply to the Metroplex.”

Unstated is that many people in the area that would be impacted are highly opposed to this project, due in large part to those “unavoidable impacts.” This is a battle of economic interests – those in the Metroplex that purport a need for this water vs. those, such as the timber producers, that would be eliminated by the reservoir. Indeed, the official position of the planning process in planning Region D, where the reservoir would be located, is in opposition to the project, and it is not included in their plan. This contrasts with deriving “new” water supply from building-scale RWH, which would have positive economic impacts in Region C – benefiting businesses that would design, install and maintain the building-scale RWH systems – and no negative impacts in Region D.

As noted, utilizing in the Metroplex any of the water collected in this reservoir would require a huge investment in transmission facilities – pipelines and pump stations – and on-going operating costs to maintain them and for energy to run the pumps. Of course the water would need to be treated, also entailing considerable energy requirements. Since it takes water to make energy, this would cut into the water use efficiency from this source. And making that energy would also generate greenhouse gases, which would exacerbate the already problematic impacts of climate change on regional water resources. This contrasts with the building-scale RWH strategy, which would not require any transmission facilities and would require far less energy to treat and pressurize the water for use within the building.

As the Region C report states, it will take a long time to permit and build this reservoir and the transmission facilities, meaning delivery of the first drop of water is decades away. In contrast, the building-scale RWH strategy could begin delivering water supply immediately, and grow in lockstep with demand, one building at a time.

The passage from the Region C report refers only peripherally to the ecosystem services that flooding the land would eliminate or damage, noting only loss of habitat for “key waterfowl species”, without quantifying how critical to the well-being or survival of those species that loss may be. That of course would be sorted out in the process of preparing the environmental impact analysis that will be required as part of the permitting process, another expense that would be obviated by the building-scale RWH strategy. But those ecosystem services go well beyond their impact on birds. Eliminating the timberlands loses the oxygen production and carbon sequestration they provide, along with habitat for many other plants and animals. Forests are also important to maintaining water quality and to the storage and release of water for environmental flows, which would instead need to be provided “artificially”, with water from the reservoir of degraded quality, including thermal impacts. None of these “externalities” figure into the cost of water projected for this strategy, significantly “warping” the analysis.

There would also be significant losses from the watershed-scale rainwater harvesting system this reservoir would create. Huge evaporation losses from the reservoir would be incurred, and there would be significant losses in the transmission system. In contrast, the building-scale RWH strategy would suffer no such losses.

The Region C report also states, “… the unit cost [of the water supply the reservoir would provide] is less than that of most other major water management strategies.” While at the end of the day the overall direct cost of Marvin Nichols Reservoir and its required infrastructure might be less than the aggregate direct cost of the number of building-scale RWH systems that would provide equivalent supply – which it is noted has not been developed in the Region C report for comparison – much of the cost of the former would need to be expended well up front of delivering the first drop of water to the Metroplex, and all that investment would be at risk. The costs of the building-scale RWH strategy, on the other hand, would be incurred incrementally, one building supply system at a time, so the delivery of supply would pretty directly track the capital requirements. This works with the “time value of money” to defray the global long-term cost of the building-scale RWH strategy. So it is not at all clear that the global cost of the Marvin Nichols option, even neglecting the externalities which the Region C report ignores, would be less.

In summary, broadscale implementation of building-scale rainwater harvesting may provide sufficient supply so that the conventional sources would be sufficiently “relieved”, allowing growth to be sustained without requiring new reservoirs. And it may do so at a cost that would be competitive with the global costs of continuing to extend and perpetuate the watershed-scale rainwater harvesting strategy, which would require going far afield to obtain additional new supply. Yet this is, quite consciously, the road not taken by the water planners in Region C. Or, as noted, anywhere else in the state where building new reservoirs, raiding remote aquifers, and other conventional supply strategies are purported to be needed to support projected growth. Time to re-evaluate?

 

Rainwater Harvesting for Water Supply – By The Numbers

July 3, 2014

In “Zero Net Water” the case was made for centering water supply on building-scale rainwater harvesting (RWH). Here we look in more detail into the potential of that strategy to provide water supply in Central Texas, parts of which are forecast to have considerable population growth over the next few decades. Since it is in new development where the Zero Net Water concept would be best applied, this area is a prime target for that strategy.

As reviewed in “Zero Net Water”, a modeling process was used to determine the “right-size” of a rainwater harvesting system to supply interior usage in houses. Modeling was executed presuming a 4-person occupancy in “standard” subdivisions and a 2-person occupancy in subdivisions targeted at seniors. A “right-sized” system is one that has a roofprint and cistern volume relative to the expected water demand profile such that backup supply would only be required in the worst drought years, and even then would be rather limited. This is specified so that the demand for backup supply in these houses from our “normal” supply sources would be minimized, and in recognition that a trucked-in backup supply – expected to be the dominant mode of providing that supply for a number of reasons that are not belabored here – would be stressed if backup supply requirements were not so limited.

First we examine locations tributary to the Highland Lakes, which currently provide the water supply for Austin and much of the area around the lakes, including such fast-growing places as Bee Cave and Dripping Springs. The inherently greater efficiency of building-scale RWH vs. watershed-scale RWH noted in “Zero Net Water” is illustrated by modeling these locations in that tributary area:  Brownwood, Burnet, Fredericksburg, Llano, Menard, San Saba and Spicewood. Only in Brownwood and Menard, located further to the north and west in this area, does the modeling indicate that any backup supply would have been required after the extreme drought year of 2011, while the “right-sized” RWH systems would have provided all the interior water supply since then in all the other locations. This contrasts to how the lakes have “performed” as the watershed-scale “cistern” over that period, as they remain chronically low, not “recovering” after 2011 in the way the “right-sized” building-scale RWH systems would have.

The “right-sized” building-scale RWH systems would have provided 95-98% of the interior demands over the recent drought period at these locations. Using building-scale RWH for interior water supply would have relieved the lakes of having to provide that supply, thus they would have been drawn down more slowly if that had been a broadscale practice. So even though backup supplies to provide the 2-5% deficit may have been drawn out of the lakes – or withdrawn from streams flowing into them – the overall result would have been to significantly conserve region-wide water supply over the modeling period.

Now looking at Austin proper, and at Dripping Springs, as representative of the high-growth areas in this region, we see that a “right-sized” building-scale RWH system would have provided 96-98% of interior demands in the recent drought period through 2013. Indeed, even with 2014 having been very dry well into May, the models show that no backup supply would have been required to date in 2014 as well.

Based on a modeled demand rate of 45 gallons/person/day and an occupancy of 4 persons, “right-sized systems for single-family homes around Austin and Dripping Springs require 4,500 sq. ft. of roofprint and a 35,000-gallon cistern to have provided 97-98% of interior demand through the current drought period. These are fairly large, and would impose significant costs, so the impact of better demand control – water conservation – was also examined.

A demand rate of 45 gallons/person/day is reported by the American Water Works Association to be routinely expected for a residence equipped with state-of-the-art fixtures in which the users give “reasonably” conscientious attention to demand control – e.g., it presumes minimal leakage losses, “reasonable” showering time, etc. It is understood, however, that better demand control is readily attainable. My personal experience is a case in point. According to our winter water bills my wife and I have an average interior demand rate of 37 gallons/person/day for our two-person household. As we are served by the watershed-scale Austin Water RWH system, not a building-scale RWH system, we have no particular impetus to “highly” conserve, as would a rainwater harvester who could see the cistern volume dwindling when rain is scarce. The only “highly” efficient appliance in our house is a front-loading washing machine; all the rest are 1990s-era fixtures. One can conclude, therefore, that something in the range of 35-40 gallons/person/day is a demand rate that is readily attainable without any “crimping” of lifestyle.

Indeed, a lower demand rate is typically presumed by those who design and install building-scale RWH systems, with 35 gallons/person/day being routinely presumed. So the models were also run using a demand rate of 40 and 35 gallons/person/day. At 40, a “right-sized” system that would have attained that same 97-98% coverage of interior water demand requires 4,000 sq. ft. of roofprint and a 30,000-gallon cistern. At 35 gallons/person/day, 96-97% of interior demand would have been covered with a 3,500 sq. ft. roofprint and a 25,000-gallon cistern. All these results presume 4-person occupancy in the house, which is above what demographics indicates is the average household size in most single-family residential developments around Austin and in the Hill Country, so it is expected that this sizing criteria would adequately supply the demands in most new houses.

These findings indicate that attaining very good demand control can significantly decrease the scale of facilities needed to “right-size” the building-scale RHW system, which would significantly reduce their costs. A single-story house plus garage and a “normal” area of covered porches/patios might provide 3,500 sq. ft. of roofprint, so an RWH house “right-sized” for a demand rate of 35 gallons would not require “extra” roofprint to be fit into the plan, so would not entail a cost increase to provide the required roofprint. And with the cistern being the costliest component of a building-scale RWH system, reducing its size contributes significantly to rendering the overall system more cost efficient.

With the baby boomers coming to retirement age, and single people and “DINKS” (dual income, no kids) being significant demographics, many building-scale RWH systems may be sized to serve 2-person households, for which the “right-sized” systems would be much smaller. Modeling in Austin and Dripping Springs shows that, with a demand rate of 45 gallons/person/day, a roofprint of 2,500 sq. ft. and a cistern volume of 17,500 gallons would have covered 97-98% of interior demands through the recent drought period. At a demand rate of 40 gallons/person/day, this result would have been attained with a roofprint of only 2,000 sq. ft. along with that 17,500-gallon cistern. If demand rate averaged 35 gallons/person/day, then a roofprint of 2,000 sq. ft. along with a 12,500-gallon cistern would have covered 97-98% of total interior demand. A small single-story house plus garage or carport and a “reasonable” area of covered porch/patio would provide that 2,000 sq. ft. roofprint, thus requiring no “extra” roofprint to be paid for. So, with significantly smaller cisterns being required, this market could more cost efficiently employ a building-scale RWH water supply strategy.

A model was also run covering the drought of record period from the late 1940s to the mid-late 1950s. The worst portion of that drought was from 1950 to 1956. Model results show that for all the scenarios reported above, a “right-sized” building-scale RWH system would have covered 92-95% of the interior water demands through that period. Comparing the rainfall deficits relative to long-term averages, it is seen that the 1950-1956 period was somewhat more “intense” than the recent drought period; while 2011 was the worst year on record, overall the current drought has not (yet) approached the severity of the drought of record. Even under the drought of record condition, however, it is seen that a “right-sized” building-scale RWH system would have provided the vast majority of interior water demands.

Many commercial and institutional buildings would also have a roofprint to water demand ratio that would be favorable to building-scale RWH. For example, a system for a two-story office building in which water usage rate is 5 gallons/person/day (typical toilet and lavatory use by an office employee) might have provided ~99% of water demand through the recent drought period. Whole campuses of such buildings might be built without having to install any conventional water and wastewater infrastructure, using wastewater treated at the building scale, perhaps supplemented by condensate capture, to supply toilets and all irrigation of the grounds, so allowing a smaller cistern to be installed, or allowing a higher water usage rate – e.g., to also cover food service – while still providing essentially all the demand. Capturing roof runoff in the RWH system would also reduce the stormwater management problem in such a development, enhancing the benefit of this strategy.

We can see therefore that building-scale RWH has great potential for relieving stress on the watershed-scale RWH systems that compose our “normal” water supply strategies, and could blunt the need for such high-cost options as desalination, direct potable reuse, or long-distance transfers from remote water sources. So even though building-scale RWH is relatively expensive in capital costs, it may be cost efficient relative to other options, while also offering low long-term operating costs.

One of those costs is for energy to pump and treat water. Building-scale RWH is a strategy that would entail relatively low energy use. Since the water loop is “tight”, water would be pumped only very short distances with little elevation head to overcome. This would save even more water, since it takes water to produce electricity to drive pumps – the so-called “water-energy nexus”.

On the basis of water usage efficiency, then, the building-scale rainwater harvesting strategy is well-worth serious consideration as a major means of serving the increasing demands which would be imparted by the projected growth in Central Texas. The same can be demonstrated for other high-growth regions in Texas, such as the Dallas-Forth Worth area.

Yet the present State Water Plan utterly rejects building-scale RWH as having any merit as a water supply strategy. I am told the reason for this is because the mental model of our controlling institutions sees building-scale RWH as “unreliable” because the cisterns may run dry during severe drought and require those minor fractions of total supply to be added to them from other sources. The counter to this is to think of it as “conjunctive management” of the total water resource, with the RWH systems diverting demand from other sources, decreasing their routine drawdown so that they have the capacity to provide the backup supply.

This highlights that, as noted in “Zero Net Water” there are challenges to be addressed, but those challenges may be less problematic than those posed by desalination, direct potable reuse or long-distance transfer schemes. So water policy makers should be called upon to recognize this clear potential and to incorporate this strategy into their water planning going forward.

It is noted in closing that the analyses reported in this post addressed only interior water usage. As reviewed in “Zero Net Water”, that concept envisions exterior usage – irrigation – to be largely supplied by localized reclamation and reuse of the “waste” water produced in the buildings being supplied by building-scale rainwater harvesting. In itself that tight-looped “decentralized concept” of wastewater management is a more highly efficient strategy – in regard to both money and water – than the conventional long-looped “regional” system, as was generally reviewed in “It’s the infrastructure, stupid”. That aspect of the Zero Net Water concept will be further considered in a future post.

 

Stormwater Management Can Be “Green” Too

March 16, 2014

Meet Dr. Katherine Lieberknecht. She is a professor in the University of Texas School of Architecture who proposes the revolutionary idea that stormwater runoff can – and should – be managed as a water resource, rather than as nuisance to be drained “away” as “efficiently” as practical. This is “revolutionary”, of course, only to the conventional mindset, whose nuisance-centric mental model of stormwater management unfortunately continues to hold sway over much of the regulatory machinery and design community. Coming to understand the management strategy that Katherine advocates and to move that to the fore of mainstream practice is a deep conservation strategy; it is how we can make stormwater management “green” too, helping to move us toward sustainable water.

Katherine suggests capturing natural cycles of water movement through the landscape at the neighborhood scale, asserting this would lead to cost savings – including, notably, energy savings by minimizing irrigation needs – and would enhance habitat value of the landscaping. Medians, rights-of-way, parks – just about any greenspace can be made multi-functional, designed to hold onto stormwater instead of to shed it, serving as beautification and accomplishing water quality and quantity management goals. In this vein, Katherine suggests that a community attitude should be fostered. Shared, collaborative and cooperative “storage” of stormwater is urged, utilizing otherwise relatively hydrologically functionless areas – for example church grounds – to hold more water on the land. Throw in green roofs and the areas covered by buildings can also be made more hydrologically functional, holding water on the site instead of making it flow “away”. Or harvest the roof runoff and store it in cisterns to provide irrigation supply, which also effectively holds more water on the land.

The landscape-based strategy can be manifested in two basic ways. One is to design the greenspace as “formal” water quality management devices, such as bioretention beds, that infiltrate the water that flows into them. The other is to use a type of landscaping, such as wildflower meadows or restored native prairie, that naturally hold more water on the land, and perhaps to enhance that by utilizing permaculture techniques that create micro-ponding areas to further increase the amount of runoff that gets infiltrated. Consulting a table of curve numbers (CN) – a parameter that determines the propensity to shed or infiltrate runoff, the higher the number, the more runoff – shows that for a Group D soil (the class that produces the most runoff), a conventional turf landscape would have a CN of about 84 while a wildflower meadow or native prairie would have a CN of about 73. This would create a very significant reduction in runoff. And result in a whole lot more water held on the land, contributing to deep soil moisture and so maintaining the landscape better through drought periods.

The native landscape would also demand far less routine maintenance – very little mowing, no fertilization, little or no irrigation. Bringing up … A critical aspect of this, Katherine stresses, is follow-through with appropriate O&M, to assure that the hydrologic function of this distributed system is maintained. In setting forth this overall concept to the local regulatory system, it is indeed a concern for the O&M costs of many small installations that is their major objection to “mainstreaming” this concept. This highlights that we must use lower O&M strategies, like “passive” infiltration rain gardens and low-maintenance native plantings, as the mode of implementing this distributed concept.

But considering this concern for O&M also highlights the inherent resilience of this decentralized concept of stormwater management. The “failure” – however one might define that – of any one distributed component – e.g., “clogging” of a rain garden – impacts on only a very small part of the overall system. And so that overall system would continue to provide good overall performance, assuming there is sufficient O&M provided so that the occasional such failures would be detected and corrected before they become so widespread as to meaningfully impact on the overall system. Again, by choosing practices which would entail low O&M to begin with, such as choice of landscaping and passive infiltration devices, that “sufficient O&M” would entail a fairly low liability.

Another component of keeping O&M manageable would be education, so that the property owners would understand what that landscaped depression in the corner of their lot is and what function it provides, so that it would be less likely that they might do something “stupid”, like fill it in or radically alter the landscaping. The literature on Low-Impact Development (LID) – of which this distributed “green” approach is an exponent – universally notes that education is a fundamental component of that strategy. The concerns of the regulatory system could be blunted by requiring that the educational component be part and parcel of implementing that strategy.

This whole idea of rendering the landscape more hydrologically functional and lower maintenance, along with allied practices, can yield a number of benefits. To quantify some of those benefits, let me introduce Tom Hegemier. Now working in the private sector, Tom produced some estimates of water savings potential from pursuing various distributed strategies when he worked on water supply issues at the Lower Colorado River Authority, the agency that manages the lower part of the Colorado River in Texas. Tom’s calculations are not belabored here – I can provide the methodology to anyone who is interested – but they indicate there is huge potential for water savings.

Basically, Tom asked what if half of all new housing built in Travis County – where Austin, the largest city in the lower Colorado basin, is located – between now and 2040 utilized one or more of a suite of water management options. These include:

  • The “Hill County Landscape Option”, which minimizes turf in favor of native plants and emphasizes improving the soil so that its water holding capacity is enhanced. This would result in significantly decreased demand for landscape irrigation water.
  • Building-scale rainwater harvesting to capture water to provide landscape irrigation water.
  • Wastewater reclamation and reuse to defray landscape irrigation water demands.

Tom’s estimates of water savings were as follows:

  • Application of the Hill Country Landscape Option – water demand reduction ranging from 12,000 to 15,000 acre-feet per year.
  • Application of the Hill Country Landscape Option plus rainwater harvesting – water demand reduction ranging from 17,000 to 19,000 acre-feet per year.
  • Application of the Hill Country Landscape Option plus wastewater reuse to defray landscape irrigation demands – water demand reduction ranging from 20,500 to 24,000 acre-feet per year.
  • A combination of all three strategies – water demand reduction ranging from 25,000 to 28,500 acre-feet per year.

As a point of comparison, the total water use in Austin was about 170,000 acre-feet in 2011, and it is projected to be about 300,000 acre-feet in 2040. So a 50% penetration of just these site-based strategies would accomplish almost a 10% reduction in demand by 2040. Tom went on to ask, what if in addition to these strategies, LID practices like those advocated by Katherine plus rainwater harvesting were more universally employed as the manner in which stormwater is managed as sites are developed, opining that this would move development a long way toward being “water neutral”. Or what I set forth in the previous post to this blog as Zero Net Water.

An underappreciated facet of the savings potential is the reduction in peak water demands these strategies can offer. In the climate of Central Texas, annual peaking is driven by irrigation demands. So when irrigation demand is reduced, peaking is reduced. And when peaking is reduced, the sizes of all manner of water supply infrastructure can be reduced, or their implementation can be put off further into the future.

None of this is inherently difficult to accomplish. It hinges on the choice to view rain falling on the site as a water resource to be husbanded to the maximum practical extent, instead of as a nuisance to be shed and made to go “away”. The means to do this, in the physical sense, are readily available and are largely cost efficient. Institutionally, it is “merely” a matter of making that choice and setting regulations and accepted best practices to husband that water resource. Particularly in areas like Central Texas, where water supplies are becoming stressed by growth and that stress is being exacerbated by chronic drought, it is high time that our controlling institutions make that choice.

 

Zero Net Water

January 21, 2014

A sustainable water development concept for the Texas Hill Country – and beyond

Imagine a water management strategy that would accommodate growth and development without unsustainably pumping down aquifers or incurring the huge expense and societal disruption to build reservoirs or transport water from remote supplies to developing areas. Welcome to the concept of Zero Net Water.

As the name implies, Zero Net Water is a water management strategy that results in zero demand on our conventional water supplies – rivers, reservoirs and aquifers. Under the Zero Net Water development concept, water supply is centered on building-scale rainwater harvesting, “waste” water management centers on project-scale reclamation and reuse, and stormwater management employs distributed green infrastructure to maintain the hydrologic integrity of the site. Together these result in minimal disruption of flows through a watershed even as water is harvested at the site scale and used – and reused – to support development.

The key is taking advantage of the difference in capture and distribution efficiency between a building-scale rainwater harvesting system and the watershed-scale rainwater harvesting systems that compose all of our conventional water supplies. The basis for this is illustrated in the schematics below.

WATERSHED_SCALE3 copy

[click on image to enlarge]

The prevailing conventional water supply strategy – again, this is watershed-scale rainwater harvesting – is illustrated in this schematic. Typically only a very minor fraction of the total rain falling onto the watershed makes it into the “cisterns” of that rainwater harvesting system – the aquifers and reservoirs. The rest is lost to evapotranspiration, a “loss” which maintains the ecology of the watershed. Water that does make it into reservoirs is subject to high losses to evaporation (see a discussion of the severity of that here). So the inherent capture efficiency of this system is quite low.

Water supply produced from these watershed-scale “cisterns” is distributed to points of use – where some of the rain fell to begin with – a process which also suffers significant losses. Water industry standards recognize a 15% water loss in the distribution system as “good” performance, and many water systems have much greater losses. So here too we suffer an inherent inefficiency in turning rainfall into water supply that is available for human use.

BLDG_SCALE_RWH_Water_System

[click on image to enlarge]

The building-scale rainwater harvesting concept is illustrated in this schematic. Close to 100% of the rain falling onto a rooftop can be captured and converted into usable water supply. There will be some losses to cistern overflows in large storms or when there is an extended period of wet weather, so the actual efficiency will vary with weather patterns, but in a system properly sized relative to the water usage pattern, it will be consistently very high. The building-scale distribution system is typically – and can practically be – maintained “tight” so there would be negligible distribution losses.

This high capture and distribution efficiency allows the water supply to be “grown” in fairly direct proportion to water demand, one building at a time, thus rendering this a more sustainable water supply strategy. And because the water supply would be provided, and paid for, only to serve imminent development, this strategy is also economically efficient, and thus more fiscally sustainable.

An immediate, practically knee-jerk, objection to this water supply strategy is that harvesting rainwater off rooftops would “rob” the watershed of streamflow and/or recharge, and would thus produce no net gain in the available, usable water supply. As can be inferred from the illustrations above, this is not the case. When not directly harvested, a large majority of roof runoff would be abstracted in the watershed. In any case, when land is developed, the amount of rainfall that becomes quickflow – water that runs directly off the land – increases, and the amount that infiltrates is reduced. Because of the other impervious surfaces besides the rooftops that development adds, the volume of runoff would typically increase even if building-scale rainwater harvesting were to be implemented on all the buildings in the development, as noted in the illustration below.

BLDG_SCALE_WATER_CYCLE

[click on image to enlarge]

Indeed, because development increases runoff, development regulations generally require that steps be taken to treat and detain this excess runoff. Broadscale practice of building-scale rainwater harvesting can actually reduce the magnitude of this problem. The net result in any case is that the post-development runoff volume is typically greater than the predevelopment runoff volume, thus there would be no “robbing” of flows into the watershed-scale water supply system, relative to the pre-development flow regime.

In any case, the water sequestered in the building-scale cisterns is not removed from the watershed. Its release back into the hydrologic cycle is simply delayed. Most of this water, once used in the building, appears as wastewater flow. As reviewed below, and illustrated in the schematic above, under the Zero Net Water concept this flow would preferably be used to defray irrigation demands, so doing a better, more targeted job of maintaining some of the plant life in the watershed. If instead the wastewater were discharged into streams (after treatment of course), the result would be to create a more steady flow of this water over time, as opposed to the “flash” hydrology imparted by direct runoff from the rooftop.

A simple way to encapsulate all this is that we capture and utilize on site much of the additional runoff imparted by placing impervious surfaces over the land. We do this instead of allowing this additional runoff to become an increased quickflow that, if not mitigated in some other way, creates water quality, channel erosion, and flooding problems. So bottom line, broadscale practice of rainwater harvesting off all the buildings in a watershed would actually improve the overall yield from the watershed of water that would be directly usable by humans, without any significant impact on the rest of the ecology, in particular on “environmental flows” in our rivers.

“Right-Sizing”

There is a caveat on the “zero” in Zero Net Water. The cistern in a building-scale rainwater harvesting system operates in the same manner as a reservoir in a conventional surface water supply system – it stores the water for future use. Just like a reservoir, a building-scale cistern has a “firm yield” that will cover a given water demand profile. The building-scale cistern is typically sized to cover most conditions, with imported backup supply added to get through the worst drought periods. Considerations of cost efficiency and the sustainability of the backup supply system lead to the concept of “right-sizing” of the building-scale rainwater harvesting system. This is the combination of roofprint and cistern volume relative to the expected water usage profile that would result in only limited backup supply requirements, needed only during the worst drought periods.

The backup supply would of course be drawn from the conventional water supply systems, from aquifers and/or reservoirs. So there would be some small draw of water from the watershed-scale system to get the building-scale rainwater harvesting systems through the droughts. The magnitude would depend on how well the building-scale systems were “right-sized” and on whether the users of those systems practiced “sufficient” conservation, and also of course on the happenstance of the rainfall patterns over the area. Still, modeling indicates that the vast majority of the water supply for these buildings would be provided by direct capture of the rainfall onto the building’s roofprint.

The “right-sized” facilities vary around the state, depending on the area’s climate. In the Texas Hill Country, a 4-person household which is “reasonably” conservative with their water use typically requires a roofprint of 4,500 sq. ft. and a cistern volume of 35,000 gallons. These sizes could be decreased if the users practice very good water conservation. Most cost efficiently incorporating “extra” roofprint, and perhaps integrating the cistern into the building envelope, are the province of building design concepts. It is suggested that efforts be made to formulate a “Hill Country rainwater harvesting vernacular” house design concept to address those matters. This needs to be taken up by architecture schools, working architects and homebuilders.

Wastewater Reuse

That “right-sized” system noted above will only cover interior water use. To supply landscape irrigation directly from the cistern would require either a significantly larger system or would incur significantly greater backup supplies. However, there is a flow of water right there, water that has already been provided for use in the house – the wastewater flow out of the house. This flow can be treated and dispersed in a subsurface drip irrigation field to defray landscape irrigation demands. Modeling shows that doing this, a sizable area of irrigated landscaping can be maintained without having to either upsize the cistern and roofprint or incur much greater backup supplies.

This strategy was reviewed in “Slashing pollution, saving water – the classic win-win (but ignored by society)”. As set forth there, this sort of reuse system has been implemented on the site scale for over two decades, and doing so will provide superior environmental protection, particularly in sensitive watersheds. It is a small step to do this same process on a project scale, if the nature of the development requires that it employ a collective wastewater system, rather than an individual on-site system for each house. That project-scale reclamation and reuse concept must be part and parcel of the Zero Net Water concept if irrigated landscaping is to be supported.

Stormwater Management

As noted previously, development causes an increase in quickflow runoff at the expense of infiltration due to some of the ground area having been covered with impervious surfaces. These impervious surfaces also increase levels of pollution entrained in the runoff. So development regulations typically require that methods be implemented to blunt both the pollution and the impacts of the additional runoff on downstream flooding and on channel erosion. The building-scale rainwater harvesting systems can help to blunt all these impacts by sequestering roof runoff in the cisterns.

Runoff from the rest of the development and any cistern overflows can, and should, be addressed using distributed low-impact development (LID) practices, focusing on intercepting and infiltrating an initial depth of runoff deemed to have entrained most of the pollution. The aim of the LID strategy is to restore the rainfall-runoff response of the developed site as close as practical to that of the pre-development site. This matching of runoff to pre-development conditions would maintain the hydrologic integrity of the site, and by a multiplicity of sites so treated, would maintain the hydrologic integrity of the watershed. This whole area of “green” stormwater management is the subject of a future entry on this blog. Suffice it here to note that it is an important element of Zero Net Water, as it holds more water on the land and thus blunts the “desertification” of the site that development typically imparts.

Confirmation – and Challenges

Modeling indicates that for all locations in and around the Hill Country, “right-sized” rainwater harvesting systems would not have required any backup supplies after the severe drought of 2010-2011 broke in late 2011, even though the general impression is that drought has persisted in this region. One indicator of this is that water levels in Lake Travis and Lake Buchanan remain very low. Indeed, it is reported that inflow to the lakes in 2012 was the 6th lowest year on record and in 2013 it was the 2nd lowest, above only the extreme drought year of 2011. It is noted that this occurred despite total annual rainfalls over the drainage basins flowing into the lakes having been generally around the long-term average rainfalls there over those two years.

This is simply a confirmation that the capture efficiency of the building-scale rainwater harvesting system is inherently much higher than that of the watershed-scale system. The low inflows to the lakes are a happenstance of rainfall patterns, failing to create the large runoff events needed to significantly raise lake levels. But those same rainfall patterns would result in high capture efficiency off of a rooftop, and so the building-scale systems would not be under the same stress that persists in the watershed-scale system.

Despite the overall efficiency of the building-scale rainwater harvesting system, the Zero Net Water development concept faces challenges to becoming commonly practiced. The building design issues were noted above. The large roofprint required to “right-size” systems in Central Texas would require “right-sized” lots to accommodate it. And two-story houses would clearly be problematic under this concept. Multi-family housing, as presently configured, would also be hard pressed to provide roofprint commensurate with water demand. Then too storage cisterns would take up space, unless they were integrated into the building envelope. All that would have implications for development style, and so would require some tinkering with prevailing development models.

On the other hand, with a typical occupancy of only 2 persons, water demand in seniors-oriented developments – which may be a considerable portion of new development in Central Texas – would be supported by the roofprint typically provided by a one-story house plus garage. Many commercial and institutional buildings would also have a favorable relationship of roofprint to water use in the building. Indeed, as asserted in “First ‘Logue in the Water”, these would be prime candidates for a Zero Net Water strategy. Employing some combination of building-scale rainwater harvesting, condensate capture, and project-scale wastewater reclamation and reuse, those types of buildings would draw no water from the watershed-scale systems. That would relieve a significant portion of demand due to growth, and as a bonus would also blunt stormwater impacts in those sorts of projects, which typically entail high impervious cover, the roofprint being a significant portion of it.

Cost is, of course, a primary consideration, for society at large as well as development principals. Where there is already a conventional water system nearby which has capacity to provide service to the development, the cost of the building-scale systems could not be justified relative to installing conventional distribution infrastructure. Wherever capacity is a problem, however, then the real cost of increasing capacity has to be figured in. Where those costs are very large – e.g., building a new reservoir or tapping a remote aquifer and building pipelines to deliver water from those to growth areas – then building-scale facility costs may be globally competitive. And as noted previously, building-scale facilities require money to be spent only to support buildings as they are built, while those area-wide strategies require huge investments up front of being able to sell the first lot, so the Zero Net Water concept is inherently economically efficient.

We must indeed consider costs globally, not just the immediately apparent costs of continuing with “business as usual”. This comes starkly into play in the Hill Country, where aquifers are under stress even at current usage rates, and serving considerable new development out of them will only “mine” them further. The drawdown created is drying up springs that historically flowed all throughout the Hill Country. That creates a cost to the local ecology, and will fundamentally alter the character of the region. The impact of this degradation was encapsulated in the title of an article appearing in the Texas Observer a couple years ago, “The End of the Hill Country”. This is not to mention reducing water availability from the rivers flowing out of the Hill Country, water which is depended upon for both water supply and ecological services all the way to the Gulf of Mexico. Thus the Zero Net Water development concept may be particularly valuable in the Hill Country.

While the Zero Net Water development concept faces fiscal and institutional challenges, the prospects for sustainably accommodating growth in a globally more cost efficient manner urge its consideration as a water management strategy for new development over much of Texas. Particularly in areas like the Hill Country, where aquifers are under stress and the only other option is a long distance “California style” water transfer from remote aquifers or reservoirs, which may entail both fiscal and ecological sustainability issues. The Zero Net Water concept offers a pathway toward sustainable water even where high growth rates are forecast. It remains only to address the challenges and to put it into practice.

One More Generation

October 29, 2013

By the rude bridge that arched the flood,

Their flag to April’s breeze unfurled,

Here once the embattled farmers stood

And fired the shot heard round the world.

                                                                                                                                                                                                                             – Ralph Waldo Emerson

The “rude bridge” around here is in Lee County. Rather than a store of weapons, the “embattled farmers” there, and in neighboring Bastrop County, are defending the long-term sustainability of their water supply, and thus their own economic future. The British Redcoats in this analogy are the “water hustlers” and their allies who are attempting to gain the ability to pump the aquifer storing that water supply at unsustainable rates, which will result in large drawdown, and eventual depletion, of that aquifer as a usable water supply. In short, Bastrop and Lee counties are seen as water “colonies”.

One ally in particular, Hays County, has entered into an agreement with one of those water hustlers to create an unsustainable draw on that aquifer. Purportedly this is to meet the future water needs in Hays County, on the presumption its recent growth trend will continue for the next few decades. This is simply a taking from the future of the “colonies” to secure their own. That’s why King George was taxing the American colonies, right? You might say, therefore, that a shot has been fired in the Great Texas Water War. How far and wide that shot is heard remains to be seen.

According to the groundwater model which is accepted by the Texas Water Development Board as an accurate picture of the impact on this aquifer, the drawdown which the currently demanded pumpage would create indicates that the Simsboro Aquifer in Lee County would be significantly dewatered, headed toward depletion, in one more generation. And that is just considering the water to be exported, leaving little to support economic development in “the colonies”. Indeed, it is projected that, if the current drought in this region endures, Bastrop and Lee counties will have a water supply deficit for their own municipal needs even without that water being exported.

Coincidentally, one more generation is how far my “horizon of explicit concern” for the well being of regional society has just been extended, with the recent birth of my first grandchild. Which raises the question, what would I have actors like Hays County do to assure there is water available to sustain a healthy society in this region through that child’s lifetime? Isn’t it a function of government to take actions to assure a secure water supply for its citizens on into the future?

Sure, but does that really need to be done at the expense of the economic future of neighboring counties, and in a manner that is not sustainable? Understand that Hays County feels it needs water from Lee County in order to serve an expanding population over the next 50 years, but that population will not – everyone hopes at least – just go away then. Rather it will still be there, continuing to need water for the 50 years beyond that, and so on into the future. So when they suck Lee County dry, then what? A pipeline from the Great Lakes?

It should be quite clear, therefore, that the extractive once-though 19th century water infrastructure model that Hays County – and all the rest of regional society – seems intent on perpetuating simply cannot be sustained in this region, certainly not by actions like “mining” the Simsboro Aquifer. This hits on the continuing theme of this blog, that we need to transcend the mental model that (mis)informs that course of action and transition to a water infrastructure model that will lead us toward, rather than ever further away from, sustainable water.

As reviewed in previous posts, a sustainable water infrastructure model will impart deep conservation – durable increases in water use efficiency that are inherent in the water management methods being employed. Hays County, and similarly situated entities concerned about future water supply, could be leaders in moving society toward sustainable water, rather than ever further away with the sort of essentially “stopgap” (from a long-term perspective) projects like the water grab they are pursuing – really a slow-motion rearranging of the deck chairs as the ship goes down.

What we are seeing here is a classic “tragedy of the commons”. Indeed it is an enduring tragedy of the human condition that what is perceived to be needed for short-term well being, particularly of the large fiscal interests that exert great influence on our controlling institutions, is rather blindly pursued without much regard to the long-term well being of society. So understanding that short-term interests will invariably prevail over the long-term implications, the task at hand is to show how those short-term interests can be adequately served in a manner that would not run us into a box canyon – having promoted, in effect, a population in this region by drawing down its water supply to the point it is effectively depleted, leaving that population high and dry.

It should be understood that, besides its lack of sustainability, there are fiscal reasons to question that course of action. Consider the pitfalls of pursuing a long-distance “California-style” water transfer scheme in a place like Hays County. It seems to be just presumed that the water price this would induce would be deemed “affordable”, perhaps simply because it is presumed that no other options exist, thus since people need water they will pay whatever that price turns out to be. But is that growth actually “manifest destiny”, whatever it will cost?

The purported need for this additional water supply is to support a projected growth in population, in the case of Hays County a projected 4-fold growth by 2040. It should be clear that the upward-trending “J-curve” growth that Hays County has experienced over the last few decades is not any more sustainable than is the over-pumping of the Simsboro Aquifer. Indeed, that growth is predicated on simple extrapolation of historical trends which were based on societal conditions over the period during which those trends were observed. The State Water Plan, which projects the populations upon which future water demand is predicated, states, “… the county’s population is projected one year at a time by applying historical growth rates, survival rates, and net migration rates to individual cohorts ….” [emphasis added]

Much of the projected growth would be due to net migration to/from the area. That decision to come, or to stay or go, would be predicated on a number of factors, the cost of living being among them, and a hugely inflated cost of water would impact upon that. But perhaps more basic is the prospect of a job in that area, and the cost of water will be a factor in a job creator choosing to establish operations in an area. Thus the future cost of water imposed by pursuing that long-distance water transfer scheme would seem to be a very important consideration, one which may significantly alter the conditions on which the growth projections are based.

In the particular case of Hays County, it does not appear that much has been done to evaluate what impact this would have on the cost of water there. Hays County Judge Bert Cobb has stated that no plans have been evaluated to even pipe the water to Hays County, much less to treat and distribute this water within the county. Indeed, the $5 million that Hays County agreed to pay their water hustler ally reportedly is just a “reservation fee” and the price of water at the wellhead is not even clear. So at present it appears that Hays County does not know the cost of water this supply scheme would create.

I was involved in a study of water supply alternatives conducted by Hays County in the late 1980’s which projected water prices due to importation schemes from sources much closer than Lee County of up to $20/thousand gallons. In a study conducted by Hays County a couple years ago exploring water supply options for the western portion of the county, the price to import water from a relatively nearby source into the Wimberley Valley ran above $10/thousand gallons, even though the water treatment costs were not accounted for in that analysis. These prices compare with those typically charged at present by water supply entities in Hays County in the range of $3/thousand gallons. So it can be anticipated that scheme to raid the Simsboro Aquifer would lead to severe rate shock in Hays County.

An issue is that such prices increases would not kick in until the investments have been made and the water starts flowing into the county. Again, Hays County’s presumption is that this water is “needed” to enable the projected growth, but those investments – pretty much an “all or none” proposition – would have been incurred prior to that growth being in place, on speculation so to speak. The price signal that would urge pursuit of alternate strategies would lag the decision to impose the higher prices. Thus, a possibility would be that they built it but no one came – the very high cost of water deflected growth elsewhere.

This brings us back to the argument that we should first pursue those alternate strategies – a sustainable water infrastructure model, a more resilient, more decentralized infrastructure, imparting deep conservation. A model that can be implemented only as required to serve imminent development, thus matching required investments with actual growth. The central questions about any such strategy are of course, where will the water come from, and what will that water cost?

As reviewed in previous posts, a fundamental transformation of the form and function of the water infrastructure system can “wring” considerable more function out of existing supplies. And it is argued that this “new water” could be made available while saving money, because the sustainable water infrastructure model would be more cost efficient – as reviewed, for example, in “It’s the infrastructure, stupid”, in “A $13 million failure of imagination in Center Point” and in “Motherless in Bee Cave.” Imagine indeed if growth in irrigation demand were – very cost efficiently – supplied by distributed wastewater reuse instead of imported water. And as pointed out in “Irrigation efficiency – a new ‘reservoir’ for your city”, there is huge potential for relieving current demand for irrigation water, freeing up that amount of existing water supply for growth. And then there is a move to a more regionally appropriate landscaping ethic, centered on native plants that do not require much irrigation, even in severe drought, which could also free up a considerable amount of the existing water supply.

All this may pale, however, in comparison with a move to a “zero net water” development model. Under this concept, water supply would be centered on building-scale rainwater harvesting, rather than on the watershed-scale rainwater harvesting model which composes all of our conventional water supply systems. The inherent efficiency at which rainfall collected directly off a roof can be converted to a water supply usable by humans is close to 100%, and there is no transmission loss in a building-scale system. In the watershed-scale system, only 10-15% of rainfall onto a watershed typically makes it into an aquifer or a reservoir, then there is very high evaporation loss from reservoirs – up to 50% is reported – and considerable transmission loss in distribution of the water to points of use – 15% loss is considered excellent by the water industry, and much higher losses are commonly experienced. So due to that large increase in capture and delivery efficiency, the building-scale rainwater harvesting system can essentially “grow” the water supply in proportion to water demand, one building at a time.

Of course, the building-scale rainwater harvesting system will have implications for building design. In Hays County, for example, modeling indicates that a roofprint of about 4,500 sq. ft. would be required for a typical 3-bedroom home to be essentially water-independent; that is, needing a very limited amount of backup supply from the watershed-scale system only during the most severe drought years. The house plus a garage and covered patios/porches in “standard” one-story house plans would typically provide 3,000-3,500 sq. ft. of roofprint, so would require additional roofprint to be built on. And typically having an even smaller roofprint, two-story houses would clearly be problematic under this concept. Multi-family housing, as presently configured, would also be hard pressed to provide roofprint commensurate with water demand. Then too storage cisterns impose a considerable cost, and take up space. All that would have implications for development style, and so would require some tinkering with prevailing development models. How cost efficient that water supply strategy may be would depend on the setting – sustainability of groundwater in the face of continuing development, distance of the development from an existing waterline, etc. – and of course the price of using instead a piped-in water supply, which as noted may become way higher than currently prevailing prices.

On the other hand, with a typical occupancy of only 2 persons, water demand in seniors-oriented developments – which may be a considerable portion of new development in Hays County – would be supported by the roofprint typically provided by a one-story house plus garage. The same would be so for many commercial and institutional buildings. Indeed, as asserted in “First ‘Logue in the Water”, employing some combination of building-scale rainwater harvesting, condensate capture, and project-scale wastewater reclamation and reuse, those types of buildings would draw no water from the watershed-scale systems. That would relieve a significant portion of demand due to growth, and as a bonus would also blunt stormwater impacts in those sorts of projects, which typically entail high impervious cover, the roofprint being a significant portion of it.

Finally, as also noted in “First ‘Logue in the Water”, the energy demands of all this decentralized infrastructure would be significantly lower than required to run the prevailing infrastructure model. The low lift out of a cistern and the very short distance to the point of use impart drastically lower energy requirements for building-scale rainwater harvesting systems. Likewise, the greatly shortened water loops in a decentralized concept wastewater reclamation and reuse system would greatly reduce energy demanded by those systems. Since it takes water to make energy – this is the so-called “water-energy nexus” – moving to the sustainable water infrastructure model would save a lot of that water too.

As noted, places like Hays County could be the leaders in moving society toward a more sustainable water future in this region. Given the projected 4-fold population increase, that implies a lot of the growth would be on presently vacant land, to which no services have been extended, thus there is no sunk cost in conventional water and wastewater systems that need to be respected. In large part then they have a “blank canvas”. They can choose to “paint by numbers” and repeat in rote manner the prevailing extractive once-through 19th century infrastructure model, adding on the long-distance transfer of water from a source that would not be sustainable over the long term in a desperate attempt to extend that model’s usefulness one more generation. Or they can choose to move boldly into the 21st century and create a sustainable water future.

Motherless in Bee Cave

September 23, 2013

“I am the Lorax.  I speak for the trees.”

 So spoke the hero of the famous Dr. Seuss tale about the wanton destruction of his land’s forest resources by the Once-lers, intent on their profit and convenience of the moment.  Around here, we have the “Once-through-lers”, who seem intent on the wanton destruction of our land’s water resources, for their profit and convenience of the moment.  And just like the trees in the Dr. Seuss tale, no one speaks for the water.

 A most excellent example of this is being played out around Bee Cave, a fast-developing community in the Texas Hill County, just west of Austin.  An article in the Austin American-Statesman reviewed how the West Travis County Public Utility Agency (WTCPUA) – the entity that recently took over the wastewater system there from the Lower Colorado River Authority – has come to the realization that, with development activity in the area picking up steam, they do not have enough capacity to accommodate all the developments that have requested service or are expected to request service in the near future.  The situation was posed as a “crisis”, that development will go begging for service until capacity can be increased.

 If, as the adage goes, necessity is the mother of invention, then it would appear they are motherless in Bee Cave. That is because, as the article relates it, the WTCPUA’s mental model can only accommodate a conventional centralized sewer system to provide wastewater management for these developments.  Without feeling any necessity to evaluate any other options, they are making plans to extend and expand the capacity of the existing conventional centralized sewer system.  As a result, the fate of the wastewater, once treated at the centralized plant, would be spreading upon land set aside for that purpose, to make it go “away”.  A water resource, addressed solely and exclusively as if it were a nuisance, used “once-through” and then thrown away, truly wasting this water.  Wanton destruction of water resources!

 It just so happens that I have a perhaps unique perspective on this matter.  I was contacted by a real estate broker, who is marketing some properties around Bee Cave, to talk about creating stand-alone wastewater systems on those properties.  He contacted me because he knew that I advocate a “decentralized concept” wastewater management strategy.  The basic idea of that concept is to address this water as a resource right from its point of generation, and to maximize the beneficial reuse of this water to defray non-potable water demands on or near the project generating the wastewater.  Which is to say, centering “waste” water management on water management, not on making a resource misperceived as a nuisance to go “away”.

 This broker knew of the difficulties being faced by proposed developments in the Bee Cave area which were queuing up for service.  Since delaying development would cost development interests (including him) money, he wanted to know if decentralized concept systems might be used to allow development to proceed without having to await the sewer system expansion.  This broker also asserted that Bee Cave is facing a water supply crisis, that it did not have sure access to supplies that would support continuing the rapid pace of development in this area.  All the more necessity to manage all water as a resource there, rather than to so gratuitously throw it away.

 One of the broker’s agents, who is friends with the Bee Cave mayor, arranged an audience with the mayor, also attended by the city administrator and the city planner.  The concept of managing wastewater as a water resource rather than as a nuisance was presented, showing how to practically do this with point-of-use treatment and reuse.  They were shown how this would focus a majority of the fiscal resources on utilization of this water resource to defray non-potable demands, rather than on running pipes all over the countryside to make a perceived nuisance to go “away”, and how this sort of strategy would be less costly, both to the developer and to society at large.  It was a cordial audience, they listened, asked relevant questions.  They suggested that some of the other developers with wastewater needs be contacted, but indicated no interest on the city’s part to even discuss the matter with WTCPUA.

 A few inquiries were made.  No response, so I let it lie.  It was a long shot as I saw it, lacking any “enthusiasm” on the city’s part, to get a developer to “bite”, so dogged persistence did not seem merited.  Then that article came out in the Statesman.  It was suggested to some of our local environmental activists who are concerned about water in the Hill Country, and to a few “water friendly” politicians that their “crisis” could be our opportunity to press for Bee Cave to at least consider water management strategies that focus on the resource value of the water.  The “carrot” being that a decentralized concept wastewater system could grow “organically” with the development, rather than having to all be installed – and paid for – up front of putting the first house on the ground.  This would relieve their “crisis” – since capacity could be installed on an as needed, or “just in time”, basis – while saving money for both the developer and the general public.

 That communication was then copied to the mayor, city administrator and city planner.  The mayor responded, saying she thought the idea had merit.  But rather than asking the WTCPUA to consider it, she suggested again that I, unilaterally, go to the developers and try to get them to – on their own, without “sponsorship” by WTCPUA – consider a wastewater system concept not recognized as an option by the controlling institutions, despite its fiscal and water resources benefits.  Because of that, again such a unilateral outreach to a developer was considered to be a long shot.  But I attempted to contact the developers the mayor identified anyway.  Again, no response.  The activists and politicians were advised that this may be an opening, and were asked for assistance in making the contacts.  No response. It appeared no one wanted to speak for the water.

 But I did dig up some details on the developments.  These indicate that every development being planned out around Bee Cave has its plan rooted in the presumption that its wastewater would indeed go “away”.  It’s like they see that as an entitlement!  That the WTCPUA simply had to extend a line to their property.  (And, one expects, the raw land prices these developers incurred were predicated on a unit yield that presumed this sewer service.)  It appears this is a deeply rooted mental model, the water must go “away” – to be wasted – or it will hurt “the deal”.

 However, a crude evaluation – not knowing the explicit character of each property – indicates that a decentralized concept strategy could be used on at least some of those developments without reducing the planned number of units.  For example, in the largest development investigated, only 23% of land area would be required to house drip irrigation fields, out of the 60% minimum total pervious area required by the Bee Cave ordinance.  With appropriate design, utilizing front yards, parkways, medians, greenbelts and common areas, it can be reasonably expected that a workable system could be installed.

 So indeed it appears the opportunity may be sitting right there to save water while also saving money.  Again, the monetary savings would be attained by: (1) eliminating all those pipes, and lift stations too, that would do nothing but move the stuff around, (2) allowing the wastewater system to be built only as required to serve imminent development, and (3) saving the money not spent to produce and deliver water to make up for what would otherwise be wasted.

 However, with the WTCPUA appearing ready and willing to take the wastewater “away”, the developers would not likely be too keen on “encumbering” their projects with reclaimed water irrigation systems, even if the green space was there, even it if will be irrigated in any case.  Without WTCPUA taking over long-term operations, the developers would be very loath to consider a decentralized concept strategy, regardless of whatever savings in up front costs might be realized.

 We can all fully understand how the developers, and their agents like that broker I dealt with, would take the view that it is their deal which is of paramount importance.  If that entails the wanton destruction of water resources, that’s not their immediate concern.  They generally want the least hassle service plan that they deem affordable.

 The catch, of course, is that the conventional centralized plan may only be “affordable” if these developers are allowed to externalize some of their costs to society.  Society will pay in the long run for the value of the water wasted, day after day, year after year, by the conventional centralized, make-it-go-“away” wastewater service plan.  Anyone who doubts that, consider that this region is predicted to be importing water in the not too distant future, which will be very expensive – and paid for by the public.  Also, WTCPUA would quite likely raise rates on all its customers to cover the bonded indebtedness it would incur to install the system expansion it needs to serve these new developments, and for the increased on-going operational costs, so local society would also pay directly in the short term.

 What is harder to understand is why agents of the public interest would take the view that wasting water is just fine.  Shouldn’t the public expect that those agents would be open to even extraordinary efforts to avoid that, given the water realities of this region?  But they show no indication of interest. Again, it appears that no one will speak for the water.

 First there is WTCPUA, and all its participating entities.  One suspects that they perceive they can only get the revenues from these developments if they provide conventional centralized sewer service.  That’s their “deal of the moment”.  They also understand they’d have to deal with regulatory issues if they were to “sponsor” a decentralized concept management strategy.  It can be reasonably argued that those issues can be favorably resolved, but this effort is no doubt seen as “inconvenient”.

 The regulatory issues exist because another one of our institutions, the Texas Commission on Environmental Quality (TCEQ) – which should be an agent of the public interest – also focuses on its “deal of the moment”, defending rather than rationalizing its rules.  It runs a rule system that sees wastewater management as being ALL about “disposal” of a perceived nuisance.  These rules do not allow reuse to be contemplated until a fully developed “disposal” system is in place.  TCEQ would most definitely have a great deal of heartache about the very idea of a distributed, rather than a centralized, wastewater system, as it goes against its “regionalization” policy.  It appears it would be “inconvenient” to wrap their heads around the fact that water is fundamentally a resource, and so to regulate on that basis.

 Then there are the engineers that work for WTCPUA, who – if that entity were to ever consider its full range of options for the form and function of a wastewater system – would be called upon to inform it of its options and advise it on the merit of each.  No doubt they have on the line sizable contracts to plan, design and permit the sewer system expansion.  This “deal of the moment” would seem to make moving to a decentralized concept strategy rather “inconvenient” for them.

 Just like the developers, then, all these people too appear to be focused on their deals of the moment.  So it is that an opportunity to change the paradigm, to begin managing water resources as if water and the environment matter is quite certain to slide on by, because all these “Once-through-lers” are focused on their short-term profit and/or convenience.

 And no one speaks for the water.

 

A $13 million failure of imagination in Center Point

April 20, 2013

Center Point is a small unincorporated town in Kerr County, Texas, lying along the Guadalupe River, about midway between the larger towns of Kerrville and Comfort. The sewer plan proposed there is an excellent example of a mental model, when left unexamined, costing society way too much money to solve a fairly simple problem. At the same time this deprives society of an opportunity to implement deep conservation, and so move society toward, rather than ever further away from, sustainable water.

Researching this matter, it seems that the idea of sewering up Center Point had been cast about for many years. It was asserted that existing on-site wastewater systems were polluting the Guadalupe River, and relieving that condition creates the “need” for the sewer. In the recent reports, it is also asserted that the sewer is “needed” to accommodate growth, even though population growth projected thru 2040 over a several square mile service area is fairly low, from a current estimate of 2,090 persons to a projected population of 2,519.

The facility plan I found in the Texas Water Development Board (TWDB) file was performed by a large nationwide engineering firm, out of its San Antonio office. This plan offered as options only the continued use of conventional on-site systems – termed in the plan “septic tanks” – or piping the stuff “away” in a conventional large-pipe sewer system. In the recommended option, many miles away to the treatment plant in Comfort. The price-tag for the collection system to get the wastewater there is listed at $13 million and change. Total system cost would also include the price of providing treatment capacity for this flow, a cost which was not evaluated in the planning documents I found in the TWDB files. Spread over the 900+ connections in the service area, the cost is over $14,000 per connection for just the collection system.

This is a very impoverished range of options, ignoring everything between those two extremes, reflecting what I call the “dichotomy view”. That is a mental model which holds that wastewater management can be done in only two ways. One, totally within the confines of one’s lot, with the owner being unilaterally responsible for planning, design, permitting, funding, installing, and operating and maintaining that system. Or two, by dumping it in a pipe leading to the centralized treatment plant, with the user paying a fee and the sewer system operating authority doing all that. One or the other, no options in between. As observed in the case of Center Point, that is the commonly held “understanding” of our controlling institutions, including the large nationwide engineering firms.

Should the people in those institutions know better? In this case at least, ABSOLUTELY! Among the ranks of the engineering firm which prepared the Center Point facility plan are two employees with whom I am acquainted that are both long-time proponents of what I labeled in 1986 the “decentralized concept” of wastewater management. This concept adopts a “continuum view”, considering a range of options lying between the two extremes that compose the “dichotomy view”. The basic idea is that you treat – and beneficially reuse to the maximum extent feasible in the situation at hand – the wastewater as close to where it is generated as practical in the context at hand.

In many circumstances – as reviewed below, Center Point almost certainly being one of them – this strategy can deliver wastewater systems that are more fiscally reasonable, more societally responsible and more environmentally benign than conventional centralized systems. This is the so-called “triple bottom line” of sustainability. Those two employees that I know have been advancing that message for many years, including obtaining for their firm several contracts for various studies about decentralized concept strategies. So clearly, the knowledge exists within that firm to have known that a whole range of options was being ignored in Center Point.

It may be, however, that “mainstream” engineering firms avoid all that for “business reasons”. It is impossible to underestimate the level of resistance that the controlling institutions pose to the decentralized concept. As noted, the long-standing presumption has been that Center Point “needed” to be sewered up, so the local political leaders advanced that as the explicit goal, touting it in every interview I’ve read. These politicians no doubt want to be connected to a “grand project”, to deliver a large grant, and so are disinterested in advancing a “smaller” solution, no matter the relative merits. Then too, an engineering firm would be assured of a hefty design contract to implement the centralized system – indeed, for the Center Point project, TWDB has provided a $1.8 million grant for planning and design – while for a decentralized concept strategy, the available design fees are unknown. It’s also well understood that all the regulations, funding programs, etc., are heavily biased toward conventional projects. Then there are the legal firms, the financial advisers, etc., who stand to get a cut of the pie all lined up cheerleading the centralized system. In the face of all that “weight of expectation”, it’s easy to understand why a firm which wants to keep on doing such projects might hesitate to suggest any “outside the box” options, perceiving a risk of being branded as “not a team player”, of being “blacklisted” on future projects.

The result is that decentralized concept solutions, which could deliver superior service at less cost – including putting that water resource to work in Center Point instead of shipping it “away” – will never even be put on the table for consideration. I have observed this same pattern of behavior repeated by firm after firm, in community after community. In effect, our controlling institutions are operating a conspiracy to constrain the options that get considered to those which match their current mental model of how one manages wastewater.

Options which could have been put on the table – should have been, routinely so, in a context like Center Point – include improved on-lot systems and “cluster” systems, at various scales. These systems would feature high quality treatment – using technologies that are robust and resilient, thus are manageable in distributed systems – and subsurface drip irrigation dispersal. Those practices would preclude whatever pollution issues are purported to be caused by existing “septic tanks”, and they would provide whatever level of service is needed for further development in and around the town.

I would expect the preferred configuration to be on-lot or small-scale cluster systems, so the reclaimed water could be most cost efficiently routed to the most beneficial irrigation usage. This perhaps could be in the pecan orchard or commercial nursery, each close to Center Point’s town center. Or it could provide much of the landscape irrigation water demand at houses, businesses, parks, etc.

That sort of strategy would save money in several ways. First, not all of the 900+ connections have failing “septic tanks” so the initial installation could be limited to properties where the existing on-lot systems are failing, and to commercial sites on which conventional “septic tank” systems are inappropriate. Thus, the cost of fixing the actual immediate problems may be drastically lower. Long term, some properties may never need anything but their existing conventional “septic tank” system, so the ultimate cost of the whole system would most likely be lower, even if the cost per house needing new service under the decentralized concept approach were to be greater than it is with the centralized system.

An evaluation of a decentralized concept system in a similar community, however, indicates that small-scale collective systems could be installed for less than $14,000 per connection. In contrast to what that buys in the centralized system, a collection system only, this buys a collection, treatment and reuse system. Therefore, even if every connection did require an upgraded system, the total cost of the decentralized concept strategy is almost certain to be way less than the centralized option, on an “apples to apples” basis.

Second, since facilities need be built only to serve existing development and imminent new development, this strategy does not speculate on the scale of future development. Whatever new development does occur would not incur costs until it actually hits the ground, so those costs would be delayed until actually needed. This works with the “time value of money” – a dollar you can put off paying until later is worth more than a dollar you have to spend now. This frees money for other investments in the meantime.

This also has a “social justice” aspect. Under a decentralized concept strategy, the costs of developing new capacity, whenever they occur, would be borne directly by the development generating the need for that capacity. This relieves the existing population from having to participate in financing of facilities to serve activities which may not benefit them in any way, as they will be forced to do under the centralized system. Indeed, Kerr County has asserted that it has not been requiring failing “septic tanks” to be upgraded in Center Point because the residents cannot afford it. Yet, the proposed plan would surely impose high monthly fees on these people, likely higher than the amortized cost of an upgraded on-lot system.

Third, regarding those monthly charges, the overall operations and maintenance costs of the decentralized concept system are likely to be lower. This is difficult to evaluate right now because the planning documents for the centralized system provide O&M costs only for maintaining the collection system. Those costs work out to about $40/connection/month, already a pretty high sewer rate. Decentralized concept systems in a somewhat similar community were projected to incur a considerably lower monthly charge than this for all the O&M. In Center Point, however, the users of the proposed system would also have to pay sewer fees to the treatment plant operator, increasing their total monthly payout by an undetermined amount. As noted above, however, under a decentralized concept option, not all the connections are likely to require upgraded service, so the total O&M cost would very likely be much lower.

Fourth, the water has value, so if reused to defray irrigation that would be done in any case, that would save water. Whether or not this yields direct savings on an explicit water bill, there is also another planning process in play to augment water supply in Center Point, entailing another multi-million dollar project. Defraying water demands in this community would limit the need for new water supply. In particular, irrigation supply creates demand peaking, so shaving that peak with reuse could reduce the scale of facilities, saving some of that money.

Fifth, optimizing the beneficial use of the reclaimed water benefits the regional water economy, so would likely put off or decrease the cost of future water supply projects generally. Other, less readily apparent benefits may also decrease global long-term costs to society. For example, salinity of the estuary at the end of the Guadalupe River is of concern, and lessening water demands anywhere in the river basin frees up water for environmental flows. This simply highlights the multi-faceted water challenges facing this region. Without this sort of deep conservation being built into the water management system at every opportunity, costs would be incurred to free up water from other sources to provide the environmental flows, or the estuary will suffer, damaging the economy that depends on its productivity.

It is understood that, given the high institutional bias for conventional projects, getting a decentralized concept system approved and funded might entail a “hassle factor”. But this is no reason to ignore it. Really this is just another cost factor to be evaluated, as the “hassle” translates into hours expended to work through the barriers. So it would seem that the rational course is to evaluate the relative costs and benefits of various options, and then consider if any “hassle” really offsets the benefits that would be delivered by a “non-standard” project. Noting once again, not the least of which is moving us toward sustainable water. It seems that all concerned choose instead to simply presume, without analyzing it, that the conventional strategy is the only one that could be funded and approved. This avoids ever exposing the barriers and working toward their resolution. So the pattern repeats, with the engineer on the next project again fearing to venture “outside the box”.

The bottom line is that the work has not been done to know if the proposed centralized system is the most fiscally efficient, societally responsible or environmentally benign option available. Indeed, the necessary work to expose all the costs of that option have not even been done. Yet the controlling institutions are all conspiring to move this option forward, apparently unconcerned that other viable – and perhaps significantly superior – options have not been considered at all. In Center Point, this apparent compulsion to cater to a prevailing mental model is a $13 million – at least – failure of imagination.

In terms of both money and water, society cannot afford to continue to suffer such failures.

 

Slashing pollution, saving water – the classic win-win (but ignored by society)

March 24, 2013

In this entry, we’re going small-bore, looking at a rather localized and somewhat parochial issue. But one that highlights some of the challenges we face in moving society toward sustainable water, in stimulating deep conservation.

Barton Springs is the natural discharge point of the Barton Springs segment of the Edwards Aquifer, which lies to the south/southwest of Austin, Texas. Nitrate levels in Barton Springs have been increasing in recent years. And, according to a USGS report, a good deal of it is from wastewater sources. Who could have guessed? I mean, besides anyone who gave this matter a moment’s notice.

Nitrate in BSZ-USGS

The graphic above shows the growth of those wastewater sources from 1990 to 2010. The top line of graphics shows the growth in number of OSSFs. That stands for on-site sewage facility, the Texas rules-speak name for what are popularly known as “septic” systems. The bottom line shows TLAP systems. That stands for Texas land application permit. In this type of wastewater system, the effluent is spewed out over an area that is “irrigated” mainly just to make the water go “away” rather than for an actual irrigation benefit, such as an improved landscape or growing a marketable crop. In the most used type of “septic” system – consisting of an “aerobic treatment unit” (ATU) and a couple of spray rotors – the wastewater is also spewed out over the ground, with little regard for the value of the water, or of the environmental impacts. Such as increasing the nitrate levels in Barton Springs.

As these graphics show, the density of the “septic” systems has increased very dramatically over the 2 decades they cover. Growth in the number of TLAP systems, while not so dramatic, was also considerable. Given the nature of those systems, spewing the water over land surfaces that, in the case of OSSFs, are not “qualified” at all and, in the case of TLAPs, are addressed in a rather cursory manner, it should not have been the least bit surprising that nitrate levels would be rising in the waters that drain out of this watershed. Indeed, particularly when combined with increases in pollution it was known would occur simply because development was occurring there, it should have been readily anticipated that this would be so.

The level of nitrate in Barton Springs is approaching 2 mg/L. The often quoted limit for nitrate in drinking water is 10 mg/L. This is what is termed the “enforcement limit”, the level at which definitive action would be required to reduce nitrate loadings into the groundwater. But there is another limit in the rules, 2 mg/L, which is termed the “preventative action limit”. That is the level at which actions to stem the shedding of nitrate into the groundwater – preventative action – are to be considered. We are there!  So it’s time to start taking preventative action, no?

The tragedy here is that this did not have to happen. Preventative action has been available all throughout those 2 decades. Wastewater could have been managed by means which would have greatly blunted, perhaps essentially eliminated, the shedding of nitrates from these wastewater sources. AND this could have been done at very low overall cost, perhaps at NO cost – or even at a savings – in terms of global life-cycle costs of this water management function, while at the same time conserving water. In any case, the tide can certainly be turned going forward by moving practice to those methods.

First, here is what is wrong with the currently prevailing methods. The ATU employs a technology, activated sludge, which is inherently unstable, and so typically suffers “excursions” in its treatment quality, particularly when used in the essentially unsupervised on-lot environment. As my realtor cousin once said of them, “They puke solids.” In any case, the ATU does not remove nitrogen from the wastewater. Spraying this effluent over the ground surface also limits the amount of denitrification – the biologically-mediated conversion of nitrate to nitrogen gas – attained in the soil. These on-lot systems spew the effluent onto the ground without regard to whether it’s raining or how wet the ground is. All this results in a nitrate-rich effluent being dispersed in a manner that heightens the likelihood a good bit of it would be shed, rather than assimilated in the plant/soil ecology, and so would appear in the waters that drain from this watershed.

That shedding of nitrate can be greatly blunted, perhaps even essentially eliminated, by a shift in the type of OSSF used. First, a treatment unit employing recirculating gravel filter (RGF) technology can be designed to remove a majority of the nitrogen from the wastewater prior to dispersal. This is a very robust, inherently stable treatment process, so it can consistently produce this high-quality, denitrified effluent in the lightly supervised on-lot operating environment. The major proof-of-concept field study of this technology was a project I ran on Washington Island, Wisconsin, in which nitrogen reduction of over 60%, and in some cases approaching 90%, was consistently achieved by systems subject to all the vagaries of operating in the on-lot environment. So using the RGF instead of the ATU for treatment will eliminate over half the nitrogen loadings prior to dispersal, consistently and reliably.

Then, instead of spewing it into the air, this effluent can be dispersed in a subsurface drip irrigation field. With the level of nitrogen in the effluent reduced, it is much more evenly matched to the uptake rate by plants. This dispersal method will also enhance in-soil denitrification. Together, these assure consistently more complete assimilation of the nitrogen that is dispersed into the soil. And subsurface dispersal eliminates runoff of effluent during rainy weather. The result is that very little nitrate will leach or flow “away” to appear in the waters that drain from the watershed.

The RGF/drip strategy is also a deep conservation measure, that can move us toward sustainable water. Drip rather than spray dispersal can greatly serve the water economy by displacing potable water with this effluent to defray irrigation demands. Because spray dispersal entails a potential for contact with this partly treated water (which is also questionably disinfected, the reasons for which we won’t get into here), the spray heads are set away from the house, off somewhere on the lot where they won’t be “obtrusive”. But the improved landscaping, the plants that might be irrigated in any case, are typically up around the house, so these spray systems are hardly ever arrayed to serve that landscaping. Because the drip lines are subsurface, there is very low contact hazard, so the water can be dispersed anywhere on the lot where the owner chooses to install irrigated landscaping, and the effluent routed to that drip field would defray irrigation usage, pretty much gallon for gallon through the peak irrigation season.

Also, irrigation efficiency of drip is inherently much greater than for spray. In any case, the rules require the design dispersal rate for spray systems to be very low, so not much irrigation benefit could be derived even if it did operate at higher efficiency. The rules allow the application rate for drip to be significantly higher, much more in line with irrigation rates through the peak irrigation season. So, in combination with the high irrigation efficiency of drip, a much higher irrigation benefit can be derived from drip dispersal.

Further, the rules do not require the area over which effluent is sprayed to be “qualified” in any meaningful way, in regard to soil depths and plant cover. In contrast, drip fields must have at least 6 inches of soil beneath the drip lines and 6 inches of cover over them. In the Hill Country terrain of the Barton Springs watershed, this often requires importing soil to attain these depths. Soil is often “enhanced” to create improved landscaping in any case, so with drip the OSSF dispersal field is typically placed in the best soils available on the lot. Better soil increases irrigation efficiency by providing more soil moisture storage capacity, and with there being more soil volume to “absorb” the water even when the soil is already wet from rainfall, it provides for better assimilation of nutrients.

The bottom line is that with higher quality pretreatment, including significant nitrogen reduction, and drip dispersal, the shedding of nitrate would be greatly blunted, if not essentially eliminated, and a very high percentage of the annual effluent flow could contribute to defraying water used for irrigation. The first benefit would halt whatever portion of the nitrate increases in Barton Springs that have been due to OSSFs. The second benefit is a bonus, one that is very valuable to this water-challenged region. I’ve been designing this type of OSSF for over 20 years, and it has been approved by all the local jurisdictions. Therefore, it is clear that these are benefits which can be readily realized, which could have been attained all along.

So why weren’t they? We won’t belabor the details here, but the installed cost of the RGF/drip system would be somewhat higher than an ATU/spray system. And that’s why the latter are so ubiquitous, because first cost typically rules the day. However, the life-cycle costs would be similar, at least if the cost of the water saved is taken into account. (Whether the cost of that water shows up on a monthly bill would depend on if the home were served by a well or by a piped water system.) Other savings derive from much lower power costs (also a benefit in regard to energy sustainability), from lower equipment replacement costs, and from not requiring chlorine for disinfection (another insult to the environment that is avoided by subsurface drip systems). So nitrate reduction could be realized at very low, or no, cost on a global, life-cycle basis.  Again, the barrier is first cost.

These same technologies could be just as readily used in those TLAP systems. In those systems, land application is, in theory, operated so that nitrogen loadings match plant uptake and in-soil denitrification rates. That could be much more readily, and cost efficiently, attained using the denitrifying RGF system for treatment and subsurface drip irrigation for dispersal. The shedding of nitrate could be further attenuated by placing the drip fields in areas that would be irrigated in any case. This improved landscaping would have better soils than the rangeland and cedar breaks typically constituting the dispersal fields in TLAP systems, to which the water is routed simply to make it go “away”, with no intent of defraying irrigation water usage in the development the TLAP system serves.

Again, the RGF/drip strategy is an exemplar of deep conservation – integrating water efficient practices, instead of water wasting practices, into the very fabric of development. Indeed it could be called to question why any responsible entity in this increasingly water-challenged region would allow water to be so gratuitously wasted, when there are readily available – and globally cost efficient – methods that can blunt that water waste, to realize the resource value of what is now being so foolishly managed solely and exclusively as if it were a nuisance. That both state and local regulatory systems embrace and support those wasteful methods is testament to the institutional resistance to deep conservation.

Going forward, however, a win-win situation is there for the taking. At the same time that water use efficiency could be greatly enhanced, further increases in nitrate being shed into this watershed can be essentially eliminated by shifting to the appropriate technologies. Over time, the existing sources could also be phased out. As people come to value the water being thrown away in their sprayfields, the spray systems may be replaced with drip irrigation fields, arrayed to irrigate their highest value landscaping.

This could be spurred on if there continue to be water curtailments due to drought, since the drip field would “drought-proof” the landscaping it serves. That’s because water curtailments in all the local drought contingency plans impact only exterior water use. The wastewater dispersed in the drip field would derive from interior water use, which is not curtailed, so the landscaping over the drip field could continue to be irrigated through the drought.

Then too, as the ATUs wear out, or the owners get tired of the frequent replacement costs (or the stench they often produce), they could be retrofitted to an RGF, obtaining the nitrogen reductions in the treatment system as well. Together with the drip field replacing the spray system, again this would greatly blunt, if not essentially eliminate, the nitrate being shed by the existing OSSFs.

This is a fairly impressive list of benefits, for both water quality and water quantity, from simply plugging in the appropriate technologies for the circumstances at hand. As noted, the barrier is the first cost of those appropriate technologies, along with the inertia of the wastewater management field of practice, and the sad fact that ATU/spray systems are accorded what amounts to a “most favored status” in the OSSF rules system in Texas.

The latter two factors are matters of reforming the “culture” of the field, but the first cost issue is a ubiquitous problem in regard to all manner of efforts to enhance water sustainability. Society has not figured out how to send to those who incur the first costs the signal sent by the global life-cycle costs. The result is that choices are made which may well serve the short-term interests of those who bear those first costs but poorly serve the long-term best interests of society.

A solution to that conundrum could be provided by appropriate regulation to attain ends which do serve the long-term best interests of society. Like requiring OSSFs in nitrogen-sensitive watersheds to meet nitrogen reduction standards, while simultaneously significantly defraying irrigation demands on “original” water supplies. Here in Texas, society has not yet gotten around to considering its long-term best interests in these regards. So we’ve seen, and no doubt will continue to see, increases in the level of nitrate measured in Barton Springs. And all that water running through those wastewater systems will continue to be indeed wasted.

Irrigation Efficiency – a new “reservoir” for your city

March 11, 2013

Let’s start this one with a BIG NUMBER. To quote the web site of the Alliance for Water Efficiency, “The efficiency of overhead irrigation, such as rotors and pop-up sprayheads, is typically 50 percent and rarely exceeds 70 percent. The efficiency of a well-designed drip irrigation system can reach nearly 100 percent.” This indicates that irrigation efficiency could be as much as DOUBLED by converting to drip. Or, to put it more graphically, the same amount of irrigation would be accomplished using HALF THE WATER! System-wide, that would be a VERY BIG number.

OLYMPUS DIGITAL CAMERA

But wait a minute. Take a close look at the irrigation “system” in the above picture, a not at all atypical scene along the streets of my city, Austin. That 50-70% efficiency estimate is for a “designed” spray system – using rotors and spray heads, laid out in a pattern that provides head-to-head throw of water, uniformly covering the area to be irrigated, and hopefully with the spray arcs set so that very little water sprays over areas not intended to be irrigated, like sidewalks and driveways. What do you suppose the “efficiency” of the spray operation in the picture would be, spreading much of the water on the sidewalk and street? Maybe 20%? Or less?!

Now sure this picture was selected exactly because it serves as a particularly bad example, but as noted it is not all that atypical. One morning, I rode my bike through my South Austin neighborhood and took note of all the irrigation going on that day, at about 25 houses in all. Of those, only a couple were “solid set” systems using pop-up spray heads. The rest were hose-end sprinkler applications. And in only one of the operating systems was there no overspray onto pavement! Most of them were dropping A LOT of the water onto pavement, creating rivulets running along the curb, just like we see in that picture above. It’s a small sample of the entire city, to be sure, but it indicates that these low-efficiency operations are more common than well-designed spray systems.

So it may be that converting those irrigation operations to subsurface drip would perhaps TRIPLE – or more – the efficiency. System-wide, that is a VERY, VERY BIG number! As the title of this piece notes, it would be sort of like a whole new “reservoir” for your city’s water supply.

A “reservoir” of relieved capacity at just time it is most needed! That’s because water savings obtained by increasing irrigation efficiency comes directly off the peak demand, since that is driven almost exclusively by irrigation water use in this region. And it was purported by the City of Austin that a growth in peak demand created the need to build the new water treatment plant it is presently constructing sooner rather than later. So, as is no doubt the case in many cities, measures to increase irrigation efficiency would be particularly valuable to the overall system, allowing sufficient service to be provided without having to increase their peak supply capacity. Yet these measures generally remain quite neglected, in terms of any programs explicitly aimed to stimulate, promote or require them.

This highlights that increasing the efficiency of irrigation operations could be a huge water saver. Not in one big fell swoop, but by the multiplicity of many, many small actions. And that is probably why aggressively pursuing irrigation efficiency has been pretty much neglected as a part of most city’s water conservation programs – it would require the stimulation of many individual actions, through education, incentives and/or mandates. The city bureaucracies no doubt consider that “too hard” – much easier to just build more capacity, which is under its unilateral control, they think, even though an ever-expanding supply is not sustainable. And, as just noted, is unlikely to be the most cost efficient strategy. To move toward sustainable water, it’s clear we will have to take on “distributed” measures like irrigation efficiency at some point. So why not now, BEFORE we put ourselves in hock for expanded peak supply capacity that could be avoided?

The application efficiency – accurately routing the water onto the plants you want to irrigate – is only part of the overall efficiency. Other aspects must also be addressed to maximize the savings. One of them is the quality and depth of the soil. The more soil over the irrigated area and the higher its “sponge effect”, the more water it can hold, so more water would be held in the soil until the plant roots can take it up, rather than draining through the soil and being lost to the plants. Because more depth of good quality soil also allows more rainwater to infiltrate and holds more rainfall in the root zone, irrigation can be delayed longer after a rainfall, also saving water. And because improving the soil reduces runoff, so blunting stormwater management problems, it’s a win-win-win sort of strategy.

Requiring a minimum depth of soil was urged by a resolution of Austin’s Resource Management Commission in early 2006, and was considered by the water conservation task force later that year, but it wasn’t included in the water conservation program, reportedly due to objections from builders. You see, builders are totally focused on the installation cost and aren’t impacted by the long-term costs of having to “over-water” because there’s not much soil there to hold the water. So the city, in its infinite wisdom, chose not to impose that cost on the builders, rather to in effect subsidize them by enduring the inefficient irrigation that results, so driving a perceived need to provide more water treatment capacity, for which the rest of us will pay.

This illustrates the insidious nature of allowing today’s first cost issues to dominate what should be a long-term strategy. This is a ubiquitous problem plaguing many efforts to instill deep conservation practices.

Another aspect of irrigation efficiency is watering at the optimum time. You don’t want to lose water to runoff or leaching below the root zone because watering took place when the soil was still “too wet” – either because the area had been recently watered or because there had been recent rainfall. To maximize this aspect of efficiency requires either real-time expert management, consistently applied – which simply does not happen, is not practical, for most irrigation systems – or using an irrigation control system which can sense when irrigation is needed. “Smart” irrigation control systems that can do this are readily available, and are cost efficient for high usage systems, where savings would be most significant.

All these factors highlight the importance of good system design. As noted, it is likely that a lot of irrigation water runs through systems that are not designed at all, rather are simply a movable sprinkler at the end of a hose.

As noted, those hose-end systems may operate at very low efficiency. Look again at the picture, at the low regard for watering efficiency exhibited by setting the sprinkler on the sidewalk. Now I would speculate that the person who did this is not doing it because he is dumb, rather he is simply using the piece of equipment that he has, to get water onto the parkway strip between the sidewalk and the street. He is doing this, so gratuitously wasting a lot of water, rather than financing a highly efficient drip irrigation system, which is exactly the best way to water areas like the parkway strip between the sidewalk and the street in that picture. And no one is really telling him he should not be wasting water like that.

The Austin Water utility’s propaganda does say that intentionally spreading water on pavement is considered illegal, so I don’t mean that literally no one is telling this person that “irrigating the sidewalk” is not legal. I mean that he is not receiving any signal through either the billing system or through any incentive program that wasting water in this manner is not in the public interest, that it is so economically inefficient, that he – and all the rest of us – are paying for his wastefulness by financing increased water supply capacity, needed only for peak demands that are driven by that wastefulness.

It is also notable that considerable efficiency may be gained simply by better educating people who are using well-designed systems about the actual need for irrigation. One city’s conservation department compared actual irrigation rates to ET (evapotranspiration) rates obtained from weather stations and found that most users were drastically over-irrigating. This launched an effort to educate their irrigators, which reduced irrigation water usage city-wide substantially.

All this highlights the systematic neglect of irrigation efficiency on the part of most cities. It seems rather basic that they need to examine the various means of increasing irrigation efficiency that were reviewed above. They need to come up with estimates of the system-wide water savings that could be attained by widespread application of those measures and of the costs of implementing those actions. This then would reveal the price of this “relieved capacity”, and that could be compared with the price to be charged for adding that same supply capacity to the system. Then the city could incentivize deep conservation actions, like highly efficient irrigation systems, at the level that reflects their real value to the overall water supply system. Or they could mandate those that are clearly fiscally efficient for the end user (despite perhaps being more costly to the builder) – like requiring drip irrigation in all new projects – to forestall, or even avoid entirely, having to do things like spend a billion dollars to expand treatment capacity.

So back to that person who set the sprinkler on the sidewalk, you’ve got to figure out if the fiscal signals you can reasonably send will influence this behavior in a meaningful manner, and if not, then how efficiency could be enforced in order to proliferate it. This is an effort that most cities have so far chosen not to pursue, and so irrigation efficiency, despite that BIG NUMBER noted at the beginning, remains a neglected stepchild. Changing this might have, by itself, allowed Austin to delay construction of its new water treatment plant by a decade or more. How many more cities could, in essence, gain a new reservoir’s worth of capacity simply by investing aggressively in irrigation efficiency?

 

It starts with a vision …

February 28, 2013

As an aid to transcending the mental models that pose barriers to deep conservation, it is useful to offer a “vision” of organizing our water infrastructure to pursue that end. Back in 1996, I saw the need to set forth such a vision, conveying what a “decentralized concept” strategy of wastewater treatment and reuse might look like when it was fully developed and in place as the “normal” way of running our water resources management system. I drafted the following piece, looking 20 years into the future to 2016, to offer that vision. Never did get anyone to publish it. That is how uninterested most were in the prospects of water shortages at the time. What about now? It is offered again below, to help you better understand the fundamental transformation of the form and function of our water resources infrastructure that will implement deep conservation and move us toward sustainable water.

By the way, “the droughts of … 2009-2013” – just call me Nostradamus II 😉

******************************

IS “WASTE” WATER RECLAMATION AND REUSE IN YOUR FUTURE?

by David Venhuizen, P.E.

Despite a 1985 report by an engineering consultant which indicated that a system of decentralized, small-scale facilities would be the most cost efficient, environmentally benign, and societally responsible way to manage wastewater in the fast-developing Hill Country watersheds, the City of Austin has continued to extend conventional, centralized sewer service to that area.  The institution of a project to study decentralized management methods, ordered by the Austin City Council in 1993, appears to have done nothing to slow this trend.  Potentially leaky sewer lines and problematic lift stations continue to proliferate in the Lake Austin watershed and in the Edwards Aquifer Recharge Zone and tributary areas, all the while piping water away to be dumped in the river rather than beneficially reusing it.

In the meantime, where on-site wastewater systems – known popularly as “septic” systems – continue to be used, these are treated largely as plumbing projects, focusing on the cheapest way to pipe wastewater underground so it won’t come back to the surface on that lot.  In short, the emphasis has been on making it “go away” with very little concern for what happens when the water – and the pollutants it contains – gets to wherever “away” is.  This despite the environmental sensitivity of these areas, especially the recharge zone and nearby contributing area.  These “plumbing” systems also waste precious water resources.  Methods which are more environmentally sound and more societally responsible are readily available, but regulatory agencies are concerned about their operations and maintenance liabilities.

A solution is to integrate these “better” on-site systems into a decentralized management system which also addresses the needs of higher density development throughout the area.  This “alternative” wastewater management system may be more cost efficient, it would definitely be more environmentally benign, and it would drastically reduce overall water demand.  The importance of the latter in this region is highlighted by the current dry spell.

This begs the question — Can such an “alternative” decentralized wastewater management system actually be practical and workable?  Imagine this scenario.  The year is 2016 …

Jim wheels his electric car into the Uplands Commercial Center.  Got to get to the photovoltaic plant today for a new battery pack, he thinks as he drives over to the center’s management office.  Jim’s official title is Southwest Quadrant Water Reclamation Plant Engineer.  Far less officially, he is known by another title.

“Hey, if it isn’t the turd patrol!” cries Ken, operations manager for the commercial center, as Jim is ushered into his office.  Jim and Ken exchange insults and other pleasantries, then head out toward the back of the complex.

“The water reclamation plant is doing great,” Ken reports.  “We scraped the slow sand filter last week after a 4 month run, as usual.  And we pulled pump number 3 for its annual maintenance checkout.  That’s all we’ve had to do since your last inspection.”

Jim thinks as they walk along about that term, “water reclamation plant”.  They used to be called “wastewater treatment plants” he recalls.  Jim wonders why anyone would have ever thought of this resource as “waste” water.  In his job as the “turd patrol”, he provides quarterly inspection services for industries and commercial management companies that operate and maintain their own reclamation plants, and he is the chief inspector for all the city-owned reclamation plants that serve residential and neighborhood commercial areas.

As he and Ken approach the plant, Jim can’t help but notice the expanse of metal roof gleaming in the sunlight.  He knows that the commercial center’s large rooftop is a major rainwater harvesting facility in this section of town, providing all the water for the center – easy to do since all its “waste” water is reused – and part of the residential water demand in nearby areas.  Overall, rainwater harvesting supplies about a third of the annual water use in the southwest quadrant.

The water reclamation plant isn’t much to look at as Jim walks up to it.  Just concrete boxes and tanks with domes covering them.  The first box is a septic tank.  Water coming out of it is sprayed onto intermittent sand filter beds in the larger tanks.  Water coming out of that filter flows into a holding tank, to be routed through a slow sand filter in the last box, then finally through an ultra-violet light disinfection unit.  Pumps to move water through the system are the only mechanical parts.  The plant is so simple and unobtrusive, Jim thinks, that it’s no wonder Ken never has complaints about system operation.

Water coming out of this plant is practically up to drinking water standards.  But with all the non-potable water demands – toilet flushing, landscape irrigation and cooling tower supply – sitting right there to use the reclaimed water, Jim knows there is no point in further treatment.  He wonders why, back in the 1990’s, it was thought of as intelligent to use drinking quality water for these purposes at the same time all that “waste” water was piped away and – well – wasted, at considerable cost.

Jim marvels that before the “Water Revolution” was started by the City of Austin in the late 1990’s, bringing these simple technologies into broadscale municipal use for decentralized wastewater management, all of them had languished, hardly ever used by cities, even though they had been in existence for over 100 years at that point, and were known to be well-proven, reliable methods.  Times sure have changed, Jim thinks, since people actually thought this “waste” water should all be treated at one large, complex, electricity-hungry plant and then dumped in the river.

The Uplands Commercial Center was the first major project to be installed using this small-scale treatment and reuse concept.  Since then, all development in the southwest quadrant had employed these methods, and even formerly sewered areas had “unhooked” and converted to reuse systems.  Jim recalls the stir that was created, because of concern about aerosols from cooling towers, when Westlake Village became the first development to change over.  But reuse had been proven to be safe by that time, and people soon came to accept it as readily as they had previously accepted the liabilities of frequently overflowing lift stations used in the old centralized management concept.

It doesn’t take Jim very long to give the plant a good “once-over” and take a couple water quality samples.  He trades one last good-natured insult with Ken, then drives to the first of the four quarterly inspections of city water reclamation plants he has scheduled for this day.

As he drives along Bee Cave Road, Jim exchanges a wave with George, a field operator for the Cooperative Council.  Jim recalls how very small-scale reclamation systems – once called on-site wastewater systems – were also integrated into the overall area-wide management system in the late 1990’s. Introduced into this area by a local engineer way back in 1987, sand filter treatment and subsurface drip irrigation “disposal” became the standard small-scale reclamation system for the same reason the Uplands uses these technologies – they are extremely simple and stable.  All the local jurisdictions banded together, forming the Cooperative Council to coordinate management of these systems.  George was no doubt on his way to do semi-annual inspections of some of those small-scale reclamation systems.

On down the road, Jim passes the Lake Point subdivision, one of the last urban fringe developments to use a “waste” water system which mimicked the old centralized strategy – conventional sewers, a package plant, and a land-dumping disposal system.  They thought they were just waiting for a trunk sewer to be built to that area so they could waste their water more “efficiently”.  Jim remembered what a hassle it was dealing with that package plant.  Since it had been replaced with a sand filter plant and a reclaimed water distribution system was installed in 2010, the system had been far easier to manage, not to mention less wasteful.

Jim thinks as he drives along that it’s hard to believe people once had a problem with neighborhood treatment and with reusing reclaimed water for irrigation and toilet flushing, or that industries and commercial developments weren’t expected as a matter of course to recycle or reuse their “waste” water. Nowadays, he reflects, they enthusiastically embrace these concepts because of the water savings they afford.  The droughts of the late 1990’s and of 2009-2013 had a lot to do with that, of course.  By going to extensive reuse, the Austin area had managed to avoid the “water wars” which plagued other area cities until they got smart and did the same thing.  Now, Jim knows, about half of all water demand in the metropolitan area is supplied by direct reclamation and reuse.

But no time for idle thoughts today, Jim realizes.  He still has to inspect the city reclamation plants and get his water samples to the lab, then get the reports of the day’s inspections prepared.  The job is never over till the damn paperwork is done, Jim sighs.  And, oh yes, he still has to get to the photovoltaic plant for that battery pack ….