The Waterblogue has featured the idea that building-scale rainwater harvesting (RWH) can provide a significant contribution to water supplies in this region. For example here, here, and here. Raising the obvious question, what actual contribution could this strategy make?
Discussing the merit of and prospects for building-scale RWH as a water supply strategy in this region with a colleague, I was surprised when he confronted me with this proposition. According to state policy, in order to be considered a functional water supply strategy, a method must be capable of delivering a “firm yield” through a repeat of the “drought of record”, and under that definition, most building-scale RWH systems simply do not exist, do not deliver any recognized water supply!
Let that sink in for a moment. A well-planned, well designed building-scale RWH system around here is typically expected to be able to provide in excess of 95% of total demand over a period of years, which would include a period of drought. An example is the “right-sized” system summary shown below produced by modeling the period 2007-2023, inputting Austin rainfall records over that period. This time period includes 2008-2014, which is reported to be the new “drought of record” period for the Highland Lakes, which is the major watershed-scale RWH system that serves this region.
[click on image to enlarge]
As you see, we can readily choose system sizing relative to the expected level of water usage to create systems that can indeed deliver 95% or more of the total water supply over the modeling period. So to the question, if the system does not deliver the total water supply needed through a drought period, what would it take to assure this strategy does deliver a functional, secure and assured water supply?
I posed this matter in the TWDB-funded investigation “Rainwater Harvesting as a Development-Wide Water Supply Strategy” that I ran for the Meadows Center at Texas State University in 2011-2012. I set forth the idea of “right-sizing”, that rather than having to pay to upsize the RWH system to cover the last little bit of demand, it would be more cost efficient society wide to install a “right-sized” system that would provide the vast majority of total water demand, and to provide for a backup supply of the small amount of shortfall, that would be needed only though bad drought periods. As can be seen in the table above, for example, to provide 100% supply at 45 gallons/day to a 4-person household would require 5,000 sq. ft. of roofprint and a 42,500-gallon cistern. But that system could be downsized considerably, to 4,000 sq. ft. of roofprint with a 30,000-gallon cistern – saving a ton of money – and would still have provided 97.5% of total demand through the modeling period. So the question becomes, how to assure that 2.5% shortfall could be provided by other means.
The presumption behind this concept is that there is not an unlimited market for development. The development that would be provided water supply by building-scale RWH systems would displace development that would have otherwise drawn its water supply from the watershed-scale RWH system, rather than be development in addition to that. So the supply being provided by building-scale RWH would be supply that would be left in the watershed-scale system storage pool most of the time, so presumably not drawing it down as severely as it would have been if all those building-scale systems had instead been routinely supplied by the watershed-scale system. Thus the watershed-scale system would have the “slack” to provide the relatively small amount of backup supply to the building-scale systems through the drought periods.
In the example above, “right-sizing” at 4,000 sq. ft. and 30,000 gallons, the table shows a total backup supply of 28,000 gallons would have been needed through the drought period 2008-2014, or just 4,000 gallons per year on average, out of a total modeled demand of 65,700 gallons/year. That system would have been 94% supplied by the building-scale RWH system through that 7-year drought period, and as noted above 97.5% supplied through the total 17-year modeling period.
The question, of course, is if indeed the watershed-scale RWH systems – such as the Highland Lakes in this area – would have the capacity to provide that backup supply demand through a drought period, as well as continuing to serve all the development that routinely draws from it. My colleague, while acknowledging the logic of my argument, asserted that the “growth model” presumes that the watershed-scale systems serving any given area would indeed become completely encumbered by development they serve directly – based I’m guessing on the very circumstance that overall growth around here is projected to exceed the capacity of existing water supplies to service it – so that it’s presumed there would be NO capacity available in that system through a drought of record period.
As best I can translate, it is asserted that the “right-sizing” strategy is illegitimate, because there would be no sources available for backup supply through a drought. So rendering that evaluation noted above, that if the building-scale system would not carry 100% of the projected supply needs, for the purpose of planning water supply strategy, it is presumed that the system provides NO water supply, is of NO value to the regional water economy.
I find that viewpoint to be, well, strange, contrary to common sense. Does it not seem that if a building-scale system provides in excess of 95% of the total supply over a period of years, that would be supply that the watershed-scale system is relieved of having to provide, and so this is effective water resource conservation, that does have value to the regional water economy? It seems rather didactic to simply “erase” the whole building-scale RWH water supply strategy because it would need a minor portion of total supply to be provided out the watershed-scale system, which the building-scale systems would be totally relieving much of the time. Indeed, one wonders where else in public policy is a 95+% “success” rate deemed “unreliable”? Yet that is what my colleague contends state planning principles presume “must” be so when considering whether to base water supply strategy on any use of building-scale RWH over any given area. That only the capacity of systems sized to deliver 100% of the projected supply can be deemed to exist.
It is little wonder then that we do not see the building-scale RWH strategy being set forth in any of the regional water plans, and thus not having been meaningfully incorporated into the State Water Plan. Here is the sum total of what the 2022 Texas State Water Plan says about building-scale RWH as a water supply strategy:
“Rainwater harvesting involves capturing, diverting, and storing rainwater for landscape irrigation, drinking and domestic use, aquifer recharge, and stormwater abatement. Rainwater harvesting can reduce municipal outdoor irrigation demand on potable systems. Building-scale level of rainwater harvesting, as was generally considered by planning groups and which meets planning rules, requires active management by each system owner to economically develop it to a scale that is large and productive enough to ensure a meaningful supply sustainable through a drought of record. About 5,000 acre-feet per year of supply from rainwater harvesting strategies is recommended in 2070 to address needs for select water users that have multiple additional recommended strategies.”
To put that projection of supply to be provided by building-scale RWH in perspective, if we presume a typical system does provide supply at 45 gallons/person/day for 4 persons, or 180 gallons/day total, each such system would supply 65,700 gallons/year, or about 0.2 acre-feet/year. So a contribution of 5,000 acre-feet/year would require 5000/0.2 = 25,000 RWH systems of this size, or the functional equivalent, to be put in place. How much growth of this strategy does this project?
While there is no authoritative data base that would provide the number of existing RWH systems, a rough guess that one expert on the subject offered is that there is likely in excess of a quarter million RWH systems – 10 times the number calculated above – in just 7 states, with Texas being the site of a goodly portion of those. Indicating that it does not appear the 5,000 acre-feet by 2070 projection even comes close to representing what is already on the ground, routinely producing water supply today.
But, as reviewed above, those who set water planning policy in Texas are loathe to accord to this strategy any actual contribution to supply, because of that “firm yield” requirement. So we need to consider if that is indeed sound reasoning, if that is a sufficient reason to exclude all contributions by building-scale RWH systems all of the time, or if we should rethink that.
Might, for example, society be better served by planning for building-scale RWH systems within a “conjunctive use” strategy, under which whatever the backup supply source is would have that capacity “reserved” in some manner? Just as this concept is applied to co-managing surface water and groundwater, so that one source might “fill in the gaps” of the other source’s capacity. To do this of course would require conscious consideration of and planning for building-scale RWH as a contribution to area-wide water supply. Which is absent at present, and so this matter remains “fallow”.
There are hundreds if not thousands of houses, businesses too, around here where folks are making building-scale RWH work as a water supply strategy, successfully arranging for whatever backup supply their systems need on an ad hoc basis. All of the water hauling companies that provide that backup supply report they are confident their business model will remain viable, so those supplies can be maintained into the future. So it would seem that building-scale RWH could indeed be a broad scale water supply strategy, with some intentional planning for assuring backup supplies are provided at need.
The situation can be summed up, that building-scale RWH is not meaningfully included in water resources planning in Texas, upon the “reasoning” that this method would not provide a “firm yield” through a repeat of the “drought of record”. This ignores any prospect for co-managing this strategy with the watershed-scale RWH systems to assure that whatever gaps in firm yield would result would be covered out of the watershed-scale systems. Does this not seem to show a lack of vision among the mainstreamers who control those planning processes?
In pursuit of society’s best interests, it is suggested that this whole viewpoint be revisited. This is another example of how we need to take a peek down the road not taken … so far. As that could make all the difference.
A development called Mirasol Springs is being proposed in Central Texas, along the Pedernales River on the Travis County–Hays County border. The development scheme is shown in the schematic below. It includes a “resort” hotel (the Inn), “branded residential” homes, “resort” cottages, “conservation” lot houses, a research facility, and a farm. This area is a somewhat “pristine” landscape, in particular including much of the – so far – “undisturbed” Roy Creek watershed, renowned by naturalists as a great example of a “native” Hill Country landscape. Thus it is considered an imperative to develop in this area with great sensitivity to this landscape, in particular in regard to water resources management, to blunt the draw on this area’s limited water supply resources and to minimize water quality degradation. The developer’s scheme proposing to accomplish that is set forth on the project’s website here, offering his team’s vision of how to best manage water resources – water supply, “waste” water management, and stormwater management. The “header” of this page reads, “Mirasol Springs will set a new standard for environmentally focused Hill Country development.” Raising the question, would it really?
As reported in “One Water” = the “Decentralized Concept”, there is a broadly supported, but so far largely unrealized on the ground, idea that engineering practice in this area needs to move toward “One Water” practices, and that a better understanding of the way we do this needs to be fleshed out. It was argued in that post that the “One Water” ideal would be most effectively and beneficially delivered by designing efficient water management into the very fabric of development, as if it were a central point, rather than to first arrange for water to “go away” and then to attempt to append on that efficient management at the “end of the pipe”, as if it were an afterthought. With the inevitable conclusion from this being that imparting “One Water” practice will rely in large part on employing distributed management schemes, such as the decentralized concept described in the piece linked above. Let’s take a look at how all that might play out in a setting like Mirasol Springs.
Water Supply
In the vision the developer sets forth on the project website, listed under “Water Use” are four components: surface water, reclaimed water, rainwater harvesting, and groundwater.
Under “surface water”, the website states, “Surface water purchased from the LCRA [Lower Colorado River Authority] will be the base water supply for Mirasol Spring’s [sic] potable water and will meet 100% of our demand.” It is first brought to question, just how is this so very conventional idea – extracting a water supply from the watershed-scale rainwater harvesting system that supplies the vast majority of water supply in the Central Texas region – a “new standard”?
In this case, as can be seen in the graphic below, the proximate source of the potable water supply would be the Pedernales River, which runs along the border of the project site. The water withdrawn from the river would be pumped into a water supply reservoir to be built on the site. Water would be withdrawn from that reservoir and run through a water treatment plant. The treated water would be distributed in a conventional distribution system, routing water to all of the buildings on the development, requiring distribution lines to be extended to all the various developed areas on this site.
This conventional water supply system would entail a great deal of site disruption. This includes installing the intake structure in the river and a pump system and delivery pipe running up the bluff on the Mirasol Springs side of the river, and installing the reservoir, which would entail excavating the pond and distributing the excavated material on the project site. The distribution lines would cause disruption over and between the developed areas, in particular to get to all the “conservation” lots in the more “pristine” parts of the site, in the Roy Creek watershed.
Raising the obvious question, what could the developer do instead to create a water supply system for the project? Skipping to the “rainwater harvesting” component, the website states, “Rainwater collection from rooftops will be a requirement for larger structures constructed across the property, a practice that is already in use on the ranch. Deed restrictions for home sites will include water capture for irrigation purposes and guidelines for non-water intensive vegetative covers, water conservation-oriented landscapes and xeriscapes. Landscape irrigation on home sites will be restricted to rainwater collection only; no potable water will be allowed for landscape use.” While this laudably proposes to make rainwater collected on site from rooftops a primary supply for irrigation needs, it neglects considering the most “One Water thing” one could conceive here, maximizing the resource value of the water falling upon this site. So perhaps obviating all the expense and disruption of creating and operating the surface water supply system.
Consider the benefits of a water supply derived from distributed building-scale rainwater harvesting (RWH) vs. the surface water system. First and foremost is the efficient use of the area’s strained water resources. The very reason why the developer would pursue groundwater as a backup supply, reviewed below, is that they conceive the possibility that the Pedernales River would run dry, or dry enough to have their water supply curtailed. So why not consider the prospect of not depleting that surface water resource at all?
Second, as noted, with the facilities arrayed at the building scale, site disruption to install the storage pond and the water distribution system would be avoided. As would the inevitable leakage losses that plague such water distribution systems, so largely avoiding that often rather sizable source of water use inefficiency.
Third, the energy requirements to run the building-scale RWH systems would be considerably lower than would be required to run the surface water supply system. In the former, considerable energy would be required to lift the water from the river to the on-project water supply pond, and from the pond to the water treatment plant, and also to run the more energy-intensive surface water treatment unit, and then to pressurize and move water through the distribution system. In the building-scale systems, any lift from a cistern would be low and the water would only have to be run a very short distance. The treatment unit required to render the roof-harvested rainwater to potable quality would require far less energy than the conventional surface water plant. Not only would this be a fiscal plus for the MUD that will pay the energy bills, since it takes water to make energy – the so-called water-energy nexus – all this energy conservation would enhance the overall efficiency of the region-wide water system.
Fourth, under the surface water supply scheme, a considerable evaporative loss from the on-project water storage pond would be incurred, at its maximum just when drought would typically be at its worst. Evaporative losses from the covered building-scale cisterns would be minimal, a not-insignificant efficiency advantage for the building-scale RWH strategy.
Fifth and finally, pretty much the entire surface water system would have to be planned, designed, permitted and installed before the first building on the project could be provided a water supply. This is a hefty amount of up-front cost that must be incurred before any revenue-generating facilities may come on line, imposing a considerable “time value of money” detriment. The building-scale RWH facilities, on the other hand, don’t need to come on line until the building(s) each unit serves would be built, so the lag between incurring those costs and being able to derive revenue from each building could be much shorter. Also, it is expected that each of the building-scale systems – excepting for the Inn – would fall below the threshold to be classified as a Public Water Supply System, so the long and expensive process of permitting these systems through TCEQ could be avoided, a further “time value of money” benefit.
To determine the degree to which building-scale RWH could create a sufficient supply to meet the potable water demands in the buildings, a model would be used, into which the roofprint (water collection) area, the cistern (water storage) volume, and the expected water usage profile would be input. The model would be run over a number of years of historic monthly rainfalls to see how much, and how often (if at all), backup supply would have been needed in each year through the cycles of drought and plenty, and how much water supply would have been lost via cistern overflows during large storms and through extended rainy periods.
Based on the outcomes, “appropriate” building design, to increase roofprints – for example the “veranda strategy”, adding on covered patios and porches to add relatively inexpensive additional collection area – and “proper” cistern sizes, as well as the target conservation behavior, could be chosen to make the system as robust as desired. Past modeling of and experience with building-scale RWH in this region indicates that this strategy could provide a quite sufficient supply for much of the interior (potable) water uses at Mirasol Springs.
Through this means, the potential for building-scale RWH could have been evaluated, and the costs of using it could have been compared to the costs of the conventional surface water supply system. And the benefits of avoiding the site-wide disruption entailed in the conventional strategy could also have been evaluated. None of this appears to have been considered by the developer, rather it seems to have been simply presumed that the surface water supply, the watershed-scale RWH system, was “needed”, that building-scale RWH could be no more than an adjunct supply to defray irrigation usage. Opportunity to set an actual “new standard” foregone.
Now consider the “groundwater” component. The website says: “Groundwater will only be used if surface water is unavailable or curtailed. The goal of the project is to significantly limit the use of groundwater through conservation, including the use of reclaimed water and harvested rainwater, noted above, to meet non-potable water demands. When surface water is not available, Mirasol Springs will utilize groundwater to service the demand for domestic use. No groundwater will be used for landscape irrigation. Good stewardship of groundwater resources will be supported through additional planning and holistic water management measures. There will be no individual water wells. Water availability studies have demonstrated that adequate groundwater is available from the underlying aquifer when the project is required to use groundwater.” Quite a number of claims and caveats there to be considered.
While there is no indication what the parameters of the deal to purchase water from the LCRA may be, it is expected that any curtailment or unavailability would be predicated on the flow in the Pedernales River, which would rise and fall with cycles of drought and wetter times. If a drought were of such severity that river flow would drop so low that LCRA would curtail or ban further withdrawal of the surface water, it would be exactly such a time period that the region’s aquifers would also be under maximum stress.
There is no analysis, however, of “[w]hen surface water is not available”, and so when/if groundwater might be “needed” is entirely opaque. There is no indication, no standard for what would constitute “[g]ood stewardship of groundwater resources”, no idea offered for how those resources “will be supported through additional planning and holistic water management measures.” It all seems to be a “just trust us” proposition, hardly any sort of “new standard for environmentally focused Hill Country development.”
Thus, by plan, groundwater would be prevailed upon to carry the entire potable water supply just when that source too would be most stressed, and so when groundwater withdrawals would be most problematic. But again there is absolutely no analysis of when/if groundwater might be “needed”. So it may be called to question if indeed “adequate groundwater [would be] available from the underlying aquifer when the project is required to use groundwater.” There is no indication that a drought-stressed local aquifer could provide the full potable water demand over any given period, for this or any other developments in this area. Indeed, it is the questionable future condition of the local aquifer that urged the developer to look to a surface water supply to begin with.
Then too it can be called to question if the treatment requirements for a groundwater supply would be the same, using the same sort of treatment train, as for the surface water drawn from the river. Water quality of groundwater varies considerably across the Hill Country, and “over-drawing” aquifers can cause the quality of water from some wells to degrade. So this is another aspect of the overall scheme that appears to be a bit open-ended.
All this would be imparted by choosing to ignore the readily available “One Water” strategy, an actual “new standard” strategy, of maximizing supply from water falling onto this site.
“Waste” Water Management
While those water supply matters basically rest on analyses that the developer chose not to pursue, and almost certainly sells short the “One Water” supply strategy, in the “waste” water arena, there is a much more clear-cut choice. For the “reclaimed water” component, the website states: “Mirasol Springs will reclaim wastewater from the Inn, the Farm, the University of Texas Hill Country Field Station, and all the home sites in a centralized collection facility that is aesthetically integrated into the landscape. There will be no septic systems. The facility will be equipped with the best technology available for nutrient removal and will reclaim 100% of the effluent for irrigation uses. This wastewater will be treated and used to offset irrigation needs for the property and other non-potable uses. No potable water will be used for landscape irrigation. Also known as beneficial reuse, this process completes the effort to maximize the lifecycle of water usage onsite. There will be no discharge into any creek or river. All wastewater will be collected and returned to a treatment plant.” This word salad begs for examination.
As stated, and as seen on the water systems graphic above, the developer is proposing that a conventional centralized system be installed, collecting all the “waste” water to be treated at one centralized facility. Including from the large “conservation” lots, entailing a rather long run of sewer line, in the Roy Creek valley, the most environmentally sensitive portion of this site, to collect a relatively small portion of the total amount of “waste” water that would be generated on the overall project. For those lots, to avoid the disruption and pollution — and the cost — those lines would impart, the developer should consider on-lot systems, to treat and reuse the water on each lot to serve irrigation demands there. Which, I expect, requires a dose of perspective.
It is noted that the website explicitly states, “There will be no septic systems.” Like that is a good thing. One can read between the lines here that “septic system” is deemed, at best, a secondary good, and likely is presumed to be a source of pollution, that the developer sees as being eliminated by centralizing the “waste” water from the various lots. Ignoring of course that components of the centralized system would themselves be pollution vulnerabilities. Conventional collection lines leak – longer runs of lines impart more leakage, and this becomes worse as the lines age – and manholes in those lines overflow. Lift stations inevitably needed in the terrain on this site will inevitably fail and overflow at intervals. And all this is in addition to the widespread disruption of the landscape that would be entailed in installing the centralized collection system.
This could all be obviated by choosing to pursue a decentralized concept “waste” water management strategy, treating and reusing the “waste” water as close to where it is generated as practical. Again, for the dispersed “conservation” lots, this would be a no-brainer strategy, presuming the use of the sort of “septic system” that is equal to the task at hand. A system providing high quality pretreatment – including removal of a large majority of the nitrogen from the “waste” water prior to dispersal – consistently and reliably while imparting rather minimal O&M liabilities. Then dispersing the reclaimed water in a subsurface drip irrigation field, arrayed as much as possible to serve grounds beautification, the landscaping that would be irrigated in any case, whether the reclaimed water was there or not, so practically maximizing beneficial reuse of the “waste” water resource in the on-lot environment.
The High Performance Biofiltration Concept treatment unit – set forth for distributed treatment duty in “This is how we do it”, and more fully described here – fits the bill here, being by its very nature stable, benign and robust. This is the very treatment technology used, for example, at the highly touted Wimberley “One Water” school, exactly because of that.
Unfortunately this treatment concept is not very broadly known, as the “septic system” market in Texas is so dominated by the “aerobic treatment unit” (ATU), which is a small, home-sized, bastardized version of the activated sludge treatment process, a process that is by its very nature inherently unstable, and even more unstable in these bastardized incarnations of the process. And, as reviewed in “Averting a Crisis”, the “septic system” regulatory process in Texas is legend for neglecting on-going O&M, so making it even more critical that “fail-safe” systems like the High Performance Biofiltration Concept be used, especially in a setting like Mirasol Springs.
Noting, however, that assuring “proper” O&M “shouldn’t” be an issue on Mirasol Springs, as the entire “waste” water system, no matter how deployed, would be professionally operated and maintained by the MUD the developer proposes to establish to run the water utilities on this project. All the more reason to use the inherently stable and robust, the more “fail-safe” High Performance Biofiltration Concept treatment unit instead of ATUs, to reduce the load on that O&M system.
Indeed, even the centralized treatment plant the developer proposes would be episodically loaded, with flows rising and falling through the diurnal cycle. So using an inherently unstable activated sludge system for that plant would be a vulnerability, urging the use of the “fail-safe” option there as well.
But again, the major vulnerabilities would be avoided by not centralizing all flows, rather by distributing the system to each building or set of buildings, as would be most cost efficient in each circumstance. Note in particular how this disperses risk. Any problem with the centralized treatment plant would impact on the entire flow, while a problem with any of the distributed treatment units would impact on only a minor fraction of the total flow. And again, there we would be using the low risk “fail-safe” treatment technology.
But the developer foregoes this opportunity, in fealty to the conventional understanding that it is best to centralize all flows to one treatment unit, despite all the pollution potential, and disruption, inherent in gathering flows to that central point. And despite the cost of running the collection lines out to each developed area, and – if the reclaimed water is to be reused for landscape irrigation as the “vision” asserts – the redistribution lines to send water from the central treatment plant to the areas to be irrigated. Again, all that would be avoided under the decentralized concept strategy.
Then there is the matter of the “time value of money”. The centralized system is an “all or none” proposition. The treatment plant would be initially built with the capacity to treat flows from all the buildings on the project, while portions of the development would come on line in phases, so that some of the treatment plant capacity would lie idle in the interim until the project was built out. It is also likely that the collection and redistribution lines would all have to be installed to get the water from all areas to the treatment plant and back to irrigation sites. Here too, investments would sit in the ground, not fully utilized, until the project built out.
A distributed system would obviate all that unrealized value. First by not having to install the collection and redistribution lines at all. And then, by using the improved type of “septic system” noted above, installed for each development area on a “just in time” basis, to serve only imminent uses. For development other than the “conservation” lots, likely a collective system serving more than one building at each treatment center, but still overall a distributed system, not requiring any investment in the larger-scale collection and redistribution lines. Further realizing the “time value of money” by building only those systems needed to serve imminent development, rather than having to plan, design, permit and install the entire centralized system before service could be provided to the first building.
As for treatment quality, the developer appears to presume a need for “the best technology available for nutrient removal”, even though all the reclaimed water would be dispersed in subsurface drip irrigation fields, providing all the treatment and “buffering” that the soil-plant complex offers. The High Performance Biofiltration Concept treatment unit can consistently and reliably produce an effluent with low – typically about 10 mg/L – BOD and TSS, the two basic measures of how well treated the water is. 20-30 mg/L is deemed “secondary” treatment, which is the minimum required to disperse the reclaimed water in subsurface drip irrigation fields. This is mainly to assure that drip irrigation emitters would not clog, as the level of “dirtiness” of the water as measured by BOD and TSS, as long as it is in the “secondary” range, is otherwise irrelevant in a soil dispersal system.
The High Performance Biofiltration Concept unit can also routinely remove a large majority of the nitrogen from the “waste” water, typically producing an effluent concentration of about 15 mg/L. Less than 20 mg/L is deemed to be a “safe” level that would, along with plant uptake and in-soil denitrification we have in this climate, result in a vanishingly small amount of wastewater-derived nitrogen flowing into environmental waters when dispersed in a TCEQ-compliant subsurface drip irrigation field.
Phosphorus – the pollutant of greatest concern in discharges to streams – would be irrelevant here, since at the concentrations found in domestic wastewaters, phosphorus would be fully “sorbed” in any soil mantle that would provide a decent growing medium.
Bottom line, the High Performance Biofiltration Concept treatment system would deliver an effluent that would be highly protective of the environment, even in this sensitive area, assuming of course that the subsurface drip irrigation systems were well designed, well implemented and well operated. Which of course would be the same condition that the conventional system the developer proposes would have to meet.
The conclusion is that the decentralized concept strategy described here, utilizing distributed “fail-safe” treatment units and dispersing the reclaimed water into subsurface drip irrigation fields, would produce a “waste” water system for this project that would be more fiscally reasonable, more societally responsible, and more environmentally benign than would be offered by the conventional centralized system – with reuse appended on – that the developer proposes.
Stormwater Management
The website is light on how stormwater would be managed on this project. It states under the heading “Watershed Protection and Storm-water Runoff”, “The ultimate goal is to maintain the hydrology of the environment in its current state. This will be accomplished through short-term construction site management strategies that include silt fencing, soil berms and wattles to prevent erosion and silting of nearby streams.” Which seems to sequester the efforts to mitigating water quality degradation due to construction activities. Necessary of course as the development is being built, but the major task is to indeed “maintain the hydrology of the environment in its current state.”
In that quest, the website states, “There will only be a few homesites in the Roy Creek watershed, all with a 1,000-foot land buffer between the home [and] Roy Creek [sic]. The engineers recommend allowing for the native vegetation and soil to act as a natural ‘filter,’ as it has done for thousands of years, rather than trying to capture it and then release it from a pond or other structural water quality controls that would unnecessarily disrupt the natural character of the land.” The actual solution here is restricted to the “conservation” lots only, leaving it open what is to be done elsewhere, but implying the only option is the conventional view of stormwater management, that the site “should” be efficiently drained into an “end-of-pipe” facility – “a pond”. It seems to deny the “One Water” strategy of collecting and infiltrating the runoff on a more distributed basis, the Low-Impact Development (LID) strategy utilizing permeable pavement, Green Stormwater Infrastructure (GSI), etc.
The obvious measure of maintaining “the hydrology of this environment in its current state” is to render the rainfall-runoff response of the developed site as close as practical to that of the native site. This dictates that, up to the rainfall depth where runoff would begin on the native site, after all the “initial abstraction” were “filled”, all runoff should be intercepted and caused to infiltrate. While some of that infiltration might be imparted by flow over the “natural filter” in downslope areas, more generally some of that infiltration would have to be “forced”, with permeable pavement or by running it through GSI, such as distributed rain gardens. It seems the developer has not considered this basic “One Water” concept, choosing to rely on a more conventional end-of-pipe management scheme. Perhaps entailing the installation of grey infrastructure to convey flows from developed sites to ponds and such, as seems to be implied in the “Mirasol Water Systems” schematic above. It is called to question how well this could maintain the rainfall-runoff response very similar that of the native site.
The website further asserts, “Considerations will also include restrictions on impervious cover to prevent run-off and divert water into the aquifer.” Disregarding the non-sequitur, it should be clear that the LID/GSI strategy, infiltrating runoff from impervious surfaces on a highly distributed basis, is the manner in which one could reasonably “prevent run-off and divert water into the aquifer”, particularly on the more intensive portions of the development, like the Inn and resort cottages, perhaps the “branded residential” homesites too. We’d just have to deal with pavement, since rainwater harvesting would basically take rooftops “out of play”. The water that would have infiltrated over the area covered by rooftops would be captured, stored and later infiltrated, either through irrigation directly or once used in the buildings and becoming “waste” water, then irrigated. That concept was explained in this post.
Now as noted the 1,000-foot “land buffer” would indeed be quite effective in mitigating pollution and the increases in runoff imparted by development on the “conservation” lots, but of course there would have to be constructions to cause any concentrated flows to disperse into overland flow, so the scheme would not be quite so “automatic” as the website appears to present it. GSI, such as full infiltration rain gardens, should be installed there as well, to intercept flows off of impervious covers, to directly infiltrate some of the flow and to spread flows over that “natural filter”. This would be particularly so for any rainwater cistern overflows, which would be “concentrated” flows out of a pipe.
The website is totally silent on this LID/GSI approach — the “One Water” approach — implying that the only option the developer can conceive to letting stormwater runoff flow away downslope would be to first route it to “a pond or other structural water quality controls that would unnecessarily disrupt the natural character of the land.” Noting of course that it is the disruption of the “natural character” of the land caused by development that any such constructions would be installed to mitigate. Again this seems to reflect that conventional bias for gathering runoff into end-of-pipe “ponds”, rather than running it through highly distributed constructions like full infiltration rain gardens, with only “large” storm runoff overflowing on down the slope. Somewhat better mimicking the hydrology of the native site.
Summary
So it is that the water “vision” of the developer can readily be called to question. To sum it up, if the developer of Mirasol Springs is going to style its water management scheme as a “vision”, then perhaps it should impart some. It should follow the best “One Water” practices available in this setting, the water supply, “waste” water management, and storm water management practices reviewed above. Presenting the conventional scheme the developer proposes as a “new standard” can be quite fairly seen as simply greenwashing that very conventional scheme. If this project is to deliver on its promise of preserving and protecting this “pristine” landscape, a more holistic, more “One Water” strategy will be required.
Or so is my view of this matter. What’s your view?