HIGH FLYERS THINK TANK
Sponsored by:
Innovative technical solutions for water management in Australia
University of Adelaide, 30 October 2006
Group A Decision support sciences
Rapporteur: Dr Brett Bryan
We had a very productive discussion. We worked on filling those squares in the matrix the first one was energy tradeoffs in relation to how the decision support sciences contribute to that area of interest.
Stuart Minchin started us off with a barb about a common perception that there is a technical solution to every water problem, and these solutions range from dragging icebergs up from Antarctica to piping water down from the north of Australia: they do exist, but at a price. And it is all about those tradeoffs where the decision support sciences can contribute to the argument.
There is a need for an informational approach. Tradeoffs require information, they require us to understand how tradeoffs between the costs and benefits of water access and water supply interact.
(Click on image for a larger version)
Some approaches that we discussed were things like embodied energy, which is a common approach used in that sort of field. We extended that to embodied water, or further into energy footprints, carbon costs and so forth all alluding down the track to more of a full cost accounting, or a triple bottom line analysis of water supply, to account for those non-market or externality impacts of water supply. 'Life cycle analysis' is another term for this kind of thing.
Decision support systems, as I mentioned, can contribute knowledge and information to support decisions made by individuals or government policy makers, enabling them to make knowledge-based decisions. They can then assess the costs and benefits and develop the tradeoff curves through these decision support systems.
In terms of tradeoffs, often we don't know where the 'sweet spot' is. And people's values along the tradeoff axes are different, so my optimal point of tradeoff of cost and benefit of water supply will likely be very different from yours. They are scale impacted, they happen differently at the local scale as compared with regional or even catchment or continental scale, they are different for different geographical localities. Also, we talked about decision support sciences providing information but not making the decisions. People's decisions then get attenuated by all sorts of social, attitudinal, behavioural and political considerations.
(Click on image for a larger version)
A point was also made about environmental tradeoffs, and that often the cheapest water is the most environmentally damaging.
We brought this down to a householder level: start at the home. Decision support systems can help people, individuals, inform their own decisions about water use. That extends to use of rainwater tanks for capturing rainwater for use and reuse, and also stormwater attenuation, farm dam releases, grey water and reuse and recycling.
We looked also at the concept of the electricity grid as applied to water. Can we return water to the grid? What are the impediments to quality, health impediments, infrastructure requirements? What about a stormwater grid?
Modelling is a key part of decision support. Decision support has to include multiple objectives, not only environmental and economic but also social objectives. A lot of the modelling so far has been concentrated on basic volumetrics and quantity assessments, and the more difficult and softer areas have not been so well modelled.
The problem formulation, then, and the way we think about these things, have to involve social sciences and economics. That, I think, reiterates a point made this morning.
Water transport and storage infrastructure, maintenance and engineering: our kick-off point was that the projection is an estimate that over the next 10 years there is likely to be about a $30 billion investment in water infrastructure. That needs very good forecasting, very good information. With the advent of water markets and major individual users, decision support has changed from individuals working in the Murray-Darling Basin Commission or different agencies charged with the job of managing resources to a distributed market.
(Click on image for a larger version)
So we have got a range of different users. Their use is different, the tools that they use are different, right through from coloured pencils and maps on walls to tables, charts, advanced software and full integrated models. So there is a spectrum or a collection of tools that we discussed. There is a range of things in the modeller's toolbox, from modelling approaches and optimisation right through to the qualitative techniques and participatory techniques.
The applications are different, the geography is different and the hydrology is different, but the tools are often the same. We can often throw things from the same grab-bag of tools at similar problems in different areas.
(Click on image for a larger version)
Currently there can be a lack of sophistication of tools, especially visualisation tools, which leads to a lack of accuracy and to increased uncertainty, and that can lead to problems of oversupply, inefficiency and so forth.
We think it is best to start off with easy goals, simple models, and build in complexity as we go, as we start to kick goals. And in terms of water infrastructure, which this topic is about, it is a better defined system and perhaps there is less uncertainty, less complexity involved than in natural systems, so it is perhaps an easier target.
What about communication to the people that need their decisions supported by the decision sciences? Why do we need to adopt these new tools? That might be a question typically asked. Well, we need to prove the effectiveness of new tools to cover the transaction costs of people actually taking these things up.
Do the users require training, education? Do they need the modellers to sit with them? There are new techniques coming out, like companion modelling and so forth, where models are built in conjunction with decision makers and stakeholders, and they have a key role in model development.
A key need is transparency. Users need to know what is in the model. It can't be a black box. They need to know how it works, even if at a fairly high level; that helps to build trust in users.
Modelling in decision support is 'horses for courses' different models for different problems. And different users require different levels of sophistication of modelling and decision support. So it is not a 'one size fits all'.
What about softer DSS qualitative methods and participatory approaches to bring in people and stakeholders in approaches that perhaps don't require so many decimal points but require more qualitative information?
There was a thought that industry may be more receptive to science than community groups and the individual user, so we need to tailor informational needs to those groups.
Standards, access rights, water quality and quantity, environmental allocations, seasonality: standards are required, obviously, and some exist. We talked about some, including the Australian standard for risk assessment. Standards limit the potential to reinvent the wheel. They build on existing expertise, not double up on it. We can build reusable components, and there are examples of that in this industry Catchment Toolbox and so forth. Agencies have a tendency to develop customised tools specifically tailored to their problem, and there is a lot of overlap.
(Click on image for a larger version)
How do we use tools to help the decisions to release environmental water? There are a couple of examples there. Is it better to top up irrigation flows to flood wetlands? We looked at countercyclical trading and other techniques to piggyback on the flows for irrigation to get some environmental benefits as well.
Integration was a key thing we discussed. Models need to be integrated. For example, climate models are used to inform water models, but there is very little feedback so far on how the hydrology of the landscape affects the climate models themselves. So those complex feedbacks need to be included.
We need standards for analysis, in terms of the quality of analysis, as well as data standards. We talked a little bit about model choice, so when to apply different modelling technology that also affects the quality of the solution. And we considered the relevance of historical data. If things are changing, how useful is data that perhaps doesn't cover the scope of potential future eventualities?
Finally, risk management: models need to quantify risk and make it explicit, by including things such as shocks and surprises and extremes. We saw that as being very important: don't only model the expected but model the unexpected to try and cover off on those extreme events. We need to cover uncertainty and quantify uncertainty, make it explicit, make people understand the nature and magnitude of the risk. Covering these shocks and surprises can inform crisis management: 'What do we do if?'
(Click on image for a larger version)
We need to get more sophisticated and perhaps extend scenario analysis, which takes three or four different scenarios and analyses those, perhaps with Monte Carlo simulation, which can provide probability distributions over the outcomes.
We need to provide this information as well to individual users. A farmer might think, 'What are my chances of getting a full allocation? Well, I know I need a minimum of 50 per cent of my water allocation. What are my chances of getting that this year? That will affect my decisions of what to plant, when to plant and so forth.'
There are usually multiple working hypotheses, and these may occur simultaneously. Uncertainty needs to be assessed in those terms. Perhaps we should adopt a multimodel inference framework and allow the end user to explore differences in the different approaches.
And we discussed the concept of an audit trail for decision making. This is around the point of transparency. People are making decisions; they have got to be accountable for their decisions; there has got to be an audit trail of information that led to the decision, because these decisions that are made with a glass of water have significant impacts on the local and catchment level.
(Click on image for a larger version)
Corporate memory was seen as important, and corporate memory loss in particular. The ability to recall different models that were used at different times, different information, can disappear if a person moves from job to job or leaves the organisation. Capability development was also seen as important. There are not enough skills in this area, in the general quantitative sciences and applied sciences, especially in decision-support modelling, statistics et cetera. And that has really led to not as well developed an environment in water resources as in some other things such as stockmarket trading.
Issues of monitoring and collection: the more data we collect, the less our uncertainty. It reduces our uncertainty and enhances our ability to assess tradeoffs.
There are also definitional differences when we are working in an interdisciplinary environment. Social scientists, economists and biophysical scientists think of different terms in different ways, and we saw risk as a good example of that. Also, people's definition of what is a 'tolerable' risk might differ.
Finally, we made some attempt to identify the priority issues, most of which I have talked of. They include estimates of uncertainty and to cover off on extreme events; the tradeoffs, costs and benefits, transparency, openness and intellectual property involved in models; fairness, which was a key issue, as was data; the need to involve industry and engage stakeholders in modelling; the whole concept of integrated modelling, of system-wide understanding that includes not only the environmental and biophysical integration but people and the economy; an emphasis on decision support, not decision making, because people's decision making is influenced by many external pressures, not just the information on the state of the system; and also a full cost accounting of benefits and costs.


