HIGH FLYERS THINK TANK

DEST logo

National Research Priorities Strategic Forum

The Shine Dome, Canberra, 26-27 June 2002

A critique of the status quo, with respect to the operation of research organisations
by Bruce Hobbs

Bruce Hobbs is Deputy Chief Executive, Strategic and Investment Planning, CSIRO. Within CSIRO, he was previously Deputy Chief Executive, Minerals and Energy; Chief, Exploration and Mining; Chief, Geomechanics; and Chief Research Scientist, Geomechanics. He was formerly Foundation Professor of Earth Sciences at Monash University. He is an Honorary Professor at Monash University and the Sir James Foots School of Mineral Resources, University of Queensland. He was awarded the Jaeger Medal by the Australian Academy of Science in 2001, for distinguished contributions to Earth sciences. His dominating interests at present involve the development and implementation of large national R&D projects that address significant problems facing Australia over the next 25 years.

I might say it is a great pleasure to be here, and some of the talks have been absolutely inspirational to me. I want to go through a bit of a journey that is still in progress. I have not taken the topic that was in the paper (I thought it was about investing in science and measuring the outcomes) but a bit of the critique will come in a moment.

Slide 2

This is 'our' map of MIT. I think it is a good map of a knowledge infrastructure. It has a whole lot of things in it to do with what universities normally do, and it has a whole lot of things to do with what other kinds of people normally do. What I worry about in the present trend that I see in this country is that people haven't defined what this is, as far as we as a country are concerned. It would be nice if we did. And, secondly, if it looked like this – and I think it does look something like this – then we would be trying to optimise all the bits without trying to optimise the system. My preference would be that we take a big overview of all this, as in David Strangway's beautiful example from Canada, and say: how do we actually optimise the system, rather than optimising, say, the outputs of new ventures or the outputs from teaching or something like that? Let's take a systems view of this, and say what Australia wants to achieve rather than what individual entities want to achieve.

Well, many governments are moving to this knowledge infrastructure type approach, but the trouble is that most governments in Australia – by that I mean state and federal – still try to measure things as though they were a kind of Department of Administration or something, or like a phone-call centre: 'How long did you spend on each phone call, and what did you achieve from it?' They therefore regard the R&D organisations, by which I mean the universities, CSIRO, AIMS and so forth, as from a customer/provider point of view. In fact, if you go and talk to DoFA, the Department of Finance and Administration, which I have unfortunately spent a lot of time doing over the last year or so, you find they really have that customer/provider attitude to life. And from that point of view they have enormous difficulty in defining or understanding what the benefits of the R&D organisation are. They want to understand what you actually provide to them as a customer, and that difficulty arises because they cannot readily measure the effectiveness of the R&D organisation's outputs.

My message today is simple. In order to develop a world-class knowledge infrastructure for Australia, it is essential that both governments and R&D organisations view the funding process as an investment, rather than a cost. That is again a beautiful example from Canada. But implementation of such a view carries responsibilities for both governments and R&D organisations. I am going to continuously refer to Canada, because what has happened there is that the government has made it very clear what they expect, which is an accountability that says, 'This is what we achieved,' and that those outcomes are nicely measured by the framework that has been put in place and there is a responsibility on the R&D organisation to do that reporting and to make sure it happens properly. If the funding process is one of investment, then governments must have a way of being convinced of the magnitude of their return on investment – and that their investment is being managed properly.

An essential feature of the knowledge infrastructure of a country is the investment of government in publicly funded R&D organisations that undertake fundamental research or strategic research that is too risky for industry to undertake. This investment culture implies a shift from a customer/provider to an investment/performer relationship. So my talk is all about: how do you measure performance? It necessitates a move from some form of performance-based contracts with, ideally, a quantitative measure of returns to government. You can see again that the Canadians have gone some way down that route towards quantitative measures, but it is very nicely balanced, I thought, between 'soft' measurement and 'hard' measurement.

This is the way in which many, many companies and personalities would behave if they wanted to invest in, say, BHP. They would look at the quality of management and management processes – that is, the governance, the finance systems, the strategic planning processes, all that kind of stuff; the track record of the organisation, which is the way in which we normally measure science in this country, say by citation indices, but that gives you some kind of hint as to the likelihood that your return on investment might be rather large; and then, lastly, the likely return on investment, measured in some ideally quantitative way. We to some extent measure the track record very well in this country. We hardly ever really think about the quality of management and management process. In fact, most academic communities would say, 'We don't want to really have anything to do with that at all,' until recently. And we are very, very bad, all over the world, at measuring in quantitative terms, the likely return on investment.

Here is a critique of the status quo:

  • Vague goals lead to perpetual programs achieving poor results.
  • We can rarely show what our R&D investments have produced, and we do not link information about performance to our decisions about funding.
  • Many R&D projects have ended up stepping beyond the legitimate purposes of government to compete with – or unnecessarily subsidize – commercial ventures.
  • Finally, many R&D projects directly benefit corporations that could fund their own R&D projects without federal assistance.

Who said that? President George Bush. And in The President's Management Agenda: Fiscal Year 2002, chapter 8, there is a whole discussion on what he – or his advisers – thinks about the way in which R&D is conducted in the United States. They have run a little trial thing with the Department of Energy now to say, 'How are we going to do this better?' But the kind of general sentiment that was expressed in that quote is something that I hear continuously from government.

So a widely recognised problem in specifying outcomes is to define quantifiable measures of effectiveness. I don't know whether you can ever go all the way to getting quantitative measures of effectiveness, but the term 'effectiveness' is used to mean the impact of outputs upon specified outcomes. That is real, strict DoFA-ese that I can speak fluently now. And we heard it again from David Strangway this morning: what is the impact of the research, how do you go about measuring it, how do you go about telling a story that convinces the public, the government or whatever that you have actually been effective in the research that you have done?

The move internationally is away from vague anecdotal statements of achievements, which amount to an attempt to demonstrate compliance with what are commonly equally vague policy statements made by government, to contracts with government based on performance indicators and measures that ideally are quantitative in nature. That point there about 'equally vague policy statements by governments' is just as important as the equally vague, kind-of-anecdotal statements that R&D organisations make. And so this attempt to set priorities or to talk about priorities I consider to be an incredibly important thing to do, because it gets away from these incredibly vague statements that governments make, to saying something quite specific: what does this community expect of you as a group of R&D performers?

Best practice effectiveness measures

Slide 11

Slide 12

We have put this list together from a guy who worked for us in Canada, who is kind of the guru of R&D organisations around the world. I am not going read this out but really it says that if you are going to put together some measures, you have to be sure that they address these particular individual entities. Here is a continuation of that. And what is really important again is that if you do this properly, you have a reference for further review of R&D effort and infrastructure. There is a framework there that enables you to make rational decisions about the future, rather than just ad hoc. Even if you do change governments, you can go back to a history of performance that enables you to say, 'This was really important. It really delivered,' as opposed to this that perhaps did not.

Slide 14

What are some measures of inputs and outputs? Some people like to put together diagrams that look like this, and I have just thrown this together. Here is a map of R&D expenditure in the United States, the top 125 or 130 R&D organisations, which are mainly universities. Where do they sit, and where do we sit? You can see that there are four in here and CSIRO actually rates fifth in that. But this is an incredibly good record. You can spread it all out, but you can say that all of the universities in Australia – those that I looked at, at least – fit in the top 120 or so of R&D expenditure in the United States. You can compare that with Oxford, Cambridge and so forth; the story is basically not too bad.

Slide 15

But you can go down this route too, that you can start to worry about patents and so forth. Scientific research leads to patents, which leads to commercialisation, which leads to new companies. So it is absolutely essential that this country have a fundamental basis in scientific research. If you don't have that – and there is a whole story around this that you can develop – you don't go down this track. We have been very bad at getting to here; equally, we have been worse at getting to there; and of course that is the reason we don't have a lot of that. But if we do put in place the processes that enable that stream to happen, I am convinced that this country can be a world beater.

Slide 16

No need to stress the importance of S&T indicators. Every business day there are that many science papers and that many new patent documents; patent revenues in the United States look something like that shown. This is another measure that we could use. How do we stack up against all that? We don't stack up very well. We are good at performance – and I am afraid that this data is a bit CSIRO-centric, but that points to another issue. It is incredibly difficult to get this data for Australia. One thing I would urge DEST or DoFA to do is to start putting in a series of processes that enable these kinds of things to be tracked.

Slide 17

You can see that at least in CSIRO, in the top 22 areas that ISI look at, we are in the top 1 per cent in 11 of those fields. ANU is in the top 17 of those fields, in the top 1 per cent of the world.

Slide 19

These are just normal ways in which many people measure inputs and outputs. Here is another little table, and you can see that there are blanks. The reason they are there – I could actually fill them in now, but I couldn't when this slide was made – is that it is very difficult to get this data, in a way that you can actually believe is true, from the annual reports of R&D organisations. The number of US patents that ANU has produced in this time plan I still do not know. Somebody knows, but I don't. It really requires an enormous amount of work. If we are going to show the results of the outcomes of our research in any meaningful way in the future, we must have a national system that enables these kinds of things to be tracked.

Slide 20

Australia's patents scorecard shows the areas in which we are weak and declining; we are strong but declining; we are weak but growing; and we are strong and growing. Is that what we want as a country? This is another part of the priority setting process, in my view. What do we actually want that scorecard to look like? Do we want it to just happen in an ad hoc manner, or do we really want to make sure that that kind of thing changes? We will not know it has changed unless we do a reasonable amount of work in tracking that progress over time.

Slide 21

Slide 22

Patent trends in Australia on the basis of US-registered patents: again I am pleased to report that CSIRO does not do all that badly, and compared with the other people, including BHP in particular, and Comalco and ANU, it is not too bad. But we have not done a lot with it, and that is the issue. What needs to be done now is not just to say that that is there – and the growth rates are not bad either, especially for ANU. The point is that we need to have processes in place, which for me is a priority, to make sure that we do not just put that graph up but go to the next bit, which says that the royalty returns have been pretty colossal.

Slide 23

If you look at the royalty returns within divisions of CSIRO – the star, I am sorry to say, is Plant Industry, but still you can't do much more about that – it is not bad, but there is something more important if you look at those comparisons. In other words, Jim Peacock brings in around about 9 per cent of his total income from the royalties stream. You can compare that with what some of the other groups on earth do, and what's the story? I'm not sure. Florida State obviously does something pretty special, but Yale or Stanford, or Harvard, are way below that in some instances.

So what do we want as a country? What do we want the universities or CSIRO or AIMS or whatever to look at on that range of things? Again we will not know unless we spend a reasonable amount of effort in trying to pull the data together. And I tell you, the data is incredibly difficult to get at, at the moment.

Slide 25

Measures of outcomes and effectiveness measures is a different issue. The NHMRC, as you know, has been quite successful at getting in extra resources, and they have gone about it in a very systematic manner which is illustrated here. They have said what they want the investment to achieve, and they have then defined what the outcomes of that research are – for instance, research of high international standard. On the basis of that, they have then said, 'How are we going to go about measuring this?' and so there is a group of ABS statistics that will talk about that, or ISI or somebody. They put together a whole portfolio of measurements that would enable government to go back after five years and say, 'Yep, you did a very good job.' Most people in Australia can't do that.

Slide 26

Again the way in which they want to go about this is based on what has happened over the last 200 years or so in Australia, and they would put up many, many, many graphs that look like this, that say that if we really want to target something in this country, it is not much use looking at infectious diseases these days, unless it is a very special infectious disease. The priorities have to be out here in respiratory diseases, or stroke or whatever. So it is that kind of information that needs to be assembled for Australia that enables you to put effectiveness measures in place fairly readily.

Slide 27

CSIRO has tried to do the same kind of thing, to say, 'That is the outcome that we want – Healthy, wealthy and wise – and we are going to do it in these particular areas. These are the outputs that we expect: new knowledge, collaborative R&D, research services, and licensing, patents and spin-offs. And we need to know how you are going to measure all that.' One of the ways in which we did that was to go to the Centre for International Economics and show them maybe 100 of the projects that we have done over the last five or 10 years, asking them, 'What kind of system can you see in all this?' They have come out with effectiveness measures that are based on how CSIRO contributes to the triple bottom line of Healthy, wealthy and wise, and – because this is the kind of thing that the people in Finance want to know – how you price products, what kind of quantity and quality you have. And this is the kind of the thing that we normally generate in universities, 'What is the amount of money you spend per research project? What does the ISI say about you?'

The systematics behind this is to show what CSIRO does and delivers, CSIRO's pathway to the triple bottom line.

Slide 29

What I am trying to get at here is that there are ways of putting together structures that say: what do I expect to deliver, what are the ways of measuring that, and what is the ultimate contribution of our research to the impact of that work on Australia? I believe that unless you go down that track, then it is going to be very difficult to make cases for your research actually having made an impact.

The effectiveness indicators that the Centre for International Economics came up with are here. They are about lower/more competitive production costs; improved quality goods and services; ew products, services and businesses; reduced risk (economic, social, environmental); development of skills (enhanced human capital); improved human health, safety and well-being; informing policy (more cost-effective public programs); reduced pollution; and improved environmental health.

Let me cut to the chase. The point about it all is that it is possible to develop a structure like this which actually addresses the triple bottom line – the first three of those are in the economic area, the next three are in the social area and the last three are in the environmental area – which you can place, given enough money and time, some real quantitative measures on. Those quantitative measures then enable you to go through a process of developing a benefit-cost analysis, if that is the way you really want to go. We have done that quite a bit.

But let me go to a slightly different slant on this and say, if you are going to develop some priorities, how we tried to go about it, at least over the last two years. In the last two years we have gone through this process of saying, 'What is going to happen in Australia, say over the next 25 years?' It has to be based on this kind of data.

Slide 32

You would notice that the latest census came in with a total fertility rate of 1.7, and the government has just announced a net immigration per annum of 110,000. That can be tracked over time, but the implication of that is that the population of Australia in 2025 will be 25 million, and that has a whole lot of implications as far as energy production, water use, quality of life and so on are concerned.

Slide 33

Out of that comes, after a lot of other analysis, 'All right, let's take one of those things, like CO2 emissions.' We know what the CO2 emissions are going to be if you stay as you are at the moment and don't do anything about it, but if you plotted out over time what were the technologies that would be available in order to reduce those CO2 emissions, they would automatically assemble themselves into two bits. One is to basically do things that you normally do now. Lots of companies do it: reducing the compression ratio, using variable valve timing and so forth. That will take the emissions from Australia to a certain level, but it is not incremental, it is something that has to be done, and in my book it would be a priority for Australia to carry that line of research – assuming that you think that reducing CO2 emissions is an important thing to do. On the other hand, you could take that bold step and say, 'We are going to go to fuel cells, the hydrogen economy and whatever,' and that would take CO2 emissions down much lower. So that is a paradigm shift, and in my view there are always these two lines.

Slide 34

The same kind of thing can be done with respect to fuel efficiency. You can scale up, do steam temperature, but if you go into solar enhanced fuels et cetera, you are in a completely different class of things.

Slide 35

As far as we are concerned, that then places the strategies about how you develop portfolios around this into this kind of space, where one quadrant is strategic priority driven research; universities and ARC would want to have another axis on this graph that is all about curiosity-driven research, but we have one here called priority curiosity-driven research, without any blue-sky bit in it whatsoever. Along the horizontal axis, this is an important national activity and this is a complete paradigm shift. We have emerging areas of science, we have the so-called Flagship Projects, which Graham Harris will talk about in a minute, and we have simply priority-driven research. These are just as important things to do as these, but the thing about this is that the time scale and the magnitude of what you can actually achieve, because of thermodynamics or the nature of the problem or whatever, is intrinsically much smaller than what you can ultimately achieve here. But you must do both, and you must spend a lot of time in here, because this is the basis for ultimately doing these other two in the future.

Session 6 discussion