29 August 2023

Randomised testing good for policy development, says Leigh

| Chris Johnson
Join the conversation
9
Andrew Leigh

Policy needs more vigorous evaluation, says Andrew Leigh. Photo: George Tsotsos.

If we tested new ideas the same way we test new pharmaceuticals, we’d be a better nation.

That’s the opinion of Andrew Leigh, assistant minister for competition, charities, treasury and employment.

In an address to the National Press Club on Tuesday (29 August), Dr Leigh helped launch the Australian Centre for Evaluation saying policy development in the public service should undergo more rigorous evaluation and include randomised testing.

At times talking directly to public servants, the Federal Member for Fenner suggested change in approaches to policy development could be driven by them.

“Public servants can become better consumers of evidence,” he said.

“When claims are made about the effectiveness of a program, ask about the quality of that evidence.

“Is it a single before-after study, or a systematic review of multiple randomised trials?

“Each of us can work to raise the evidence bar.

“Public servants can also help produce evidence about what works. If you spot an opportunity to run a high-quality evaluation in your agency, I encourage you to engage with the Australian Centre for Evaluation.”

READ ALSO Government coughs up more for its workforce, but union says it’s not just about the pay

Announced in the 2023 Federal Budget, the centre was given $10 million over four years and an initial 14 staff to work across government agencies to improve evaluation capabilities, practices and culture.

“A core role for the centre will be to champion high-quality impact evaluations, such as randomised policy trials,” Dr Leigh said.

“Past reports have clearly shown the need to improve the quality of evaluation across government.

“Work done for the Thodey Review of the public sector found that the quality of evaluation was ‘piecemeal’.

“Some high-quality evaluations have been conducted, including by the behavioural economics team in the Department of Prime Minister and Cabinet.

“But in many other areas, the capacity to conduct rigorous evaluation is lacking.”

Dr Leigh cited a report from the Australian Evaluation Society estimating that in 2021-22, the Commonwealth procured 224 evaluations from external consultants at a cost of $52 million.

“One problem for consultants is that there isn’t much incentive to undertake a high-quality evaluation,” he said.

“… the better that consultants design their evaluation, the less likely they are to produce a report that shows the program worked. Which may make it harder for them to win the next contract.

“That’s why we’re also encouraging agencies to rebuild their own in-house evaluation capabilities and consider partnering with the Australian Centre for Evaluation…

“Another reason that consultants’ evaluations may fall short is if they are commissioned to produce evaluations late in the process when there is insufficient planning and data available.

“So the Australian Centre for Evaluation will also be working with government agencies to strengthen evaluation planning, especially in new budget proposals, and ensure that evaluation is considered at all stages of policy and not seen as an afterthought.”

Dr Leigh said embedding evaluation in the work of government could take many forms over time – and not just running pilot studies.

He said when policies were rolled out to different sites over time, randomisation could be built into that rollout which would guarantee a rigorous evaluation. When funds were being allocated across states and territories, it would be possible to provide resources for those jurisdictions willing to conduct rigorous experiments.

READ ALSO Policymakers would do well to view productivity from fresh angles, says outgoing commissioner

He said more support should be given to programs backed by the best evidence.

“In the face of major challenges, low-quality evaluation is a hindrance, not a help.

“Using dodgy impact evaluation techniques is like doing your running training with a slow watch. It might make you feel like you’re fleet-footed, but when it comes to race day, you’ll eventually be shown up.

“That’s why researchers in areas such as pharmaceutical development are committed to using randomised trials.

“They recognise the importance of accurately evaluating new treatments. They know poor evaluation of medical treatments can cost money and lives.”

Dr Leigh said the Australian Centre for Evaluation would seek to take the same approach to policy – testing new ideas with the same methods used to test new pharmaceuticals.

“In the face of hard problems, we must bring more than a crash-or-crash-through mentality. We need to show up with a willingness to rigorously evaluate those solutions,” he said.

“We need to bring enough modesty to the task to acknowledge that answers which sound right may not always work in the real world.

“To generate and sustain a culture of continual learning, we need to be open to being proven wrong, and to use that information to do better the next time.

“We need to accept honest feedback – not pretend to get by with a dodgy wristwatch.”

Join the conversation

9
All Comments
  • All Comments
  • Website Comments
LatestOldest

I like Leigh, he’s one of the few trying to make the actual governing of us all better, but as Peter Graves points out below, this idea won’t go far with ministers or the secretaries employed to push their barrows. Robodebt is an extreme example but an exemplar of how the modern public service has been reshaped over the last 25 years to deliver what the minister wants, and good policy be damned.

Stephen Saunders6:10 pm 31 Aug 23

Hypocrite. The most partisan backer ever, of massive levels of immigration. No question of evaluation, or taking voters into account, it’s his way or the highway.

How do you figure that? Immigration is the only thing that’s kept our economy going for decades, and more workers = more growth in most economic models, so why wouldn’t an economist support immigration?

Start with government pandemic policies.

Balance needed4:23 pm 30 Aug 23

Fascinating how evaluation is back on the agenda. When I headed up a Departmental Evaluation Unit in the mid-90s “evaluation” was all the rage. Whole conferences devoted to it. Our job was to instruct policy developing areas to “think evaluation first”, build the strategy in from the start, collect baseline data to allow effectiveness to be measured.
Then the buzz all seemed to disappear. Maybe Governments didn’t want their programs to be evaluated after all.

Peter Graves4:41 pm 30 Aug 23

Page 17 OF “The Performance Framework of the Australian Government, 1987 to 2011” by Keith Mackay. OECD Journal on Budgeting, Volume 2011/3 provides the answer as to what happened to formal APS evaluation requirements in the 1990s. This is a verbatim extract:
3.2.3. Abolition of the evaluation strategy
There had been considerable opposition on the part of line department secretaries to
the creation of the evaluation strategy in 1987, mainly on the grounds that they viewed it
as an intrusion on their areas of responsibility. However, once the strategy had been
established, there was little stated opposition to it during the following decade. All this
changed after there was a change in government.
With the advent of the Howard government in 1996, and consistent with the government’s push for less “red tape”, line departments pressed for less oversight by, and reporting to, the DoF. Line departments also took the opportunity to highlight the burden to them of planning and conducting evaluations.
One particular concern was the requirement for preparation of portfolio evaluation plans (PEPs). Some of these had increased in size from a recommended 20 or 30 pages in length, to over 120 pages, with a concomitant increase in the administrative workload necessary to prepare them. A consensus had emerged within the bureaucracy that while it was important to have evaluation findings available to assist decision making by programme managers and by the Cabinet, detailed and elegantly worded plans were not necessary to achieve that objective. These arguments immediately found a receptive audience with the new government, which therefore decided in 1997 to abolish the evaluation strategy, including its four formal requirements:
● That every programme be evaluated every 3-5 years.
● That portfolios prepare annual portfolio evaluation plans.
● That new policy proposals indicate how they would be evaluated.
● That evaluation reports should normally be published.

With respect to the effectiveness/usefulness of randomized control trials there are many, many RCT studies (you only need to look in the implementation science discipline in health) where all these testing of ideas have failed to be moved to scale up or provide any useful information to policy and contexts. There is ample literature that demonstrates clearly that RCT’s are not suitable in complex social settings (including health), and that RCTs cannot answer certain research questions relevant to policy and program needs. They are expensive, time consuming, limited in scope and whilst useful in lab conditions typically breakdown in real world settings.

Just a few questions for Andrew Leigh in general for his new Evaluation Centre and specifically for his admiration of randomized control trials and their ability to be applied in complex social settings:

With respect to the governments ‘new’ evaluation centre:

• Hasn’t the federal government already got this infrastructure – for example, the Australian Institute of Health and Welfare, Australian Institute of Family Studies, Australian Institute of Criminology, the Australian Education Research Organisation to name a few, all government statutory bodies, with specific sector/subject matter expertise as well as the ability to provide evaluations? The questions then are why were previous governments outsourcing evaluations to consulting firms, and why is the current one not utlising existing mechanisms – surely the 14 people in the new organization do not have the expertise/ knowledge of the many others in our existing ones?
• Why do economists and the department of treasury think that they are in the best position to evaluate effectiveness and efficiencies in complex social settings such as health, education, housing etc? Whilst they can assess the dollar number, they would have no context or insight of the systems, contexts and values in which these problems occur.

Peter Graves12:25 pm 30 Aug 23

“Dr Leigh said embedding evaluation in the work of government could take many forms over time – and not just running pilot studies.”

There is a significance difference between beginning the implementation of evaluation across the APS and embedding its practise and responsiveness by management. One means will show seriously this is being taken by Secretaries: the level at which “evaluation” is carried out within each Agency.

One Department I worked in initially had no Evaluation Unit, then a new Secretary introduced a Deputy Secretary Evaluation, then that successor’s Secretary downgraded responsibilities to a Director EL 2.

Evaluating effectiveness cannot be an optional practice, but has to be mandatory – as the American Foundations for Evidence-Based Policy-Making Act required from 2018. This was followed by Presidential OMB Instructions on implementing the Act, in June 2021. It is also more that producing Australian evaluation reports – senior managers should be applying their Recommendations.

The Public Service Act states that the APS is “an apolitical public service that is efficient and effective in serving the Government, the Parliament and the Australian public;” Evaluation is an important means of demonstrating that highly-desirable effectiveness.
Dr Peter Graves
Lecturer, Policy Monitoring and Evaluation
University of Papua New Guinea.

Daily Digest

Want the best Canberra news delivered daily? Every day we package the most popular Riotact stories and send them straight to your inbox. Sign-up now for trusted local news that will never be behind a paywall.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.