6.2. Responding to and using evaluation findings

For each evaluation report, the relevant agency should prepare a written response to the recommendations. The response might agree, partially agree or disagree with a recommendation and should provide an explanation for any partial acceptance or rejection of a recommendation.

Where recommendations have been accepted, or partially accepted, key follow-up actions should be identified, with a time frame specified and the responsible unit named. It is important to identify an individual to coordinate the overall management response and an agreed deadline by which comments must be provided (usually within two months of receiving the final evaluation report). The management response must also be provided to the DTF. Further guidance on preparing management responses is available on the BetterEvaluation website.[1]

The Program evaluation framework PDF (924.4 KB) outlines the responsibilities for ensuring evaluation findings are used, with roles for both central and lines agencies.

The Cabinet handbook states that agencies are expected to incorporate lessons learned from previous evaluations into program and policy design. This includes relevant evaluations from other states, territories or countries, as appropriate.

CMC, OCPE and DTF have a role promoting the use of evaluation in government decision making. The Program Evaluation Unit within DTF will prepare an annual whole of government summary of evaluation for the Budget Review Subcommittee. This will include:

  • a list of the evaluations that have been completed in the previous year, including a status update on recommendation responses
  • a list of any evaluations that were scheduled but did not take place
  • an updated rolling schedule of evaluations for the next four years.

For the evaluations that have been completed in the previous year, the PEU will outline the recommendations from each and note whether or not they have been actioned. This will ensure that agencies respond to the recommendations from each evaluation and help close the loop between evaluation planning and evaluation use.

6.2.1. Program reality checks

There are sometimes gaps between the expectations of program proponents and the realities of program implementation. The Northern Territory Ombudsman has cautioned that:

“Government must be steadfast in its support of new approaches, recognising the realities discussed in the following table – realities that are often overlooked in the turmoil of spontaneous reaction to newsworthy events.”[2]

Box 3, sourced from the Ombudsman’s 2017 report Women in Prison II, outlines program reality checks – many of which are relevant to all programs.

Box 3: Program reality checks from the Ombudsman's 2017 report Women in Prison II[2]
  • No program solves every problem.
  • Anyone can point to a theoretical gap or snag.
  • ‘Better’ is a big step forward. Don’t expect a panacea.
  • No program gets it right from the start.
  • No battle plan survives contact with the enemy. Improving a program over time due to experience is a positive step, not a concession of failure.
  • We all make mistakes, prisoners and staff.
  • Individual failings may make a juicy story but they don’t mean a program is failing.
  • Expect the best programs to be challenging and expect people to falter from time to time.
  • No program works overnight.
  • Don’t expect results today.

These program reality checks are consistent with the learning mindset that is encouraged in the Program evaluation framework PDF (924.4 KB).

6.2.2. De-implementation

Keeping the program reality checks in mind, an evaluation may sometimes prompt an agency to consider partly or entirely de-implementing a program.

While detailed guidance on de-implementation is outside the scope of this toolkit, the Department of Education (DoE) has developed a De-implementation guide, which will soon be publicly available . This guide aims to assist Territory schools and the Department of Education business units to reverse, reduce, replace or rethink programs that are not evidence-based. Although it has been written within an education context, the DoE De-implementation guide may help agencies consider what thoughtful de-implementation looks like in their own context.

[1] BetterEvaluation: Manager's guide to evaluation – 9. Disseminate reports and support use of evaluation: Support the use of evaluation findings

[2] Ombudsman NT: Investigation report – Women in Prison II - Alice Springs Women’s Correctional Facility: Volume 1

Last updated: 14 December 2020

Give feedback about this page.

Share this page:

URL copied!