6. Disseminate results and support use of evaluation


Evaluation findings may be used for different purposes and should be received by key stakeholders (as outlined in section 2.7: Stakeholder engagement) as well as others who may not have a direct ‘stake’ in the evaluation but would benefit from the lessons learned.[1]

A common approach to communicating evaluation findings is through a report, which may be internally circulated or published on the agency’s website. Either a full report or summary needs to reach the intended audiences in a way that is relevant and meaningful to them.

Depending on the audience and budget, communicating results could also include:

  • presentations at forums and conferences
  • developing a short video
  • sharing stories, photos or drawings
  • creating posters or infographics.[2]

A plan for budget repair noted that publicly releasing evaluation outcomes could increase trust in government and demonstrate a commitment to improving services and outcomes. If an evaluation has adverse findings, it is important to remember that it is better to find out and take corrective action rather than assume a program is working.

The communication strategy should build on the stakeholder engagement plan (see section 2.7: Stakeholder engagement) and articulate how any lessons learned and recommendations will be communicated to stakeholders. Ultimately, the decision to publicly release evaluation findings rests with the relevant agency Minister(s).

[1] BetterEvaluation: Manager's guide to evaluation – Disseminate reports and support use of evaluation

[2] BetterEvaluation: Methods and processes – Report & Support Use | of findings

For each evaluation report, the relevant agency should prepare a written response to the recommendations. The response might agree, partially agree or disagree with a recommendation and should provide an explanation for any partial acceptance or rejection of a recommendation.

Where recommendations have been accepted, or partially accepted, key follow-up actions should be identified, with a time frame specified and the responsible unit named. It is important to identify an individual to coordinate the overall management response and an agreed deadline by which comments must be provided (usually within two months of receiving the final evaluation report). The management response must also be provided to the DTF. Further guidance on preparing management responses is available on the BetterEvaluation website.[1]

The Program evaluation framework outlines the responsibilities for ensuring evaluation findings are used, with roles for both central and lines agencies.

The Cabinet handbook states that agencies are expected to incorporate lessons learned from previous evaluations into program and policy design. This includes relevant evaluations from other states, territories or countries, as appropriate.

CMC, OCPE and DTF have a role promoting the use of evaluation in government decision making. The Program Evaluation Unit within DTF will prepare an annual whole of government summary of evaluation for the Budget Review Subcommittee. This will include:

  • a list of the evaluations that have been completed in the previous year, including a status update on recommendation responses
  • a list of any evaluations that were scheduled but did not take place
  • an updated rolling schedule of evaluations for the next four years.

For the evaluations that have been completed in the previous year, the PEU will outline the recommendations from each and note whether or not they have been actioned. This will ensure that agencies respond to the recommendations from each evaluation and help close the loop between evaluation planning and evaluation use.

6.2.1. Program reality checks

There are sometimes gaps between the expectations of program proponents and the realities of program implementation. The Northern Territory Ombudsman has cautioned that:

“Government must be steadfast in its support of new approaches, recognising the realities discussed in the following table – realities that are often overlooked in the turmoil of spontaneous reaction to newsworthy events.”[2]

Box 3, sourced from the Ombudsman’s 2017 report Women in Prison II, outlines program reality checks – many of which are relevant to all programs.

Box 3: Program reality checks from the Ombudsman's 2017 report Women in Prison II[2]
  • No program solves every problem.
  • Anyone can point to a theoretical gap or snag.
  • ‘Better’ is a big step forward. Don’t expect a panacea.
  • No program gets it right from the start.
  • No battle plan survives contact with the enemy. Improving a program over time due to experience is a positive step, not a concession of failure.
  • We all make mistakes, prisoners and staff.
  • Individual failings may make a juicy story but they don’t mean a program is failing.
  • Expect the best programs to be challenging and expect people to falter from time to time.
  • No program works overnight.
  • Don’t expect results today.

These program reality checks are consistent with the learning mindset that is encouraged in the Program evaluation framework.

6.2.2. De-implementation

Keeping the program reality checks in mind, an evaluation may sometimes prompt an agency to consider partly or entirely de-implementing a program.

While detailed guidance on de-implementation is outside the scope of this toolkit, the Department of Education (DoE) has developed a De-implementation guide, which will soon be publicly available . This guide aims to assist Territory schools and the Department of Education business units to reverse, reduce, replace or rethink programs that are not evidence-based. Although it has been written within an education context, the DoE De-implementation guide may help agencies consider what thoughtful de-implementation looks like in their own context.

[1] BetterEvaluation: Manager's guide to evaluation – 9. Disseminate reports and support use of evaluation: Support the use of evaluation findings

[2] Ombudsman NT: Investigation report – Women in Prison II - Alice Springs Women’s Correctional Facility: Volume 1

DTF is responsible for maintaining a register of all completed Territory Government evaluations. Internal reports and recommendations are encouraged to be shared so that lessons learned, whether positive or negative, can help improve future programs and strengthen policy design across agencies. Publicly available reports will be added to the DTF website.

Last updated: 14 December 2020

Give feedback about this page.

Share this page:

URL copied!