2.10. Reviewing the evaluation work plan

The evaluation plan should be developed well in advance of the start of the first evaluation (ideally, before program implementation) to allow for review by relevant stakeholders, making necessary changes, obtaining ethical approval (where required) and pilot testing data collection instruments (as needed).

The evaluation work planneeds to be submitted to DTF within six months of program approval. As programs may change over time, the evaluation work plan should be considered a ‘living document’. It should be reviewed periodically or in response to significant program events by the program manager. DTF should be provided with updated versions in a timely manner.

Prior to and throughout the implementation of the evaluation, it is important to review the evaluation work plan to determine whether it:

  • is consistent with the available evaluation resources and agreed evaluation objectives
  • focuses on the most important types of information to know (‘need to know’ rather than ‘nice to know’)
  • does not place undue burden on project/program staff or participants
  • is ethical and culturally appropriate.

Reviewers could include: the DTF, project/program staff, internal or external evaluation experts, project/program participants, and relevant community members.

2.10.1 Technical review of the evaluation design

Before finalising the design, it can be helpful to have a technical review by one or more independent evaluators. It may be necessary to involve more than one reviewer in order to provide expert advice on the specific methods proposed, including specific indicators and measures to be used. Ensure that the reviewer is experienced in using a range of methods and designs, and well briefed on the program context, to ensure they can provide situation-specific advice.

2.10.2 Review of the design by the evaluation management structure

In addition to being considered technically sound by experts, the evaluation design should be seen as credible by those who are expected to use it. Formal organisational review and endorsement of the design by an evaluation steering committee can assist in building credibility with users.

Undertake data rehearsal of possible findings with the primary intended users where possible. This is a powerful strategy for checking the appropriateness of the design by presenting mock-ups of tables, graphs and quotes that the design might produce. It is best to produce at least two different versions – one that would show the program working well and one that would show it not working.

Ideally, the primary intended users of the evaluation will review both designs and either confirm suitability or request amendments to make the potential findings more relevant and credible.[1]

2.10.3 Review the program logic

When reviewing the program logic, the following questions should be addressed:

  • What evidence was the basis for its development? What additional evidence should be used in the review?
  • Whose perspective formed its basis? To what extent and in what ways were the perspectives of intended beneficiaries and partner organisations included?
  • Were there different views about what the intended outcomes and impacts were and/or how these might be brought about?
  • Has there been more recent research and evaluation on similar projects and programs which could inform the program logic?

[1] BetterEvaluation: Manager's guide to evaluation – 5. Manage development of the evaluation methodology

Last updated: 14 December 2020

Share this page:

Was this page useful?

Describe your experience

More feedback options

To provide comments or suggestions about the NT.GOV.AU website, complete our feedback form.

For all other feedback or enquiries, you must contact the relevant government agency.