2.4. Evaluation roles and responsibilities

There are many decisions to be made in an evaluation including:

  • the focus of the evaluation (including the key evaluation questions)
  • choosing the evaluator/evaluation team
  • approving the evaluation design
  • approving the evaluation report(s) and who can access them.

BetterEvaluation’s Manager’s Guide to evaluation encourages consideration of who will be involved in making these decisions, what their role will be and how the decisions will be made[1]:


Contributors to involve in the decision-making process may include:

  • the program manager within the agency
  • an evaluation steering committee
  • a technical advisory group or a number of individual technical advisors (including service providers)
  • a community consultation committee or relevant people from the community.


The role of each individual or group in relation to specific decisions can be categorised as follows:

  • to consult: those whose opinions are sought (bilateral)
  • to recommend: those who are responsible for putting forward a suitable answer to the decision.
  • to approve: those who are authorised to approve a recommendation.
  • to inform: those who are informed after the decision has been made (unilateral).


One or more of the following processes may be employed in the decision-making process:

  • Decisions made based on support from the majority. Where decisions may be contentious it is important to be clear about who is eligible to vote and whether proxy votes are allowed.
  • Decisions made based on reaching a consensus. In practical terms, that can mean giving all decision makers the right to veto.
  • Decisions made based on hierarchy (formal positions of authority).

Evaluation managers are often, but not always, the program manager. For large evaluations, the evaluation manager may be assisted by one or more staff members with specific responsibilities in the management process.

Table 7: Potential evaluation roles and responsibilities
Program manager
  • Educate the external evaluator(s) about: the program's objectives, operations and intended beneficiaries; the expectations about the evaluation and any relevant organisational background.
  • Provide input and/or collate feedback on the evaluation plan.
  • Specify reporting requirements in terms of progress in the implementation of the evaluation (including reporting of important challenges and their resolution or which potential issues need to be raised for decision making elsewhere).
  • Specify what is expected to be included in the formal evaluation report(s).
  • Keep the evaluator(s) appraised of any changes in the program's operations or evaluation context.
  • Provide regular updates on the evaluation process to all staff.
  • Monitor the implementation of the evaluation including completion of milestones/deliverables.
  • Facilitate program staff involvement in the evaluation, where relevant and agreed.
  • Serve as trouble-shooter, resolving problems or locating help to resolve them.
[Program name] Evaluation Steering Committee
  • Endorse the terms of reference and evaluation work plan.
  • Provide feedback on draft findings and recommendations and the draft evaluation report.
  • Chair of the Steering Committee to sign-off on the final evaluation report.
  • Draft the evaluation terms of reference and evaluation plan for the evaluation.
  • Conduct, manage, or advise on evaluation activity as required.
  • Develop an evaluation plan, in conjunction with the program manager.
  • Provide monthly or quarterly progress reports on the implementation of the evaluation (written or in person).
  • Attend evaluation meetings.
  • Train data collectors on participant/case selection for sampling purposes, using data collection instruments, data quality assurance.
  • Ensure adherence to ethical standards adherence (for example, confidentiality of data) during all phases of the evaluation.
  • Oversee implementation of data collection such as: interviewing program staff and participants, conducting focus groups,   observing service delivery activities, reviewing participant case records, developing data management procedures and tools (such as a database), coding and cleaning data, analysing data.
  • Write interim (quarterly, biannual, yearly) evaluation reports and the final evaluation report.
  • Present findings.
  • Review the final evaluation plan

[1] M. K. Gugerty, D. Karlan, The Goldilocks Challenge: Right Fit Evidence for the Social Sector, New York, Oxford University Press, 2018.

Last updated: 14 December 2020

Give feedback about this page.

Share this page:

URL copied!