2.6. Budget and resourcing
When designing a program, it is important to develop an estimate of the resources that are available for evaluation and what will be required to do the evaluation well.
The resources needed for an evaluation include:
- existing data
- funding to engage an external evaluator, evaluation team or for specific tasks to be undertaken and for materials and travel
- time, expertise and willingness to be involved of staff, program partners, technical experts and the wider community, whether as part of the evaluation team, evaluation governance and/or relevant people and data sources.
When considering data availability, look carefully at the quality of existing data and what format it is in. Also clarify the skills and availability of any people who will need to be involved in the evaluation.
There are a few ways to estimate the budget for an external evaluation:
- Calculating a percentage of the program or project budget – sometimes 1–5%: This is a crude rule of thumb approach. Large government programs with simple evaluation requirements may be around 1%; smaller government programs with more complex evaluations – for example, detailed testing and documentation of an innovation – may be around 5%.
- Developing an estimate of days needed and then multiplying by the average daily rate of an external evaluator: This can be useful for simple evaluations, especially those using a small team and a standardised methodology such as a few days of document review, a brief field visit for interviews and then a short period for report write up.
- Using the average budget for evaluations of a similar type and scope: This can be a useful starting point for budget allocation providing that the amounts have been shown to be adequate (see Table 12 in section 1.2: How will the program achieve this?).
- Developing a draft design and then costing it, including collection and analysis of primary data: This can be done as a separate project before the actual evaluation is contracted but will usually require staff with prior evaluation experience.
Estimate the costs of collecting and analysing the data, as well as the project management and reporting time needed. Allow time to secure resources (for example, including them in an annual or project budget, or seeking someone with particular expertise). If ongoing evaluation input is needed consider a staged approach to funding.
Table 12: Estimated costs for evaluation services
|Evaluation services||Scale of the program||Estimated cost|
|Design and planning for evaluation|
|Capability training for internal evaluation teams (one day workshop)||any scale||$5,000–10,000|
|Facilitate internal development of a program logic and the outcomes to be targeted by the recovery program||any scale||$5,000–10,000|
|Evaluation of needs/needs analysis for evaluation design||large scale||$20,000–30,000|
|small or mid scale||$10,000–20,000|
|Developing outcome indicators and a plan for measuring and monitoring progress toward outcomes (including planning workshop)||any||$5,000–15,000|
|Developing a plan for measuring and monitoring progress toward outcomes, including development of indicators||large scale||$15,000–20,000|
|small or mid scale||$10,000–15,000|
|Preparing an evaluation plan for a full outcome evaluation||any||$25,000–35,000|
|Supporting an internal team to develop an evaluation plan for a full outcome evaluation (providing advice, reviewing documents, providing material and resources, small workshops)||any||$5,000–15,000|
|Providing ongoing evaluation support and advice to an internal evaluation team||any||$10,000–20,000|
|Conducting Process and/or outcome evaluation|
|Process evaluation||large scale||$65,000–90,000|
|small or mid scale||$50,000–70,000|
|Support and advice for an internally-led process review||any||$20,000–30,000|
|interim evaluation of program process and progress toward outcomes||large or mid scale||$70,000–100,000|
|Conduct a full outcome evaluation||small scale||$50,000–80,000|
|Conduct a full outcome evaluation with multiple components||large scale||Over $175,000|
|Outcome evaluation of a component of a larger scale program (for example, social wellbeing, business recovery)||large or mid scale||$50,000–125,000|
2.6.1. Evaluation on a shoestring
If the resources required for the evaluation are more than the resources available, additional resources will need to be found and/or strategies used to reduce the resources required. A hybrid approach to evaluation (where an evaluation is delivered using internal resources with support from specialist providers) can help keep evaluation costs down and build internal capability. Careful targeting of the evaluation within the context of existing evidence can also help keep the costs of evaluation down.
It is not feasible or appropriate to try to evaluate every aspect of a program. As such, evaluations need scope boundaries and a focus on key issues. For example:
- a program evaluation might look at implementation in the past three years, rather than since commencement
- a program evaluation could look at performance in particular regions or sites rather than across the whole Territory
- an outcome evaluation may focus on outcomes at particular levels of the program logic or for particular components of the program
- a process evaluation may focus on the activities of particular stakeholders, such as frontline staff, or interagency coordination.
Table 13: Possible options for reducing evaluation costs
Cost reduction options
How to manage the risks
Reduce the number of key evaluation questions
Evaluation may no longer meet the needs of the primary intended users
Carefully prioritise the key evaluation questions
Review whether the evaluation is still worth doing
Reduce sample sizes
Reduced accuracy of estimates
Check these will still be sufficiently credible and useful through data rehearsal (mock-ups of tables and graphs showing the type of data the evaluation could produce)
Make more use of existing data
May mean that insufficiently accurate or relevant data are used; cost savings may be minimal if data are not readily accessible
This is only appropriate when the relevance, quality and accessibility of the existing data is adequate – need to check this is the case before committing to use
 Personal communication from Dr George Argyrous (Manager, Education and Research, Institute for Public Policy and Governance, University of Technology Sydney) based on evaluation costs in New South Wales.
 In the TulaSalud Case Study (part of the Goldilocks Toolkit), Innovation Poverty Action noted the efficacy of the program’s practices were documented in medical research. Therefore, they recommended the evaluation should focus on the training of community health workers and their ability to use the system because this was more relevant and less burdensome operationally than an assessment of the platform on health outcomes.
Last updated: 14 December 2020
Share this page:URL copied!