Last updated 9 October, 2020 by Judy Oakden
Working Effectively in Evaluation
Pragmatica has an interest in how the evaluation process itself can have a positive impact and build capacity.
Evaluation in Complex Situations
Drawing on examples from the Sustainable Farming Fund Evaluation this presentation offered practical tips to build usability into an evaluation from the start. The presenters showed how they built use into the evaluation at all stages in the process. This included:
- the scoping, relationship building and contracting stages
- the way external evaluators engaged with commissioners to design a multi-purpose evaluation approach
- the stance taken to project management and client engagement
- planning communication from the outset, which ensured findings were communicated with stakeholders.
Judy Oakden (external evaluator) and Clare Bear (commissioner) presented this paper at the ANZEA Conference, in Wellington.
Using a Complementary Evaluation Approach
Complementary evaluation is a valuable approach that not only meets demands for efficiency and effectiveness but also aligns with proven good practice in evaluation. Complementary evaluation can promote relationship building and capacity building. This paper presented a theoretical perspective, an applied perspective and a “lived” perspective on the benefits of using this approach. The session was for people who commission evaluation, conduct evaluation and for those who wanted to learn of ways of working in collaborative environments.
Experience showed this way of working was:
- genuine: enabling a wide range of people to engage in a fair and transparent way, and to come to well understood judgments about the evaluand
- robust: with the best possible use of existing and new data, and with multiple perspectives and values included in the data gathering
- practical: easily implemented and flexible enough to adapt to the needs of different organisations, while keeping the integrity of the model.
Carol Much, Nicola Maw and Judy Oakden presented this paper at the ANZEA Conference, in Hamilton.
To develop and use rubrics on a large complex evaluation of the Food Systems Innovation (FSI) Iniitiative, team members sought mentoring from Pragmatica. This practice note, by Samantha Stone-Jovicich from CSIRO, describes the steps she went through and the challenges she overcame to develop rubrics for the FSI Initiative.
Samantha reflected afterwards:
[This was] a very large, complex project where we were trying to design a monitoring, evaluation and learning (MEL) system to capture the dynamics of innovation and its impacts. Judy helped us adapt and tailor a rubrics approach. Her abundance of knowledge and experience, coupled with her collegiality, collaborative nature, and flexibility and creativity, were instrumental to supporting us incorporating a useful and fit-for-purpose rubrics approach into our MEL system.