Working Effectively in Evaluation

Pragmatica has an interest in how the evaluation process itself can have a positive impact and build capacity.

Evaluation in Complex Situations

It ain’t what you do, it’s the way that you do it: Evaluation processes that deliver maximum impact and influence  

Drawing on examples from the Sustainable Farming Fund Evaluation this presentation offered practical tips to build usability into an evaluation from the start. The presenters showed how they built use into the evaluation at all stages in the process. This included:

  • the scoping, relationship building and contracting stages
  • the way external evaluators engaged with commissioners to design a multi-purpose evaluation approach
  • the stance taken to project management and client engagement
  • planning communication from the outset, which ensured findings were communicated with stakeholders.

Judy Oakden (external evaluator) and Clare Bear (commissioner) presented this paper at the ANZEA Conference, in Wellington.

Download PDF

Using a Complementary Evaluation Approach

Complementary evaluation – bringing theory and practice together through relationship building

Complementary evaluation is a valuable approach that not only meets demands for efficiency and effectiveness but also aligns with proven good practice in evaluation. Complementary evaluation can promote relationship building and capacity building. This paper presented a theoretical perspective, an applied perspective and a “lived” perspective on the benefits of using this approach. The session was for people who commission evaluation, conduct evaluation and for those who wanted to learn of ways of working in collaborative environments.

Experience showed this way of working was:

  • genuine: enabling a wide range of people to engage in a fair and transparent way, and to come to well understood judgments about the evaluand
  • robust: with the best possible use of existing and new data, and with multiple perspectives and values included in the data gathering
  • practical: easily implemented and flexible enough to adapt to the needs of different organisations, while keeping the integrity of the model.

Carol Much, Nicola Maw and Judy Oakden presented this paper at the  ANZEA Conference, in Hamilton.

Download PDF

Mentoring support

Providing mentoring support to develop rubrics

To develop and use rubrics on a large complex evaluation of the Food Systems Innovation (FSI) Iniitiative, team members sought mentoring from Pragmatica. This practice note, by Samantha Stone-Jovicich from CSIRO, describes the steps she went through and the challenges she overcame to develop rubrics for the FSI Initiative.

Samantha reflected afterwards:

[This was] a very large, complex project where we were trying to design a monitoring, evaluation and learning (MEL) system to capture the dynamics of innovation and its impacts. Judy helped us adapt and tailor a rubrics approach. Her abundance of knowledge and experience, coupled with her collegiality, collaborative nature, and flexibility and creativity, were instrumental to supporting us incorporating a useful and fit-for-purpose rubrics approach into our MEL system.

Download PDF
Image