Complexity

Evaluating in complexity: What have we learned about designing and running evaluations

Judy Oakden presented this paper at the American Evaluation Association Conference in Atlanta, Georgia on 28  September 2016.

In Eoyang and Berkas’s article Evaluating Performance in Complex Adaptive Systems (1998) they suggested a number of systems thinking and complexity approaches that might be of use to evaluators in evaluating performance in complex adaptive systems (Eoyang & Berkas, 1998). Eighteen years on, we have made considerable progress in the developing tools and techniques for evaluation in these situations.

This demonstration was designed for evaluation practitioners looking to integrate specific skills into their evaluation toolkit by suggesting some approaches that are useful when navigating the characteristic behaviours of complex adaptive systems. This includes:

  • the development of evaluative criteria which act an ‘anchor point’ to hold the evaluation together
  • use of generic performance ratings to accommodate shifts in the system
  • use of pattern spotting approaches in a sensemaking process that makes more nuanced judgments –  taking into account the emerging context and competing values of stakeholders
  • thinking about different ways findings might be communicated.
Download PDF

Collective Sensemaking – How to Plan Yours to Great Effect

Along with Irene Guijt, Judy Oakden presented this paper at the European Evaluaiton Society Conference in 2016.

Sensemaking is essential in evaluation design, facilitating deeper stakeholder engagement that leads to better insights and more utility. This paper discussed how to design a collective sensemaking process as part of your M&E practice in ways that helps navigate values and needs between different stakeholders to make evaluation more useful. Five questions were tackled:

  • What is ‘collective sensemaking’ in M&E?
  • What forms can it take?
  • When is collective sensemaking successful and what conditions are necessary to achieve this?
  • What role can it play to navigate the values, needs and understandings responsibly?
  • Why is collective sensemaking often not (yet) part of evaluative processes? How can we strengthen this part of M&E practice?

In this session we observed that while there has been considerable innovation with qualitative data collection methods and of analytical procedures in quantitative reasoning in M&E, innovations in the analytical processes for M&E of mixed methods appear to be lagging behind.

Download PDF

What does good and bad procurement look like from a contracting and relationship management perspective?

Judy Oakden presented this paper at the International Year of Evaluation 2015 Commissioning Evaluation Seminar, hosted by the Ministry for Business, Innovation and Employment in Wellington.

At this seminar Judy discussed the challenges of contracting and looked at whether current practices are “fit for function”.  This provided insight for people in procurement roles regarding the need to set up in a manner that will support the procurement process.

Download PDF
Image