Current Work

This page showcases some of our recent work and thinking.

Exploring what COVID-19 means for changes to our evaluation practice

While COVID-19 has significantly disrupted our lives, a closer inspection shows we are not all impacted in the same ways. For example, some groups have experienced huge financial and social impacts, while others have not.

Recently, practitioners in the fields of economic development and evaluation made sense of the fast-shifting contexts they find themselves in during two events: the Economic Development New Zealand (EDNZ) online conference and the Australian Evaluation Society’s online FestEVAL. This post, based on learnings from these events, explores what the COVID-19 disruptions might mean for evaluation practice.

This is the first in a series of posts where a group of evaluators, Nan Wehipeihana, Judy Oakden,Kellie Spee and Kahiwa Sebire, will explore our current evaluation practice for signals to the future.


Contracting public health and social service: Insights from complexity theory for Aotearoa New Zealand

In a recently published article in Kōtuitui: New Zealand Journal of Social Sciences Online, Judy Oakden along with co-authors Matt Walton and Jeff Foote, observe that public health and social services are often hard to specify, complex to deliver and challenging to measure. Using a complexity theory-informed lens this research explores the challenges and opportunities of contracting out for public health and social services in Aotearoa New Zealand.

Our findings show that public sector managers are experimenting with different ways of contracting out, yet the underlying New Public Management ethos, which is being applied in many administrative arms of government, can hamper initiatives. There is a growing impetus to find alternative approaches to contract out more effectively. An alternative, complexity theory-informed, framing highlights where changes to contracting out organisation and practices may support more effective service provision. This research also provides insights into why achieving change is hard.


Envirolink Evaluation

The Ministry of Business, Innovation and Employment commissioned us to review the Envirolink scheme and inform them of:
* how well the scheme operates, and
* whether it achieved its intended outcomes and provided value for money.
This was the first review of the scheme since it commenced in 2005.

The results of the review were positive and we rated the scheme very good overall. The review also provided evidence that the scheme makes a worthwhile and valuable contribution in supporting eligible councils to engage with and use environmental science and support uptake by users. The report’s recommendations will inform future funding decisions relating to Envirolink.


Chapter 13 Evaluation” contributed to the textbook “Social science research in New Zealand: An introduction”Edited by Martin Tolich and Carl Davidson / Auckland University Press

The new textbook Social science research in New Zealand: An introduction has been released.  This textbook is positioned as “the definitive introduction for students and practitioners undertaking social research in New Zealand”.  Judy Oakden and Julian King contributed a chapter on evaluation which described the evaluation approach used for the Evaluation of the Sustainable Farming Fund.

The editors observed at the start of this chapter:
“This is an important chapter because evaluation is a discipline that is often misunderstood both among other researchers and the users or research outputs. Evaluation differs from other research approaches because it focusses on understanding the value of something. … Much of what researchers think of as ‘leading-edge’ practices have come from evaluation, and the authors capture much of that excitement and innovation here”. 


Adaptive Evaluation: A synergy between complexity theory and evaluation practice

Some traditional evaluation designs assume there will be high levels of predictability and control. The problem is that complex programs or contexts challenge these basic assumptions. Often programs deal with emergent outcomes and objectives, adaptive program processes, nonlinear theories of change and evolving stakeholder expectations. In these instances, we need a more adaptive approach to evaluation. One that fits the environment without compromising rigor.

In this paper, we provide an overview of the challenge and previous efforts to address it, an introduction to basic theory and practice of human systems dynamics (HSD) and theoretical foundations for a new approach to evaluation in complex environments, Adaptive Evaluation. We demonstrate applications of this new evaluation practice in a case study. Finally, we articulate lessons learned and emerging questions.

Read the article Glenda Eoyang and Judy Oakden wrote about using generic rubrics in complex evaluation settings here.