24
Conceptual distinctions: Complexity and Systems Making sense of evaluation of complex programmes IDS workshop Brighton 26-27 March 2013 Bruno Marchal Institute of Tropical Medicine, Antwerp

IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

  • View
    560

  • Download
    2

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Conceptual distinctions: Complexity and Systems Making sense of evaluation of complex programmes

IDS workshop Brighton 26-27 March 2013

Bruno Marchal Institute of Tropical Medicine, Antwerp

Page 2: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Values &

organisational

culture

Goal

attainment

Service provision

Interacting with the

environment

2

The multipolar framework, based on Sicotte et al. 1999

Page 3: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Making sense of development programmes

3

A programme represents a set of resources that is provided to people who decide to use them (or not) to reach the programme’s goals (or other goals)

‘Programmes are complex interventions introduced in complex social systems’ (Pawson 2013)

A programme is all about people

Page 4: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

An overview of Systems in 2 minutes

A system

=

A unit made up by and organised through relations

between elements (agents),

structures and

actions (processes)

(Morin, 1977)

4

Page 5: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

System as a machine

Mechanical system perspective

Systems as living organisms

Open systems

Complex systems

5

Page 6: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Open systems

Open

Bounded

Negative entropy

Embeddedness

constant interaction with environment and between its open components

external & internal boundaries

requiring inputs (resource dependency)

part of a larger system and of the environment of other systems (co-evolution)

6

Systems thinking - Senge, The fifth discipline (1990)

Page 7: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Complex systems

Multiple interconnected elements

Non-linear interaction, non-proportional effects

Negative & positive feedback loops (time delays)

change in 1 element can change (the context of) all others

sensitive to initial conditions

‘Positive feedback enables a system to escalate many tiny changes into globally different behaviour patterns’ (Stacey 1995)

7

Page 8: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Complex systems

Influenced by their past

evolution of complex systems is not completely unpredictable

path dependence

the future is boundable

8

Page 9: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

9

Emergence - emergent behaviour

Complex adaptive systems

Adaptation = capability of learning and evolving

• not just ‘passive’ adaptation to environment but essentially human capacity to learn, adapt and survive

Page 10: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Consequences

Complex adaptive systems can only be understood as a whole

its elements, relations and history all matter

Their behaviour cannot be (fully) predicted

non-linear relations

agency - structure interaction

… but their future is ‘boundable’

sensitivity to initial conditions & path-dependence

Challenges of complexity for evaluators, planners, researchers, …

10

Page 11: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

How to deal with complexity?

11

Some sense making frameworks Stacey’s diagramme (Stacey 1995)

Stacey et al. (2000)

Snowden & Stanbridge (2004)

Kurtz & Snowden (2003)

Page 12: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

The Cynefin framework Kurtz & Snowden (2003)

Simple contexts

Cause-and-effect relations: stable, clear, linear

Known knowns

Predictive models and best practices can be identified

Straightforward management Structured techniques: standard best practices, command and control style

12

- the ordered domain of well-known causes and effects

Evaluation

Assessing impact is possible and straightforward

Standardised approaches can be developed, requiring technical skills

Page 13: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Complicated contexts Cause and effect relationships

are known, but not clear for everyone Causal chains spread out

over time and space Knowing cause-and-effect

relations is difficult Known unknowns

Effective management relies on experts & satisfying good practices

Evaluation

Problem can be deconstructed in sets of simple problems

But this requires expert evaluators

13

- the ordered domain of knowable causes and effects

Page 14: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Complex contexts Cause-and-effect relations exist (multiple – non-linear) Patterns emerge, but impossible to predict in most cases

Retrospective coherence: we can understand why events happen

only after the facts

Unknown unknowns

Expert opinion is of limited use because it is based on understanding of and experience with hard-to-know, but in essence predictable patterns

Pattern matching, trajectory tracking, fail-safe experiments, …

14

- the unordered domain

Page 15: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Complex contexts Evaluation

Understanding requires Probing – Sensing – Responding • Flexible designs, adaptation, piloting & testing • Adopting multiple perspectives

• Multi / interdisciplinary teams • Participatory approaches

15

Page 16: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

The way forward for evaluation and research

Accepting uncertainty

Complex issues requires “a willingness to be uncertain at times and to know that being uncertain is crucial to the process” (Zimmerman et al. 2012)

Expertise is relative

Reflexivity / ability to decontextualize experience and recontextualise knowledge and know how

Reflexive practice

Kolb’s experiential learning

Learning organisation theory

16

Page 17: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Capturing emergence

Dealing with unknown unknowns

Alterations of the planned intervention, parallel events, context elements, etc.

Use of wide range of observation and collection methods (see longitudinal approaches like processual analysis, Pettigrew 1990)

Dealing with the social interaction that leads to emergent behaviour

Flexible and adaptive designs that allow learning

17

Page 18: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Figuring out causal attribution

Complex problems can only be understood a posteriori

Ex post, plausible explanations based on demonstrating mechanisms

What is the relative contribution of the intervention to the observed outcome?

Contribution analysis (Mayne 2001)

Qualitative comparative analysis (Ragin 1999)

‘Hindsight does not lead to foresight, because external conditions and systems constantly change’ (Snowden & Boone 2007)

18

Page 19: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Dealing with social complexity

Developmental evaluation (Patton 2011)

Focus on complex situations and interventions

Continuous adaptation of the evaluation to the evolving intervention, monitoring and documentation of changes in time (emergent evaluation design)

Theory-driven approaches

Theories of change

Theory-based evaluation

Realist evaluation

19

Page 20: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Realist evaluation Pawson & Tilley (1997)

Generative causality

Actors have a potential for effectuating change by their very nature (agency)

Structural and institutional features exist independently of the actors (and researchers)

Both actors and programmes are rooted in a stratified social reality, which results from an interplay between individuals and institutions

Causal mechanisms reside in social relations as much as in individuals, and are triggered by the intervention only in specific contexts

20

Page 21: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Realist evaluation = theory-driven

Middle range theory = bridge between knowledge and empirical findings, and between cases

MRT is ‘specified’ in a process of cumulative testing

Plausible explanations

RE indicates in which specific conditions a particular programme works (or not) and how (psychological, social or cultural mechanisms)

21

Page 22: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

Conclusion

Cynefin framework can help to make sense of when problems, interventions or situations are likely to be complex

Practical consequences: each zone calls for specific evaluation approaches and capacities

Currently, much attention to modelling approaches

Need for innovative approaches to better deal with social complexity

Theory-driven approaches allow a peek in the black box

Realist evaluation

22

Page 23: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

References

Kurtz C & Snowden D (2003) The new dynamics of strategy: sense-making in a complex and complicated world. IBM Systems Journal 42: 462-483

Marchal B et al. (2013) Complexity in health. Consequences for research & evaluation, management and decision making. Working Paper, Institute of Tropical Medicine, Antwerp

Mayne J (2001) Addressing attribution through contribution analysis: using performance measures sensibly. Canadian Journal of Program Evaluation 16(1): 1-24.

Pawson R & Tilley N (1997) Realistic Evaluation. London: Sage

Pawson R (2013) The science of evaluation: a realist manifesto. London, SAGE Publications

Pettigrew A (1990) Longitudinal field research on change: theory and practice." Organization science 1(3)

Ragin, C (1999) Using qualitative comparative analysis to study causal complexity. Health Serv Res 34(5 Pt 2): 1225-1239

23

Page 24: IDS Impact Innovation and Learning Workshop March 2013: Day 1, Paper Session 1 Bruno Marchal

References

Senge P (1990) The fifth discipline. New York: Currency Doubleday

Snowden D & Boone M (2007) A leader's framework for decision making. Harvard Business Review

Snowden D & Stanbridge P (2004) The landscape of management: creating the context for understanding social complexity. Emergence: Complexity and Organisation 6(1-2): 140-148

Stacey R (1995) The science of complexity: an alternative perspective for strategic change processes. Strategic Management Journal 16: 477-495

Stacey R et al. (2000) Complexity and management. Fad or radical challenge to systems thinking? London, Routledge

Zimmerman B, Dubois N, Houle J, Lloyd S, Mercier C, et al. (2012) How does complexity impact evaluation? An introduction to the special issue. The Canadian Journal of Program Evaluation 26: v-xx.

24