Drawing From Complexity Science 

Title: Drawing From Complexity Science

Date and time: Monday 8 May & Monday 15 May 202310.00am to 1.00pm AEST (registration from 9.45am) Registrants are to attend both sessions. (full day workshop - 2 sessions)

Venue: Via Zoom. Details will be emailed to registrants just prior to the workshop start time

Facilitator: Jonathan Morell

Register online by: Friday 5 May 2023. Spaces limited to 25 participants.

Fees (GST inclusive): Members $260, Organisational member staff $375, Non-members $425, Student member $125*, Student non-member $210*
 * Students must send proof of their full-time student status toaes@aes.asn.au 

Workshop Overview

Many of the programs we deal with exhibit complex system behavior. Because they do, we need program theory, methodology, and data interpretation strategies that respect those behaviors. Doing so does not require exotic tools or methods that are outside the sphere of common evaluation knowledge. Rather, it requires understanding how that familiar knowledge can be applied to complex behavior. This workshop will focus on a few critical concepts in Complexity Science that are especially useful for discerning how programs behave and what they accomplish.

The workshop will rely heavily on active participation by its attendees. Please join with the expectation that your experience and knowledge will play a crucial role in learning by the entire group.

Workshop Content

The workshop will focus on three constructs from Complexity Science – emergence, sensitive dependence, and attractors. Participants will work in groups to apply these concepts to evaluation scenarios, with the intent of providing a working knowledge of how these constructs affect decisions about theory, methodology, and data interpretation. Other constructs from Complexity Science will be added as appropriate as evaluation exercises evolve. Because it is emphatically not the case that drawing from complexity is always desirable, the workshop will provide an appreciation of the trade-offs between drawing from complexity or relying on traditional forms of reasoning about program operation and outcome. While it is desirable to think in terms of complexity early in an evaluation’s lifecycle, the workshop will show how a great deal can be gained from applying complexity during data collection, analysis, and interpretation. Finally, there will be a focus on how evaluators can work with stakeholders to understand the implications of complex behavior.

Workshop Outcomes 

At the conclusion of the workshop, participants will be able to answer the following questions. 1) What is the value of thinking in terms of the behaviour of complex systems rather than complex systems themselves? 2) How can reasoning in terms of “emergence”, “sensitive dependence” and “attractors” affect beliefs about program design, methodology, and data interpretation? 3) How can the familiar corpus of evaluation knowledge be applied with respect to complexity? 4) What are the trade-offs between applying constructs from Complexity Science and relying on traditional evaluation approaches? 5) How can constructs from Complexity Science be applied to good advantage at different stages in the evaluation lifecycle? 6) How can evaluators have meaningful discussions with program stakeholders with respect to complexity?

PL competencies

This workshop aligns with the AES Evaluator’s Professional Learning Competency Framework competencies. The identified domains are:

  • Domain 2 – Evaluation theory
  • Domain 3 – Culture, stakeholders, and context
  • Domain 4 – Research methods and systematic inquiry

Who should attend?

The workshop will rely heavily on participants’ experience to inform discussion and conduct exercises. Therefore, the workshop is limited to people who have hands-on experience doing evaluation. Each participant in the workshop should be able to make statements to the group such as: “This is what happened when I did X”. Or “I had planned to use methodology Y and stakeholder involvement plan Z, but it did not work out for reasons A, B, and C.” Or “This implementation tactic worked because of circumstances D, E, and F”. In addition to being able to make such statements, participants should be willing to do so.

Workshop start times

About the facilitator

Dr. Jonathan (Jonny) Morell is known for his writing and research on connections between Evaluation and Complexity Science. Topics he has studied include: 1) complexity constructs that are applicable to evaluation, 2) agent-based modelling for evaluation, 3) detection of hidden assumptions, 4) unintended consequences, 5) use of project schedules as program models, and 6) how evaluation can benefit by drawing from the domains of Ecology and Evolutionary Biology. Jonny believes that evaluation should draw from Complexity Science because interventions produce complex outcomes, but that evaluations should be as simple and straightforward as possible.  His evaluation work includes safety in transportation settings, the R&D process, technology choice and implementation, and the application of event-based networks to the simulation of social behavior. Jonny has been recognized by the American Evaluation Association, who awarded him their Paul F. Lazarsfeld Evaluation Theory Award. He is Editor-in-Chief Emeritus of the journal Evaluation and Program Planning, and owner of 4.669… Evaluation and Planning. His PhD is in Social Psychology from Northwestern University. Details of his work can be found at his website www.jamorell.com, his blog Evaluation Uncertainty: Surprises in Programs and Their Evaluations, and his YouTube channel. His mantra is: Respect Data. Trust judgement.

View All Events Register