Workshop: Rubrics for Complexity and Innovation Settings
Date and time: Tuesday 21st February 2023, 10.00am to 1.00pm AEDT (registration from 9.45am). Half day workshop - 1 session
Venue: Via Zoom. Details will be emailed to registrants just prior to the workshop start time
Facilitator:Dr Ellise Barkley
Register online by: Monday 20 February 2023. Spaces limited to 25 participants
Fees (GST inclusive): Members $165, AES Organisational member staff $275.00, Non-members $315, Student member $95*, Student non-member $140*
* Students must send proof of their full-time student status to email@example.com
When evaluating social impact, systems change, and place-based approaches evaluators need a toolkit that caters for complexity. Rubrics offer a flexible and effective tool for evidence-informed design and decision-making, shared learning and evaluation, and mapping progress of collaborative-led change across long timeframes. The purpose of this workshop is to build participant capability in developing and applying rubrics for complexity, innovation, and community settings, to support participatory approaches to measurement, learning and evaluation
The knowledge and techniques presented will draw on the facilitator’s extensive practice-based experience of designing and using bespoke rubrics for evaluating systems change and place-based approaches. As well as showcasing a range of examples, participants will learn some key principles, steps, and tips for designing and applying rubrics in complex and collaborative settings. This includes the use of participatory, community-informed, and culturally-appropriate processes for rubric development and use, and how to utilise qualitative and quantitative data, evidence and measures. The workshop will be interactive and include opportunities for peer-to-peer sharing.
Some of the rubrics covered in the workshop include:
- progress mapping tools for long-term change initiatives and systems change (great for tracking the sometimes ‘invisible’ outcomes, such as shifts in power sharing, community narratives etc, and for allowing for a '360 degree' view across stakeholders and the roles they play to be written into the rubric)
- bespoke rubrics for design, innovation and pilots that support rapid learning and developmental evaluation
- the First Nations’ Ripple Mapping Tool, a rubric developed by Skye Trudgett (Kowa) and Clear Horizon with 'Firsts Nations First' leadership at the centre
- rubrics to support contribution analysis and strength of evidence rating.
By the end of the workshop, participants will understand some of the rubrics relevant to complexity contexts and some key principles and steps for participatory rubric development and use. Participants will develop some basic applied skills in participatory design for building a rubric. Throughout the workshop, you will have the opportunity to set out a proposed approach to ECB in your context. You can immediately start using (or at least consulting on) this plan in your workplace.
The two key domains of relevance to the Framework are:
-Domain 3: Culture, Stakeholders and Context
-Domain 4: Research Methods and Systematic Inquiry
For domain 3, the workshop builds competency in rubrics as a tool to address diverse definitions of 'what counts as evidence' (characteristic of collaborations and place-based approaches) and is inclusive of a multiplicity of perspectives, including the incorporation of community voice alongside quantitative targets. We are finding rubrics are a very culturally-appropriate method for engagement, evaluation processes, and progress reporting - and can cater to wide diversity of stakeholders and contexts. In the workshop I will be promoting bespoke context-specific rubric design. For domain 4, participants will build skill and understanding in the method and principles for developing a rubric to suit complexity settings. The case examples I will be providing are highly rigorous and systematic approaches for evaluating systems change and place-based approaches. Rubrics are such a versatile and effective tool for helping collaborations make evaluative judgements- however need to be structured to be fit for their purpose and objective of the evaluative activity. We will cover in the workshop identifying appropriate evaluative criteria and measures, and how to develop rubrics inclusive of qual and quant data, as well as participatory process for how rubrics are used to increase rigour.
Who should attend?
The workshop is designed for evaluators and change-makers working on, or supporting, complex social impact initiatives. It is intermediate-level and would suit participants with moderate evaluation experience and some understanding of rubrics assumed.
- VIC, NSW, ACT, TAS: 10.00am
- QLD: 9.00am
- SA: 9.30am
- NT: 8.30am
- WA: 7.00am
- New Zealand: 12.00pm
- For other time zones please go to https://www.timeanddate.com/worldclock/converter.html
About the facilitator
Ellise is Lead Principal Consultant with Clear Horizon and specialises in evaluating social impact, systems change, and place-based initiatives. Ellise has over 20 years of experience across sectors in program design and evaluation and has a keen interest in participatory, learning-driven approaches. Ellise works with a mix of communities, governments, and non-government partners on long-term and complex projects, including many collective impact initiatives. Ellise played a key role in developing the Place-based Evaluation Framework and is the online learning facilitator for Clear Horizon Academy’s ‘Evaluating Systems Change and Place-based Approaches’ public course.