Saturday pre-conference workshops

Saturday 5 September 2015

>>> DOWNLOAD a printable pre-conference workshop program

Having trouble seeing the full overview table? Switch to mobile version here.

View Sunday program here.

8am–9am REGISTRATION
9am–12:30pm WORKSHOP PROGRAM

Everyday program logic for organisations: doing it and using it (full day)

Zazie Tolmer; Lee-Anne Molony

> Details
> Register

Survey design and item writing for evaluators (full day)

Kelly Hannum

> Details
> Register

Developmental evaluation: applying systems thinking and complexity concepts to enhance innovation and use (full day)

Nan Wehipeihana; Kate McKegg

> Details
> Register

Demystifying data triangulation in an evaluation context (half day)

Timoci O'Connor; Ruth Aston

> Details
> Register

The Most Siginifcant Change (MSC) in use: an overview of the technique and practical examples of how it can be adapted to context
(half day)

Tracey Delaney

> Details
> Register

Developing monitoring and evaluation frameworks (full day)

Ian Patrick

> Details
> Register

Systems Thinking for evaluation practitioners (full day)

Chris Skelly; Samantha Abbato

> Details
> Register

BEGINNER/
INTERMEDIATE

BEGINNER/
INTERMEDIATE

INTERMEDIATE

BEGINNER

BEGINNER

INTERMEDIATE

BEGINNER

12:30pm–1:30pm LUNCH
1:30–5pm WORKSHOP PROGRAM
Zazie Tolmer; Lee-Anne Molony
continued
Kelly Hannum
continued
Nan Wehipeihana; Kate McKegg
continued

The evaluation findings conference: a key step in drafting effective and useful evaluation reports and other products (half day)

David Turner

> Details
> Register

Bringing evaluation to life with improvisational facilitation (half day)

Vanessa Hood; Viv McWaters

> Details
> Register
Ian Patrick continued Chris Skelly; Samantha Abbato continued
       INTERMEDIATE BEGINNER TO ADVANCED    

SATURDAY 5 SEPTEMBER


Everyday program logic for organisations: doing it and using it

presented by Zazie Tolmer and Lee-Anne Molony; Clear Horizon Consulting Pty Ltd, Australia-wide

The purpose of the workshop is to demystify program logic and to equip participants with fundamental principles and practical techniques to effectively use program logic for monitoring and evaluation.

The workshop will clarify the basics of program logic before demonstrating how program logic supports the development of tailored monitoring and evaluation plans. Specifically, the workshop will focus on how program logic contributes to:

  • understanding how a program intends to effect change
  • scoping a monitoring and evaluation plan
  • meaningful and implementable monitoring and evaluation activities
  • program learning and improvement.  


The workshop will also explore the soft benefits of program logic for program performance, including developing a shared understanding of a program within the program team, supporting concise and clear communications about the program.

There are many approaches to developing program logic. For this workshop, a people centred approach will be used. Social change happens through people changing their behaviour, practices and policies, and the people-centred approach places a strong focus on expected behavioural outcomes, i.e. who needs to be doing what differently. This approach is applicable to all sectors, but particularly for programs and projects that involve elements of behavioural change.

Using a simple case study, participants will improve their understanding of program logic, experience building a real program logic model and explore how to use the model to inform monitoring and evaluation planning.

This workshop is aimed at the beginner to intermediate level, and is applicable to all sectors.

The workshop addresses two domains of the Evaluators' Professional Learning Competency Framework directly, including:

  • Research Methods and Systematic Inquiry – program logic is described as a tool competent evaluators use to undertake assessments of impact; and
  • Evaluation Activities – program logic is a well-known tool for describing programs in order that they can be evaluated.

> back to overview  > register


Survey design and item writing for evaluators

presented by Kelly Hannum; Aligned Impact LLC; Greensboro, United States of America

In this workshop, you will learn the fundamental elements of good survey design and item writing. If you develop or use surveys as part of your evaluation practice, but have had little or no formal training in survey design or item writing, then this workshop is for you. Poorly designed surveys yield inaccurate data as well as data that can be difficult to analyse and interpret. By applying the principles and strategies of effective survey design, you will be more efficient in designing surveys and the data you gather will be more accurate and easier to interpret. If you are currently designing a survey or want to improve a survey, bring a copy of it with you to work on. Sample surveys will be provided to others to enable hands-on learning.

You will learn about different types of surveys, how to determine what type of survey best fits different evaluation purposes and contexts, the essential elements of a survey, how to craft effective items (and how to identify poorly constructed items), how to create effective response options for items, strategies for assessing survey quality, and options for administering surveys (including a brief overview of survey administration software and how to decide what is right for your situation). 

Throughout the workshop we will discuss the appropriateness of different options in different contexts to underscore that while there are guidelines for practice, context must be considered in order to design an effective survey and survey process.

After this workshop you will be able to:

  • identify the type of survey needed based on the evaluation purpose and context
  • develop essential survey elements (instructions, response options, items)
  • organize survey elements into a well-designed survey
  • select and apply appropriate options to assess survey quality
  • identify and manage an effective survey process.

> back to overview  > register


Developmental evaluation: applying systems thinking and complexity concepts to enhance innovation and use

presented by Nan Wehipeihana, Research Evaluation Consultancy – a member of the Kinnect Group, Wellington, New Zealand and Kate McKegg; The Knowledge Institute – a member of the Kinnect Group; Hamilton, New Zealand

Developmental evaluation (DE) provides evaluative information and feedback to social innovators, and their funders and supporters, to inform adaptive development of change initiatives in complex dynamic environments. DE brings to innovation and adaptation evaluative tools, thinking and processes to inform adaptive management and development. A complex system is characterized by a large number of interacting and interdependent elements in which there is no central control. Patterns of change emerge from rapid, real time interactions that generate learning, evolution, and development – if one is paying attention and knows how to observe and capture the important and emergent patterns. This is the niche of DE.

The purpose of the workshop is to provide participants with a theoretical grounding in the application of complexity concepts to evaluation design and practice with practical guidance on conducting developmental evaluations.

Participants will learn about the unique niche of DE, its theoretical foundations, principles and practices that support the implementation of DE, challenges in DE and how to overcome these challenges.

Participants will learn:

  • key factors for assessing appropriateness and readiness of contexts and clients for DE
  • niche, strengths and limitations of DE
  • common barriers to engaging in developmental evaluations and how to overcome those barriers, including:
    – five critical roles and responsibilities in DE
    – eight essential principles or DE
  • implications of designing an evaluation through a complex dynamic systems lens
  • options for developmental evaluation design and methods based on situational responsiveness, adaptability and creativity.

Participants will have the opportunity to learn from world leading developmental evaluators, drawing on the latest research on DE and 12 global case exemplars. Participants will engage in facilitated, pragmatic, hands-on application of developmental evaluation’s concepts. The combination of theoretical and experiential learning will ensure the knowledge gained by participants about developmental evaluation can be more easily applied in real world contexts.

Intermediate level evaluation practitioners with sufficient experience to know the challenges of situational adaptability and using diverse methods.

Evaluators' Professional Learning Competency Framework
It is clear form the purpose and objective statements that the workshop addresses the Theoretical Foundations and Research Methods and Systematic Inquiry domains. DE is a highly relational form of evaluation. For this reason the workshop also addresses the Evaluative Attitude and Professional Practice domain as well as the Interpersonal Skills domain.

> back to overview  This email address is being protected from spambots. You need JavaScript enabled to view it. register


Demystifying data triangulation in an evaluation context (HALF DAY – am)

presented by Timoci O’Connor and Ruth Aston, Centre for Program Evaluation, University of Melbourne, Melbourne, Australia

In the world of social programs and policy there is a wealth of existing data of varying quality. It is important that we as evaluators make use of this data and new data that we collect. In order to come to an accurate, valid and reliable evaluative judgement utilising existing and newly collected data a process of triangulation is required. This workshop addresses domains 4 and 7 of the professional learning competency framework. 

Underpinned by mixed methods methodology,  a number of triangulation methods and techniques will be presented. Further this workshop will introduce participants to the concept and discuss the background theory. Participants will then apply this knowledge and develop their skills, by completing a data appraisal and triangulation exercise.

This workshop will targeted towards novice evaluators and is open to students as well. Participants do not need an in-depth understanding of statistics to participate in this workshop.

> back to overview  > register


The Most Siginifcant Change (MSC) in use: an overview of the technique and practical examples of how it can be adapted to context (HALF DAY – am)

presented by Tracey Delaney; Independent Consultant; Melbourne, Australia

Despite Most Significant Change (MSC) now being a widely recognised tool for M&E, a number of myths and misconceptions exist regarding its purpose and use. This workshop aims to provide participants with a clear understanding of the role and value of MSC as well as a deeper appreciation of how it can be applied in different contexts. 

MSC involves the collection of stories of change. Stories have the potential to engage stakeholders in an initiative as well as offer insight into what is really being achieved and what is valued. MSC uses an inductive approach through participants making sense of events after they have happened. Each story therefore represents the storyteller’s personal interpretation of impact, which is then reviewed and discussed by a nominated group or panel. The process offers an opportunity for a diverse range of stakeholders to enter into a dialogue about program intention, impact and ultimately future direction. MSC therefore goes beyond merely capturing and documenting participants’ stories of outcomes, to offering a means of engaging in effective dialogue.

The specific objectives of the workshop are to:

  •  increase understanding of MSC 
  •  build awareness of what is, and is not, MSC
  •  expand knowledge of how MSC can be adapted to different settings.

Participants will be provided with comprehensive training notes and examples of real applications of the technique throughout the workshop. The training will be participatory and draw on the in-depth knowledge and experience of the trainer. The workshop is aimed as those with limited or no knowledge of MSC as well as those with a basic understanding or experience of using MSC who would like to build on their existing knowledge.

This workshop addresses the following domains from the Evaluators Learning and competency Framework: Domain 1, Domain 3 and Domain 4

> back to overview  > register


The evaluation findings conference: a key step in drafting effective and useful evaluation reports and other products (HALF DAY – pm)

presented by David Turner; David Turner Research Ltd; Wellington, New Zealand

This workshop will provide participants with an opportunity to practice a technique for preparing clear, useful, and effective reports and other evaluation products. It involves an engagement between evaluators and other key stakeholders, who may include evaluation clients, programme staff, and/or programme participants. The findings conference is held after research and analysis are largely done, but early enough that its tentative conclusions may be tested further if necessary. A findings conference is usually held face-to-face but can be conducted online, using a variety of software tools. Such a meeting provides a means for faster and more effective report writing.

This workshop provides training on how and why researchers can work with clients and other stakeholders to reach consensus on the structure and content of evaluation products, before writing a draft report. Participants will consider how to work with diverse stakeholders to identify the key messages to be included in an evaluation product, facilitate group discussions, identify any remaining issues to be researched at an early stage, and draft the structure of the intended report or other products. 

The workshop will begin with a presentation on the purpose and content of the findings conference, followed by one or more practice sessions, preferably using examples provided by participants. It will be targeted at participants with some experience in evaluation, but will not be presented at an advanced level.

The workshop addresses the evaluator competency of evaluation activities, particularly with regard to synthesising evaluative conclusions and reporting in ways that are culturally sensitive and useful.

> back to overview  > register


Bringing evaluation to life with improvisational facilitation (HALF DAY – pm)

presented by Vanessa Hood; independent facilitator; Melbourne, Australia and Viv McWaters; Beyond the Edge; Torquay, Australia

Increasingly, there is demand for evaluators to use participatory approaches. Projects that evaluators work on are also becoming more complex.  New participatory approaches are needed for evaluators to succeed in this changing context, and to have the ability to adapt in the moment.

Traditional training in evaluation and facilitation rarely provides the skills to improvise. Improvisation is a discipline, like any other, that can be learned, practiced and applied. These skills allow evaluators to enhance the way they engage individuals and groups – in evaluation training, data collection or analysis.

Applied improvisation is a refined system for observing, connecting and responding (Bernard and Short, 2012).  It is based on techniques used in theatre where people work in small groups to spontaneously co-create scenes on the stage. Improvisation is also used off-stage in communities, schools and businesses, anywhere there is a need for people to connect, communicate and understand each other. It allows learning to happen at a more rapid rate, it enables people to be more observant and more aware of themselves and others. 

This workshop is more than a traditional facilitation course. It also provides participants with practical skills to improvise. The basic principles of improvisation explored during the workshop include accepting offers, noticing, letting go, being affected, understanding status and making your partner look good (McWaters, 2012). 

This workshop is designed for evaluators who use participatory approaches (beginners to advanced).  It is an interactive workshop based on adult learning principles.  The workshop structure allows people to experientially learn new skills and relate their insights to their work. Participants are supported to apply their new knowledge during the workshop.  They leave with new approaches and ways of thinking that be applied immediately.

Resources and links to further information are provided (electronic and hard copy). This workshop directly relates to Domain 6 of the AES Evaluators’ Professional Learning Competency Framework: Interpersonal Skills.

> back to overview  > register


Developing monitoring and evaluation frameworks

presented by Dr Ian Patrick; Ian Patrick & Associates, Melbourne, Australia

The development and implementation of Monitoring and Evaluation Frameworks is a core skill in evaluation practice and requires knowledge and competencies in determining the role and purpose of monitoring and evaluation, predicting results, identifying areas of investigation and detailed planning for the collection and analysis of data. 

This workshop provides participants with useful, step by step practical guidance for developing a Monitoring and Evaluation Framework, supported by relevant background and theory.  The focus will be on developing a Monitoring and Evaluation Framework for a program or related initiative, but the contents will have applicability to other contexts such as organisational and policy level evaluations. The workshop presents a clear and staged conceptual model, discusses design and implementation issues and considers possible barriers or impediments, with strategies for addressing these. The relationship between the Monitoring and Evaluation Framework and functions such as stakeholder engagement, accountability, reporting and learning will be considered. Participants will learn about the approach and format for developing a Monitoring and Evaluation Framework, the range of techniques and skills involved in its design and implementation, and develop an appreciation of the parameters of the tasks involved and how to approach them. A case study approach will be utilised to provide a practical focus for application of knowledge and skills. This will be combined with presentations and opportunities for discussion and sharing experience. The workshop is designed for intermediate level, and assumes some background with evaluation concepts and practical experience.

> back to overview  > register


Systems Thinking for evaluation practitioners

presented by Chris Skelly and Samantha Abbato, Samantha Abbato and Associates, Brisbane, Australia

The purpose of the workshop is to introduce evaluation practitioners to the theory, principles and practice of Systems Thinking (ST). It will show participants how to use ST to map, manage and communicate project, program, organisation and external environment information for evaluation.

Basic ST theory will be introduced, including the use of systems properties (e.g. boundaries) and methods that can be used in evaluating complexity (e.g. social network analysis, causal loop diagrams and outcome mapping). Participants will be shown how to use online tools to map relationships and will provided multiple opportunities in the sessions to learn whilst doing. Kumu and Insight Maker are free online tools that will be used in the workshop (www.kumu.io; insightmaker.com). Participants can use these on any computer with an Internet connection and an up-to-date web browser. These powerful tools provide users with the ability to better understand complexity, visualise and communicate with clients.

Very brief lectures (each 10–15 min) will be used to present each topic and associated theory, followed by practical exercises (each 30–60 min). We will create system models that can be used to generate insights during client workshops and produce powerful ‘pictures’ for client meetings and evaluation reports. The workshop leader will provide a simulated evaluation case-study prior to each session. Each exercise will be followed by a discussion on the application of the techniques learned to evaluations that participants may be working on now or in the future.

Session structure:

  • Introduction to Systems Thinking: What is this all about and how can it benefit evaluation and the evaluation practitioner?
  • Program mapping with Kumu: Program maps of their ‘logic’ are usually represent as s simple linear progression, but what happens when you map all the interconnections?
  • Causal loop diagrams with InsightMaker: if we create a systems map of a program’s logic, can we evaluate whether the causal assumptions hold up?
  • Simulation for learning organisations with InsightMaker: What can we learn from the dynamic simulation of the program being evaluated?

At the end of this workshop, all participants should be able to take data from spreadsheets and documents and bring them alive with Kumu and InsightMaker to tell visual stories. Additionally, we’ll focus on guiding participants in how to create a template for future projects.

This workshop is aimed at evaluators with little or no ST knowledge. No previous experience is assumed. Participants must bring their own computer and have the ability to connect to the Internet. The workshop leader will assume only that participants are computer literate to the extent that they know how to use their own computer and have an up-to-date web browser.

> back to overview  > register