>>> DOWNLOAD a printable pre-conference workshop program
Having trouble seeing the full overview table? Switch to mobile version here.
View Saturday program here.
|9am–12:30pm WORKSHOP PROGRAM|
Troubleshooting program logic and theory of change (half day)
Jess Dart; Zazie Tolmer
Engaging stakeholders in the evaluation journey: strategic evaluation methods and tools to make a difference (full day)
Zita Unger; Anthea Rutter
Fundamentals of good evaluation reporting (full day)
Advanced skills in realist evaluation (full day)
Data visualisation in evaluation – seeing is believing (full day)
Managing external evaluations: Getting value for your organisation (full day)
Marlène Läubli Loud
|1:30–5pm WORKSHOP PROGRAM|
Dealing with the politics of evaluation (half day)
Survey research for evaluation: advanced issues in design and implementation (half day)
|Zita Unger; Anthea Rutter continued||Anne Markiewicz
|Ellen Vasiliauskas continued||Marlène Läubli Loud continued|
Troubleshooting program logic and theory of change (HALF DAY – am)
presented by Jess Dart and Zazie Tolmer; Clear Horizon Consulting Pty Ltd, Melbourne, Australia
Program logic/ theory of change is rapidly spreading around the world in a wide variety of forms and for a wide range of uses. Program logic is used at many scales – at the project, program, initiative and strategy levels. The topics spread into every sector, and the diversity of program modalities range from legislative programs, to empowerment programs, grants processes and large international facilities to mention a few.
Logic clarification is a fundamental step in scoping and framing an evaluation. It also is a great evaluative tool in its own right for interrogating and evaluating the underpinning strategy of a program. Program logic and theory of change is surely a well established part of best practice in evaluation.
The aim of this training is to explore some of the more complex uses and limitations of program logic/ theory of change and to open the floor to participants’ dilemmas and challenges. Throughout the day four sessions will be held.
- Recap of the basics and develop of checklist for what makes good logic and theory of change.
- Workshop key issues and challenges participants have encountered.
- Introduce five different approaches to developing a program logic / theory of change and when to use them.
- Explore conducting program logic in challenging contexts, including: loose and emergent programs, collective impact initiatives, strategy logic and two speed logic.
This workshop is pitched at the intermediate and advanced level of practice and is designed to be flexible; participants’ interests/challenges will guide part of the agenda. The workshop aims to draw on the collective experience and wisdom of the group and participants will be invited to bring their own dilemmas and challenges. We will draw on the whole group to collaboratively develop innovative solutions.
Dealing with the politics of evaluation (HALF DAY – pm)
presented by John Owen; Centre for Program Evaluation, The University of Melbourne, Australia
The purpose of the workshop is to equip participants to deal with the exercise of power that negatively influences the conduct of evaluations and the application of evaluation findings. It provides participants with
- an introduction of standards of practice that relate to the political viability of evaluation
- the opportunity to take into account these standards when planning an evaluation
- strategies for dealing with political issues when they arise during the conduct of an evaluation.
In this workshop we take (little p) politics to mean the exercise of power to influence the direction and use of an evaluation for the benefit of one or more groups of program stakeholders. Topics which fall under the rubric of the politics of evaluation include: conflict between stakeholder groups about issues to be addressed, attempts to influence the agenda of an evaluation as it proceeds, pressure from program stakeholders on evaluators when framing findings and recommendations, and the use and misuse of findings in decision-making for the benefits of sectional interests. While political power in this sense might have a positive affect on an evaluation, this workshop deals mainly with the exercise of power in ways that negatively influence the work of evaluators, as judged by the standards of evaluation that underlie our practice
The workshop is based on adult learning principles. Participants will be provided with scaffolding for working through a case that illuminate key concepts related to the topic.
It is expected that participants will come to the workshop with some knowledge about evaluation, or at least have some background in social research.
The workshop will be mostly concerned with Domains 3 and 7: Attention to Culture, Stakeholders and Context, and Evaluation Activities, with particular reference to how evaluators can deal with political aspects as they arise in the course of an evaluation.
Survey research for evaluation: advanced issues in design and implementation (HALF DAY – pm)
presented by Benoît Gauthier; Consultant and President of the Canadian Evaluation Society, Canada
Survey research is ubiquitous in evaluation. While this method is extremely flexible and highly useful, it raises a number of issues that need to be addressed to ensure that results are useful to the evaluation process. The facilitator will structure a brief presentation of each topic and will frame the issues encountered as well as solutions he has identified in the literature and in his own practice. Participants will be expected to contribute with their own experience and examples, and to apply critical thinking throughout the seminar.
The purpose of this session will be to review and discuss four advanced topics:
- moving from the evaluation framework to the planning of a survey (operationalization of concepts, sampling, scales, budgeting);
- making sure that the meaning of questions is shared (pretesting and translation);
- approaches to enhancing response rates;
- assessing survey research conducted by others.
Engaging stakeholders in the evaluation journey: strategic evaluation methods and tools to make a difference
presented by Zita Unger; Ziman; Melbourne, Australia and Anthea Rutter; The Centre for Program Evaluation; Melbourne, Australia
This workshop will help participants appreciate how evaluation can be strategic in their organisation, and how this differs from traditional approaches that fail to ask strategic questions. Participants are introduced to a variety of tools that support a strategic evaluation approach and are familiarised with the facilitator’s model, Strategic and Tactical Evaluation Management (STEM).
The workshop will include:
- what are strategic questions and how they make a difference
- how evaluation can be strategic in the organisation
- overview of Investment Logic Maps in the workplace
- STEM Stakeholder Information Needs approach which supports strategic evaluation
- practical and case study examples.
The specific objectives of the workshop:
- provide a conceptual framework for strategic evaluation
- encourage participants to consider the benefits of strategic evaluation in their workplace
- discuss barriers and enablers for implementation.
The workshop is highly interactive, providing opportunities for group discussion and exercises based on practical tools and scenarios. Take away reference materials are included. It is directed towards those working in the corporate, non-profit and government sectors, and is pitched at beginning/intermediate levels.
The Evaluators’ Professional Learning Competency Framework calls for the development of a range of competencies for conducting evaluation activities. This workshop enables participants to address the following domains of competence:
- Evaluation Theory
- Culture, Stakeholders and Context
- Research Methods and Systematic Inquiry
- Interpersonal Skills
- Evaluation Activities
The overall value of this workshop will be a sharing of experiences and case examples that incorporate the tensions and complexities of organisational development and change. The workshop will introduce key challenges for evaluation as well as implications for strategic development and learning in an organisational context.
Fundamentals of good evaluation reporting
presented by Anne Markiewicz; Director, Anne Markiewicz and Associates; Melbourne; Australia
The workshop will provide workshop participants with a useable framework that imparts key principles and approaches for credible evaluation reporting. It will provide commissioners, producers and users of evaluation reports with guidance for assessing the quality and comprehensiveness of reports.
Participants will develop an appreciation and understanding of the key elements that make for effective evaluation reporting. The knowledge and practices will assist them in both producing credible evaluation reports and reviewing reports produced by others.
Participants will learn:
- ten useful features of effective evaluation reporting that they can use as a checklist when writing or reviewing evaluation reports
- explanation as to why the features have been selected and the role they play in producing credible evaluation reports
- discussion of different types and formats of evaluation reporting
- a table of contents for a comprehensive summative evaluation report.
A mix of training methods will be used including Power Point presentation and opportunity for small-group discussion and structured exercises to assist with application of the principles.
Domains of the AES Evaluators Professional Competency Framework addressed in this workshop include the following:
- Evaluative Attitude and Professional Practice: the workshop focuses on achieving professionalism in the production of credible evaluation reports
- Evaluation Theory: The approach to reporting presented is strongly theory based, particularly utilising Program Theory, Program Logic and Evaluation Questions to structure evaluation reports
- Research Methods and Systematic Inquiry: the workshop adopts a strong emphasis on reports presenting the systematic enquiry from which evaluative judgments have been formed.
Advanced skills in realist evaluation
presented by Gill Westhorp; Community Matters Pty Ltd; Charles Darwin University; Adelaide, Australia
Realist evaluation is purpose-designed to ‘cross boundaries’. The realist approach to context makes realist approaches useful in multi-site, multicultural, multi-country and international development evaluations. The realist approach to ‘theory’ means that it can be used in many sectors and in evaluations that cross sectoral boundaries. It uses mixed methods and multiple data sources. Individual evaluations can be multi-disciplinary, crossing knowledge and practice boundaries.
The use of realist evaluation has been expanding over recent years, and a significant number of Australian evaluators have attended introductory training in the approach. This workshop will provide opportunities to develop more advanced skills in specific areas that are necessary for realist evaluation (Evaluators competencies 4 and 5). An intermediate level of evaluation knowledge and a working understanding of realist evaluation will be assumed.
The first session will provide a quick reminder of the key concepts in realist evaluation and explain why these concepts matter for ‘crossing boundaries’. It will explain how ‘context’ differs from ‘moderators’ and ‘mechanisms’ differ from ‘mediators’. It will consider different ways of thinking about mechanisms for different levels of systems and provide opportunities to practice identifying mechanisms at different levels.
The second session will address differences between traditional approaches to program theory and realist program theory, and provide opportunities to practice writing realist propositions to be tested through realist evaluations.
The third session will deal with realist interviewing. It will explain and provide opportunities to practice the specific skills and approaches required for realist interviewing, and their implications for research ethics applications, evaluation design, and evaluation management.
The final session will deal with analysis of interview data. It will compare ‘bottom up’ and ‘theory testing’ approaches. An opportunity to practice analysis of interview data will be included.
All sessions will be highly interactive and will include opportunities for questions and discussion.
Data visualisation in evaluation – seeing is believing
presented by Ellen Vasiliauskas; Director, d-sipher, Sunshine Coast, Queensland Australia
Participants will gain an overview of the meaning of data visualisation as it applies to evaluation and current trends. They will learn techniques for applying data visualisation to reporting and presentation of results in evaluation. The benefits of these techniques are many – from more effective communication of results and clarity on the meaning of large amounts of complex data to better engagement of decision makers and recall of information.
Participants will gain knowledge in various methods and processes of data visualisation in quantitative and qualitative applications. Techniques will include modelling, developing visual maps, the use of metaphor and storytelling, use of colour to convey meaning, graphical presentation, mind maps, storyboards and so forth.
The workshop will explore:
- definitions of data visualisation as it applies to evaluation
- the case for data visualisation and the needs of stakeholders
- how audiences absorb data and information
- How does data become clear, intuitive and even fun?
- What makes for clear data visualisation and what makes things murky?
- strategies for enhancing visual presentation and reporting
- rules for charts and simple quantitative presentation formats
- options for visualising qualitative data
- useful visualisation techniques in MS Excel.
The aims of this workshop are to
- understand broad directions and trends in data visualisation
- understand the meaning of data visualisation as it applies to evaluation
- understand the relevance of data visualisation and needs of evaluation audiences and stakeholders
- understand the difference between good and bad data visualisation
- understand fundamental rules for charts and simple quantitative presentation formats
- explore options for visualising qualitative data
- be aware of the resources available in data visualisation.
Managing external evaluations: getting value for your organisation
presented by Marlène Läubli Loud; Consultant and Trainer, United Kingdom
Evaluation is a powerful tool that can provide useful, evidence-based information to help inform and influence policy and practice. But useful evaluations not only depend on the evaluators’ skills, but also on those of the commissioners and contract managers. Evaluation commissioners and evaluation managers determine when and what should be evaluated; they also oversee the evaluation process, dissemination of the evaluation findings and, more especially steer discussions on what use will be made of the conclusions and recommendations. Evaluative information is therefore more likely to be useful to the commissioning organisation and effectively used when it is well-planned, coordinated and managed.
But useful evaluation not only relates to individual evaluation studies; evaluation’s utility for adding value to the organisation also depends on the degree to which an evaluation and learning culture are embedded in the organisation. This means that the organisation recognises and appreciates evaluation’s role and the diverse functions it can have, particularly for helping understand what it is achieving and where and how improvements can be made.
This workshop will explore the two major frameworks that have been discerned in the literature for managing evaluation: (1) for the management of one or more specific studies, and (2) for the institutionalisation and management of the structures and processes needed to support a sustainable evaluation policy.
This workshop will address the following issues:
- explain and discuss the six stage approach to evaluation management regarding both (1) and (2) above
- explore the types of challenges common to evaluation management
- identify the main types of strategies being used to address these challenges and in different settings
- explore workshop questions, practical problems and examples from the group.
It aims at enabling participants who manage evaluation to improve their planning, design and management of evaluation studies. Equally it aims at helping them identify useful strategies for assuring a useful and sustainable evaluation service for their organisation.
At the end of the workshop, participants will be able to:
- understand the key concepts of the two main evaluation frameworks
- be familiar with the key elements of a step-by-step procedure for planning, designing and managing useful and useable evaluation studies
- be able to compare and contrast their current work practices with course derived knowledge so as to identify where positive changes could be made.
Course Method: The method is based on the principles of reflective professional practice achieved through both a participatory and case study approach, and will draw on practice-based experiences. The course will combine a combination of delivery methods to include presentations, discussion, and group exercises.
This workshop is for:
- staff responsible for commissioning and/or managing evaluations and monitoring processes in their organisation
- teachers of policy and programme evaluation
- evaluation practitioners
- anyone interested in learning about good practices in evaluation management, within their work setting.
The workshop is especially relevant to staff responsible for evaluation issues in non-governmental and governmental organisations alike. It offers those interested in learning about the challenges and experiences of managing evaluations e.g. in an internal evaluation unit, the opportunity to share experiences and good practices with peers.