Pre-conference workshop program: Monday 17 September 2018

>>> DOWNLOAD a printable conference workshop program

View Tuesday workshop program here.


WORKSHOP DETAILS

Codesign and evaluation for social innovation

presented by Penny Hagen, Auckland Co-design Lab, The Southern Initiative, Auckland Council

FULL DAY – BEGINNER/ INTERMEDIATE/ ADVANCED

This workshop explores how co-design and evaluative practice are being brought together to support new and developmental ways of working across national and local government, NGOs, business, frontline workers and community members.

Co-design is increasingly being referred to and used in the context of developing social policy, programmes and innovations and for driving ‘social innovation’. Tools from design and co-design are gaining traction as a means to better understand and respond to complex social challenges through a human-centred and systems lens. At the same time evaluative practices and mindsets help us to know where to best focus our efforts and whether such efforts are making a difference. Even so, co-design andevaluation are most often commissioned as separate processes, functions or stages that have no formal relationship with each other. For social practitioners, evaluation may be perceived as ‘outside’ the real work of social change.

In this workshop we will explore, using various examples, how these two disciplines or mindsets can borrow from and compliment each other, given that there are many different implementation scenarios to explore. We acknowledge that this is still new territory. We will work together to explore the prospect of increasing social impact through the integration of co-design and evaluative practice.

The workshop will explore:

  • a working definition of co-design for social innovation
  • the relationship between co-design and evaluative practice
  • opportunities for using and growing evaluative thinking in co-design
  • opportunities for using and growing co-design in evaluation
  • critical contexts shaping co-design and evaluation (place, culture, timing)

This will be an interactive workshop were we will be looking to build our mutual understanding of the opportunities of this integrative space. The workshop material will draw upon practice and research with collaborators from across Australia and Aotearoa New Zealand. This recent research has been focused on the local and cultural dimensions of co-design and evaluative practice and place-based social innovation initiatives.

About the presenter:
Dr Penny Hagen assists organisations and teams to apply co-design and design-led approaches to the design and implementation of strategy, programs, policies and services. She specialises in projects with social outcomes and supports capability building with teams and organisations wanting to increase their social impact through the adoption of more participatory and evaluative approaches.

Penny has supported cross sector teams and communities in Australia and New Zealand responding to complex social issues including youth employment, youth mental health and wellbeing, chronic health conditions and family violence. Penny has a PhD in participatory design and much of her work involves the integration of different disciplines such as health, design and evaluation to increase impact around social outcomes. She has worked with organisations such as the Young and Well CRC in Australia, developing a framework to integrate Participatory Design and traditional evidence-based approaches for the design of mental health interventions for young people. She also worked with Lifehack in New Zealand supporting the development of a design and evaluation model for innovation in youth wellbeing. She is currently Co-design Lead at the Auckland Co-design Lab, helping to build co-design and social innovation capacity across public service and community teams.

> back to overview > register


Developing Monitoring and Evaluation Frameworks

presented by Anne Markiewicz, Director of Anne Markiewicz and Associates

FULL DAY – BEGINNER / INTERMEDIATE

The development and implementation of Monitoring and Evaluation Frameworks at strategy, program and project levels are important processes to adopt in order to provide an indication of results achieved and to resource organisational learning. The Monitoring and Evaluation Framework defines the parameters of routine monitoring and periodic evaluation that will take place over the life of a program or an initiative. The workshop provides participants with useful, step by step practical guidance for developing a Monitoring and Evaluation Framework, supported by relevant background and theory. It presents a clear and staged conceptual model, discusses design and implementation issues and considers any barriers or impediments, with strategies for addressing these.

Participants will learn the format and approach for developing a Monitoring and Evaluation Framework, the range oftechniques and skills involved in its design and implementation and develop an appreciation of the parameters of the tasks involved and how to approach them.Participants will learn:

  • the value and purpose of investing in and developing Monitoring and Evaluation Frameworks
  • the participatory approach and processes involved in developing such frameworks
  • the steps and stages involved and the suggested 'Table of Contents' for constructing a Monitoring and Evaluation Framework.

The trainer will alternate between use of a Powerpoint presentation and small group interactive work. The workshop follows a case study approach and involves participants in the development a Monitoring and Evaluation Framework for the case study. In this way, the approach to training is participatory and hands-on while still conveying sufficient theory and context.

About the presenter:
Anne Markiewicz is the Director of Anne Markiewicz and Associates, a consultancy that specialises in developing Monitoring and Evaluation Frameworks, and the delivering of training, mentoring and capacity building in monitoring and evaluation. Anne is the co-author of the text book Developing Monitoring and Evaluation Frameworks (Sage 2016). She has extensive experience in the design and implementation of monitoring and evaluation frameworks for a wide range of different initiatives, building the capacity of organisations to plan for monitoring and evaluation. Anne has been an evaluator for over 20 years and has been recognised by the Australasian Evaluation Society through receipt of a number of awards for excellence in evaluation and she is a Fellow of the Society. Anne has delivered this training program extensively in Australia, the Pacific and in the USA and the UK.

> back to overview > register


The basics of systems thinking and its application to systems evaluation

presented by Ralph Renger, University of North Dakota; Center for Rural Health; Grand Forks, North Dakota, USA

FULL DAY – INTERMEDIATE

The purpose of the workshop is to help evaluators understand the difference between using systems thinking and systems theory to evaluate programs versus modern day systems.

The specific workshop objectives are to understand how to apply system evaluation theory (SET) to:

  1. define a system and know why this is critical for the evaluating system efficiency and effectiveness
  2. evaluate system efficiency, including system attributes (e.g., leadership, training, information technology), feedback mechanisms, and trigger points for cascading failures
  3. evaluate system effectiveness.

The workshop will used mixed teaching methods. Group discussion will use the Socratic teaching method, posing questions to participants that draw on their own experience, to highlight the differences and benefits of system thinking and system theory. Then, a simple, practical case example requiring no substantive content expertise will be used to teach participants how to apply SET to a) define a system and know why this is critical for the evaluating system efficiency and effectiveness, b) evaluate system efficiency, including system attributes (e.g., leadership, training, information technology), feedback mechanisms, and trigger points for cascading failures, and c) understand how interconnectedness and wholeness impact the evaluation of system effectiveness. Next, real world case scenarios based on projects I personally evaluated will be presented to illustrate the application of SET. The workshop will conclude using group discussion to reflect on how what was learned can be applied to the participants’ current evaluation projects.

The workshop targets intermediate level evaluators who are already proficient at program evaluation and would like to learn how those skills translate to conducting a systems evaluation.

About the presenter:
Dr Ralph Renger spent the first twenty years of his evaluation practice advancing theory driven methods for improving program evaluation. Several publications arose from his work including the ATM approach to logic modeling, using source documentation to reconstruct program theory, and using root cause analysis to construct a context map and to facilitate process evaluation. The application of these methods occurred primarily in privately and government funded programs in the health care sector. Over the last four years he switched focus to developing theory driven methods to advance system evaluation: this is NOT the application of system thinking to program evaluation; rather the application of systems thinking to evaluating modern day systems. He published a series of system evaluation articles in EJA, including the sentinel System Evaluation Theory (SET). He continues to evolve SET through its application in the emergency response and public health prevention sectors.

> back to overview > register


Making it stick – creating an evaluation report for impact and use

presented by Samantha Abbato, Director, Samantha Abbato and Associates

FULL DAY – BEGINNER/INTERMEDIATE

This workshop is designed for professionals who commission, write or use evaluation reports. Beginners and those new to evaluation will also benefit as well as those at Intermediate level.

It is for evaluators and other professionals who would like to:

  • Learn how to communicate evaluation findings effectively to a range stakeholders.
  • Ensure the report communicates the key components of the evaluation.
  • Produce a report that is “sticky” and memorable.
  • Get their report noticed using design principles.
  • Learn how evaluation reports can be best utilised as an effective catalyst for positive organisational change.

The workshop will be interactive, involve the sharing of experiences as well as hands-on activities. Case studies from the experience of the presenter in the areas of health, communities, Aboriginal and Torres Strait Islander and culturally and linguistically diverse settings will be discussed. Participants will be provided opportunities to apply new skills to their own work. Participants are invited to bring examples of completed or in-progress evaluation reports to discuss.

Participants will learn how to maximise report reach, engagement and use through applying strategies based on current psychological and communication principles and theory. Innovative reporting techniques using photo, film, story and online tools will be introduced.

Four main areas will be covered:

  1. Communication principles: Basic principles and models, communications planning, tailoring to stakeholders and utilisation-focused principles
  2. Report content and structure: Essential and optional contents, what to look for when designing and assessing reports, thinking outside the traditional evaluation report.
  3. Making the report stick: Communicability and the rule of three, formulating feasible and actionable recommendations, story for memorability, readability.
  4. Getting the report noticed: Basic design principles and their practical application, catering to diverse stakeholders, and empathy mapping.


About the presenter:
Samantha Abbato is an evaluation consultant and director of Visual Insights, a pictures and stories approach to evaluation. Sam has completed more than 100 evaluation and research reports and papers for a range of government, non-government organisations and community stakeholders. She has numerous published book chapters and peer-reviewed journal articles and worked as a freelance journalist for several years. Sam was the recipient of the 2015 AES Evaluation Publication Award (Caulley Tulloch Award). Sam’s academic qualifications include: a PhD (Epidemiology) and MPH (Epidemiology/Biostatistics), UC Berkeley (1997), a Bachelor of Science in Mathematics with Honours in Physiology, Monash University (1988). In addition, she lectured in epidemiology and qualitative methods at the University of Queensland (1997-2000).

Sam has worked as an independent consultant for more than twelve years and has thirty years of health and community sector experience. She is a specialist in health and community sector evaluation with extensive experience in qualitative and quantitative evaluation approaches, working with Aboriginal and Torres Strait Islander communities and a range of culturally and linguistically diverse communities, including refugees.

With a passion for communication and maximising evaluation use, and an extensive understanding of the evaluation commissioner perspective through her work on Visual Insights organisational capacity building, Sam is able to offer a wealth of case studies of the good, the bad and the ugly of evaluation reporting. Drawing on her own shift from an academic approach to a “use focus” to achieve outcomes goals in partnership with organisations, she is able to share effective engagement and communication strategies that are responsive to organisations’ needs whilst maintaining the integrity of method and evidence.

> back to overview > register


Valuing social outcomes to demonstrate impact

presented by Arjun Ravi and Taimur Siddiqi, The Incus Group

FULL DAY – INTERMEDIATE/ ADVANCED

We have seen the funding landscape transformed in recent years as governments shift from block funding of initiatives towards payment by results, commercial interests harnessed through impact investing and increased expectations by all funders to gather real time evidence and demonstrate value for money. Traditional evaluations that look at value for money either describe the value qualitatively or deploy cost benefit analysis, which often does not quantify ‘intangible’ outcomes such as employment readiness or community cohesion, meaning these outcomes are overlooked or relegated in decision making. It’s essential that evaluation practice evolves to counter this and value outcomes by blending economic valuation techniques with in depth stakeholder engagement using Social Return on Investment (SROI). SROI is a leading internationally recognised methodology for measuring and valuing the impact of a program, organisation or policy. By enabling social, environmental and economic costs and benefits to be represented in a common unit, SROI allows organisations and evaluators to speak the language of many funders while still ‘valuing what really matters’.

This interactive workshop will focus specifically on how to undertake an SROI analysis to value outcomes as part of ongoing monitoring and evaluation (M&E) activities and using M&E data. It will also encourage participants to consider the benefits and challenges of valuing outcomes. It will be based on peer learning, with a series of cooperative learning exercises and opportunities for group discussion. Participants will be asked to bring their own examples and provided with take home templates and resources to assist them with their analyses.

The learning outcomes are:

  • ability to apply SROI techniques in your M&E work
  • ability to identify appropriate financial proxies
  • how to interpret SROI ratios
  • use SROI principles to more critically interpret programs and make more effective funding decisions

This workshop enables participants to address the following domains of competence:

  • Culture, Stakeholders and Context
  • Research Methods and Systematic Inquiry
  • Evaluation Activities

The workshop is delivered by a Social Value International accredited practitioner and trainer and designed for those with intermediate evaluation experience and an interest in better measuring value for money and cost effectiveness.

About the presenters:
Arjun Ravi is director of The Incus Group and an experienced impact measurement and evaluation professional with extensive experience advising clients in the corporate, not-for-profit and government sector. Prior to founding The Incus Group in 2015, he spent five years with Net Balance and later EY Australia building and establishing the social sustainability practice.

His experience in the impact measurement space has been focused on leading innovative engagements seeking to measure, value and manage the often “intangible” impacts of organisations’ operations and investments. He is accredited by Social Value International as a Social Return on Investment (SROI) practitioner and is a licensed trainer of the SROI methodology and has trained hundreds of diverse participants worldwide. Some of his project highlights include working with Save the Children to forecast the social return of a suite of their child protection initiatives, evaluating the impact of a federally funded Indigenous health promotion grants program, and determining the social value of providing improved transportation access to persons with a disability to advocate for greater funding support.

Arjun holds a Bachelor of Finance degree from the University of Illinois and a Master of International Development from Monash University.

Taimur Siddiqi is an experienced practitioner and trainer of the Social Return on Investment (SROI) methodology and has completed numerous SROI projects and delivered accredited training to over 100 individuals. Taimur has applied SROI valuation techniques as part of his role as an evaluation consultant working with a range of public sector, corporate and not-for-profit organisations. His project work has included a peer reviewed SROI analysis of Indigenous financial counselling; a cost effectiveness evaluation of a child protection community legal pilot and an extensive SROI research project for Foodbank Australia. Prior to co-founding The Incus Group, he spent four years with Net Balance where he eventually managed the $2 million Social Impact business line.

> back to overview > register