Loading…
Conference hashtag #aes24MEL
arrow_back View All Dates
Friday, September 20
 

9:00am AEST

Plenary: Indy Johar "Navigating transitions through risk and uncertainties"
Friday September 20, 2024 9:00am - 10:00am AEST
Indy Johar, RIBA register architect, serial social entrepreneur, and Good Growth Advisor to the Mayor of London, UK

Keynote address: Navigating transitions through risk and uncertainties

Abstract to follow.
Chair
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
Speakers
avatar for Indy Johar

Indy Johar

RIBA register architect, serial social entrepreneur, and Good Growth Advisor to the Mayor of London, UK
Indy Johar is an RIBA register architect, serial social entrepreneur, and Good Growth Advisor to the Mayor of London. Indy was born in Acton, West London & is a lifelong Londoner. He is focused on the strategic design of new super scale civic assets for transition – specifically... Read More →
Friday September 20, 2024 9:00am - 10:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Involving children and young people in evaluations: Equity through active participation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Sharon Marra-Brown (ARTD Consultants), Moya Johansson (ARTD Consultants, AU)

Think it's important to enable children and young people to have a voice in evaluations, but find it challenging? This paper presents tried and tested strategies for ensuring ethical engagement with children and young people and encouraging meaningful participation.

Involving children and young people in evaluation is critical to ensure that we arrive at evaluations that accurately reflect their experiences and capture the outcomes they consider most important. Children and young people have the right to have a say about their experiences, and evaluations that avoid their involvement risk perpetuating ongoing inequities.

However, involving children and young people in evaluations can prompt ethical concerns in relation to their comprehension of research, capacity to provide consent, potential coercion by parents, and the potential conflicting values and interests between parents and children. Depending on the subject, it can also create concerns about safety and readiness.

Based on our experience successfully achieving ethics approval for multiple evaluations of services for children and young people across Australia, which include interviews with children and young people who have accessed these services, we will talk through considerations for ensuring the voice of children and young people in evaluation while safeguarding them from unnecessary risks.

We will then take you through how we've overcome challenges engaging children and young people in evaluations with youth-centred innovative solutions, including carefully considering the language we use and how we reach out. We will demonstrate the developmental benefits of meaningful participation of children and young people once ethical considerations have been carefully considered and navigated.

Finally, we will take you through our tips for ensuring meaningful and safe engagement with children and young people. We will point you in the direction of Guidelines and practice guides for involving young people in research and evaluation in a safe and meaningful way.

The presenters are evaluators with extensive experience in designing, delivering and reporting on evaluations that include data collection with children and young people. This includes recently achieving ethics approval and commencing interviews with children as young as seven, accessing a suicide aftercare service.

While much attention is devoted to ensuring safe and inclusive data collection with various demographics, specific considerations for engaging children and young people remain relatively uncommon. Recognising the unique needs of this population, coupled with the understandably cautious stance of ethics committees, underscores the necessity for a thoughtful and deliberate approach to evaluations involving children and young people.

Given the additional complexities and ethical considerations involved, the default tendency can be to exclude children and young people from evaluation processes. However, it is important that children and young people are able to have a say in the programs, policies and services that they use. Participation in evaluations can be a positive experience, if risks are managed and the process is designed to be empowering.

This session will provide valuable insights, actionable strategies, and an opportunity for participants to reflect on their own practices, fostering a culture of inclusivity and responsiveness in evaluation.
Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Mitchell Rice-Brading

Mitchell Rice-Brading

ARTD Consultants
I started with ARTD in early 2022 after completing his Bachelor of Psychological Science (Honours) in 2021. This, in combination with experience as a Psychology research assistant, helped me develop strong research skills, namely the ability to synthesise and critically evaluate qualitative... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.html
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

The Road Home - an evaluation journey to discover and demonstrate a new and exciting way to deliver a crisis housing response.
Friday September 20, 2024 10:30am - 11:30am AEST
105
Authors: Anne Smyth (LDC Group), Lesley Thornton (LDC Group, AU), Kym Coupe (First Step, AU), Caroline Lynch (Launch Housing, AU)

As all Wayfinders would understand, when we embarked on a developmental evaluation of the Road Home, we really had no idea how the program or evaluation would play out in practice. We did know however that the usual way of delivering crisis housing services was not working well for either clients or staff. Something needed to change. We needed to change. So, we did - we being the Road Home team working with the evaluators.

Road Home centres on a strong and engaged multidisciplinary team to deliver mental health, medical, legal and housing services to people in crisis accommodation, where they are, and when they need it the most. This integrated way of working is in stark contrast to the conventional, single discipline outreach and in-reach approaches that characterise service delivery in the community sector - its impact has been significant.

This panel will bring leading representatives of the Road Home team and the evaluators together to explore with our audience what we have learned; what it takes to do this well, the benefits to clients, staff and participating organisations, the pitfalls and challenges and the value of developmental evaluation and its methods.

We now have a much better idea of what Road Home looks like, what it takes to support and enable it, to achieve valued outcomes and to meaningfully evaluate it. The role of evaluators and the project manager in holding the uncertain and evolving space characteristic of developmental evaluation and wayfinding is central - it has taken clarity, alignment of purpose, a lot of patience and much persistence, not to mention flexibility. It has been and remains quite the journey!
Chair Speakers
avatar for Anne Smyth

Anne Smyth

Principal Consultant, LDC Group
I have extensive experience in working with the community and not for profit sectors. I am able to draw on 40 years of experience as an educator and researcher in university leadership and management development programs and as a consultant in the fields of organisational change and... Read More →
LT

Lesley Thornton

Principal Consultant, LDC Group
As an evaluator and organisational development consultant, I have extensive experience in government and not-for-profit sectors working in areas of policy and service development, evaluation, leadership and organisational development. Drawing on the theory and practice across these... Read More →
avatar for Kym Coupe

Kym Coupe

Project Manager, First Step
Kym is project lead for the collaborative partnership between First Step and Launch Housing and is passionate about the benefits – to both staff and consumers – of collaborative and integrated service delivery. Kym has a Masters of Public Health and has worked in health and community... Read More →
avatar for Caroline Lynch

Caroline Lynch

Service Manager - Women Services, Launch Housing
I am a trauma informed, feminist leader who believes in using my influence for a more inclusive and equitable society. I oversee the four programs within the Women Services function at Launch Housing. This includes the Women’s Only Crisis Accommodation, the Transitional Housing... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Chair Speakers
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Nicole Tujague

Nicole Tujague

Founder and Director, The Seedling Group
Nicole TujagueBachelor of Indigenous Studies (Trauma and Healing/Managing Organisations)1st Class Honours, Indigenous ResearchPhD in Indigenous-led Evaluation, Gnibi College, Southern Cross UniversityNicole is a descendant of the Kabi Kabi nation from Mt Bauple, Queensland and the... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Reviewing and writing for the Evaluation Journal of Australasia
Friday September 20, 2024 10:30am - 11:30am AEST
103
Authors: John Guenther (Batchelor Institute Of Indigenous Tertiary Education), Anthea Rutter (University of Melbourne, AU), Yvonne Zurynski (Macquarie Univesity, AU)

The Evaluation Journal of Australasia (EJA) supports evaluators who wish to share their knowledge and practical experiences in a peer-reviewed article. Documenting evidence, including for programs which do not achieve expected results, is critical for improving evaluation practice, building the evidence base, and advancing evaluation methodologies that are rigorous and ethical.

The EJA depends on volunteer reviewers who can offer critical feedback on articles that are submitted. Reviewers help to improve the quality of manuscripts the Journal receives.

The focus of this presentation is on how to write a good review: how to be academically critical, while at the same time providing constructive feedback that will benefit authors and readers. The presenters will offer step-by-step advice on what to look for, how to judge the quality of a manuscript, and how to make constructive suggestions for authors to consider.

The presentation will also explain how reviewing fits within the publication process, from submission to production. It will be most helpful to potential authors and current and potential reviewers. Authors will learn how to prepare their articles so they receive a favourable review, and reviewers will receive clear guidance on presenting their review feedback to authors.
Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for John Guenther

John Guenther

Research Leader, Education and Training, Batchelor Institute of Indigenous Tertiary Education
John Guenther is a senior researcher and evaluator with the Batchelor Institute of Indigenous Tertiary Education, based in Darwin. Much of his work has been based in the field of education. He has worked extensively with community-based researchers in many remote parts of the Northern... Read More →
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
avatar for Jeff Adams

Jeff Adams

Managing Editor | Senior Lecturer, Evaluation Journal of Australasia | Eastern Institute of Technology
I am the Managing Editor of the Evaluation Journal of Australasia - talk to me about publishing in, or reviewing for the journal. I also teach postgraduate Health Sciences at Eastern Institute of Technology, Auckland.
Friday September 20, 2024 10:30am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Impact evaluation: bringing together quantitative methods and program theory in mixed method evaluations
Friday September 20, 2024 11:00am - 12:00pm AEST
Authors: Harry Greenwell (Australian Treasury), To be determined (Australian Treasury, AU)

This session will provide an overview of some of the main quantitative methods for identifying the causal impacts of programs and policies, while emphasising the importance of mixed-methods that also incorporate program theory and qualitative research. It is intended for people unfamiliar with quantitative evaluation methods who would like to develop their understanding of these methods in order to better contribute to theory-based, mixed method impact evaluations.

The session will cover 3 of the most common quantitative approaches to separating causality from correlation: i) mixed-method RCTs, ii) discontinuity design, and iii) matching. Each method will be explained with real examples. The session will also cover: the benefits and limitations of each method, and considerations for determining when such methods might be suitable either on their own, or as a complement to other evaluation methods or approaches.

Special attention will be given to the ethical considerations inherent in the choice of impact evaluation method, including issues related to consent, fairness, vulnerability, and potential harm.

After attending this session, participants will have a better understanding of: how program theory can inform the design of quantitative impact evaluations, including through mixed-method impact evaluations; and how to identify when certain quantitative impact evaluation methods may be suitable for an evaluation.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Harry Greenwell

Harry Greenwell

Senior Adviser, Australian Treasury
Harry Greenwell is Director of the Impact Evaluation Unit at the Australian Centre for Evaluation (ACE) in the Australian Treasury. He previously worked for five years at the Behavioural Economics Team of the Australian Government (BETA). Before that, he worked for many years in the... Read More →
avatar for Vera Newman

Vera Newman

Assistant Director
Dr Vera Newman is an Assistant Director in the Impact Evaluation Unit at the Australian Centre for Evaluation. She has many years experience conducting impact evaluations in the private and public sector, and is dedicated to applying credible methods to public policy for generating... Read More →
Friday September 20, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Value Propositions: Clearing the path from theory of change to rubrics
Friday September 20, 2024 11:00am - 12:30pm AEST
Authors: Julian King (Julian King & Associates Limited), Adrian Field (Dovetail Consulting Limited, NZ)

Evaluation rubrics are increasingly used to help make evaluative reasoning explicit. Rubrics can also be used as wayfinding tools to help stakeholders understand and participate meaningfully in evaluation. Developing rubrics is conceptually challenging work and the search is on for additional navigation tools and models that might help ease the cognitive load.

As a preliminary step toward rubric development it is often helpful to co-create a theory of change, proposing a chain of causality from actions to impacts, documenting a shared understanding of a program, and providing a point of reference for scoping a logical, coherent set of criteria.

However, it's easy to become disoriented when getting from a theory of change to a set of criteria, because the former deals with impact and the latter with value. Implicitly, a theory of change may focus on activities and impacts that people value, but this cannot be taken for granted - and we argue that value should be made more explicit in program theories.

Specifying a program's value proposition can improve wayfinding between a theory of change and a set of criteria, addressing the aspects of performance and value that matter to stakeholders. Defining a value proposition prompts us to think differently about a program. For example, in addition to what's already in the theory of change, we need to consider to whom the program is valuable, in what ways it is valuable, and how the value is created.

In this presentation, we will share what we've learnt about developing and using value propositions. We'll share a simple framework for developing a value proposition and, using roving microphones, engage participants in co-developing a value proposition in real time. We'll conclude the session by sharing some examples of value propositions from recent evaluations.

Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Friday September 20, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Our five guiding waypoints: Y Victoria's journey and learning from applying organisation-wide social impact measurement
Friday September 20, 2024 11:30am - 12:00pm AEST
103
Authors: Caitlin Barry (Y Victoria), Eugene Liston (Clear Horizon Consulting, AU)

The demand for organisations to measure impact seems to be ever increasing. However, impact measurement looks different depending on what level you are measuring it (program level, organisation-wide, ecosystem level etc). While a lot of organisations focus on measuring social impact at a program level, what appears to be less commonly achieved is the jump to effective measurement of impact at an organisation-wide level.

The literature providing guidance on how to implement org-wide social impact measurement makes it seem so straight-forward, like a Roman highway - all straight lines. But what is it really like in practice? How does it differ from program-level impact measurement? How can it be done? What resources does it take? And, what are the pitfalls?

The Y Victoria has spent the last three years on a journey to embed org-wide social impact measurement under the guidance of our evaluation partner. The Y Victoria is a large and diverse organisation covering 7 different sectors/service lines; over 5,500 staff; over 180 centres; and delivering services to all ages of the community. This presented quite a challenge for measuring organisation-wide impact in a meaningful way.

While the journey wasn't 'straight-forward', we've learnt a lot from navigating through it. This presentation will discuss the approach taken, tell the story of the challenges faced, trade-offs, lessons learnt (both from the client and consultant's perspective), and how we have adapted along the way.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Jess Boyden

Jess Boyden

Senior Social Impact Manager - Recreation, YMCA Victoria
Hello! I'm Jess and I bring 20 years of experience in program design, strategy and social impact measurement within international aid and local community development settings. I specialise in creating practical and meaningful approaches to measuring social impact, using the power... Read More →
avatar for Caitlin Barry

Caitlin Barry

Principal Consultant, Caitlin Barry Consulting
Caitlin has extensive experience in monitoring and evaluation, and holds a Masters of Evaluation (First Class Honours) from the University of Melbourne and an Environmental Science Degree (Honours) from James Cook University. The focus of Caitlin's presentation will be from her work... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Reflections on a Developmental Evaluation of a traditional healing service model for the Kimberley region of Western Australia
Friday September 20, 2024 11:30am - 12:00pm AEST
106
Authors: Gillian Kennedy (The University of Notre Dame Australia), Tammy Solonec (Kimberley Aboriginal Law and Culture Centre, AU)

Traditional Healers, known in the Kimberley as mabarn (medicine men) and parnany parnany warnti (group of women healers), have been practicing their craft for millennia, however, cultural forms of healing are not funded or incorporated into health services in Western Australia. In 2019 a Kimberley cultural organisation was funded to develop and trial a service delivery model of traditional healing. The trial ended in November 2023.

This presentation will reflect on a Developmental Evaluation (DE) that was undertaken throughout the model development and trial of this traditional healing service using a multi-method approach, incorporating participant observation; semi-structured interviews; small group discussions; and a client survey. Data was collated into a 'checklist matrix', using a traffic light system to show how each element of the model was tracking according to different stakeholder perspectives. This information was then provided back to the healing team iteratively to incorporate further into the model design.

The DE team acted as a 'critical friend' to the project. Two Aboriginal research assistants (one male and one female) were able to provide valuable cultural interpreting for the project to ensure that cultural sensitivities around the healing practices were carefully managed. The DE team also helped the healing team to develop a set of guiding principles and a Theory of Change to help the project stay true to their underpinning cultural values.

The DE process helped to inform a culturally-governed and owned clinic model, working with both men and women healers, that is unique to the Kimberley. DE puts the evaluation team inside the project. This relational element is reflective of Aboriginal worldviews but may bring challenges for perceptions of objectivity that are championed in traditional forms of evaluation. We argue that the evaluator as a trusted, critical friend was ultimately part of the success of the healing project.


Chair Speakers
avatar for Tammy Solonec

Tammy Solonec

Jalngangurru Healing Coordinator, Kimberley Aboriginal Law and Cultural Centre (KALACC)
Tammy Solonec is a Nyikina woman from Derby in the Kimberley of Western Australia. Since late 2020, Tammy has been engaged by KALACC as Project Coordinator for Jalngangurru Healing, formally known as the Traditional Healing Practices Pilot (THPP). Prior to that from 2014 Tammy was... Read More →
avatar for Gillian Kennedy

Gillian Kennedy

Translational Research Fellow, The University of Notre Dame Australia
Gillian Kennedy is a Translational Research Fellow with Nulungu Research Institute at The University of Notre Dame, Broome campus and has 20 years’ experience as an educator and facilitator. Her research focus is on program and impact evaluation within the justice, education, and... Read More →
avatar for Eva Nargoodah

Eva Nargoodah

Cultural advisor and healer, Jalngangurru Healing, Kimberley Aboriginal Law and Culture Centre
Eva Nargoodah is a senior Walmajarri woman who was born on Christmas Creek Station in the Kimberley region of Western Australia. As a child she lived at Christmas Creek Station, GoGo Station and Cherrabun Station. Eva completed her schooling in Derby and worked as a teacher. She has... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

The ACT Evidence and Evaluation Academy 2021-24: Lessons learned from a sustained whole-of-government ECB effort
Friday September 20, 2024 11:30am - 12:00pm AEST
105
Authors: Duncan Rintoul (UTS Institute for Public Policy and Governance (IPPG) ),George Argyrous (UTS Institute for Public Policy and Governance (IPPG), AU),Tish Creenaune (UTS Institute for Public Policy and Governance (IPPG), AU),Narina Dahms (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Peter Robinson (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Robert Gotts (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU)

The ACT Evidence and Evaluation Academy is a prominent and promising example of sustained central agency investment in evaluation capability building (ECB).

The Academy was launched in 2021 as a new initiative to improve the practice and culture of evidence-based decision-making in the ACT public sector. Its features include:
  • a competitive application process, requiring executive support and financial co-contribution
  • a series of in-person professional learning workshops where participants learn alongside colleagues from other Directorates
  • a workplace project, through which participants apply their learning, receive 1-1 coaching, solve an evaluation-related challenge in their work and share their insights back to the group
  • executive-level professional learning and practice sharing, for nominated evaluation champions in each Directorate
  • sharing of resources and development of evaluation communities of practice in the Directorates
  • an annual masterclass, which brings current participants together with alumni and executive champions.

Four years and over 100 participants later, the Academy is still going strong. There has been an ongoing process of evaluation and fine tuning from one cohort to the next, with encouraging evidence of impact. This impact is seen not only for those individuals who have taken part but also for others in their work groups, including in policy areas where evaluation has not historically enjoyed much of a foothold.

The learning design of the Academy brings into focus a number of useful strategies - pedagogical, structural and otherwise - that other central agencies and line agencies may like to consider as part of their own ECB efforts.

The Academy story also highlights some of the exciting opportunities for positioning evaluation at the heart of innovation in the public sector, particularly in the context of whole-of-government wellbeing frameworks, cross-agency collaboration and strategic linkage of data sets to support place-based outcome measurement.

Chair Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
Friday September 20, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Gamified, flexible, and creative tools for evaluating a support program for palliative children and their families
Friday September 20, 2024 11:30am - 12:00pm AEST
104
Authors: Claire Treadgold (Starlight Children's Foundation Australia), Erika Fortunati (Starlight Children's Foundation, AU)

Our program creates personalised experiences of fun, joy, and happiness for families with a palliative child, aiming to foster family connections and celebrate the simple joys of childhood during this challenging circumstance. Evaluating the program is of utmost importance to ensure that it meets the needs of the families involved. Equally, due to the program's sensitivity and deeply personal nature, a low-pressure, flexible evaluation approach is necessary.
In our session, we will showcase our response to this need and share our highly engaging, low-burden tools to gather participant feedback that leverages concepts of gamification and accessibility to boost evaluation responses and reduce participant burden. In particular, we will focus on our innovative “activity book”, which evaluates the program through artistic expression. By emphasising creativity and flexibility, our tools aim to enrich the evaluation process and respect the diverse preferences and abilities of the participating families.
The core argument will focus on our innovative evaluation methodology, how it aligns with best practices in the literature, and our key learnings. Key points include the considerations needed for evaluating programs involving palliative children, empowering children and young people through their active involvement in the evaluation process, and how gamification and creativity boost participation and engagement.
Outline of the session:
  • Introduction to the palliative care program and the need for flexible, creative, and respectful evaluation methods
  • What the literature tells us about evaluation methods for programs involving palliative children and their families
  • A presentation of our evaluation protocol
  • Case studies illustrating the feedback collected and its impact
Our learnings and their implications for theory and practice
Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Erika Fortunati

Erika Fortunati

Research and Evaluation Manager, Starlight Children's Foundation Australia
Erika is the Research and Evaluation Manager at Starlight Children's Foundation, an Australian not-for-profit organisation dedicated to brightening the lives of seriously ill children. In her current role, Erika manages research projects and program evaluations to ensure that programs... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Embracing the L in "MEL": A Journey Towards Participatory Evaluation in Government Programs
Friday September 20, 2024 12:00pm - 12:30pm AEST
103
Authors: Milena Gongora (Great Barrier Reef Foundation)

Best practice in evaluation encompasses a crucial step of learning, yet it often receives inadequate emphasis, particularly within government-funded initiatives. Our paper documents the journey of transforming a top-down, prescriptive evaluation process within a government-funded program into an inclusive, consultative approach aligned with Monitoring, Evaluation, and Learning (MEL) principles.

Funded by Australian Government, and managed by the Great Barrier Reef Foundation, the Reef Trust Partnership (RTP) launched in 2018 aiming to enhance the resilience of the Great Barrier Reef. Within it, a $200 million portfolio aims to improve water quality working with the agricultural industry. A framework for impact evaluation was developed in its early days. Whilst appropriate, due to the need to comply with broader government requirements, it was top-down in nature.

Four years into implementation, the Foundation was ready to synthesise, interpret and report on the program's impact. The Foundation could have simply reported "up" to government. However, we acknowledged that in doing so, we risked missing critical context, simplifying findings, misinterpreting information and presenting yet another tokenistic meaningless report.

Interested in doing things better, we instead circled back with our stakeholders in a participatory reflection process. Through a series of carefully planned workshops, we invited on-ground program practitioners to ground-truth our findings, share contextual nuances, and collectively strategise for future improvements.

Despite initial reservations, participants embraced the opportunity, fostering an atmosphere of open dialogue and knowledge exchange. This reflective process not only enriched our understanding of program impact but also enhanced collaboration, strengthening overall program outcomes.

Our experience highlights the importance of transcending tokenistic evaluation practices, particularly in environments where top-down directives prevail. Participatory approaches can be implemented at any scale, contributing to a culture of continuous improvement and strategic learning, ultimately enhancing the impact and relevance of evaluation efforts.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Chair Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Planning and Environment NSW
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Place-based evaluation: collaborating to navigate learning in complex and dynamic contexts
Friday September 20, 2024 12:00pm - 12:30pm AEST
105
Authors: Sandra Opoku (Relationships Australia Victoria), Kate Matthies-Brown (Relationships Australia Victoria, AU)

Yarra Communities That Care (CTC) is a network of 24 local partner agencies who share a commitment to support the healthy development of young people in the City of Yarra. One of the key initiatives of Yarra CTC is the collaborative delivery of evidence-based social and emotional messaging to families by a centrally coordinated Facilitator Network involving multiple partner agencies. Building on positive feedback and program achievements from 2017-2022, we led an evaluation of the collaborative and place-based approach of the Yarra CTC Facilitator Network to better understand its contribution to systemic change and apply learnings to future place-based approaches for our respective organisations. The evaluation project team adopted the 'Place-Based Evaluation Framework' and was informed by a comprehensive theory of change. This provided an anchor in an otherwise complex and dynamic environment and unfamiliar territory.
There is an increased focus on collaborative place-based approaches at the federal, state and local levels as a promising approach to addressing complex social problems. Previous evaluations and literature identify successful collaboration and a strong support entity or backbone as key enabling factors that make place-based approaches successful. The collaborative place-based approach to strengthening family relationships in Yarra provides a local example of this.

Consistent with systems change frameworks this evaluation provided evidence of structural changes. These changes, manifested in the form of improved practices and dedicated resources and supports, ultimately leading to effective collaborative and transformative changes for the community.

This presentation will share the journey, key insights, and learnings of the evaluation project team over a two-year period to collaboratively gather evidence to inform ongoing program development and contribute to future place-based approaches. The Yarra CTC Facilitator Network serves as a valuable template for implementing best practices for place-based coalitions due to its focus on collaboration and fostering a sense of community.

Chair Speakers
avatar for Sandra Opoku

Sandra Opoku

Senior Manager Evaluation and Social Impact, Relationships Australia Victoria
My role leads impact, evidence and innovation activities at Relationships Australia Victoria. These activities contribute to achieving strategic objectives and improving outcomes for individuals, families and communities. This now also includes oversight of several key prevention... Read More →
avatar for Kate Matthies-Brown

Kate Matthies-Brown

Since 2022, Kate has supported RAV’s evaluation and social impact activities, including program evaluation, practice development, and evidence reviews. She is a qualified social worker with experience in family services, youth mental health and academia. Kate has experience with... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

A sprint, not a marathon: Rapid Evaluation as an approach for generating fast evidence and insights
Friday September 20, 2024 12:00pm - 12:30pm AEST
104
Authors: Marnie Carter (Allen + Clarke Consulting)

Increasingly, evaluators are called upon to quickly equip decision makers with evidence from which to take action. A program may be imminently approaching the end of a funding cycle; a critical event may have taken place and leadership needs to understand the causes and learnings; or a new program of work is being designed for which it is important to ensure that finite resources are being directed to the most effective interventions. For such circumstances, Rapid Evaluation can be a useful tool.

Rapid Evaluation is not simply doing an evaluation quickly. It requires a deliberate, interlinked and iterative approach to gathering evidence to generate fast insights. What makes Rapid Evaluation different is that the evaluation design needs to be especially flexible, constantly adapting to the context. Data collection and analysis don't tend to follow a linear manner, but rather iterate back and forth during the evaluation. Rapid Evaluation is often conducted in response to specific circumstances that have arisen, and evaluators therefore need to manage a high level of scrutiny.

This presentation will provide an overview of how to conduct a rapid evaluation, illustrated by practical examples including rapid evaluations of a fund to support children who have been exposed to family violence, and a quickly-established employment program delivered during the COVID-19 pandemic. It will discuss the methodological approach to conducting a Rapid Evaluation, share lessons on how to manage the evolving nature of data collection as the evaluation progresses, and discuss how to maintain robustness while evaluating at pace.


Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Marnie Carter

Marnie Carter

Evaluation and Research Practice Lead, Allen + Clarke Consulting
Marnie is the Evaluation and Research Practice Lead for Allen + Clarke Consulting. She is experienced in program and policy evaluation, monitoring, strategy development, training and facilitation. Marnie is particularly skilled in qualitative research methods. She is an expert at... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

From evaluation to impact-practical steps in a qualitative impact study
Friday September 20, 2024 1:30pm - 2:00pm AEST
Authors: Linda Kelly (Praxis Consultants), Elizabeth Jackson (Latrobe University, AU)

This presentation focuses on a multi-year program funded by Australia that aims to empower people marginalised by gender, disability and other factors. Like similar programs, the work is subject to regular monitoring and evaluation - testing the effectiveness of program activities largely from the perspective of the Australian and national country Government.
But what of the views of the people served by the program? Is the impact of the various activities sufficient to empower them beyond their current condition? How significant are the changes introduced by the program, given the structural, economic, social and other disadvantages experienced by the marginalised individuals and groups.
Drawing from feminist theory, qualitative research methods and managed with local research and communication experts this presentation outlines the study focused on the long-term impact of the program.

The presentation will outline the methodology and practical considerations in the development of the approach and data collection methodologies. It will highlight the value of exploring impact from a qualitative perspective, while outlining the considerable management and conceptual challenges required in designing, introducing and supporting such an approach. It will consider some of the implications in shifting from traditional evaluation methods to more open-ended enquiry and consider whose values are best served through evaluation versus impact assessment?


Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Elisabeth Jackson

Elisabeth Jackson

Senior Research Fellow, Centre for Human Security and Social Change, La Trobe University
Dr Elisabeth Jackson is a Senior Research Fellow at the Centre for Human Security and Social Change where she conducts research and evaluation in Southeast Asia and the Pacific. She is currently co-leading an impact evaluation of a program working with diverse marginalised groups... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Fidelity to context: A realist perspective on implementation science
Friday September 20, 2024 1:30pm - 2:00pm AEST
105
Authors: Andrew McLachlan (NSW Department of Education)

At first glance, realist methodology seems ideally suited to investigating implementation problems (Dalkin et al., 2021). It is versatile in that it draws on theories from diverse fields of social inquiry. It is pragmatic in that the theories it adopts are deemed useful only in so far as they offer explanatory insight. And it is transferable; realist methodology is less concerned with generalising findings than in understanding how programs work under different conditions and circumstances.

As for implementation science, its founding aim is purpose built for realist work; it seeks to improve the uptake of evidence-based practices by investigating the barriers and facilitators to implementation. Yet despite the affinity between realist methodology and implementation science, so far there have been few attempts to formalise the relationship (Sarkies et al., 2022).

This paper offers insights into how evaluators can harness realist methodology to better understand challenges of program implementation. It demonstrates how implementation concepts like fidelity (the degree to which a program is delivered as intended), adaptation (the process of modifying a program to achieve better fit), and translation (the ability to transfer knowledge across organisational borders) can be combined with realist concepts to develop a more active understanding of context.

In showing how to construct program theories that are responsive to changing conditions, the paper promises to equip evaluators with tools that can help them navigate the complexities of program implementation in their own work.



Chair Speakers
avatar for Andrew McLachlan

Andrew McLachlan

Evaluation Lead - Strategy, NSW Department of Education
Andrew McLachlan is an Evaluation Lead for the NSW Department of Education. Before becoming an evaluator, Andrew had over 10 years of experience as a teacher, working in settings as diverse as far North Queensland and Bangladesh. Since 2021, Andrew has worked as an embedded evaluator... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from the past: Reflections and opportunities for embedding measurement and evaluation in the national agenda to end Violence against Women and Children
Friday September 20, 2024 1:30pm - 2:30pm AEST
106
Authors: Lucy Macmillan (ANROWS), Micaela Cronin (Domestic and Family Violence Commission, AU), Tessa Boyd-Caine (ANROWS, AU), TBC TBC (National Lived Experience Advisory Council, AU)

As evaluators, we are often asked to examine complex, systems-change initiatives. Domestic, family and sexual violence is a national crisis. In late 2022, the Commonwealth Government, alongside all state and territory governments released the second National Plan to End Violence against Women and Children 2022-2032. The plan provides an overarching national policy framework to guide actions across all parts of society, including governments, businesses, media, educational institutions and communities to achieve a shared vision of ending gender-based violence in one generation.

After 12 years of implementation under the first National Plan, assessing whether our efforts had made a meaningful difference towards ending violence against women was a difficult task. We ask: As we embark on setting up measurement and evaluation systems against the second National Plan, how do we avoid making the same mistakes again?

The Domestic, Family and Sexual Violence Commission was established in 2022 to focus on practical and meaningful ways to measure progress towards the objectives outlined in the National Plan. This session will discuss:
  1. the current plans, opportunities and challenges in monitoring progress, and evaluating the impact of this national framework, and
  2. the role of lived-experience in evaluation and how large publicly-funded institutions can balance their monitoring and sensemaking roles at the national-level with accountability to victim-survivors.

The panel will explore common challenges faced when seeking to monitor and evaluate complex national policy initiatives, including data capture, consistency and capacity, and explore some of the opportunities ahead.

The audience will have the opportunity to contribute their insights and expertise on how we, as evaluators, approach the evaluation of complex systems-change at a national scale, and over extended durations, while also prioritising the voices of those most affected. How do we collectively contribute to understanding if these national policy agendas will make a difference?


Chair
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Speakers
LM

Lucy Macmillan

Dir Evaluation & Impact, ANROWS
Friday September 20, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Navigating ethics dilemmas when evaluating for government: The good, the bad and the ugly
Friday September 20, 2024 1:30pm - 2:30pm AEST
Authors: Kristy Hornby (Grosvenor), Eleanor Williams (Australian Centre for Evaluation), Mandy Chaman (Outcomes, Practice and Evidence Network)

Navigating ethics is an essential part of any evaluation journey. As evaluators we often encounter complex situations that require thoughtful consideration of ethical principles and practice, far beyond the formal ethics process itself.

This session will explore real-world scenarios and provide attendees with actionable strategies to enhance ethical decision-making in their evaluation practice. The panel will speak to questions of managing commissioners' expectations, how to speak frankly to program areas where under-performance is found, issues of confidentiality, ensuring culturally sensitive practice, and ensuring power imbalances are acknowledged and addressed.

The panel presentation will take attendees through the journey of ethical practice and will consider:
- The overall significance of ethical thinking in evaluation
- Common ethical challenges faced by evaluators
- Practical tools and frameworks that empower evaluators to uphold their ethical standards and deliver meaningful results that can withstand scrutiny
- From an internal evaluator perspective, the balancing act of managing these tensions successfully
- Case studies that can illustrate the application of practical ethics in evaluation
- Takeaways and recommendations.

Eleanor Williams, Managing Director of the Australian Centre for Evaluation; Mandy Charman, Project Manager for the Outcome, Performance and Evidence Network in the Centre for Excellence in Child and Family Welfare; and Kristy Hornby, Victorian Program Evaluation Practice Lead at Grosvenor will be the panellists. Our expert panellists will talk to their deidentified war stories in their current and previous roles to set out exactly what kind of challenges evaluators can face in the conduct of their work, and learn from the panellists' hands-on experience on what to do about them. Attendees will be encouraged to participate in a dynamic dialogue with the panellists and with each other, to share their own experiences and strategies for addressing ethical concerns, building on the content shared through the session.
Chair
avatar for Sally Clifford

Sally Clifford

General Manager, Matrix on Board
Having graduated from QUT in Brisbane with a Bachelor of Arts in Drama ( Hons)( 1992) and then a Master of Arts ( CCD in Healthcare settings)( 1997) I worked for 6 years as a freelance community cultural development artist across Brisbane and SE Qld. In 1998 I was invited to develop... Read More →
Speakers
MC

Mandy Charman

Project Manager, Outome Practice and Evidence Network, Centre for Excellence in Child and Family welfare
Dr Mandy Charman is the Project Manager for the Outcome, Performance and Evidence Network (OPEN) in the Centre for Excellence in Child and Family Welfare. OPEN, which represents a sector–government–research collaboration, has been developed to strengthen the evidence base of the... Read More →
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
  Footprints

1:30pm AEST

Investment logic mapping or evaluation logic modelling? Similarities and differences.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlie TULLOCH (Policy Performance)

Evaluation logic modelling is a frequently used technique, looking at such things as inputs, activities, outputs and outcomes.
Since the early 2000s, the Department of Treasury and Finance (Victoria) has used an adapted logic modelling format called Investment Logic Mapping (ILM). It is now used nation-wide and internationally to support resource allocation planning, along with stakeholder engagement.

ILMs and evaluation logic modelling have many similarities, but some major differences.

This ignite presentation will compare and contrast both these tools, and describe when and why to use each.

Attendees will very quickly understand the main similarities and differences, their advantages and drawbacks.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Measuring Impact Through Storytelling: using Most Significant Change to evaluate the effectiveness of QHub for LGBTIQA+ young people.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Gina Mancuso (Drummond Street Services), Arielle Donnelly (Drummond Street Services, AU)

LGBTIQA+ young people experience discrimination and marginalisation which contribute to poorer mental and physical health outcomes, compared to the general population. QHub is an initiative that creates safe spaces, offers mental health and well-being services, and provides outreach tailored for LGBTIQA+ young people in Western Victoria and the Surf Coast. QHub provides LGBTIQA+ young people and their families/carers with welcoming, inclusive and integrated support, as well as opportunities to connect with peers and older role models. This presentation will outline how the collection and selection of stories of change (Most Significant Change) is helping us evaluate the impact of QHub.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Program Evaluation Fundamentals in the NSW Department of Planning, Housing and Infrastructure: An eLearning course on evaluation
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Anabelle (Pin-Ju) Chen (NSW Department of Planning, Housing and Infrastructure)

Introducing Program Evaluation Fundamentals (PEF) in the NSW Department of Planning, Housing and Infrastructure, an eLearning course designed to facilitate a coherent journey of learning within the department. With learning and adapting together in mind, the design of PEF empowers individuals at all levels to navigate the fundamentals of evaluation. Through interactive modules, learners will understand key evaluation concepts and cultivate best practices. PEF promotes transformative growth by emphasising foundational evaluation knowledge. By leveraging PEF, we can shift our approach, embrace innovation, and advance the field of evaluation across the public sector, fostering a supportive community of forward-thinking evaluators.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Anabelle (Pin-Ju) Chen

Anabelle (Pin-Ju) Chen

Senior Analyst, Evidence and Evaluation, NSW Department of Planning, Housing and Infrastructure
Anabelle (Pin-Ju) Chen is a distinguished senior analyst hailing from Taiwan, with a global perspective on evaluation, data analysis, and project management. Having studied in Taiwan, the United Kingdom, and Australia, Anabelle brings a diverse range of experiences and insights to... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Putting values on the evaluation journey map
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People)

Values guide all evaluation processes, methods and judgements. Although evaluators are often not aware of the values shaping their work and can't readily name them, they know when we are straying off their values path through the experience of conflict or unease. Reflecting on the evaluation literature and two decades of evaluation practice using a 'values' perspective, it is argued that there has never been a more important time to build values literacy. This presentation demonstrates how values literacy can guide conversations with yourself, your team and others and provide signposting and illumination of a more rigorous and ethical evaluation journey.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Voices of the Future: Elevating First Nations Leadership in the Evolution of Educational Excellence
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Skye Trudgett (Kowa ),Sharmay Brierley (Kowa, AU)

This ignite presentation will delve into the pioneering evaluation within the education sector, where a series of education initiatives were designed and implemented by Aboriginal Community Controlled Organisations (ACCO's) and mainstream Education partners to uplift and support young First Nations peoples. We will uncover how the initiative's evaluation framework was revolutionarily constructed with First Nations communities at its heart, applying the reimagining evaluation framework, utilising diverse data collection methods and producing Community Reports that reflect First Nations experiences and voices.

Attendees will be guided through the evaluative journey, showcasing the incorporation of wisdom to demonstrate the profound value of community-delivered initiatives that contribute to change. The session will highlight the success stories and learnings, emphasising how this approach not only benefits the current generation but also lays the groundwork for the prosperity of future generations.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Sharmay Brierley

Sharmay Brierley

Consultant, Kowa Collaboration
Sharmay is a proud Yuin woman and project lead at Kowa with prior experience supporting First Nations peoples across human services sectors.As a proud First Nations woman, and through lived experience, Sharmay has a strong understanding of the many challenges faced by First Nations... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

A practical approach to designing and implementing outcome measures in psychosocial support services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
Authors: Lauren Gibson (Mind Australia ),Dr. Edith Botchway (Mind Australia, AU), Dr. Laura Hayes (Mind Australia, AU)

Outcome measurement in mental health services is recommend as best practice and provides an opportunity for clients and staff to track progress and navigate the complex road to recovery together. However, there are many barriers to embedding outcome measures in mental health services, including time constraints, low perceived value among staff and clients, and not receiving feedback on outcomes regularly. To overcome these challenges, a national not-for-profit provider of residential and non-residential psychosocial support services, created an innovative approach for designing and implementing outcome measures. The objective of our presentation is to describe this approach which has resulted in average outcome measure completion rates of over 80% across 73 services in Australia.

Design
We believe the key to achieving these completion rates is through understanding the needs of outcome measures end-users, including clients, carers, service providers, centralised support teams, and funding bodies. In this presentation we will share how we:
  • "Begin with the end in mind" through working with stakeholders to create user personas and program logics to identify meaningful outcomes and survey instruments.
  • Design easy to use digital tools to record quality data and provide stakeholders with dashboards to review their outcomes in real time through visualising data at an individual client level, and service level.

Implementation
Also key to embedding outcome measures is having a structured, multi-stage approach for implementation, with tailored support provided to:
  • Prepare services (e.g., Training)
  • Install and embed outcome measures in routine practice (e.g., Service champions)
  • Maintain fidelity over time (e.g., Performance monitoring)

The presentation will highlight the salient barriers and enablers identified during each design and implementation stage.

Overall, the presentation will provide a practical example of how to design and implement outcome measures in mental health services to ensure they are adding value for relevant stakeholders and enabling efficient and meaningful evaluation.

Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Lauren Gibson

Lauren Gibson

Researcher, Mind Australia
Dr. Lauren Gibson’s research focuses on understanding the prevalence and impact of initiatives aimed at improving physical and mental health outcomes among mental health service users. She has been a researcher within the Research and Evaluation team at Mind Australia for over two... Read More →
Friday September 20, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Chair Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Closing plenary: Panel, John Gargani "Finding Our Way to the Future Profession of Evaluation"
Friday September 20, 2024 2:30pm - 4:00pm AEST
More details of the closing plenary, including panelists, to be confirmed.

John Gargani, President of Gargani + Company, former President of the American Evaluation Association, USA

As the AES 2024 conference comes to a close, we gather one last time to consider the journey ahead. We seek a destination none have seen—a profession that in ten years’ time fully supports societal and planetary health—along a path we have never traveled. The urgency of existential threats such as AI, global heating, and pandemics call into question the traditional ways our profession has navigated the future. And new players such as impact investors, social entrepreneurs, effective altruists, and socially responsible corporations ensure that the journey will be crowded and some routes cut off.

With this in mind, we pose two questions to our panelists.
  1. What milestones and songlines should guide us on our way to an imagined future profession? How will we know if we have lost our way?
  2.     How should we interact with other professions on a similar journey? Like commuters on a train who dare not speak, shipwrecked strangers who must quickly band together, or something else altogether?
Followed by:
Conference close by the AES President, and handover to aes25 Ngambri/Canberra


Chair
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, The University of Melbourne
I'm an Associate Professor of Evaluation at the University of Melbourne Assessment and Evaluation Research Centre. I'm also a co-founder and current chair of the International Society for Evaluation Education https://www.isee-evaled.com/, a long-time member of the AES Pathways Committee (and its predecessors), and an architect of the University of Melbourne’s fully online, multi-disciplinary, Master and Graduate Certificate of Evaluation programs https://study.unimelb.edu.au/find/courses/graduate/master-of-evaluation/ .I practice, teach, and proselytize evaluation... Read More →
Speakers
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →
avatar for Elizabeth Hoffecker

Elizabeth Hoffecker

Lead Research Scientist, Local Innovation Group, Massachusetts Institute of Technology (MIT), USA
Elizabeth Hoffecker is a social scientist who researches and evaluates processes of local innovation and systems change in the context of addressing global development challenges. She directs the MIT Local Innovation Group, an interdisciplinary research group housed at the Sociotechnical... Read More →
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora, greetings allI'm always excited to attend AES conferences and the Wayfinding theme of aes24 speaks to my heritage and culture. This year with my family I holidayed in beautiful Rarotonga in the Cook Islands. We visited the site where the waka (Māori watercraft/vessel) of... Read More →
Friday September 20, 2024 2:30pm - 4:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -