Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
Foundational & Intermediate clear filter
Wednesday, September 18
 

11:00am AEST

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106
Authors: Samantha Abbato (Visual Insights People )

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Evaluation capacity building is increasingly becoming a core part of evaluation practice and a critical part of incorporating evaluation into the everyday activity of organisations (Preskill and Boyle, 2008, White, Percy and Small, 2018). Reaching the point where evaluation becomes the way of doing business requires a change of knowledge, skills, and attitudes.

Changes need to happen at the level of individuals, teams, organisations, and partnerships. This journey requires supporting and managing change to systematic enquiry processes as much as it requires evaluation expertise. In this skill-building session, we introduce Jonathan Haidt's 'rider, elephant and pathway' metaphor as a framework to support change and strengthen evaluation capacity (Haidt, 2018).

Haidt's metaphor for change includes the rider (our rational thinking side) atop an elephant (our emotional side). Behaviour change for individuals and collectives requires steps that (1) support the rider, such as giving clear directions, (2) motivate the elephant by tapping into emotions, and (3) shape a pathway to change, including clearing obstacles. In this interactive session, the facilitator will provide case studies applying Haidt's metaphor,spanning two decades Through these examples the power of this framework to support evaluation capacity building is demonstrated. Examples include using Haidt's framework for:
1. Building a Monitoring, Evaluation and Learning (MEL) system with a medium-sized community organisation;
2. Increasing the maturity of MEL in an existing large organisation; and
3. Increasing the impact of evaluation partnerships.

The active skill-building component incorporates:_
  • Cartoon elephant, rider and pathway flashcards;
  • A 'snakes and ladders' style game; and
  • Evaluation-specific examples.

The combination of examples and activities are designed to support participant learning. The session will encourage discussion of barriers, enablers and actions to build evaluation capacity relevant to different situations and contexts.

Learning objectives include:
  • Knowledge of a sound and memorable psychological framework for supporting evaluation capacity building;
  • Ability to apply Haidt's metaphor
Chair
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Culturally Responsive Initiatives: Introducing the First Nations Investment Framework
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104
Authors: Eugenia Marembo

Representatives of First Nations communities have been advocating for changes in the way initiatives are planned, prioritised, and assessed. This includes greater visibility on where funding is going, more partnerships on designing initiatives and more evaluation on the outcomes being achieved, to inform government decision making.

This paper presents key insights on what constitutes good practice when designing and appraising initiatives that affect First Nations people and communities. The National Agreement on Closing the Gap is built around four new Priority Reforms that will change the way governments work with Aboriginal and Torres Strait Islander people and communities. Priority Reform Three is about transforming government institutions and organisations. As part of this Priority Reform, parties commit to systemic and structural transformation of mainstream government organisations to improve accountability, and to respond to the needs of First Nations people.

The findings presented in this paper draw on insights from consultations with various First Nations community representatives and government stakeholders in New South Wales, and the subsequent process of developing a government department's First Nations investment framework which seeks to strengthen the evidence on what works to improve outcome for First Nations people. Additionally, the frameworks to improve practice across government processes and better inform how initiatives are designed, prioritised and funded.
Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Steven Legg

Steven Legg

Associate Director, NSW Treasury
avatar for Eugenia Marembo

Eugenia Marembo

NSW Treasury, Senior Analyst, First Nations Economic Wellbeing
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Enhancing evaluation value for small community organisations: A case example
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104
Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke, Carolyn McSporran (Blue Light Victoria, AU)Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke (Assessment and Evaluation Research Centre, University of Melbourne, AU), Elissa Scott (Blue Light Victoria, AU)

This presentation aims to provide a case example of how two small-scale, standard process/outcomes evaluations for a low-budget community organisation increased value for the organisation by identifying and seizing opportunities for evaluation capacity building. Formal evaluations represent a significant financial commitment for low-budget community organisations. By maximising the value provided by such evaluations, evaluators can contribute more to these organisations' mission and ultimately to social betterment.

There are numerous different evaluation capacity building models and frameworks, many of which appear to be quite complex (for example: Volkov & King, 2007; Preskill & Boyle, 2008). Many emphasise planning, documentation, and other resource intensive components as part of any evaluation capacity building effort. This session provides a case example of intentional but light-touch and opportunistic evaluation capacity building. Through such an approach, evaluators may need to do only minimal additional activities to provide extra value to an organisation. Reflection-in-action during the evaluation process is as important as the final reporting (Schwandt & Gates, 2021). The session emphasises, though, that a critical enabler will be the organisation's leadership and culture, and willingness to seize the opportunity offered by a formal evaluation. The session is co-presented by two members of the evaluation team and the Head of Strategy, Insights, and Impact of the client organisation.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
avatar for Stephanie Button

Stephanie Button

Research Associate & Evaluator, Assessment & Evaluation Research Centre
Stephanie has worked as a policy manager, analyst, strategist, researcher, and evaluator across the social policy spectrum in the public and non-profit sector for over 12 years. She is passionate about evidence-based policy, pragmatic evaluation, and combining rigour with equitable... Read More →
avatar for Carolyn McSporran

Carolyn McSporran

Head of Strategy, Insights and Impact, Blue Light Victoria
Passionate about social inclusion, Carolyn's work has spanned diverse portfolios across the justice and social services sectors. With a fervent belief in the power of preventative and early intervention strategies, she is committed to unlocking the full potential of individuals and... Read More →
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Revitalising Survey Engagement: Strategies to Tackle Low Response Rates
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Kizzy Gandy

Surveys are an excellent data collection tool when they reach their target response rate, but low response rates hinder the generalisability and reliability of the findings.

This Ignite presentation will discuss techniques Verian evaluators have applied to increase survey response rates while also assessing the efficacy and efficiency of these techniques. We will also explore other evidence-based strategies for boosting response rates and the value of drawing on other data sources if your response rates are still low.
Chair Speakers
avatar for Hannah Nguyen

Hannah Nguyen

Economist, Verian Group
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Sign here: Supporting Deaf participation in evaluation
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Susie Fletcher (Australian Healthcare Associates)

Auslan is a visual, signed language that was developed by and for the Australian Deaf community. People who use Auslan as their primary or preferred language are not necessarily fluent in English. Our team was engaged to review access to interpreter services for Auslan users, a population group that is often underrepresented in evaluation. In this presentation we will highlight some of the issues evaluators need to consider when working with this marginalised community, and share practical skills and techniques for making their evaluations more accessible.
Chair Speakers
avatar for Susie Fletcher

Susie Fletcher

Senior consultant, Australian Healthcare Associates
Dr Susie Fletcher is an experienced health services researcher with over 50 peer reviewed journal articles and 3 book chapters in mental health and primary care. She is passionate about improving health outcomes through integrating services across sectors; her recent work has included... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Cultivating Equity: A Roadmap for New and Student Evaluators' Journeys
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
Authors: Ayesha Boyce (Arizona State University), Aileen Reid (UNC Greensboro, US)

Evaluation can be positioned as a social, cultural, and political force to address issues of inequity. We co-direct a 'lab' that provides new evaluators with hands-on applied research and evaluation experience to support their professional development. We are proud of our social justice commitments, and they show up in all aspects of our work. We believe the next generation of evaluators must be trained and mentored in high-quality technical, strengths-based, interpersonal, contextual, social justice-oriented, and values-engaged evaluation. We have found that novice evaluators are able to engage culturally responsive approaches to evaluation at the conceptual level, but have difficulty translating theoretical constructs into practice. This paper presentation builds upon our experiences and previous work of introducing a framework for teaching culturally responsive approaches to evaluation (Boyce & Chouinard, 2017) and a non-course-based, real-world-focused, adaptable training model (Reid, Boyce, et al., 2023). We will discuss how we have taught new evaluators three formal and informal methodologies that have helped them align their values with praxis. Drawing from our work across multiple United States National Science Foundation-funded projects we will overview how the incorporation of photovoice methodology, just-in-time feedback, and reflective practice have supported our commitments to meaningfully, and respectfully attend to issues of culture, race, diversity, power, inclusion, and equity in evaluation. We will also discuss our thoughts on the implications of globalization, Artificial Intelligence, and shifting politics on evaluation capacity building and training of new evaluators.

Chair
avatar for Nick Field

Nick Field

Director (Public Sector), Urbis
Nick has twenty years of public sector consulting experience, backed more recently by six years as a Chief Operating Officer in the Victorian Public Sector. A specialist generalist in a broad range of professional advisory services, Nick has expertise in the implementation of state-wide... Read More →
Speakers
avatar for Ayesha Boyce

Ayesha Boyce

Associate Professor, Arizona State University
Ayesha Boyce is an associate professor in the Division of Educational Leadership and Innovation at Arizona State University. Her research career began with earning a B.S. in psychology from Arizona State University, an M.A. in research psychology from California State University... Read More →
avatar for Aileen M. Reid

Aileen M. Reid

Assistant Professor, UNC Greensboro
Dr. Aileen Reid is an Assistant Professor of Educational Research Methodology in the Information, Library and Research Sciences department and a Senior Fellow in the Office of Assessment, Evaluation, and Research Services (OAERS) at UNC Greensboro. Dr. Reid has expertise in culturally... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Link-Up Services and wayfinding: Co-creating and navigating a culturally safe national monitoring and evaluation strategy
Wednesday September 18, 2024 1:30pm - 3:00pm AEST
Authors: Kathleen Stacey (beyond...Kathleen Stacey & Associates Pty Ltd), Cheryl Augustsson (Yorgum Healing Services), Raelene Rosas (NT Stolen Generation Aboriginal Corporation), Pat Thompson (Link-Up (Qld) Aboriginal Corporation), Jamie Sampson (Link-Up (NSW) Aboriginal Corporation)

Link-Up Services support Aboriginal and/or Torres Strait Islander people who were forcibly removed, fostered or adopted from their families as children, and their descendants who live with the ongoing impact of forcible removal policies, to reconnect with family, community, culture and Country. Wayfinding is at the core of our work - navigating unfamiliar territory with clients towards a hoped for destination of a greater sense of 'home', wherever this is possible, in a culturally safe, appropriate and trauma-informed manner.

In 2019, the National Indigenous Australians Agency funded development of a national Link-Up monitoring and evaluation strategy with the eight Link-Up Services operate across six jurisdictions. Each Link-Up Service is either a stand-alone Aboriginal community controlled organisations or based in an Aboriginal community controlled organisation.

This interactive workshop invites participants into our collective experiences of co-creating and implementing the M&E Strategy on a national basis, presented from the voices and position of Link-Up Services. We believe our experiences and learnings will be instructive for monitoring and evaluation activity with other Aboriginal and Torres Strait Islander organisations and programs.

Travel with us in reflecting on our monitoring and evaluation wayfinding journey over three phases of work. Pause with us at key points throughout the session to exercise your critical self-reflection and analysis skills, share your ideas and learn what has worked well or presented challenges for us and why in creating, navigating and implementing a culturally safe monitoring and evaluation strategy in a complex and demanding service context.
Speakers
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
RR

Raelene Rosas

Interim CEO, Northern Territory Stolen Generations Corporation
avatar for Patricia Thompson AM

Patricia Thompson AM

CEO, Link-Up Queensland
CEO of Link-Up (Qld), an organisation that celebrates 40 years of supporting Stolen Generations this year. Has a wealth of management experience across all levels of government and importantly at a community level.  Has represented Aboriginal & Torres Strait Islander people at a... Read More →
Wednesday September 18, 2024 1:30pm - 3:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Navigating a path to system impact: designing a strategic impact evaluation of education programs
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104
Authors: Amanda Reeves (Victorian Department of Education), Rhiannon Birch (Victorian Department of Education, AU), Eunice Sotelo (Victorian Department of Education, AU)

To provide insight to complex policy problems, evaluations need to adopt a systems perspective and examine the structures, relationships and contexts that influence program outcomes.

This paper outlines the design of a 4-year strategic evaluation that seeks to understand how a portfolio of over 25 education programs are interacting and collectively contributing to system-level outcomes. In this context, policy makers require evaluation to look beyond the boundaries of individual programs and assess the holistic impact of this investment to inform where and how resources can be directed to maximise system outcomes.

The strategic evaluation presented is theory-based and multi-layered, using logic modelling to identify outcomes at the program, cluster and system level and draw linkages to develop a causal pathway to impact. The strategic evaluation and the evaluations of individual education programs are being designed together to build-in common measures to enable meta-analysis and synthesis of evidence to assess system-level outcomes. The design process has been broad and encompassing, considering a diverse range of methods to understand impact including quantitative scenario modelling and value for money analysis.

The authors will describe how the strategic evaluation has been designed to respond to system complexity and add value. The evaluation adopts an approach that is:
• interdisciplinary, drawing on a range of theory and methods to examine underlying drivers, system structures, contextual factors and program impacts
• collaborative, using expertise of both internal and external evaluators, to design evaluations that are aligned and can tell a story of impact at the system-level
• exploratory, embracing a learning mindset to test and adapt evaluation activities over time.

This paper will be valuable for anyone who is interested in approaches to evaluating the relative and collective contribution of multiple programs and detecting their effects at the system level to inform strategic decision-making.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Amanda Reeves

Amanda Reeves

Principal Evaluation Officer, Victorian Department of Education
Amanda is an evaluation specialist with over 12 years experience leading evaluation projects in government, not-for-profit organisations and as a private consultant. She has worked across a range of issues and sectors including in education, youth mental health, industry policy and... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Senior Evaluation & Research Officer, Department of Education (Victoria)
I'm here for evaluation but passionate about so many other things - education (as a former classroom teacher); language, neuroscience and early years development (recently became a mom so my theory reading at the moment is on these topics); outdoors and travel. Workwise, I'm wrangling... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When speed is of the essence: How to make sure the rubber hits the road
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103
Authors: Kristy Hornby (Grosvenor)

There is a lot of interest in rapid M&E planning and rapid evaluations at present; partially borne out of rapid contexts in a COVID-19 policy context; and partially borne out of constricting appetites for time and money spent on evaluations. It is unlikely this trend will reverse in the short-term, so what do we do about it to acquit our responsibilities as evaluators, ethically and appropriately, in a rapid context? This session sets out a step by step approach to conducting a rapid evaluation, inviting attendees to follow along with their own program in mind, to come away from the session with a pathway for conducting their own rapid evaluation. The session uses a fictional case study as the construct to move the rapid evaluation approach forward, describing throughout the session how you can use literature reviews, qualitative and quantitative data collection and analysis techniques, and report writing approaches innovatively to save you time while not compromising rigour.

We contend it is possible to do a rapid evaluation ethically and appropriately, but the backbone of doing so is good planning and execution. This session shares practical tips and approaches for doing so through each key phase of an evaluation, so attendees are well-equipped for their next rapid evaluation.

To consolidate the learning, attendees will be provided a framework to come away from the session with a high level plan of how to conduct their own rapid evaluation, to increase their chance of success.

Chair Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Developing a Tool for Measuring Evaluation Maturity at a Federal Agency
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105
Authors: Eleanor Kerdo (Attorney Generals Department ),Claudia Oke (Attorney Generals Department, AU),Michael Amon (Attorney Generals Department, AU),Anthony Alindogan (Attorney Generals Department, AU)

To embed a culture of evaluation across the Australian Public Service (Commonwealth of Australia, 2021), we must first have an accurate understanding of the current state of evaluation capability and priorities across Commonwealth agencies. This paper shares tools on how to build an effective measurement framework for evaluation culture, and discusses how to use these for evaluation capability uplift.
We explore quantitative and qualitative methods to gather and analyse data to measure an organisation's readiness to change its culture towards evaluation. This includes assessing staff attitudes towards evaluation, the level of opportunity for staff to conduct and use evaluation, and confidence in their knowledge of evaluation.
We discuss the development of a staff evaluation culture survey based on Preskill & Boyle's ROLE and how behavioural insight tools can be utilised to boost engagement. The paper discusses the utility of holding focus groups with senior leaders to understand authorising environments for evaluation and key leverage points. Also discussed, are challenges and innovative solutions that were encountered throughout the assessment process.
This paper will be valuable for those who work in, or with, any government agency with an interest in evaluation capacity building and driving an evaluation culture within organisations. This paper explains each stage of measurement design, data analysis and results, and discussing opportunities for action.
1 Preskill, H., & Boyle, S. (2008). A Multidisciplinary Model of Evaluation Capacity Building. American Journal of Evaluation, 29(4), 443-459. ://journals.sagepub.com/doi/10.1177/1098214008324182

2 Michie S, Atkins L, West R. (2014) The Behaviour Change Wheel: A Guide to Designing Interventions. London: Silverback Publishing. www.behaviourchangewheel.com.

3 Lahey, R. (2009). A Framework for Developing an Effective Monitoring and Evaluation System in the Public Sector: Key considerations from International Experience. Framework for developing an effective ME system in the public sector (studylib.net)
Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
avatar for Anthony Alindogan

Anthony Alindogan

Evaluation Lead, Attorney-General's Department
Anthony is an experienced evaluator with a particular interest in outcomes measurement and value-for-money. He completed his Master of Evaluation degree from the University of Melbourne. Anthony is an enthusiastic writer and has publications in various journals including the Evaluation... Read More →
avatar for Claudia Oke

Claudia Oke

Project Officer / Data Analyst, Australian Public Service Commission
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Chair Speakers
avatar for Jake Clark

Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Evaluation Lab: Using design to solve evaluation challenges
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Authors: Matt Healey (First Person Consulting)

The Design and Evaluation Special Interest Group (DESIG) was established in 2017. Its primary aim has been to explore the intersection of evaluation and design, and that aim has been interpreted in different ways over time. In 2024, the DESIG identified an opportunity to take the SIG model in a slightly different direction, embarking on an innovative venture with the launch of the Evaluation Lab, an initiative aimed at talk into action, and taking evaluators through a design process to address evaluation challenges.
Drawing inspiration from the concept of 'living labs,' which serve as real-world testing grounds, the Evaluation Lab created a space where evaluation professionals could come together. Employing a design-thinking process, the Lab guided participants through a structured expedition of defining, ideating, and prototyping solutions to tackle nominated challenges. Participants also learned pitch skills to communicate their solutions.
This Big Room Session provides an opportunity for the DESIG to outline the Evaluation Lab model, capped off with participants presenting their solutions through rapid-fire pitches, either live or pre-recorded, akin to explorers sharing tales of new lands discovered. The session's innovative twist lies in the audience's role, acting as both audience and judges. The audience will vote on their favourite solution, and be involved in crowing the first AES Evaluation Lab winner.
By blending lecture-style content with dynamic team presentations and active audience engagement, the Big Room Session not only highlights the critical role of design in navigating evaluation challenges but also demonstrates the practical application of these methodologies in charting a course through real-world problems.

Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Shani Rajendra

Shani Rajendra

Principal Consultant & Head of Business Group (Social Impact), Clear Horizon
Shani is a Principal Consultant in Clear Horizon’s Social Impact team. Shani has extensive experience in community-led initiatives, organisational strategy, and social enterprise. She specialises in incorporating design thinking into evaluative practice. Having completed a Master... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Trigger warnings - do they just trigger people more?
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104
Authors: Kizzy Gandy (Verian (formerly Kantar Public) )

As evaluators, one of our key ethical responsibilities is to not cause psychological harm or distress through our methods. We often start workshops or interviews with a warning that the topic may be upsetting and provide contact information for mental health services to participants under the assumption this is the most ethical practice.

Trigger warnings are used with good intentions and are often recommended in evaluation ethics guidelines. However, what do we know about their impact? Is there a risk they actually trigger people more?

This talk examines the evidence on whether trigger warnings are an effective strategy for reducing the risk of trauma and re-traumatisation when discussing topics such as sexual assault, mental health, violence, drug use, and other sensitive issues. It also touches on new evidence from neuroscience about how emotions are understood differently now compared to in the past.

This session will not provide a definitive answer on when or how to use trigger warnings but aims to challenge the audience to think critically about whether trigger warnings are useful in their own work.
Chair Speakers
avatar for Kizzy Gandy

Kizzy Gandy

National Director, Program Evaluation, Verian
Dr Kizzy Gandy is Verian's National Director of Program Evaluation. She leads a team of expert methodologists and provides quality assurance. With 20 years’ experience in consultancy, federal and state government, and academia, Kizzy has overseen the design and evaluation of over... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Development and implementation of a culturally grounded evaluation Framework: Learnings from an Aboriginal and Torres Strait Islander Peak.
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
Authors: Candice Butler (Queensland Aboriginal and Torres Strait Islander Child Protection Peak ),Michelle McIntyre (Queensland Aboriginal and Torres Strait Islander Child Protection Peak, AU),John Prince (JKP Consulting, AU)

There is increasing recognition that evaluations of Aboriginal and Torres Strait Islander programs must be culturally safe and appropriate, and represent the worldviews, priorities, and perspectives of Aboriginal and Torres Strait Islander communities. Aboriginal and Torres Strait Islander peoples have the cultural knowledge and cultural authority to design appropriate evaluations that are safe, and that tell the true story of the impacts of our ways of working.

As a peak body for Aboriginal and Torres Strait Islander community-controlled organisations we wanted to ensure that the worldviews and perspectives of our members and communities are embedded in any evaluations of services delivered by our member organisations. This is a necessary step towards building an evidence base for our ways of working, developed by and for Aboriginal and Torres Strait Islander people. To that end we developed an evaluation framework to enable self-determination and data sovereignty in evaluation, and to build capacity among our member organisations to undertake and/or commission culturally grounded evaluations. Culturally grounded evaluations are led by Aboriginal and Torres Strait Islander people and guided by our worldviews and knowledge systems - our ways of knowing, being and doing.

This paper reports on the development and implementation process used in the project and describes the standards and principles which underpin the framework. An example of how the framework is being applied in practice is also outlined. Our principles for evaluation describe the core values which underpin culturally grounded and safe evaluation including self-determination; cultural authority; truth-telling; two-way learning; and holistic approaches. The evaluation standards and associated elements operationalise our principles and embed them in evaluative practice.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Candice Butler

Candice Butler

Executive Director, Centre of Excellence, Queensland Aboriginal and Torres Strait Islander Child Protection Peak
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Commissioning evaluations - finding the way from a transactional to a relational approach
Thursday September 19, 2024 10:30am - 12:00pm AEST
Authors: Eleanor Williams (Australian Department of Treasury ),Josephine Norman (Victorian Department of Health, AU),Melissa Kaltner (Lumenia, AU),Skye Trudgett (Kowa Collaboration, AU),George Argyrous (Paul Ramsay Foundation, AU),Luke Craven (National Centre for Place-Based Collaboration (Nexus), AU)

Delivering great evaluations requires a strong professional relationship between those commissioning and delivering the evaluation, as well as all relevant stakeholders.

Traditional evaluation commissioning approaches have tended to treat evaluation as a one-off exchange focusing on the completion of pre-defined tasks. However, the evolving landscape of policies and programs tackling complex issues demands a more nuanced and relational approach to get the most out of the journey of evaluation.

This big room panel session brings together speakers who are at the forefront of thinking around collaborative commissioning partnerships from the perspectives of government, not-for-profit and Indigenous-led organisations, and the private sector who can play the full suite of roles on the commissioning journey. The discussion will delve into the experiences of a range of organisations involved in commissioning who are seeking to build enduring relationships, and in some case partnerships, between the commissioners, the evaluators and the stakeholders to whom we are accountable.

Drawing on real-world case studies and empirical evidence, the discussion will highlight the challenges and rewards of transitioning from a transactional model to a relational model. It will explore how this paradigm shift can enhance collaboration and ultimately lead to a range of positive outcomes.

Attendees will be invited to respond to engage in dialogue with the panel to bring the collective wisdom of attendees together and consider how the destination of better commissioning relationships would look, the practical obstacle we face on our pathway, and how we can reach our destination. To facilitate this active discussion, attendees will have the opportunity to use Sli.do throughout the session to provide input on key questions, share experience in real-time and ask questions of the expert panel.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
avatar for Josephine Norman

Josephine Norman

Director, Centre for Evaluation and Research Evidence, Dept of Health/Dept of Families, Fairness and Housing
I run a large internal evaluation unit, directing a team of 30 expert evaluators and analysts to: directly deliver high priority projects; support program area colleagues to make the best use of external evaluators; and, build generalist staff capacity in evaluation principles and... Read More →
avatar for Luke Craven

Luke Craven

Independent Consultant
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Thursday September 19, 2024 10:30am - 12:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Julia Suh

Julia Suh

Principal, Tobias
avatar for JESSICA LEEFE

JESSICA LEEFE

Principal, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Systems evaluation to the rescue!: How do we use systems evaluation to improve societal and planetary wellbeing?
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104
Authors: Kristy Hornby (Grosvenor), Tenille Moselen (First Person Consulting)

Systems evaluation - many might have heard the term, but few have done one. This session shares two case studies of different systems evaluations and the learnings from these to benefit other evaluators who are conducting or about to begin a systems evaluation.

The session will open with an overview and explanation of what systems evaluation is, in terms of its key features and how it is distinguished from other forms of evaluation. The presenters will then talk through their case studies, one of which centres on the disability justice system in the ACT, while the other takes a sector-wide focus across the whole of Victoria. The co-presenters will share openly and honestly their initial plans for commencing the systems evaluations, how they had to amend those plans in response to real-world conditions, and the tips and tricks and innovations they picked up along the way.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
avatar for Dr Katherine Dix

Dr Katherine Dix

Principal Research Fellow, School and System Improvement, Australian Council for Educational Research
Dr Katherine Dix is a Principal Research Fellow at ACER, with over 20 years as a program evaluator, educational researcher and Project Director. Dr Dix is the National Project Manager for Australia’s participation in OECD TALIS 2024, and is a leading expert in wellbeing and whole-school... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

A tool for addressing violence against women: An examination of the creation, benefits, and drawbacks of the Evidence Portal
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlotte Bell (Australia's National Research Organisation for Women's Safety (ANROWS)), Lorelei Hine (ANROWS, AU), Elizabeth Watt (ANROWS, AU), Rhiannon Smith (ANROWS, AU)

The first of its kind in Australia, the Evidence Portal is an innovative tool that captures and assesses impact evaluations of interventions from high-income countries that aim to address and end violence against women.

While we know high-quality evaluation evidence is an important component in informing and influencing policy and practice, decision-makers face a variety of potential barriers in accessing this evidence. By providing a curated repository of existing research, evidence portals can support policymakers, practitioners, and evaluators in their decision-making.

Our Evidence Portal consolidates and synthesises impact evaluation evidence via: (1) Evidence and Gap Maps, which provide a big-picture, visual overview of interventions; and (2) Intervention Reviews, which provide a succinct, standardised assessment of interventions in accessible language. Underpinned by a rigorous systematic review methodology, this tool seeks to:
  • Identify existing impact evaluations and gaps in the evidence base, and
  • promote a collective understanding of the nature and effectiveness of interventions that aim to address violence against women

Key points: This presentation will showcase the creation, benefits, and drawbacks of the Evidence Portal, with a focused discussion on the following areas:
  • What are evidence portals and how are they used to inform policy and practice?
  • Why and how was this evidence portal created?
  • What are the challenges in creating this tool and the learnings to date?
  • What other 'ways of knowing' should be considered?

This presentation begins with an in-depth exploration of the Evidence Portal and the important methodological decisions taken to build this tool. It then offers a reflection on our journey of creating this tool with a focus on significant learnings to date. You will gain an understanding of the Evidence Portal and key considerations for future evaluations of violence against women interventions.
Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Charlotte Bell

Charlotte Bell

Research Manager, ANROWS
Charlotte Bell is an experienced researcher who focuses on domestic, family and sexual violence. Charlotte is a Research Manager (Acting) at ANROWS, where she has worked for several years across multiple research projects. With a keen interest in evaluation and impact, and extensive... Read More →
avatar for Lauren Hamilton

Lauren Hamilton

Evaluation and Partnerships Manager, Australia's National Research Organisation for Women's Safety (ANROWS)
Lauren has over 10 years of experience in the evaluation, design and management of social programs, with a focus on violence against women and children, and women’s health. In her current role, Lauren works directly with frontline services and funders of domestic, family and sexual... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Journey Mapping: Visualising Competing Needs within Evaluations
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Jolenna Deo (Allen and Clarke Consulting)

Journey mapping acts as a GPS to grasp audience or consumer experience in evaluating policies or programs, highlighting twists, hidden gems, and pitfalls It can be a useful tool to help evaluators capture disparities and competing needs among intended demographics. This session will discuss the journey mapping method, drawing from an evaluation of a Community Capacity Building Program which used journey mapping to illustrate key consumer personas. It will explore the integration of multiple data sources to provide a comprehensive understanding of complex disparities and the cultural and historical contexts in which these arise.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Jolenna Deo

Jolenna Deo

Consultant, Allen and Clarke Consulting
Jolénna is a consultant at Allen + Clarke consulting. She is a proud Mirriam Mer, Pasifika women with a background in Development studies, Pacific studies and social policy, combining her interests in indigenous methodologies and social justice. She is experienced in community and... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Reflections by a non-analyst on the use of state-wide data sets and modelled data in evaluation
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Gabby Lindsay-Smith 

Using linked Government data sets provide an opportunity to investigate the impact of state-wide programs and policies but are often out of reach for many evaluators, and especially non-analysts. This presentation will detail a non-analyst’s experience incorporating state linked data sets into a recent evaluation of a Victorian-wide family services program evaluation. The presentation will outline tips and tricks for those who may consider incorporating government-level linked data or simulation models into large program or policy evaluations in the future. It will cover areas such as: where to begin, navigating the data and key tips for working with analysts.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The evolution of evaluation: Retracing our steps in evaluation theory to prepare for the future
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: James Ong (University of Melbourne)

As new people enter the evaluation field and as evaluation marches forward into the future, it is important to learn from evaluation theorists that have come before us. My Ignite presentation will argue that modern evaluation is built on evaluation theory, and make the call for evaluators of all levels to learn evaluation theory to:
  1. Appreciate how evaluation has evolved;
  2. Strengthen their evaluation practice; and
  3. Navigate themselves around an ever-changing evaluation landscape.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for James Ong

James Ong

Research Assistant (Evaluations), University of Melbourne
My name is James Ong. I am an Autistic program evaluator working at the University of Melbourne, where I work on evaluation and implementation projects in various public health projects such as the AusPathoGen program and the SPARK initiative. I not only have a strong theoretical... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Growing Australia's future evaluators: Lessons from emerging evaluator networks across the Asia Pacific
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Amanda Mottershead (Tetra Tech International Development), Qudratullah Jahid (Oxford Policy Management Australia, AU),Eroni Wavu (Pacific Community, FJ)

The sustainability of the evaluation sector requires emerging evaluators to be supported in pursuing high-quality practice. What this support needs to be and how it should be developed is much less certain. What topics should we focus on? How should we deliver it? Who should we deliver it to? How can the broader evaluation community support emerging evaluators?

Global experiences in emerging evaluator support contain a treasure trove of lessons which can fill this knowledge gap and inform effective support here in Australia. Experiences show that fostering a strong evaluation community, that includes emerging evaluators, can nurture, ignite and shape future evaluation practices. A variety of approaches are being adopted across the region, and the globe, to foster this sense of community, that range from formal approaches to capacity building to more informal approaches that focus on experience sharing.

In this session, we bring together current and former emerging evaluator leaders from across the Asia Pacific region to answer some of these questions and understand what approaches could work best for the Australian context. This will include presentations and discussion on in-demand topics, how to formulate support, how to target emerging evaluators and the best means of delivery. The session will be highly interactive, engaging the audience in a question-and-answer forum on this important topic. All panel members have been engaged with emerging evaluator networks in their countries or regions and bring diverse experiences to facilitate cross learning. The session will provide practical ways forward for the broader evaluation community to grow and support the future of evaluation.
Chair Speakers
avatar for Qudratullah Jahid

Qudratullah Jahid

Senior MEL Consultant, Oxford Policy Management
I am a monitoring, evaluation, research, and learning specialist with a background in bilateral and multilateral development organisations. With expertise in MEL frameworks and systems, I support OPM projects in the Indo-Pacific. My focus areas include MEL frameworks, mixed methods... Read More →
avatar for Amanda Mottershead

Amanda Mottershead

Consultant - Research, Monitoring and Evaluation, Tetra Tech International Development
I enjoy the breadth of evaluation in international development. I've had the opportunity to work across sectors including economic development, infrastructure, energy, education and inclusion. I enjoy generating evidence that promotes improvements to organisations, policies and programs... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Friday, September 20
 

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Nicole Tujague

Nicole Tujague

Founder and Director, The Seedling Group
Nicole TujagueBachelor of Indigenous Studies (Trauma and Healing/Managing Organisations)1st Class Honours, Indigenous ResearchPhD in Indigenous-led Evaluation, Gnibi College, Southern Cross UniversityNicole is a descendant of the Kabi Kabi nation from Mt Bauple, Queensland and the... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Impact evaluation: bringing together quantitative methods and program theory in mixed method evaluations
Friday September 20, 2024 11:00am - 12:00pm AEST
Authors: Harry Greenwell (Australian Centre for Evaluation), Peter Bowers (Australian Centre for Evaluation), Vera  Newman (Australian Centre for Evaluation)

This session will provide an overview of some of the main quantitative methods for identifying the causal impacts of programs and policies, while emphasising the importance of mixed-methods that also incorporate program theory and qualitative research. It is intended for people unfamiliar with quantitative evaluation methods who would like to develop their understanding of these methods in order to better contribute to theory-based, mixed method impact evaluations.

The session will cover 3 of the most common quantitative approaches to separating causality from correlation: i) mixed-method RCTs, ii) discontinuity design, and iii) matching. Each method will be explained with real examples. The session will also cover: the benefits and limitations of each method, and considerations for determining when such methods might be suitable either on their own, or as a complement to other evaluation methods or approaches.

Special attention will be given to the ethical considerations inherent in the choice of impact evaluation method, including issues related to consent, fairness, vulnerability, and potential harm.

After attending this session, participants will have a better understanding of: how program theory can inform the design of quantitative impact evaluations, including through mixed-method impact evaluations; and how to identify when certain quantitative impact evaluation methods may be suitable for an evaluation.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Peter Bowers

Peter Bowers

Assistant Director, Australian Centre for Evaluation (ACE)
I am part of the Australian Centre for Evaluation in Commonwealth Treasury that was set up to increase the volume, quality and use of evaluation across the Commonwealth government. I have a particular interest in RCTs. Come and speak to me if you would like to run an RCT in your... Read More →
avatar for Vera Newman

Vera Newman

Assistant Director
Dr Vera Newman is an Assistant Director in the Impact Evaluation Unit at the Australian Centre for Evaluation. She has many years experience conducting impact evaluations in the private and public sector, and is dedicated to applying credible methods to public policy for generating... Read More →
Friday September 20, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Our five guiding waypoints: Y Victoria's journey and learning from applying organisation-wide social impact measurement
Friday September 20, 2024 11:30am - 12:00pm AEST
103
Authors: Caitlin Barry (Y Victoria), Eugene Liston (Clear Horizon Consulting, AU)

The demand for organisations to measure impact seems to be ever increasing. However, impact measurement looks different depending on what level you are measuring it (program level, organisation-wide, ecosystem level etc). While a lot of organisations focus on measuring social impact at a program level, what appears to be less commonly achieved is the jump to effective measurement of impact at an organisation-wide level.

The literature providing guidance on how to implement org-wide social impact measurement makes it seem so straight-forward, like a Roman highway - all straight lines. But what is it really like in practice? How does it differ from program-level impact measurement? How can it be done? What resources does it take? And, what are the pitfalls?

The Y Victoria has spent the last three years on a journey to embed org-wide social impact measurement under the guidance of our evaluation partner. The Y Victoria is a large and diverse organisation covering 7 different sectors/service lines; over 5,500 staff; over 180 centres; and delivering services to all ages of the community. This presented quite a challenge for measuring organisation-wide impact in a meaningful way.

While the journey wasn't 'straight-forward', we've learnt a lot from navigating through it. This presentation will discuss the approach taken, tell the story of the challenges faced, trade-offs, lessons learnt (both from the client and consultant's perspective), and how we have adapted along the way.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Jess Boyden

Jess Boyden

Senior Social Impact Manager - Recreation, YMCA Victoria
Hello! I'm Jess and I bring 20 years of experience in program design, strategy and social impact measurement within international aid and local community development settings. I specialise in creating practical and meaningful approaches to measuring social impact, using the power... Read More →
avatar for Caitlin Barry

Caitlin Barry

Principal Consultant, Caitlin Barry Consulting
Caitlin has extensive experience in monitoring and evaluation and holds a Masters of Evaluation (First Class Honours) from the University of Melbourne and an Environmental Science Degree (Honours) from James Cook University. The focus of Caitlin's presentation will be from her work... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

The ACT Evidence and Evaluation Academy 2021-24: Lessons learned from a sustained whole-of-government ECB effort
Friday September 20, 2024 11:30am - 12:00pm AEST
105
Authors: Duncan Rintoul (UTS Institute for Public Policy and Governance (IPPG) ),George Argyrous (UTS Institute for Public Policy and Governance (IPPG), AU),Tish Creenaune (UTS Institute for Public Policy and Governance (IPPG), AU),Narina Dahms (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Peter Robinson (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Robert Gotts (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU)

The ACT Evidence and Evaluation Academy is a prominent and promising example of sustained central agency investment in evaluation capability building (ECB).

The Academy was launched in 2021 as a new initiative to improve the practice and culture of evidence-based decision-making in the ACT public sector. Its features include:
  • a competitive application process, requiring executive support and financial co-contribution
  • a series of in-person professional learning workshops where participants learn alongside colleagues from other Directorates
  • a workplace project, through which participants apply their learning, receive 1-1 coaching, solve an evaluation-related challenge in their work and share their insights back to the group
  • executive-level professional learning and practice sharing, for nominated evaluation champions in each Directorate
  • sharing of resources and development of evaluation communities of practice in the Directorates
  • an annual masterclass, which brings current participants together with alumni and executive champions.

Four years and over 100 participants later, the Academy is still going strong. There has been an ongoing process of evaluation and fine tuning from one cohort to the next, with encouraging evidence of impact. This impact is seen not only for those individuals who have taken part but also for others in their work groups, including in policy areas where evaluation has not historically enjoyed much of a foothold.

The learning design of the Academy brings into focus a number of useful strategies - pedagogical, structural and otherwise - that other central agencies and line agencies may like to consider as part of their own ECB efforts.

The Academy story also highlights some of the exciting opportunities for positioning evaluation at the heart of innovation in the public sector, particularly in the context of whole-of-government wellbeing frameworks, cross-agency collaboration and strategic linkage of data sets to support place-based outcome measurement.

Chair Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
Friday September 20, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Measuring Impact Through Storytelling: using Most Significant Change to evaluate the effectiveness of QHub for LGBTIQA+ young people.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Gina Mancuso (Drummond Street Services), Arielle Donnelly (Drummond Street Services, AU)

LGBTIQA+ young people experience discrimination and marginalisation which contribute to poorer mental and physical health outcomes, compared to the general population. QHub is an initiative that creates safe spaces, offers mental health and well-being services, and provides outreach tailored for LGBTIQA+ young people in Western Victoria and the Surf Coast. QHub provides LGBTIQA+ young people and their families/carers with welcoming, inclusive and integrated support, as well as opportunities to connect with peers and older role models. This presentation will outline how the collection and selection of stories of change (Most Significant Change) is helping us evaluate the impact of QHub.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Voices of the Future: Elevating First Nations Leadership in the Evolution of Educational Excellence
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Skye Trudgett (Kowa ),Sharmay Brierley (Kowa, AU)

This ignite presentation will delve into the pioneering evaluation within the education sector, where a series of education initiatives were designed and implemented by Aboriginal Community Controlled Organisations (ACCO's) and mainstream Education partners to uplift and support young First Nations peoples. We will uncover how the initiative's evaluation framework was revolutionarily constructed with First Nations communities at its heart, applying the reimagining evaluation framework, utilising diverse data collection methods and producing Community Reports that reflect First Nations experiences and voices.

Attendees will be guided through the evaluative journey, showcasing the incorporation of wisdom to demonstrate the profound value of community-delivered initiatives that contribute to change. The session will highlight the success stories and learnings, emphasising how this approach not only benefits the current generation but also lays the groundwork for the prosperity of future generations.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Sharmay Brierley

Sharmay Brierley

Consultant, Kowa Collaboration
Sharmay is a proud Yuin woman and project lead at Kowa with prior experience supporting First Nations peoples across human services sectors.As a proud First Nations woman, and through lived experience, Sharmay has a strong understanding of the many challenges faced by First Nations... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.