Loading…
Conference hashtag #aes24MEL
arrow_back View All Dates
Thursday, September 19
 

8:30am AEST

Plenary: Elizabeth Hoffecker "Wayfinding tools for learning and evaluation in complex systems" (followed by panel)
Thursday September 19, 2024 8:30am - 10:00am AEST
Keynote address: Plenary: Elizabeth Hoffecker 8.30-9.30, followed by plenary panel 9:30-10:00

Lead Research Scientist, Local Innovation Group, Massachusetts Institute of Technology (MIT), USA

What does a wayfinding approach look like when seeking to learn from and evaluate interventions into complex systems? 

Many of the most intractable challenges facing communities around the world are system challenges requiring system-level responses. Development-focused donors and implementers at various levels are recognizing this and funding system-strengthening and systems-change work across a variety of systems. Monitoring, evaluation, and learning work, however, has traditionally been focused at the project level, not the level of the dynamic local systems in which projects operate. A new kind of evaluation is needed for this work and is in the early stages of being developed, tested, and improved through learning-by-doing.

In forums such as the UNDP’s M&E Sandbox and the BMGF-funded Systems Monitoring, Learning, and Evaluation initiative, development donors, implementers, and evaluators are asking questions such as: what evaluation designs and approaches are most suitable for learning from and evaluating system and portfolio-level interventions? And “how do we know if we are making progress, generating results, and contributing to positive change in a complex system?”
Drawing on experience implementing “complexity-aware” evaluations of system-change interventions in Northern India and Guatemala, this session develops and explores responses to these questions. The presentation shares an evaluation approach and six related tools that are being used to evaluate, learn, and implement adaptively in these two very different system contexts. The tools--while humble and likely familiar--can become powerful wayfinding instruments for navigating complexity when combined with a systems-informed evaluation design. This session introduces this approach through a keynote presentation and then further develops it through an interactive panel with systems-informed evaluators working both internationally and domestically in Australia.

Panel: Exploring learning and evaluation tools for complex systems  9.30-10
 
Panel: Elizabeth Hoffeker, Matt Healey, Donna Loveridge, Tony Keissler
Chair: Jess Dart

Elizabeth Hoffecker will be joined by a panel to explore how the learning and evaluation tools she presents in the keynote address are being applied across different contexts.


Chair
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Donna Loveridge

Donna Loveridge

Impact strategy and evaluation consultant
I work with public sector and not for profit organisations and businesses to design and conduct evaluations and embed evaluative thinking in management systems and processes to strengthen learning and decision-making. Most of my work focuses on inclusive economic growth through impact... Read More →
avatar for Elizabeth Hoffecker

Elizabeth Hoffecker

Lead Research Scientist, Local Innovation Group, Massachusetts Institute of Technology (MIT), USA
Elizabeth Hoffecker is a social scientist who researches and evaluates processes of local innovation and systems change in the context of addressing global development challenges. She directs the MIT Local Innovation Group, an interdisciplinary research group housed at the Sociotechnical... Read More →
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
Thursday September 19, 2024 8:30am - 10:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Management Consultant (Manager), Grosvenor
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

National impact, regional delivery - Robust M&E for best practice Australian horticulture industry development.
Thursday September 19, 2024 10:30am - 11:00am AEST
Authors: Ossie Lang (RMCG ),Donna Lucas (RMCG ),Carl Larsen (RMCG ),Zarmeen Hassan (AUSVEG ),Cherry Emerick (AUSVEG ),Olive Hood (Hort Innovation )

How do you align ten regionally delivered projects with differing focus topics to nationally consistent outcomes? Take advantage of this opportunity to explore the journey of building and implementing a robust Monitoring and Evaluation (M&E) program that showcases regional nuances and aligns national outcomes, making a significant contribution to the success of this horticultural industry extension project.

Join us for an insightful presentation on how a national vegetable extension project focused on adoption of best management practices on-farm, has successfully implemented a dynamic M&E program. Over the two and a half years of project delivery, the national M&E manager, in collaboration with ten regional partners, has crafted a program that demonstrates regional impact consistently on a national scale and adapts to the project's evolving needs.

The presentation will highlight the team's key strategies, including the upskilling of Regional Development Officers in M&E practices. Learn how templates and tools were designed to ensure consistent data collection across approximately 40 topics. The team will share the frameworks utilised to capture quantitative and qualitative monitoring data, providing a holistic view of tracking progress against national and regional outcomes and informing continuous improvement in regional delivery.

Flexibility has been a cornerstone of the M&E program, allowing it to respond to the changing needs of growers, industry, and the funding partner and seamlessly incorporate additional data points. Discover how this adaptability has enhanced the project's overall impact assessment and shaped its delivery strategy.

The presentation will not only delve into the national perspective but also feature a firsthand account from one of the Regional Development Officers. Gain insights into how the M&E program has supported their on-the-ground delivery, instilling confidence in providing data back to the national project manager. This unique perspective offers a real-world understanding of the national program's effectiveness at a regional level.
Chair Speakers
avatar for Ossie Lang

Ossie Lang

Consultant-Regional Development Officer, RMCG
Thursday September 19, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating organisational turbulence: An evaluation-based strategic learning model for organisational sustainability
Thursday September 19, 2024 10:30am - 11:00am AEST
103
Authors: Shefton Parker, Monash Univeristy; Amanda Sampson, Monash Univeristy

Increasingly, turbulent, and rapidly changing global operating environments are disrupting organisational plan implementation and strategy realisation of institutions. The session introduces a novel organisational collaborative strategic learning and effectiveness model, intended to bolster organisational resilience responses amidst such turbulence.
A scarcity of suitable organisational strategic learning systems thinking models utilising evaluation methodology in a joined-up way, prompted the presenters to develop a model. The model is tailored for strategic implementation in a complex organisational system environment, operating across decentralised portfolios with multiple planning and operational layers. The model amalgamates evaluation methodologies to identify, capture, share and respond to strategic learning in a complex system. It is hypothesised the model will outperform conventional organisational performance-based reporting systems, in terms of organisational responsiveness, agility, adaptability, collaboration, and strategic effectiveness.
The presentation highlights the potential value of integrating and embedding evaluation approaches into an organisation's strategy, governance and operations using a three-pronged approach:
- Sensing: Gathering relevant, useful timely data (learning);
- Making sense: Analysing and contextualising learning data alongside other relevant data (institutional performance data, emerging trends, policy, and legislative reform etc); and
- Good sense decisions: Providing timely and relevant evaluative intelligence and insights to support evidence based good decision making.
The presenters advocate for a shift from viewing evaluation use as a 'nice to have' to a 'must have' aspect of organisational growth and sustainability. The model aims to foster a leadership culture where decision makers value the insights that contextualised holistic organisational intelligence can provide for;

i) Strategic planning: Enhanced planning and strategic alignment across portfolios;

ii) Operational efficiency: Reducing duplication in strategic effort and better collaboration towards strategic outcomes;

iii) Business resilience and sustainability: Improved identification and quicker response to emerging opportunities and challenges; and

iv) Strategic effectiveness: Informing activity adaptation recommendations for strategic goal realisation.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Shefton Parker

Shefton Parker

Senior Evidence & Evaluation Adviser, Monash University - Institutional Planning
Dr Shefton Parker is an evaluator and researcher with over 15 years of specialist experience in program and systems evaluation within the Vocational and Higher Education sectors. Recently, his evaluation of innovative education programs were referenced as evidence in the University... Read More →
avatar for Amanda Sampson

Amanda Sampson

Senior Manager, Institutional Planning, Monash University
I am leading the development and implementation of an Institutional Evaluation Model which a complex organisation to support organisational resilience, strategic adaptation and execution to realise the 10 year organisational strategic objectives. I am interested in learning how to... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Commissioning evaluations - finding the way from a transactional to a relational approach
Thursday September 19, 2024 10:30am - 12:00pm AEST
Authors: Eleanor Williams (Australian Department of Treasury ),Josephine Norman (Victorian Department of Health, AU),Melissa Kaltner (Lumenia, AU),Skye Trudgett (Kowa Collaboration, AU),George Argyrous (Paul Ramsay Foundation, AU),Luke Craven (National Centre for Place-Based Collaboration (Nexus), AU)

Delivering great evaluations requires a strong professional relationship between those commissioning and delivering the evaluation, as well as all relevant stakeholders.

Traditional evaluation commissioning approaches have tended to treat evaluation as a one-off exchange focusing on the completion of pre-defined tasks. However, the evolving landscape of policies and programs tackling complex issues demands a more nuanced and relational approach to get the most out of the journey of evaluation.

This big room panel session brings together speakers who are at the forefront of thinking around collaborative commissioning partnerships from the perspectives of government, not-for-profit and Indigenous-led organisations, and the private sector who can play the full suite of roles on the commissioning journey. The discussion will delve into the experiences of a range of organisations involved in commissioning who are seeking to build enduring relationships, and in some case partnerships, between the commissioners, the evaluators and the stakeholders to whom we are accountable.

Drawing on real-world case studies and empirical evidence, the discussion will highlight the challenges and rewards of transitioning from a transactional model to a relational model. It will explore how this paradigm shift can enhance collaboration and ultimately lead to a range of positive outcomes.

Attendees will be invited to respond to engage in dialogue with the panel to bring the collective wisdom of attendees together and consider how the destination of better commissioning relationships would look, the practical obstacle we face on our pathway, and how we can reach our destination. To facilitate this active discussion, attendees will have the opportunity to use Sli.do throughout the session to provide input on key questions, share experience in real-time and ask questions of the expert panel.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
avatar for Josephine Norman

Josephine Norman

Director, Centre for Evaluation and Research Evidence, Dept of Health/Dept of Families, Fairness and Housing
I run a large internal evaluation unit, directing a team of 30 expert evaluators and analysts to: directly deliver high priority projects; support program area colleagues to make the best use of external evaluators; and, build generalist staff capacity in evaluation principles and... Read More →
avatar for Luke Craven

Luke Craven

Independent Consultant
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Thursday September 19, 2024 10:30am - 12:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Learn, evolve, adapt: Evaluation of climate change and disaster risk reduction programs
Thursday September 19, 2024 11:00am - 11:30am AEST
104
Authors: Justine Smith (Nation Partners )

There is a pressing need to reduce the risks associated with climate change and the disasters that are likely to increase as a result. Along with the need to take action, comes the need to show we are making a difference - or perhaps more importantly the need to learn and evolve to ensure we are making a difference. However when operating in an ever changing, uncertain environment, with layers of complexity and outcomes that may not be realised for some time, or until disaster strikes, evidence of impact is not always easy to collect nor a priority.

Drawing on experience developing evaluation frameworks and delivering evaluation projects in the areas of climate change and disaster and emergency management, I will present some of the challenges and opportunities I have observed. In doing so, I propose that there is no 'one way' to do things. Rather, taking the time to understand what we are evaluating and to continually learn, evolve and adjust how we evaluate is key. This includes having clarity on what we really mean when we are talking about reducing risk and increasing resilience. Ideas I will explore include:
  • The concepts of risk reduction and resilience.
  • The difference between evaluation for accountability and for genuine learning and improvement.
  • Balancing an understanding of and progress towards big picture outcomes with project level, time and funding bound outcomes.
  • The challenge and potential benefits of event-based evaluation to learn and improve.

Evaluation has the capacity to contribute positively to action taken to reduce climate change risks and improve our management of disasters and recovery from disasters. As evaluators we too need to be innovative and open-minded in our approaches, to learn from and with those working directly in this space for the benefit of all.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Management Consultant (Manager), Grosvenor
Speakers
avatar for Justine Smith

Justine Smith

Principal, Nation Partners
With a background spanning research, government, non-government organisations and consulting, Justine brings technical knowledge and over 10 years of experience to the projects she works on. As a highly experienced program evaluator and strategic thinker, Justine has applied her skills... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Culturally inclusive evaluation with culturally and linguistically diverse communities in Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
Author Lena Etuk (CIRCA Research, AU)

In this presentation we will outline an approach to culturally inclusive evaluation with people from culturally and linguistically diverse backgrounds in Australia, its strengths, and its growth opportunities. This approach fills a critical gap in the way evaluation and research with culturally and linguistically diverse communities is traditionally conducted in Australia.

In this presentation we will explain how the Cultural & Indigenous Research Centre Australia (CIRCA) conducts in-culture and in-language evaluation with diverse cohorts of Australians, and how this practice fits within the broader methodological discourse in evaluation and social science more broadly. We will illustrate how our culturally inclusive methodology is put into practice with findings from CIRCA's own internal research into the way cultural considerations shape our data collection process. We will conclude with reflections on how CIRCA might further draw on and leverage standpoint theory and culturally responsive evaluation as this practice is further refined.

Our key argument is that doing culturally inclusive evaluation is a process that requires reflexivity and learning, alongside strong and transparent institutional processes. Combining these approaches creates systemic ways of acknowledging and working within stratified and unequal social systems, inherent to any research. Our findings will advance knowledge within the field of evaluation about how to engage and represent culturally and linguistically diverse community members across Australia.
Chair Speakers
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Cultural & Indigenous Research Centre Australia
I’m an applied Sociologist with 16+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Our journey so far: a story of evaluation to support community change in South Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
103
Authors: Penny Baldock (Department of Human Services South Australia ),Jessie Sleep (Far West Community Partnerships, AU)

The multi-jurisdictional South Australian Safety and Wellbeing Taskforce is the lead mechanism, and the accountable body to develop strategies and sustainable, place-based responses that ensure the safety and wellbeing of remote Aboriginal Visitors in Adelaide and other regional centres in the State.

This presentation discusses the challenges of establishing an evaluative learning strategy for the Taskforce that meets the needs of multiple government agencies and stakeholders, multiple regional and remote communities, and multiple nation groups.

In a complex system, this is a learning journey, requiring us to adapt together to seek new ways of understanding and working that truly honour the principles of data sovereignty, community self-determination, and shared decision-making.
As we begin to more truly centre communities as the locus of control, and consider the far- reaching reform that will be necessary to deliver on our commitments under Closing the Gap, this presentation provides an important reflection on the skills, knowledge and expertise that will be required to build evaluation systems and processes that support change.

One of the most exciting developments to date has been the establishment of a multi-agency data sharing agreement, which will enable government data to be shared with Far West Community Partnerships, a community change organisation based in Ceduna, and combined with their community owned data in order to drive and inform the Far West Change Agenda.

We present the story of our journey so far, our successes, our failures, and extend an invitation to be part of the ongoing conversation. to support the change required for evaluation success.

Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
PB

PENNY BALDOCK

Department of Human Services
avatar for Jessie Sleep

Jessie Sleep

Chief Executive, Far West Community Partnerships
Jessie is an innovative thinker and strategist, emerging as a leader in her field, redefining the role of strategic implementation with monitoring and evaluation. With the fast paced growth of the social impact lens in Australia, Jessie is part of the new generation of strategic leaders... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Management Consultant (Manager), Grosvenor
Speakers
avatar for Julia Suh

Julia Suh

Principal, Tobias
avatar for JESSICA LEEFE

JESSICA LEEFE

Principal, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

"Nothing about us, without us": Developing evaluation framework alongside victim-survivors of modern slavery using representative participatory approaches
Thursday September 19, 2024 11:30am - 12:00pm AEST
Authors: Ellie Taylor (The Salvation Army)

Amplifying survivor voices has been the cornerstone of The Salvation Army's work in the anti-slavery realm. How does this translate to the monitoring and evaluation space? How do we truly represent the voices and experiences of those with lived experience of modern slavery in monitoring and evaluation, whilst aligning with key human rights principles?

Our Research Team are exploring how to centre survivor voices in the evaluation space. This session will detail use of a representative participatory evaluation approach to monitor and evaluate the Lived Experience Engagement Program (LEEP) for survivors of criminal labour exploitation. In this session we will explore the challenges and learnings uncovered through this project.

The LEEP is designed to empower survivors of criminal labour exploitation to share their expertise to make change. Piloted in 2022-2023, and continuing into 2024-2025, the LEEP - and resulting Survivor Advisory Council - provides a forum for survivors to use their lived experience to consult with government - to assist in preventing, identifying and responding to modern slavery.

The key points explored in this session will include:
  • Realities of implementing an adaptive model, including continuous integration of evaluation findings into an iterative survivor engagement model.
  • The importance of stakeholder inclusivity, integrating lived experience voices and amplifying them alongside program facilitators and government representatives.
  • Complexities of evaluation in the modern slavery space, particularly when victim-survivors of forced marriage are included. We will speak to the need for trauma-informed, strengths-based measures and facilitating partnerships with the people the program serves.

Leading the session will be the The Salvation Army's project lead with a PhD in mental health and over 12 years of experience working with diverse community groups in Australia and internationally. They have extensive experience presenting at conferences both domestically and internationally.
Chair Speakers
avatar for Ellie Taylor

Ellie Taylor

Senior Research Analyst, The Salvation Army
Ellie has a background in mental health and has spent 12+ years designing and conducting research and evaluation initiatives with diverse communities across Australia and internationally. In this time, she's worked with people from all walks of life, across the lifespan, from infants... Read More →
Thursday September 19, 2024 11:30am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Social Impact Measurement & Evaluation – the similarities & differences that complement our journey to more fit-for-purpose destinations.
Thursday September 19, 2024 11:30am - 12:30pm AEST
106
Authors: Laura Glynn (Simna)

The measurement space has seen many new actors, terms, approaches and “gold standards” emerge in the last two decades. More than ever before has it become difficult to navigate and explore our intended destination in the space of measurement and evaluation. What schools of thought are worth exploring? What value do they offer to an existing evaluation skillset? We are also traversing through heightened levels of complexity, with cost of living, environmental and society fabric crises. In this busy and crowded environment, the Social Impact Measurement Network (SIMNA) led panel will seek to explore the similarities and differences between evaluation and social impact measurement (SIM) as mindsets to help steer us towards our destination.

The panel will involve 3 speakers from diverse sectoral backgrounds – government, not-for-profit, and private spheres, all commenting (broadly) on the questions: Are evaluation and social impact measurement the same? To what extent do they differ? How can they complement one another? While the questions themselves will be more nuanced than that, the answers will hold broad value for attendees in considering how they can bring complementary approaches and mindsets to navigating the work they do in measurement and evaluation. The panellists will draw on their unique perspectives across different sectoral and practice spaces to discuss this complementarity.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
avatar for Caitlin Barry

Caitlin Barry

Principal Consultant, Caitlin Barry Consulting
Caitlin has extensive experience in monitoring and evaluation, and holds a Masters of Evaluation (First Class Honours) from the University of Melbourne and an Environmental Science Degree (Honours) from James Cook University. The focus of Caitlin's presentation will be from her work... Read More →
avatar for Elliott Tester

Elliott Tester

Board Member & Assistant Director of Strategic Evaluation, SIMNA & NDIA
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
avatar for Paula Simões dos Santos

Paula Simões dos Santos

Senior Evaluation Advisor, Department of Primary Industries and Regional Development
avatar for Sandra Opoku

Sandra Opoku

Senior Manager Evaluation and Social Impact, Relationships Australia Victoria
My role leads impact, evidence and innovation activities at Relationships Australia Victoria. These activities contribute to achieving strategic objectives and improving outcomes for individuals, families and communities. This now also includes oversight of several key prevention... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia
  Journey

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Systems evaluation to the rescue!: How do we use systems evaluation to improve societal and planetary wellbeing?
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104
Authors: Kristy Hornby (Grosvenor), Tenille Moselen (First Person Consulting)

Systems evaluation - many might have heard the term, but few have done one. This session shares two case studies of different systems evaluations and the learnings from these to benefit other evaluators who are conducting or about to begin a systems evaluation.

The session will open with an overview and explanation of what systems evaluation is, in terms of its key features and how it is distinguished from other forms of evaluation. The presenters will then talk through their case studies, one of which centres on the disability justice system in the ACT, while the other takes a sector-wide focus across the whole of Victoria. The co-presenters will share openly and honestly their initial plans for commencing the systems evaluations, how they had to amend those plans in response to real-world conditions, and the tips and tricks and innovations they picked up along the way.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Management Consultant (Manager), Grosvenor
Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.html
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Getting to the value add: Timely insights from a realist developmental evaluation
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Phillip Belling (NSW Department of Education), Liam Downing (NSW Department of Education, AU)

This paper is aimed at early career and experienced evaluators interested in realist evaluation, but with concerns about the time a realist approach might take. The authors respond to this concern with an innovative blending of realist and developmental evaluation. Participants will exit the room with a working understanding of realist developmental evaluation, including its potential for adaptive rigour that meets the needs of policy makers and implementers.

Realist evaluation is theoretically and methodologically robust, delivering crucial insights about how, for whom and why interventions do and don't work (House, 1991; Pawson & Tilley, 1997; Pawson, 2006). It aims to help navigate unfamiliar territory towards our destination by bringing assumptions about how and why change happens out in the open.

But even realism's most enthusiastic practitioners admit it takes time to surface and test program theory (Marchal et al., 2012; van Belle, Westhorp & Marchal, 2021). And evaluation commissioners and other stakeholders have understandable concerns about the timeliness of obtaining actionable findings (Blamey & Mackenzie, 2007; Pedersen & Rieper, 2008).

Developmental evaluation (Patton, 1994, 2011 2021; Patton, McKegg, & Wehipeihana, 2015) is more about what happens along the way. It appeals because it provides a set of principles for wayfinding in situations of complexity and innovation. Realist and developmental approaches do differ, but do they share some waypoints to reliably unpack perplexing problems of practice?

This paper documents a journey towards coherence and rigour in an evaluation where developmental and realist approaches complement each other, and deliver an evidence base for program or policy decision-making that is not only robust but also timely.

We show that, in complex environments, with programs involving change and social innovation, realist developmental evaluation can meet the needs of an often-varied cast of stakeholders, and can do so at pace, at scale, and economically.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating the unfamiliar: Evaluation and sustainable finance
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Donna Loveridge (Independent Consultant), Ed Hedley (Itad Ltd UK, GB)

The nature and magnitude of global challenges, such as climate change, poverty and inequality, biodiversity loss, food insecurity and so on, means that $4 trillion is needed annually to achieve the Sustainable Development Goals by 2030. Government and philanthropic funding is not enough but additional tools include businesses and sustainable finance. Evaluators may relate to many objectives that business and sustainable finance seek to contribute to but discomfort can arise in the mixing of profit, financial returns, impact and purpose.

Sustainable finance, impact investing, and business for good are growing globally and provides opportunities and challenges for evaluators, evaluation practice and the profession.
This session explores this new landscape and examines:
  • What makes us uncomfortable about dual objectives of purpose and profit, notions of finance and public good, and unfamiliar stakeholders and languages, and what evaluators can do in response.
  • The opportunities for evaluators to contribute to solving interesting and complex problems with current tools and skills and where is the space for developing evaluation theory and practice.
  • How evaluation practice and evaluators' competencies might expand and deepen, and not get left behind in these new fields, and also sustaining evaluations relevance to addressing complex challenges.

The session draws on experience in Australia and internationally to share some practical navigation maps, tools and tips to help evaluators traverse issues of values and value, working with investors and businesses, and identify opportunities to add value.
Chair Speakers
avatar for Donna Loveridge

Donna Loveridge

Impact strategy and evaluation consultant
I work with public sector and not for profit organisations and businesses to design and conduct evaluations and embed evaluative thinking in management systems and processes to strengthen learning and decision-making. Most of my work focuses on inclusive economic growth through impact... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Man vs. Machine: Reflections on machine-assisted and human-driven approaches used to examine open-text progress reports.
Thursday September 19, 2024 1:30pm - 2:00pm AEST
Authors: Stephanie Quail (ARTD Consultants), Kathleen De Rooy (ARTD Consultants, AU)

Progress reports and case notes contain rich information about program participants' experiences and frequently describe theoretically important risk and protective factors that are not typically recorded in administrative datasets. However, the unstructured narrative nature of these types of data - and, often, the sheer volume of it - is a barrier for human-drive qualitative analysis of this data. Often, the data cannot be included in evaluations because it is too time and resource intensive to do so.

This paper will describe three approaches to the qualitative analysis of progress reports used to examine within-program trajectories for participants, and the factors important for program success as part of an evaluation of the Queensland Drug and Alcohol Court.

It will explore how we navigated the balance between human and machine-driven qualitative analysis. We will reflect on the benefits and challenges of text-mining - how humans and machines stack up against each other when identifying the sentiment and emotion in text, the strengths and challenges of each approach, the lessons we have learned, and considerations for using these types of approaches to analyse datasets of progress reports in future evaluations.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
Thursday September 19, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Evaluation by Stealth: Insights from Embedded Evaluation Practice
Thursday September 19, 2024 1:30pm - 2:30pm AEST
104
Authors: Samiha Barkat (Launch Housing) Edgar Daly (Launch Housing)

Embedded evaluation roles challenge conventional boundaries between evaluator, commissioner, strategist, advocate, program manager and critical friend.

Three years into establishing an embedded impact and evaluation team at a large not-for-profit community service organisation, Samiha and Edgar share their insights on operating as internal evaluators to advance their organisation’s mission. They draw from their recent projects, including developing and implementing an award-winning impact measurement framework. Central to their learnings is the importance of valuing relationships with diverse stakeholders and maintaining an adaptive approach.

Their insights are particularly valuable for organisations seeking to enhance the effectiveness, cost-efficiency, and impact of their evaluation services, including internal evaluation teams, program managers, and executives from both non-government and government organisations.

Samiha and Edgar work for Launch Housing, a secular Melbourne-based community organisation that delivers homelessness services and life-changing housing supports to disadvantaged Victorians. Launch Housing employs over 400 staff and runs more than 50 programs aimed at ending homelessness in Melbourne.
Chair
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
Speakers
avatar for Samiha Barkat

Samiha Barkat

Group Manager - Research, Evaluation and Data, Launch Housing
I am a development and social impact professional with over 17 years’ experience working for both the private and not-for-profit sectors globally.I am currently leading the Research and Evaluation team at Launch Housing, a large homelessness not-for-profit in Melbourne and a Board... Read More →
avatar for Edgar Daly

Edgar Daly

Monitoring & Evaluation Lead, Launch Housing
I have worked extensively in evaluation, including as a consultant and in embedded roles within government and the community services sector, specialising in developmental and mixed-methods evaluations. Currently I am the Monitoring and Evaluation Lead at Launch Housing, a large homelessness... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Scaling Impact: How Should We Evaluate the Success of a Scaling Journey?
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106
Authors: John Gargani (Gargani + Co)

The world has never faced larger problems—climate change, refugee crises, and Covid19, to name just three. And organizations have responded by scaling solutions to unprecedented size—sustainable development goals, global refugee policies, and universal vaccination programs. But scaling is a journey to a destination imperfectly imagined at the onset and difficult to recognize upon arrival. At what point is scaling a program, policy, or product successful? Under what conditions should scaling stop? Or "descaling" begin? Robert McLean and I posed these and other questions to innovators in the Global South and shared what we learned in our recent book *Scaling Impact: Innovation for the Public Good*. In this session, we outline the book's four research-based scaling principles—justification, optimal scale, coordination, and dynamic evaluation. Then we discuss how to (1) define success as achieving impact at optimal scale, (2) choose a scaling strategy best suited to achieve success, and (3) judge success with dynamic evaluation. My presentation goes beyond the book, reflecting our most current thinking and research, and I provide participants with access to free resources, including electronic copies of the book.
Chair Speakers
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

A tool for addressing violence against women: An examination of the creation, benefits, and drawbacks of the Evidence Portal
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlotte Bell (Australia's National Research Organisation for Women's Safety (ANROWS)), Lorelei Hine (ANROWS, AU), Elizabeth Watt (ANROWS, AU), Rhiannon Smith (ANROWS, AU)

The first of its kind in Australia, the Evidence Portal is an innovative tool that captures and assesses impact evaluations of interventions from high-income countries that aim to address and end violence against women.

While we know high-quality evaluation evidence is an important component in informing and influencing policy and practice, decision-makers face a variety of potential barriers in accessing this evidence. By providing a curated repository of existing research, evidence portals can support policymakers, practitioners, and evaluators in their decision-making.

Our Evidence Portal consolidates and synthesises impact evaluation evidence via: (1) Evidence and Gap Maps, which provide a big-picture, visual overview of interventions; and (2) Intervention Reviews, which provide a succinct, standardised assessment of interventions in accessible language. Underpinned by a rigorous systematic review methodology, this tool seeks to:
  • Identify existing impact evaluations and gaps in the evidence base, and
  • promote a collective understanding of the nature and effectiveness of interventions that aim to address violence against women

Key points: This presentation will showcase the creation, benefits, and drawbacks of the Evidence Portal, with a focused discussion on the following areas:
  • What are evidence portals and how are they used to inform policy and practice?
  • Why and how was this evidence portal created?
  • What are the challenges in creating this tool and the learnings to date?
  • What other 'ways of knowing' should be considered?

This presentation begins with an in-depth exploration of the Evidence Portal and the important methodological decisions taken to build this tool. It then offers a reflection on our journey of creating this tool with a focus on significant learnings to date. You will gain an understanding of the Evidence Portal and key considerations for future evaluations of violence against women interventions.
Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
CL

Charlotte Louisa Grace Bell

Research Manager, ANROWS
avatar for Lauren Hamilton

Lauren Hamilton

Evaluation and Partnerships Manager, Australia's National Research Organisation for Women's Safety (ANROWS)
Lauren has over 10 years of experience in the evaluation, design and management of social programs, with a focus on violence against women and children, and women’s health. In her current role, Lauren works directly with frontline services and funders of domestic, family and sexual... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works to address the colonial legacy she has inherited by taking an inclusive... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Perspectives on the Appropriate Use of RCTs in Evaluation?
Thursday September 19, 2024 1:30pm - 3:00pm AEST
Authors: Moderator:  Prof Rick Cummings, Emeritus Professor Murdoch University, AES Fellow
Key Speaker: Eleanor Williams, Managing Director, Australian Centre for Evaluation
Panelists: Prof Lisa Cameron, Professional Research Fellow, Melbourne Institute of Applied Economic and Social Research, University of Melbourne
Dr Wendy Jarvie, Adjunct Professor, Public Service Research Group, University of NSW
Bruce Cunningham, Assistant Secretary, Employment Evaluation Branch, Commonwealth Department of Employment and Workplace Relations
Commentators: Prof Patricia Rogers, Former Professor of Public Sector Evaluation, RMIT University, AES Fellow
Scott Bayley, Principal, Scott Bayley Evaluation Service, AES Fellow


The Commonwealth Government has established the Australian Centre for Evaluation (ACE) to put evaluation evidence at the heart of policy design and decision-making by improving the volume, quality, and use of evaluation evidence to support better policy and programs that improve the lives of Australians. This aligns well with the aim of the AES to improve the theory, practice and use of evaluation for people involved in evaluation. The creation of ACE provides an excellent opportunity for the AES and its members to work with a government agency on our common purposes. This collaboration has already commenced through shared activities and, in particular, the involvement of the responsible Minister, Dr Andrew Leigh, as a keynote speaker at the 2023 AES Conference.

An area that has attracted considerable attention is the mandate for ACE to include randomised control trials (RCTs) in at least some of their evaluation studies of Commonwealth programs. This issue was the central topic of Minister Leigh’s keynote address and created considerable debate and discussion at the conference. This demonstrates that this is a topic of importance for the AES and its members.

The aim of the session is to explore the appropriate use of RCTs in evaluation studies of public policy in Australia. The strategy is to commence a communication process on this key topic between ACE and the evaluation community as represented by the AES. Ideally, this will lead to collaboration between ACE and the AES to improve evaluation practice in Australia.

The Fellows Forum session will commence with a prepared presentation by a senior staff member of ACE explaining its mandate and outlining its approach to including RCTs in evaluation studies. This will be followed by a panel of evaluators who have experience with RCTs to explain how they included RCTs in an evaluation study or where they chose not to include an RCT and the reasons why. They will also explore what they learned from this experience to inform their future evaluation practice. Finally, one or two Fellows will act as discussants, responding to the previous presentations with their thoughts on this issue. The session will be moderated by a Fellow and there will be time for audience members to ask questions of the panel members and discussants.

Moderator:  Prof Rick Cummings, Emeritus Professor Murdoch University, AES Fellow
Key Speaker: Eleanor Williams, Managing Director, Australian Centre for Evaluation
Panelists:  Prof Lisa Cameron, Professional Research Fellow, Melbourne Institute of Applied Economic and Social Research, University of Melbourne
                       Dr Wendy Jarvie, Adjunct Professor, Public Service Research Group, University of NSW
                       Bruce Cunningham, Assistant Secretary, Employment Evaluation Branch, Commonwealth Department of Employment and Workplace Relations
Commentators:  Prof Patricia Rogers, Former Professor of Public Sector Evaluation, RMIT University, AES Fellow
Scott Bayley, Principal, Scott Bayley Evaluation Service, AES Fellow
Chair
avatar for Rick Cummings

Rick Cummings

Emeritus Professor, Murdoch University
Rick Cummings is an Emeritus Professor in Public Policy at Murdoch University. He has 40 years of experience conducting evaluation studies in education, training, health, and crime prevention primarily for the state and commonwealth government agencies and the World Bank. He currently... Read More →
Speakers
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
avatar for Wendy Jarvie

Wendy Jarvie

Adjunct Professor, Public Service Research Group, University of NSW, Canberra
Wendy Jarvie is a former Deputy Secretary in the Australian Public Service (APS) and is now Adjunct Professor at the Public Service Research Group at the University of NSW in Canberra. Over the last 30 years she has conducted and managed evaluations in the APS and for the World Bank... Read More →
avatar for Bruce Cunningham

Bruce Cunningham

Employment Evaluation Branch Assistant Secretary, Australian Government Department of Employment and Workplace Relations
Bruce Cunningham is the Employment Evaluation Branch Assistant Secretary in the Department of Employment and Workplace Relations. He has an economics background and has worked in the variety of analytical and research roles centred on helping job seekers into work. Bruce established... Read More →
avatar for Lisa Cameron

Lisa Cameron

James Riady Professor of Asian Economics and Business, University of Melbourne
Professor Lisa Cameron is the James Riady Chair of Asian Economics and Business and Program Director of the Disadvantage and Wellbeing in the Asia-Pacific group at the Melbourne Institute of Applied Economic and Social Research at the University of Melbourne. She is an empirical micro-economist... Read More →
avatar for Patricia Rogers

Patricia Rogers

Co-founder, Footprint Evaluation Initiative
Founder of BetterEvaluation and former Professor of Public Sector Evaluation at RMIT University. Now working as consultant and advisor. My work has focused on supporting appropriate choice and use of evaluation methods and approaches to suit purposes and context. I am currently working... Read More →
avatar for Scott Bayley

Scott Bayley

Managing Director, Scott Bayley Evaluation Services
Scott Bayley manages his own evaluation consultancy business and holds a MA in Public Policy majoring in evaluation and social measurement. He has over 25 years of experience in evaluation and is a Fellow of the Australian Evaluation Society. Prior to having his own consultancy Scott... Read More →
Thursday September 19, 2024 1:30pm - 3:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Monitoring and Evaluation Journeys: Making footprints, community-based enterprise in Australian First Nations contexts
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104
Authors: Donna-Maree Stephens (Community First Development ),Sharon Babyack (Community First Development, AU)

As First Nations' economies grow and develop, wayfinding of monitoring and evaluation frameworks that meaningfully address the holistic outcomes of First Nations' economic independence are a necessity. Culturally responsive monitoring and evaluation frameworks provide footprints for distinct ways of thinking about the holistic and significant contribution that First Nations' economies make to their communities and the broad Australian economic landscape.
Presenting findings from an organisation with more than 20 years of experience working alongside First Nations' communities and businesses grounded in collective and community focused outcomes, this presentation will highlight key learnings of monitoring and evaluation from First Nations' enterprises. It is an invitation to explore and rethink notions of success by drawing on experiences and Dreams (long-term goals) for community organisations, businesses and journeys towards positive outcomes alongside the role of one culturally responsive monitoring and evaluation approach. Our presentation will provide an overview of our work in the community economic development space and key learnings developed through our monitoring and evaluation yarns with First Nations' enterprises across a national First Nations' economic landscape that includes urban, regional and remote illustrations.
Chair
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
Speakers
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, Community First Development
My role at Community First Development involves oversight of research, evaluation, communications and effectiveness of the Community Development program. During my time with the organisation I have led teams to deliver major change processes and strategic priorities, have had carriage... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

A long road ahead: Evaluating long-term change in complex policy areas. A case study of school active travel programs in the ACT
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106
Authors: Mallory Notting (First Person Consulting)

The ACT Government implemented a suite of programs over the ten year period between 2012 and 2022 aiming to increase the rates of students actively travelling to and from school. 102 schools in the ACT participated in at least one of the three programs during this time which targeted well-known barriers to active travel, including parental perceptions of safety and infrastructure around school. The programs were intended to contribute towards a range of broader priorities, including health, safety, and environmental outcomes.

This short-paper session will share learnings from evaluating long-term behaviour change at a population level, based on the school active travel evaluation. The evaluation represents a unique case study, as the evaluators needed to look retrospectively over ten years of program delivery and assess whether the combination of programs had created changes within the system and had resulted in the achievement of wider goals.

The presenter will illustrate that the line between short-term and long-term outcomes is rarely linear or clear, as is the relationship between individual interventions and whole of system change. This will be done by summarising the approach taken for the evaluation and sharing the diversity of information collated for analysis, which included individual program data and attitudinal and infrastructure-level data spanning the whole school environment.

Evaluators are often only able to examine the shorter term outcomes of an intervention, even in complex policy areas, and then rely on a theory of change to illustrate the assumed intended wider impacts. The presenter was able to scrutinise these wider impacts during the active travel evaluation, an opportunity not regularly afforded to evaluators. The lessons from the active travel evaluation are therefore pertinent for other evaluations in complex policy areas and may carry implications for program design as the focus shifts increasingly towards population-level, systems change.

Chair Speakers
avatar for Mallory Notting

Mallory Notting

Principal Consultant, First Person Consulting
Mallory is a Principal Consultant at First Person Consulting. She manages and contributes to projects primarily in the area of cultural wellbeing, social inclusion, mental health, and public health and health promotion. In 2023, Mallory was the recipient of the Australian Evaluation... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Our new ways: Reforming our approach to impact measurement and learning
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105
Authors: Kaitlyn Scannell (Minderoo Foundation), Adriaan Wolvaardt (Minderoo Foundation, AU), Nicola Johnstone (Minderoo Foundation, AU), Kirsty Kirkwood (Minderoo Foundation, AU)

We have been on a journey to bring awareness, evidence and understanding to the impact of our organisation since inception, and in earnest since 2016. For years, we felt the tension of trying to solve complex problems with measurement and learning approaches that are better suited to solving simple problems.

To change the world, we must first change ourselves. In early 2023 we had the extraordinary opportunity to completely reimagine our approach to impact measurement and learning. What we sought was an approach to measurement and learning that could thrive in complexity, rather than merely tolerate it, or worse, resist it.
We are not alone in our pursuit. Across government and the for-purpose sector, practitioners are exploring and discovering how to measure, learn, manage, and lead in complexity. Those who explore often discover that the first step they need to take is to encourage the repatterning of their own organisational system. A system which, which in the words of Donella Meadows, "naturally resists its own transformation."

In this presentation we will delve into two themes that have emerged from our journey so far:
  • Transforming ourselves - We will explore what it takes to embed a systems-led approach to measurement, evaluation and learning in an organisation.
  • Sharing knowledge - We will discuss methods for generating, sharing, and storing knowledge about what works for measuring, evaluating, and learning in complexity.

The purpose of this session is to share what we have learnt with anyone who is grappling with how their organisation might measure and learn in complexity. We have been touched by the generosity of those who have accompanied us on our journey, sharing their experiences and wisdom. This presentation marks our initial effort to pay that generosity forward.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Where next? Evaluation to transformation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor), Sarika Bhana (Grosvenor)

What is evaluation? Better Evaluation defines it as "any systematic process to judge merit, worth or significance by combining evidence and values". Many government organisations and some private and not-for-profit entities use evaluations as an auditing tool to measure how well their programs are delivering against intended outcomes and impacts and achieving value for money. This lends itself to viewing evaluation as an audit or 'tick-box' exercise when it is really measuring the delivery of an organisation's mandate or strategy (or part thereof). Viewing evaluation more as an audit than a core part of continuous improvement presents a risk of our reports collecting dust.

During this session, we will discuss factors that build a continuous improvement mindset across evaluation teams, as well as across the broader organisation. This will include exploring how to manage the balance between providing independent advice with practical solutions that program owners and other decision-makers can implement more readily, as well as how to obtain greater buy-in to evaluation practice. We present the features that evaluations should have to ensure findings and conclusions can be easily translated into clear actions for improvement.

We contend that it is important to consider evaluation within the broader organisational context, considering where this might link to strategy or how it may be utilised to provide evidence to support funding bids. This understanding will help to ensure evaluations are designed and delivered in a way that best supports the wider organisation.

We end by sharing our post-evaluation playbook - a practical tool to help take your evaluations from pesky paperweight to purposeful pathway.

Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world two years ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector and not-for-profit space. Rachel... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
  Tools

3:30pm AEST

Constructing a Wisdom Base: A Hands-On Exploration of First Nations Knowledge Systems
Thursday September 19, 2024 3:30pm - 4:30pm AEST
106
Authors: Skye Trudgett (Kowa ),Haley Ferguson (Kowa, AU),Tara Beattie (Kowa, AU),Levi McKenzie-Kirkbright (Kowa, AU),Jess Dart (Clear Horizon, AU)

In the pursuit of understanding and honouring the depth of First Nations wisdom, this hands-on session at the AES conference introduces the Ancestral Knowledge Tapestry —a living guide for developing a repository of ancestral knowledge, practices, and philosophies. Participants will actively engage in co-creating a 'Wisdom Base,' a collective endeavour to encapsulate the richness of old and new First Nations knowledges and their application to contemporary evaluative practices.

Through interactive exercises, collaborative dialogue, and reflective practices, attendees will delve into the components of the Ancestral Knowledge Tapestry, exploring the symbiosis between deep knowing, artefacts, deep listening and truth-telling. The session aims to empower participants, particularly those from First Nations communities, to identify, document, and share their unique wisdom in ways that foster self-determination and cultural continuity.
Attendees will emerge from this workshop not only with a deeper appreciation for the intrinsic value of First Nations knowledge systems but also with practical insights into how to cultivate a Wisdom Base that not only preserves but actively revitalises First Nations wisdom for future generations.

Chair
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for Levi McKenzie-Kirkbright

Levi McKenzie-Kirkbright

Software Engineer, Kowa Collaboration
Software engineer at Kowa investigating how to implement Indigenous data sovereignty principles into software systems.
avatar for Tara Beattie

Tara Beattie

Consultant, Kowa Collaboration
Tara Beattie is a dedicated professional who is passionate about fostering positive change in Community.  As a Consultant at Kowa Collaboration, Tara leads projects designed to empower organisations in First Nations UMEL practices, aligning with Kowa's commitment to amplifying First... Read More →
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Charting the Course: Measuring Organisational Evaluation Capacity Building
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Rochelle Tobin (Curtin University)

Measuring evaluation capacity building is complex, and there are few examples of quantitative measurement tools to enable evaluators to chart progress. WAAC (WA AIDS Council) and Curtin established a five-year partnership to build evaluation capacity within WAAC. To measure progress, a validated tool (Schwarzman et al. 2019) to assess organisational evaluation capacity was modified and combined with another partnership-based tool (Tobin et al. in press). The survey was administered to WAAC staff at baseline (n = 17) and then one year after the partnership was established (n = 19). Significant improvements were seen in individual skills for evaluation tasks, tools for evaluation and evaluation systems and structures. These tools provide a rigorous approach to tracking progress towards organisational evaluation capacity.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Rochelle Tobin

Rochelle Tobin

PhD candidate
I am a PhD candidate investigating SiREN's (Sexual Health and Blood-borne Virus Research and Evaluation Network) influence on research and evaluation practices in the Western Australian sexual health and blood-borne virus sector. I also support SiREN's knowledge translation activities... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Journey Mapping: Visualising Competing Needs within Evaluations
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Jolenna Deo (Allen and Clarke Consulting)

Journey mapping acts as a GPS to grasp audience or consumer experience in evaluating policies or programs, highlighting twists, hidden gems, and pitfalls It can be a useful tool to help evaluators capture disparities and competing needs among intended demographics. This session will discuss the journey mapping method, drawing from an evaluation of a Community Capacity Building Program which used journey mapping to illustrate key consumer personas. It will explore the integration of multiple data sources to provide a comprehensive understanding of complex disparities and the cultural and historical contexts in which these arise.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Jolenna Deo

Jolenna Deo

Consultant, Allen and Clarke Consulting
Jolénna is a consultant at Allen + Clarke consulting. She is a proud Mirriam Mer, Pasifika women with a background in Development studies, Pacific studies and social policy, combining her interests in indigenous methodologies and social justice. She is experienced in community and... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Reflections by a non-analyst on the use of state-wide data sets and modelled data in evaluation
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Gabby Lindsay-Smith 

Using linked Government data sets provide an opportunity to investigate the impact of state-wide programs and policies but are often out of reach for many evaluators, and especially non-analysts. This presentation will detail a non-analyst’s experience incorporating state linked data sets into a recent evaluation of a Victorian-wide family services program evaluation. The presentation will outline tips and tricks for those who may consider incorporating government-level linked data or simulation models into large program or policy evaluations in the future. It will cover areas such as: where to begin, navigating the data and key tips for working with analysts.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The evolution of evaluation: Retracing our steps in evaluation theory to prepare for the future
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: James Ong (University of Melbourne)

As new people enter the evaluation field and as evaluation marches forward into the future, it is important to learn from evaluation theorists that have come before us. My Ignite presentation will argue that modern evaluation is built on evaluation theory, and make the call for evaluators of all levels to learn evaluation theory to:
  1. Appreciate how evaluation has evolved;
  2. Strengthen their evaluation practice; and
  3. Navigate themselves around an ever-changing evaluation landscape.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for James Ong

James Ong

Research Assistant (Evaluations), University of Melbourne
My name is James Ong. I am an Autistic program evaluator working at the University of Melbourne, where I work on evaluation and implementation projects in various public health projects such as the AusPathoGen program and the SPARK initiative. I not only have a strong theoretical... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

From KPIs to systems change: Reimagining organisational learning
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Katrina Barnes (Clear Horizon), Irene Guijt (Oxfam Great Britain, GB), Chipo Peggah (Oxfam Great Britain, ZW)

Traditional measures of success for international non-governmental organizations (INGOs) have been based on western (and often colonial), theories of change, use of predefined metrics and ways of knowing - rarely fitting local realities and interests. Projectised pre-determined understandings of change, limit honest reflection on larger transformative change, and inhibit meaningful learning and adaptation.

INGOs globally are being challenged to decolonise their knowledge and evaluation processes. Over the past 18 months, Oxfam Great Britain has undergone a journey to redesign how we understand impact, to rebalance and reframe accountability and strengthen learning. This new approach focuses on collective storytelling, sensemaking and regular reflection on practice. We are taking a theory-led approach to make meaning out of signals that systems are shifting across a portfolio of work. Drawing on a bricolage of various evaluation methodologies (Outcome Harvesting-lite, meta-evaluation and synthesis, evaluative rubrics, and impact evaluations) we are slowly building a picture up over time across the organisation, to tell a story of systemic change. We have seen how meaningful and honest evidence and learning processes, have enabled a stronger culture of learning.

Although we are far from the end of this journey, we have learnt some critical lessons and face ongoing challenges. We are not the only ones, many foundations, funders, and philanthropic organisations are going through similar processes as organisations increasingly try to understand their contribution to systems change. These conversations are therefore imperative to the field of evaluation, as organisations navigate new ways to 'evaluate' their own work.

At this presentation, we will start the discussion by sharing Oxfam Great Britain's journey with key challenges faced and lessons learnt. After this, we will invite a Q&A conversation to harvest insights from others also seeking to reimagine organisational learning that is grounded in decolonising knowledge processes and seeking to understand systems change.
Chair
avatar for Elissa Mortimer

Elissa Mortimer

Manager & MEL Specialist, Palladium
I have worked in the international development and health sectors for the past 25 years, primarily in nutrition, maternal and child health, HIV, tobacco control, non-communicable diseases and skills development. I have worked on a broad variety of projects, including local community... Read More →
Speakers
avatar for Katrina Barnes

Katrina Barnes

Principal Consultant, Clear Horizon
Thursday September 19, 2024 3:30pm - 4:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Growing Australia's future evaluators: Lessons from emerging evaluator networks across the Asia Pacific
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Amanda Mottershead (Tetra Tech International Development), Qudratullah Jahid (Oxford Policy Management Australia, AU),Eroni Wavu (Pacific Community, FJ)

The sustainability of the evaluation sector requires emerging evaluators to be supported in pursuing high-quality practice. What this support needs to be and how it should be developed is much less certain. What topics should we focus on? How should we deliver it? Who should we deliver it to? How can the broader evaluation community support emerging evaluators?

Global experiences in emerging evaluator support contain a treasure trove of lessons which can fill this knowledge gap and inform effective support here in Australia. Experiences show that fostering a strong evaluation community, that includes emerging evaluators, can nurture, ignite and shape future evaluation practices. A variety of approaches are being adopted across the region, and the globe, to foster this sense of community, that range from formal approaches to capacity building to more informal approaches that focus on experience sharing.

In this session, we bring together current and former emerging evaluator leaders from across the Asia Pacific region to answer some of these questions and understand what approaches could work best for the Australian context. This will include presentations and discussion on in-demand topics, how to formulate support, how to target emerging evaluators and the best means of delivery. The session will be highly interactive, engaging the audience in a question-and-answer forum on this important topic. All panel members have been engaged with emerging evaluator networks in their countries or regions and bring diverse experiences to facilitate cross learning. The session will provide practical ways forward for the broader evaluation community to grow and support the future of evaluation.
Chair Speakers
avatar for Qudratullah Jahid

Qudratullah Jahid

Senior MEL Consultant, Oxford Policy Management
I am a monitoring, evaluation, research, and learning specialist with a background in bilateral and multilateral development organisations. With expertise in MEL frameworks and systems, I support OPM projects in the Indo-Pacific. My focus areas include MEL frameworks, mixed methods... Read More →
avatar for Amanda Mottershead

Amanda Mottershead

Consultant - Research, Monitoring and Evaluation, Tetra Tech International Development
I enjoy the breadth of evaluation in international development. I've had the opportunity to work across sectors including economic development, infrastructure, energy, education and inclusion. I enjoy generating evidence that promotes improvements to organisations, policies and programs... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Committed to mentoring
Thursday September 19, 2024 3:30pm - 4:30pm AEST
103
Authors: Julie Elliott (Independent Evaluator), Jill Thomas (J.A Thomas & Associates, AU), Martina Donkers (Independent Evaluator, AU)

Mentors and mentees from the AES Group Mentoring Program share rich experiences of group learning, knowledge sharing, and reflective practice, exploring the Wayfinding skills, knowledge, and expertise they have found through the program and the valuable lessons learned.

AES remains committed to mentoring, and this session provides a unique opportunity to hear perspectives across the mentoring spectrum, from Fellows to emerging evaluators, and the ways that sharing our professional practice enhances our work. Since 2021, the AES Group Mentoring Program has been a trailblazer in fostering professional growth and competencies for emerging and mid-career evaluators, enabling mentors and peers to help navigate unfamiliar territories, incorporating various tools and strategies.

Our dynamic panel will discuss how evaluators have adapted their approaches to mentoring and to evaluation practice with the support of the program. It's a session where personal and professional growth intersect and will offer a unique perspective on the transformative power of mentorship.

This discussion is for evaluators who are passionate about learning - both their own and that of other AES members! Whether you're a seasoned professional eager to contribute to your community, an emerging talent or a mid-career evaluator navigating contemporary evaluation ecosystems, this session is for you. Don't miss this opportunity to hear directly from mentors and mentees who value the shared, continuous journey of social learning and adaptation.




Chair
avatar for Laura Holbeck

Laura Holbeck

Monitoring, Evaluation & Learning Manager, Australian Humanitarian Partnership, Alinea International
Speakers
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.
avatar for Rick Cummings

Rick Cummings

Emeritus Professor, Murdoch University
Rick Cummings is an Emeritus Professor in Public Policy at Murdoch University. He has 40 years of experience conducting evaluation studies in education, training, health, and crime prevention primarily for the state and commonwealth government agencies and the World Bank. He currently... Read More →
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The learning journey: competency self-assessment for personal learning and profession development
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105
Authors: Amy Gullickson (University of Melbourne), Taimur Siddiqi (Victorian Legal Services, AU)

AES in collaboration with learnevaluation.org offers a competency self-assessment to members. The aim to help individuals understand their strengths and plan their learning journey, to help the AES continue to tailor its professional development offerings and develop pathways to professionalisation, and to contribute to ongoing research about evaluation learners. In this session, members of the AES Pathways Committee will briefly summarise the findings from the self-assessment and then invite participants into groups by their discipline and sector to discuss: Which competencies are really core and why? Reporting out from groups will will reveal whether the core competencies differ based on the sectors/background of the evaluators. The follow up discussion will then explore: What do the findings mean for evaluation practice, and teaching and learning? How do they relate to professionalisation? If we want to increase clarity about what good evaluation practice looks like - what are our next steps related to the competencies?

Participants will benefit from reflecting on their own competency self-assessment in relation to the findings and discussion, and discovering how the backgrounds of learners influences their ideas about core competencies. The session findings will be shared with the AES Pathways Committee to inform AES' next steps for the competencies, self-assessment, and ongoing discussion of pathways to professionalisation.

Chair Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, The University of Melbourne
I'm an Associate Professor of Evaluation at the University of Melbourne Assessment and Evaluation Research Centre. I'm also a co-founder and current chair of the International Society for Evaluation Education https://www.isee-evaled.com/, a long-time member of the AES Pathways Committee (and its predecessors), and an architect of the University of Melbourne’s fully online, multi-disciplinary, Master and Graduate Certificate of Evaluation programs https://study.unimelb.edu.au/find/courses/graduate/master-of-evaluation/ .I practice, teach, and proselytize evaluation... Read More →
avatar for Taimur Siddiqi

Taimur Siddiqi

Evaluation manager, Victorian Legal Services Board+Commissioner
Taimur is an experienced evaluation and impact measurement professional who is currently the evaluation manager at the Victorian Legal Services Board + Commissioner and a member of the AES Board Pathways Committee. He is also a freelance evaluation consultant and was previously the... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -