Loading…
Conference hashtag #aes24MEL
Short paper clear filter
arrow_back View All Dates
Thursday, September 19
 

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

National impact, regional delivery - Robust M&E for best practice Australian horticulture industry development.
Thursday September 19, 2024 10:30am - 11:00am AEST
Authors: Ossie Lang (RMCG ),Donna Lucas (RMCG ),Carl Larsen (RMCG ),Zarmeen Hassan (AUSVEG ),Cherry Emerick (AUSVEG ),Olive Hood (Hort Innovation )

How do you align ten regionally delivered projects with differing focus topics to nationally consistent outcomes? Take advantage of this opportunity to explore the journey of building and implementing a robust Monitoring and Evaluation (M&E) program that showcases regional nuances and aligns national outcomes, making a significant contribution to the success of this horticultural industry extension project.

Join us for an insightful presentation on how a national vegetable extension project focused on adoption of best management practices on-farm, has successfully implemented a dynamic M&E program. Over the two and a half years of project delivery, the national M&E manager, in collaboration with ten regional partners, has crafted a program that demonstrates regional impact consistently on a national scale and adapts to the project's evolving needs.

The presentation will highlight the team's key strategies, including the upskilling of Regional Development Officers in M&E practices. Learn how templates and tools were designed to ensure consistent data collection across approximately 40 topics. The team will share the frameworks utilised to capture quantitative and qualitative monitoring data, providing a holistic view of tracking progress against national and regional outcomes and informing continuous improvement in regional delivery.

Flexibility has been a cornerstone of the M&E program, allowing it to respond to the changing needs of growers, industry, and the funding partner and seamlessly incorporate additional data points. Discover how this adaptability has enhanced the project's overall impact assessment and shaped its delivery strategy.

The presentation will not only delve into the national perspective but also feature a firsthand account from one of the Regional Development Officers. Gain insights into how the M&E program has supported their on-the-ground delivery, instilling confidence in providing data back to the national project manager. This unique perspective offers a real-world understanding of the national program's effectiveness at a regional level.
Chair Speakers
avatar for Ossie Lang

Ossie Lang

Consultant-Regional Development Officer, RMCG
Thursday September 19, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating organisational turbulence: An evaluation-based strategic learning model for organisational sustainability
Thursday September 19, 2024 10:30am - 11:00am AEST
103
Authors: Shefton Parker, Monash Univeristy; Amanda Sampson, Monash Univeristy

Increasingly, turbulent, and rapidly changing global operating environments are disrupting organisational plan implementation and strategy realisation of institutions. The session introduces a novel organisational collaborative strategic learning and effectiveness model, intended to bolster organisational resilience responses amidst such turbulence.
A scarcity of suitable organisational strategic learning systems thinking models utilising evaluation methodology in a joined-up way, prompted the presenters to develop a model. The model is tailored for strategic implementation in a complex organisational system environment, operating across decentralised portfolios with multiple planning and operational layers. The model amalgamates evaluation methodologies to identify, capture, share and respond to strategic learning in a complex system. It is hypothesised the model will outperform conventional organisational performance-based reporting systems, in terms of organisational responsiveness, agility, adaptability, collaboration, and strategic effectiveness.
The presentation highlights the potential value of integrating and embedding evaluation approaches into an organisation's strategy, governance and operations using a three-pronged approach:
- Sensing: Gathering relevant, useful timely data (learning);
- Making sense: Analysing and contextualising learning data alongside other relevant data (institutional performance data, emerging trends, policy, and legislative reform etc); and
- Good sense decisions: Providing timely and relevant evaluative intelligence and insights to support evidence based good decision making.
The presenters advocate for a shift from viewing evaluation use as a 'nice to have' to a 'must have' aspect of organisational growth and sustainability. The model aims to foster a leadership culture where decision makers value the insights that contextualised holistic organisational intelligence can provide for;

i) Strategic planning: Enhanced planning and strategic alignment across portfolios;

ii) Operational efficiency: Reducing duplication in strategic effort and better collaboration towards strategic outcomes;

iii) Business resilience and sustainability: Improved identification and quicker response to emerging opportunities and challenges; and

iv) Strategic effectiveness: Informing activity adaptation recommendations for strategic goal realisation.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Shefton Parker

Shefton Parker

Senior Evidence & Evaluation Adviser, Monash University - Institutional Planning
Dr Shefton Parker is an evaluator and researcher with over 15 years of specialist experience in program and systems evaluation within the Vocational and Higher Education sectors. Recently, his evaluation of innovative education programs were referenced as evidence in the University... Read More →
avatar for Amanda Sampson

Amanda Sampson

Senior Manager, Institutional Planning, Monash University
I am leading the development and implementation of an Institutional Evaluation Model which a complex organisation to support organisational resilience, strategic adaptation and execution to realise the 10 year organisational strategic objectives. I am interested in learning how to... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Learn, evolve, adapt: Evaluation of climate change and disaster risk reduction programs
Thursday September 19, 2024 11:00am - 11:30am AEST
104
Authors: Justine Smith (Nation Partners )

There is a pressing need to reduce the risks associated with climate change and the disasters that are likely to increase as a result. Along with the need to take action, comes the need to show we are making a difference - or perhaps more importantly the need to learn and evolve to ensure we are making a difference. However when operating in an ever changing, uncertain environment, with layers of complexity and outcomes that may not be realised for some time, or until disaster strikes, evidence of impact is not always easy to collect nor a priority.

Drawing on experience developing evaluation frameworks and delivering evaluation projects in the areas of climate change and disaster and emergency management, I will present some of the challenges and opportunities I have observed. In doing so, I propose that there is no 'one way' to do things. Rather, taking the time to understand what we are evaluating and to continually learn, evolve and adjust how we evaluate is key. This includes having clarity on what we really mean when we are talking about reducing risk and increasing resilience. Ideas I will explore include:
  • The concepts of risk reduction and resilience.
  • The difference between evaluation for accountability and for genuine learning and improvement.
  • Balancing an understanding of and progress towards big picture outcomes with project level, time and funding bound outcomes.
  • The challenge and potential benefits of event-based evaluation to learn and improve.

Evaluation has the capacity to contribute positively to action taken to reduce climate change risks and improve our management of disasters and recovery from disasters. As evaluators we too need to be innovative and open-minded in our approaches, to learn from and with those working directly in this space for the benefit of all.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Justine Smith

Justine Smith

Principal, Nation Partners
With a background spanning research, government, non-government organisations and consulting, Justine brings technical knowledge and over 10 years of experience to the projects she works on. As a highly experienced program evaluator and strategic thinker, Justine has applied her skills... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Culturally inclusive evaluation with culturally and linguistically diverse communities in Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
Author Lena Etuk (CIRCA Research, AU)

In this presentation we will outline an approach to culturally inclusive evaluation with people from culturally and linguistically diverse backgrounds in Australia, its strengths, and its growth opportunities. This approach fills a critical gap in the way evaluation and research with culturally and linguistically diverse communities is traditionally conducted in Australia.

In this presentation we will explain how the Cultural & Indigenous Research Centre Australia (CIRCA) conducts in-culture and in-language evaluation with diverse cohorts of Australians, and how this practice fits within the broader methodological discourse in evaluation and social science more broadly. We will illustrate how our culturally inclusive methodology is put into practice with findings from CIRCA's own internal research into the way cultural considerations shape our data collection process. We will conclude with reflections on how CIRCA might further draw on and leverage standpoint theory and culturally responsive evaluation as this practice is further refined.

Our key argument is that doing culturally inclusive evaluation is a process that requires reflexivity and learning, alongside strong and transparent institutional processes. Combining these approaches creates systemic ways of acknowledging and working within stratified and unequal social systems, inherent to any research. Our findings will advance knowledge within the field of evaluation about how to engage and represent culturally and linguistically diverse community members across Australia.
Chair Speakers
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Cultural & Indigenous Research Centre Australia
I’m an applied Sociologist with 16+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Our journey so far: a story of evaluation to support community change in South Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
103
Authors: Penny Baldock (Department of Human Services South Australia ),Jessie Sleep (Far West Community Partnerships, AU)

The multi-jurisdictional South Australian Safety and Wellbeing Taskforce is the lead mechanism, and the accountable body to develop strategies and sustainable, place-based responses that ensure the safety and wellbeing of remote Aboriginal Visitors in Adelaide and other regional centres in the State.

This presentation discusses the challenges of establishing an evaluative learning strategy for the Taskforce that meets the needs of multiple government agencies and stakeholders, multiple regional and remote communities, and multiple nation groups.

In a complex system, this is a learning journey, requiring us to adapt together to seek new ways of understanding and working that truly honour the principles of data sovereignty, community self-determination, and shared decision-making.
As we begin to more truly centre communities as the locus of control, and consider the far- reaching reform that will be necessary to deliver on our commitments under Closing the Gap, this presentation provides an important reflection on the skills, knowledge and expertise that will be required to build evaluation systems and processes that support change.

One of the most exciting developments to date has been the establishment of a multi-agency data sharing agreement, which will enable government data to be shared with Far West Community Partnerships, a community change organisation based in Ceduna, and combined with their community owned data in order to drive and inform the Far West Change Agenda.

We present the story of our journey so far, our successes, our failures, and extend an invitation to be part of the ongoing conversation. to support the change required for evaluation success.

Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
PB

PENNY BALDOCK

Department of Human Services
avatar for Jessie Sleep

Jessie Sleep

Chief Executive, Far West Community Partnerships
Jessie is an innovative thinker and strategist, emerging as a leader in her field, redefining the role of strategic implementation with monitoring and evaluation. With the fast paced growth of the social impact lens in Australia, Jessie is part of the new generation of strategic leaders... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Julia Suh

Julia Suh

Principal, Tobias
avatar for JESSICA LEEFE

JESSICA LEEFE

Principal, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

"Nothing about us, without us": Developing evaluation framework alongside victim-survivors of modern slavery using representative participatory approaches
Thursday September 19, 2024 11:30am - 12:00pm AEST
Authors: Ellie Taylor (The Salvation Army)

Amplifying survivor voices has been the cornerstone of The Salvation Army's work in the anti-slavery realm. How does this translate to the monitoring and evaluation space? How do we truly represent the voices and experiences of those with lived experience of modern slavery in monitoring and evaluation, whilst aligning with key human rights principles?

Our Research Team are exploring how to centre survivor voices in the evaluation space. This session will detail use of a representative participatory evaluation approach to monitor and evaluate the Lived Experience Engagement Program (LEEP) for survivors of criminal labour exploitation. In this session we will explore the challenges and learnings uncovered through this project.

The LEEP is designed to empower survivors of criminal labour exploitation to share their expertise to make change. Piloted in 2022-2023, and continuing into 2024-2025, the LEEP - and resulting Survivor Advisory Council - provides a forum for survivors to use their lived experience to consult with government - to assist in preventing, identifying and responding to modern slavery.

The key points explored in this session will include:
  • Realities of implementing an adaptive model, including continuous integration of evaluation findings into an iterative survivor engagement model.
  • The importance of stakeholder inclusivity, integrating lived experience voices and amplifying them alongside program facilitators and government representatives.
  • Complexities of evaluation in the modern slavery space, particularly when victim-survivors of forced marriage are included. We will speak to the need for trauma-informed, strengths-based measures and facilitating partnerships with the people the program serves.

Leading the session will be the The Salvation Army's project lead with a PhD in mental health and over 12 years of experience working with diverse community groups in Australia and internationally. They have extensive experience presenting at conferences both domestically and internationally.
Chair Speakers
avatar for Ellie Taylor

Ellie Taylor

Senior Research Analyst, The Salvation Army
Ellie has a background in mental health and has spent 12+ years designing and conducting research and evaluation initiatives with diverse communities across Australia and internationally. In this time, she's worked with people from all walks of life, across the lifespan, from infants... Read More →
Thursday September 19, 2024 11:30am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Systems evaluation to the rescue!: How do we use systems evaluation to improve societal and planetary wellbeing?
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104
Authors: Kristy Hornby (Grosvenor), Tenille Moselen (First Person Consulting)

Systems evaluation - many might have heard the term, but few have done one. This session shares two case studies of different systems evaluations and the learnings from these to benefit other evaluators who are conducting or about to begin a systems evaluation.

The session will open with an overview and explanation of what systems evaluation is, in terms of its key features and how it is distinguished from other forms of evaluation. The presenters will then talk through their case studies, one of which centres on the disability justice system in the ACT, while the other takes a sector-wide focus across the whole of Victoria. The co-presenters will share openly and honestly their initial plans for commencing the systems evaluations, how they had to amend those plans in response to real-world conditions, and the tips and tricks and innovations they picked up along the way.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
avatar for Dr Katherine Dix

Dr Katherine Dix

Principal Research Fellow, School and System Improvement, Australian Council for Educational Research
Dr Katherine Dix is a Principal Research Fellow at ACER, with over 20 years as a program evaluator, educational researcher and Project Director. Dr Dix is the National Project Manager for Australia’s participation in OECD TALIS 2024, and is a leading expert in wellbeing and whole-school... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Getting to the value add: Timely insights from a realist developmental evaluation
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Phillip Belling (NSW Department of Education), Liam Downing (NSW Department of Education, AU)

This paper is aimed at early career and experienced evaluators interested in realist evaluation, but with concerns about the time a realist approach might take. The authors respond to this concern with an innovative blending of realist and developmental evaluation. Participants will exit the room with a working understanding of realist developmental evaluation, including its potential for adaptive rigour that meets the needs of policy makers and implementers.

Realist evaluation is theoretically and methodologically robust, delivering crucial insights about how, for whom and why interventions do and don't work (House, 1991; Pawson & Tilley, 1997; Pawson, 2006). It aims to help navigate unfamiliar territory towards our destination by bringing assumptions about how and why change happens out in the open.

But even realism's most enthusiastic practitioners admit it takes time to surface and test program theory (Marchal et al., 2012; van Belle, Westhorp & Marchal, 2021). And evaluation commissioners and other stakeholders have understandable concerns about the timeliness of obtaining actionable findings (Blamey & Mackenzie, 2007; Pedersen & Rieper, 2008).

Developmental evaluation (Patton, 1994, 2011 2021; Patton, McKegg, & Wehipeihana, 2015) is more about what happens along the way. It appeals because it provides a set of principles for wayfinding in situations of complexity and innovation. Realist and developmental approaches do differ, but do they share some waypoints to reliably unpack perplexing problems of practice?

This paper documents a journey towards coherence and rigour in an evaluation where developmental and realist approaches complement each other, and deliver an evidence base for program or policy decision-making that is not only robust but also timely.

We show that, in complex environments, with programs involving change and social innovation, realist developmental evaluation can meet the needs of an often-varied cast of stakeholders, and can do so at pace, at scale, and economically.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating the unfamiliar: Evaluation and sustainable finance
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Donna Loveridge (Independent Consultant), Ed Hedley (Itad Ltd UK, GB)

The nature and magnitude of global challenges, such as climate change, poverty and inequality, biodiversity loss, food insecurity and so on, means that $4 trillion is needed annually to achieve the Sustainable Development Goals by 2030. Government and philanthropic funding is not enough but additional tools include businesses and sustainable finance. Evaluators may relate to many objectives that business and sustainable finance seek to contribute to but discomfort can arise in the mixing of profit, financial returns, impact and purpose.

Sustainable finance, impact investing, and business for good are growing globally and provides opportunities and challenges for evaluators, evaluation practice and the profession.
This session explores this new landscape and examines:
  • What makes us uncomfortable about dual objectives of purpose and profit, notions of finance and public good, and unfamiliar stakeholders and languages, and what evaluators can do in response.
  • The opportunities for evaluators to contribute to solving interesting and complex problems with current tools and skills and where is the space for developing evaluation theory and practice.
  • How evaluation practice and evaluators' competencies might expand and deepen, and not get left behind in these new fields, and also sustaining evaluations relevance to addressing complex challenges.

The session draws on experience in Australia and internationally to share some practical navigation maps, tools and tips to help evaluators traverse issues of values and value, working with investors and businesses, and identify opportunities to add value.
Chair Speakers
avatar for Donna Loveridge

Donna Loveridge

Impact strategy and evaluation consultant
I work with public sector and not for profit organisations and businesses to design and conduct evaluations and embed evaluative thinking in management systems and processes to strengthen learning and decision-making. Most of my work focuses on inclusive economic growth through impact... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Man vs. Machine: Reflections on machine-assisted and human-driven approaches used to examine open-text progress reports.
Thursday September 19, 2024 1:30pm - 2:00pm AEST
Authors: Stephanie Quail (ARTD Consultants), Kathleen De Rooy (ARTD Consultants, AU)

Progress reports and case notes contain rich information about program participants' experiences and frequently describe theoretically important risk and protective factors that are not typically recorded in administrative datasets. However, the unstructured narrative nature of these types of data - and, often, the sheer volume of it - is a barrier for human-drive qualitative analysis of this data. Often, the data cannot be included in evaluations because it is too time and resource intensive to do so.

This paper will describe three approaches to the qualitative analysis of progress reports used to examine within-program trajectories for participants, and the factors important for program success as part of an evaluation of the Queensland Drug and Alcohol Court.

It will explore how we navigated the balance between human and machine-driven qualitative analysis. We will reflect on the benefits and challenges of text-mining - how humans and machines stack up against each other when identifying the sentiment and emotion in text, the strengths and challenges of each approach, the lessons we have learned, and considerations for using these types of approaches to analyse datasets of progress reports in future evaluations.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Stephanie Quail

Stephanie Quail

Manager, ARTD Consultants
Thursday September 19, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Monitoring and Evaluation Journeys: Making footprints, community-based enterprise in Australian First Nations contexts
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104
Authors: Donna-Maree Stephens (Community First Development ),Sharon Babyack (Community First Development, AU)

As First Nations' economies grow and develop, wayfinding of monitoring and evaluation frameworks that meaningfully address the holistic outcomes of First Nations' economic independence are a necessity. Culturally responsive monitoring and evaluation frameworks provide footprints for distinct ways of thinking about the holistic and significant contribution that First Nations' economies make to their communities and the broad Australian economic landscape.
Presenting findings from an organisation with more than 20 years of experience working alongside First Nations' communities and businesses grounded in collective and community focused outcomes, this presentation will highlight key learnings of monitoring and evaluation from First Nations' enterprises. It is an invitation to explore and rethink notions of success by drawing on experiences and Dreams (long-term goals) for community organisations, businesses and journeys towards positive outcomes alongside the role of one culturally responsive monitoring and evaluation approach. Our presentation will provide an overview of our work in the community economic development space and key learnings developed through our monitoring and evaluation yarns with First Nations' enterprises across a national First Nations' economic landscape that includes urban, regional and remote illustrations.
Chair
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
Speakers
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, Community First Development
My role at Community First Development involves oversight of research, evaluation, communications and effectiveness of the Community Development program. During my time with the organisation I have led teams to deliver major change processes and strategic priorities, have had carriage... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

A long road ahead: Evaluating long-term change in complex policy areas. A case study of school active travel programs in the ACT
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106
Authors: Mallory Notting (First Person Consulting)

The ACT Government implemented a suite of programs over the ten year period between 2012 and 2022 aiming to increase the rates of students actively travelling to and from school. 102 schools in the ACT participated in at least one of the three programs during this time which targeted well-known barriers to active travel, including parental perceptions of safety and infrastructure around school. The programs were intended to contribute towards a range of broader priorities, including health, safety, and environmental outcomes.

This short-paper session will share learnings from evaluating long-term behaviour change at a population level, based on the school active travel evaluation. The evaluation represents a unique case study, as the evaluators needed to look retrospectively over ten years of program delivery and assess whether the combination of programs had created changes within the system and had resulted in the achievement of wider goals.

The presenter will illustrate that the line between short-term and long-term outcomes is rarely linear or clear, as is the relationship between individual interventions and whole of system change. This will be done by summarising the approach taken for the evaluation and sharing the diversity of information collated for analysis, which included individual program data and attitudinal and infrastructure-level data spanning the whole school environment.

Evaluators are often only able to examine the shorter term outcomes of an intervention, even in complex policy areas, and then rely on a theory of change to illustrate the assumed intended wider impacts. The presenter was able to scrutinise these wider impacts during the active travel evaluation, an opportunity not regularly afforded to evaluators. The lessons from the active travel evaluation are therefore pertinent for other evaluations in complex policy areas and may carry implications for program design as the focus shifts increasingly towards population-level, systems change.

Chair
avatar for Carolyn Wallace

Carolyn Wallace

Manager Research and Impact, VicHealth
Carolyn is an established leader in health and community services with over 22 years of experience across regional Victoria, Melbourne, and Ireland. She has held roles including CEO, executive director, policy officer, and researcher, specialising in community wellbeing and social... Read More →
Speakers
avatar for Mallory Notting

Mallory Notting

Principal Consultant, First Person Consulting
Mallory is a Principal Consultant at First Person Consulting. She manages and contributes to projects primarily in the area of cultural wellbeing, social inclusion, mental health, and public health and health promotion. In 2023, Mallory was the recipient of the Australian Evaluation... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Our new ways: Reforming our approach to impact measurement and learning
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105
Authors: Kaitlyn Scannell (Minderoo Foundation), Adriaan Wolvaardt (Minderoo Foundation, AU), Nicola Johnstone (Minderoo Foundation, AU), Kirsty Kirkwood (Minderoo Foundation, AU)

We have been on a journey to bring awareness, evidence and understanding to the impact of our organisation since inception, and in earnest since 2016. For years, we felt the tension of trying to solve complex problems with measurement and learning approaches that are better suited to solving simple problems.

To change the world, we must first change ourselves. In early 2023 we had the extraordinary opportunity to completely reimagine our approach to impact measurement and learning. What we sought was an approach to measurement and learning that could thrive in complexity, rather than merely tolerate it, or worse, resist it.
We are not alone in our pursuit. Across government and the for-purpose sector, practitioners are exploring and discovering how to measure, learn, manage, and lead in complexity. Those who explore often discover that the first step they need to take is to encourage the repatterning of their own organisational system. A system which, which in the words of Donella Meadows, "naturally resists its own transformation."

In this presentation we will delve into two themes that have emerged from our journey so far:
  • Transforming ourselves - We will explore what it takes to embed a systems-led approach to measurement, evaluation and learning in an organisation.
  • Sharing knowledge - We will discuss methods for generating, sharing, and storing knowledge about what works for measuring, evaluating, and learning in complexity.

The purpose of this session is to share what we have learnt with anyone who is grappling with how their organisation might measure and learn in complexity. We have been touched by the generosity of those who have accompanied us on our journey, sharing their experiences and wisdom. This presentation marks our initial effort to pay that generosity forward.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Where next? Evaluation to transformation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor), Sarika Bhana (Grosvenor)

What is evaluation? Better Evaluation defines it as "any systematic process to judge merit, worth or significance by combining evidence and values". Many government organisations and some private and not-for-profit entities use evaluations as an auditing tool to measure how well their programs are delivering against intended outcomes and impacts and achieving value for money. This lends itself to viewing evaluation as an audit or 'tick-box' exercise when it is really measuring the delivery of an organisation's mandate or strategy (or part thereof). Viewing evaluation more as an audit than a core part of continuous improvement presents a risk of our reports collecting dust.

During this session, we will discuss factors that build a continuous improvement mindset across evaluation teams, as well as across the broader organisation. This will include exploring how to manage the balance between providing independent advice with practical solutions that program owners and other decision-makers can implement more readily, as well as how to obtain greater buy-in to evaluation practice. We present the features that evaluations should have to ensure findings and conclusions can be easily translated into clear actions for improvement.

We contend that it is important to consider evaluation within the broader organisational context, considering where this might link to strategy or how it may be utilised to provide evidence to support funding bids. This understanding will help to ensure evaluations are designed and delivered in a way that best supports the wider organisation.

We end by sharing our post-evaluation playbook - a practical tool to help take your evaluations from pesky paperweight to purposeful pathway.

Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world two years ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector and not-for-profit space. Rachel... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
  Tools
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -