Loading…
Conference hashtag #aes24MEL
Foundational & Intermediate clear filter
arrow_back View All Dates
Thursday, September 19
 

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Commissioning evaluations - finding the way from a transactional to a relational approach
Thursday September 19, 2024 10:30am - 12:00pm AEST
Authors: Eleanor Williams (Australian Department of Treasury ),Josephine Norman (Victorian Department of Health, AU),Melissa Kaltner (Lumenia, AU),Skye Trudgett (Kowa Collaboration, AU),George Argyrous (Paul Ramsay Foundation, AU),Luke Craven (National Centre for Place-Based Collaboration (Nexus), AU)

Delivering great evaluations requires a strong professional relationship between those commissioning and delivering the evaluation, as well as all relevant stakeholders.

Traditional evaluation commissioning approaches have tended to treat evaluation as a one-off exchange focusing on the completion of pre-defined tasks. However, the evolving landscape of policies and programs tackling complex issues demands a more nuanced and relational approach to get the most out of the journey of evaluation.

This big room panel session brings together speakers who are at the forefront of thinking around collaborative commissioning partnerships from the perspectives of government, not-for-profit and Indigenous-led organisations, and the private sector who can play the full suite of roles on the commissioning journey. The discussion will delve into the experiences of a range of organisations involved in commissioning who are seeking to build enduring relationships, and in some case partnerships, between the commissioners, the evaluators and the stakeholders to whom we are accountable.

Drawing on real-world case studies and empirical evidence, the discussion will highlight the challenges and rewards of transitioning from a transactional model to a relational model. It will explore how this paradigm shift can enhance collaboration and ultimately lead to a range of positive outcomes.

Attendees will be invited to respond to engage in dialogue with the panel to bring the collective wisdom of attendees together and consider how the destination of better commissioning relationships would look, the practical obstacle we face on our pathway, and how we can reach our destination. To facilitate this active discussion, attendees will have the opportunity to use Sli.do throughout the session to provide input on key questions, share experience in real-time and ask questions of the expert panel.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
avatar for Josephine Norman

Josephine Norman

Director, Centre for Evaluation and Research Evidence, Dept of Health/Dept of Families, Fairness and Housing
I run a large internal evaluation unit, directing a team of 30 expert evaluators and analysts to: directly deliver high priority projects; support program area colleagues to make the best use of external evaluators; and, build generalist staff capacity in evaluation principles and... Read More →
avatar for Luke Craven

Luke Craven

Independent Consultant
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Thursday September 19, 2024 10:30am - 12:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Julia Suh

Julia Suh

Principal, Tobias
avatar for JESSICA LEEFE

JESSICA LEEFE

Principal, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Systems evaluation to the rescue!: How do we use systems evaluation to improve societal and planetary wellbeing?
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104
Authors: Kristy Hornby (Grosvenor), Tenille Moselen (First Person Consulting)

Systems evaluation - many might have heard the term, but few have done one. This session shares two case studies of different systems evaluations and the learnings from these to benefit other evaluators who are conducting or about to begin a systems evaluation.

The session will open with an overview and explanation of what systems evaluation is, in terms of its key features and how it is distinguished from other forms of evaluation. The presenters will then talk through their case studies, one of which centres on the disability justice system in the ACT, while the other takes a sector-wide focus across the whole of Victoria. The co-presenters will share openly and honestly their initial plans for commencing the systems evaluations, how they had to amend those plans in response to real-world conditions, and the tips and tricks and innovations they picked up along the way.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
avatar for Dr Katherine Dix

Dr Katherine Dix

Principal Research Fellow, School and System Improvement, Australian Council for Educational Research
Dr Katherine Dix is a Principal Research Fellow at ACER, with over 20 years as a program evaluator, educational researcher and Project Director. Dr Dix is the National Project Manager for Australia’s participation in OECD TALIS 2024, and is a leading expert in wellbeing and whole-school... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

A tool for addressing violence against women: An examination of the creation, benefits, and drawbacks of the Evidence Portal
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlotte Bell (Australia's National Research Organisation for Women's Safety (ANROWS)), Lorelei Hine (ANROWS, AU), Elizabeth Watt (ANROWS, AU), Rhiannon Smith (ANROWS, AU)

The first of its kind in Australia, the Evidence Portal is an innovative tool that captures and assesses impact evaluations of interventions from high-income countries that aim to address and end violence against women.

While we know high-quality evaluation evidence is an important component in informing and influencing policy and practice, decision-makers face a variety of potential barriers in accessing this evidence. By providing a curated repository of existing research, evidence portals can support policymakers, practitioners, and evaluators in their decision-making.

Our Evidence Portal consolidates and synthesises impact evaluation evidence via: (1) Evidence and Gap Maps, which provide a big-picture, visual overview of interventions; and (2) Intervention Reviews, which provide a succinct, standardised assessment of interventions in accessible language. Underpinned by a rigorous systematic review methodology, this tool seeks to:
  • Identify existing impact evaluations and gaps in the evidence base, and
  • promote a collective understanding of the nature and effectiveness of interventions that aim to address violence against women

Key points: This presentation will showcase the creation, benefits, and drawbacks of the Evidence Portal, with a focused discussion on the following areas:
  • What are evidence portals and how are they used to inform policy and practice?
  • Why and how was this evidence portal created?
  • What are the challenges in creating this tool and the learnings to date?
  • What other 'ways of knowing' should be considered?

This presentation begins with an in-depth exploration of the Evidence Portal and the important methodological decisions taken to build this tool. It then offers a reflection on our journey of creating this tool with a focus on significant learnings to date. You will gain an understanding of the Evidence Portal and key considerations for future evaluations of violence against women interventions.
Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Charlotte Bell

Charlotte Bell

Research Manager, ANROWS
Charlotte Bell is an experienced researcher who focuses on domestic, family and sexual violence. Charlotte is a Research Manager (Acting) at ANROWS, where she has worked for several years across multiple research projects. With a keen interest in evaluation and impact, and extensive... Read More →
avatar for Lauren Hamilton

Lauren Hamilton

Evaluation and Partnerships Manager, Australia's National Research Organisation for Women's Safety (ANROWS)
Lauren has over 10 years of experience in the evaluation, design and management of social programs, with a focus on violence against women and children, and women’s health. In her current role, Lauren works directly with frontline services and funders of domestic, family and sexual... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Journey Mapping: Visualising Competing Needs within Evaluations
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Jolenna Deo (Allen and Clarke Consulting)

Journey mapping acts as a GPS to grasp audience or consumer experience in evaluating policies or programs, highlighting twists, hidden gems, and pitfalls It can be a useful tool to help evaluators capture disparities and competing needs among intended demographics. This session will discuss the journey mapping method, drawing from an evaluation of a Community Capacity Building Program which used journey mapping to illustrate key consumer personas. It will explore the integration of multiple data sources to provide a comprehensive understanding of complex disparities and the cultural and historical contexts in which these arise.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Jolenna Deo

Jolenna Deo

Consultant, Allen and Clarke Consulting
Jolénna is a consultant at Allen + Clarke consulting. She is a proud Mirriam Mer, Pasifika women with a background in Development studies, Pacific studies and social policy, combining her interests in indigenous methodologies and social justice. She is experienced in community and... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Reflections by a non-analyst on the use of state-wide data sets and modelled data in evaluation
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Gabby Lindsay-Smith 

Using linked Government data sets provide an opportunity to investigate the impact of state-wide programs and policies but are often out of reach for many evaluators, and especially non-analysts. This presentation will detail a non-analyst’s experience incorporating state linked data sets into a recent evaluation of a Victorian-wide family services program evaluation. The presentation will outline tips and tricks for those who may consider incorporating government-level linked data or simulation models into large program or policy evaluations in the future. It will cover areas such as: where to begin, navigating the data and key tips for working with analysts.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The evolution of evaluation: Retracing our steps in evaluation theory to prepare for the future
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: James Ong (University of Melbourne)

As new people enter the evaluation field and as evaluation marches forward into the future, it is important to learn from evaluation theorists that have come before us. My Ignite presentation will argue that modern evaluation is built on evaluation theory, and make the call for evaluators of all levels to learn evaluation theory to:
  1. Appreciate how evaluation has evolved;
  2. Strengthen their evaluation practice; and
  3. Navigate themselves around an ever-changing evaluation landscape.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for James Ong

James Ong

Research Assistant (Evaluations), University of Melbourne
My name is James Ong. I am an Autistic program evaluator working at the University of Melbourne, where I work on evaluation and implementation projects in various public health projects such as the AusPathoGen program and the SPARK initiative. I not only have a strong theoretical... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Growing Australia's future evaluators: Lessons from emerging evaluator networks across the Asia Pacific
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Amanda Mottershead (Tetra Tech International Development), Qudratullah Jahid (Oxford Policy Management Australia, AU),Eroni Wavu (Pacific Community, FJ)

The sustainability of the evaluation sector requires emerging evaluators to be supported in pursuing high-quality practice. What this support needs to be and how it should be developed is much less certain. What topics should we focus on? How should we deliver it? Who should we deliver it to? How can the broader evaluation community support emerging evaluators?

Global experiences in emerging evaluator support contain a treasure trove of lessons which can fill this knowledge gap and inform effective support here in Australia. Experiences show that fostering a strong evaluation community, that includes emerging evaluators, can nurture, ignite and shape future evaluation practices. A variety of approaches are being adopted across the region, and the globe, to foster this sense of community, that range from formal approaches to capacity building to more informal approaches that focus on experience sharing.

In this session, we bring together current and former emerging evaluator leaders from across the Asia Pacific region to answer some of these questions and understand what approaches could work best for the Australian context. This will include presentations and discussion on in-demand topics, how to formulate support, how to target emerging evaluators and the best means of delivery. The session will be highly interactive, engaging the audience in a question-and-answer forum on this important topic. All panel members have been engaged with emerging evaluator networks in their countries or regions and bring diverse experiences to facilitate cross learning. The session will provide practical ways forward for the broader evaluation community to grow and support the future of evaluation.
Chair Speakers
avatar for Qudratullah Jahid

Qudratullah Jahid

Senior MEL Consultant, Oxford Policy Management
I am a monitoring, evaluation, research, and learning specialist with a background in bilateral and multilateral development organisations. With expertise in MEL frameworks and systems, I support OPM projects in the Indo-Pacific. My focus areas include MEL frameworks, mixed methods... Read More →
avatar for Amanda Mottershead

Amanda Mottershead

Consultant - Research, Monitoring and Evaluation, Tetra Tech International Development
I enjoy the breadth of evaluation in international development. I've had the opportunity to work across sectors including economic development, infrastructure, energy, education and inclusion. I enjoy generating evidence that promotes improvements to organisations, policies and programs... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -