Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
Tools clear filter
Wednesday, September 18
 

11:00am AEST

Innovating Value for Money: Finding Our Way to Greater Value for All
Wednesday September 18, 2024 11:00am - 12:00pm AEST
105
Authors: John Gargani (Gargani + Co ),Julian King (Julian King & Associates, NZ)

In this participatory session, we pose the question, "How should evaluators innovate the practice of value-for-money assessment to meet the needs of an expanding set of actors that include governments, philanthropists, impact investors, social entrepreneurs, program designers, and Indigenous and First Nations communities?" We begin by framing value for money as an evaluative question about an economic problem. How well are we using resources, and are we using them well enough to justify their use? Then we suggest new methods intended to help innovate the practice of value for money based on our body of published and current research spanning over 10 years.
These include new methods that (1) produce "holistic" assessments of value for money, (2) reflect rather than hide multiple value perspectives even when values conflict, (3) estimate social benefit-cost ratios without monetizing benefits or costs, and (4) adjust monetary and nonmonetary value for risk using Bayesian methods. Along the way, we facilitate discussions with participants, asking them to consider if, how, and by whom these innovations should be pursued, and what other innovations may be needed. We provide participants with access to a collection of our published and draft papers, and invite them to comment and continue our discussion after the conference.
Chair
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
avatar for Farida Fleming

Farida Fleming

Evaluation Principal, Assai
I'm an evaluator with over 25 years of experience in international development. I'm currently one of a core team supporting DFAT implement its Evaluation Improvement Strategy.
Wednesday September 18, 2024 11:00am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Enhancing Stakeholder Engagement Through Culturally Sensitive Approaches: A Focus on Aboriginal and Torres Strait Islander Communities
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105
Authors: Mark Power (Murawin ),Carol Vale (Murawin, AU)

This presentation explores the paramount importance of culturally sensitive engagement methodologies in ensuring meaningful contributions from Aboriginal and Torres Strait Islander communities to mission programs. Murawin, an Aboriginal-led consultancy, has developed a robust Indigenous Engagement Strategy Framework grounded in the principles of reciprocity, free, informed and prior consent, mutual understanding, accountability, power sharing, and respect for Indigenous knowledge systems. Our session aims to share insights into the necessity of prioritising Aboriginal and Torres Strait Islander voices in engagement, co-design, and research, highlighting the significance of cultural competence in fostering mutual respect and understanding.
We will discuss three key messages: the imperative of deep knowledge and understanding of Aboriginal and Torres Strait Islander cultures in engagement practices; the success of co-design processes in facilitating genuine and respectful engagement; and the strategic partnership with CSIRO to enhance cultural competence and inclusivity in addressing Indigenous aspirations and challenges. These points underscore the critical role of acknowledging cultural interactions and ensuring cultural sensitivity in building strong, respectful productive relationships with Indigenous communities.
To achieve our session's objectives, we have designed an interactive format that blends informative presentations with the analysis of case studies, complemented by engaging intercultural discussions. This approach is intended to equip participants with actionable insights drawn from real-world examples of our collaborative ventures and co-designed projects. Through this comprehensive exploration, we aim to enrich participants' understanding of successful strategies for engaging Aboriginal and Torres Strait Islander communities, ultimately contributing to the achievement of more inclusive and impactful outcomes in mission programs and beyond.


Chair
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Speakers
avatar for Carol Vale

Carol Vale

CEO & Co-founder, Murawin, Murawin
Carol Vale is a Dunghutti entrepreneur, businesswoman, CEO and co-founder of Murawin, who’s passion, determination and commitment have driven her impressive 40-year career as a specialist in intercultural consultation, facilitation, and participatory engagement, and an empathetic... Read More →
avatar for Mark Power

Mark Power

Director, Evaluation & Research, Murawin
Mark is an experienced researcher with more than 20 years of experience in Australia and the Pacific. Mark manages Murawin’s evaluation and research practice and leads multiple evaluations for a variety of clients. Mark has overseen more than 30 high-profile, complex projects funded... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Valuing First Nations Cultures in Cost-Benefit Analysis
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
103
Authors: Laura Faulker (NSW Treasury)

This paper presents the key findings from research and engagement on how cost-benefit analysis (CBA) has been applied to First Nations initiatives to date. CBA is an important tool used by governments to help prioritise budget funding decisions. It assesses the potential impacts of an initiative - economic, social, environmental, and cultural - to determine whether it will deliver value for money.

The paper explores the methods in which the value of First Nations cultures has been incorporated into CBAs, along with the associated challenges and opportunities to improve current practice. The findings have informed the development of an investment framework for the design and evaluation of initiatives that affect First Nations people and communities. The framework focuses on the key principles for embedding First Nations perspectives and ensuring culturally informed evaluative thinking.


Chair
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting
Speakers
avatar for Laura Faulkner

Laura Faulkner

Senior Analyst, First Nations Economic Wellbeing, NSW Treasury
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When speed is of the essence: How to make sure the rubber hits the road
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103
Authors: Kristy Hornby (Grosvenor)

There is a lot of interest in rapid M&E planning and rapid evaluations at present; partially borne out of rapid contexts in a COVID-19 policy context; and partially borne out of constricting appetites for time and money spent on evaluations. It is unlikely this trend will reverse in the short-term, so what do we do about it to acquit our responsibilities as evaluators, ethically and appropriately, in a rapid context? This session sets out a step by step approach to conducting a rapid evaluation, inviting attendees to follow along with their own program in mind, to come away from the session with a pathway for conducting their own rapid evaluation. The session uses a fictional case study as the construct to move the rapid evaluation approach forward, describing throughout the session how you can use literature reviews, qualitative and quantitative data collection and analysis techniques, and report writing approaches innovatively to save you time while not compromising rigour.

We contend it is possible to do a rapid evaluation ethically and appropriately, but the backbone of doing so is good planning and execution. This session shares practical tips and approaches for doing so through each key phase of an evaluation, so attendees are well-equipped for their next rapid evaluation.

To consolidate the learning, attendees will be provided a framework to come away from the session with a high level plan of how to conduct their own rapid evaluation, to increase their chance of success.

Chair Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Developing a Tool for Measuring Evaluation Maturity at a Federal Agency
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105
Authors: Eleanor Kerdo (Attorney Generals Department ),Claudia Oke (Attorney Generals Department, AU),Michael Amon (Attorney Generals Department, AU),Anthony Alindogan (Attorney Generals Department, AU)

To embed a culture of evaluation across the Australian Public Service (Commonwealth of Australia, 2021), we must first have an accurate understanding of the current state of evaluation capability and priorities across Commonwealth agencies. This paper shares tools on how to build an effective measurement framework for evaluation culture, and discusses how to use these for evaluation capability uplift.
We explore quantitative and qualitative methods to gather and analyse data to measure an organisation's readiness to change its culture towards evaluation. This includes assessing staff attitudes towards evaluation, the level of opportunity for staff to conduct and use evaluation, and confidence in their knowledge of evaluation.
We discuss the development of a staff evaluation culture survey based on Preskill & Boyle's ROLE and how behavioural insight tools can be utilised to boost engagement. The paper discusses the utility of holding focus groups with senior leaders to understand authorising environments for evaluation and key leverage points. Also discussed, are challenges and innovative solutions that were encountered throughout the assessment process.
This paper will be valuable for those who work in, or with, any government agency with an interest in evaluation capacity building and driving an evaluation culture within organisations. This paper explains each stage of measurement design, data analysis and results, and discussing opportunities for action.
1 Preskill, H., & Boyle, S. (2008). A Multidisciplinary Model of Evaluation Capacity Building. American Journal of Evaluation, 29(4), 443-459. ://journals.sagepub.com/doi/10.1177/1098214008324182

2 Michie S, Atkins L, West R. (2014) The Behaviour Change Wheel: A Guide to Designing Interventions. London: Silverback Publishing. www.behaviourchangewheel.com.

3 Lahey, R. (2009). A Framework for Developing an Effective Monitoring and Evaluation System in the Public Sector: Key considerations from International Experience. Framework for developing an effective ME system in the public sector (studylib.net)
Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
avatar for Anthony Alindogan

Anthony Alindogan

Evaluation Lead, Attorney-General's Department
Anthony is an experienced evaluator with a particular interest in outcomes measurement and value-for-money. He completed his Master of Evaluation degree from the University of Melbourne. Anthony is an enthusiastic writer and has publications in various journals including the Evaluation... Read More →
avatar for Claudia Oke

Claudia Oke

Project Officer / Data Analyst, Australian Public Service Commission
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Chair Speakers
avatar for Jake Clark

Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Failing your way to better practice: How to tread carefully when things aren't going as planned
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105
Authors: Stephanie White (Victoria Department of Education )

Evaluators can fail in many ways. The consequences of these failures can be relatively contained or wide ranging within the evaluation and can also flow on to program operations. But failure is a part of life and can be a useful catalyst for professional growth. What happens when you find yourself failing and can see the risks ahead? How do you keep going?

The session focuses on the experiences of an emerging evaluator who failed while leading a large-scale education evaluation. When some elements of the evaluation became untenable, they struggled to find the right path forward and could foresee the risks materialising if the situation wasn’t addressed. On the other side of it, they reflect on how they drew on tools in every evaluator’s toolkit to start remedying their previous inaction and missteps to get the evaluation back on track…and improve their practice along the way!

This session is relevant to any evaluator who grapples with the messiness of expectations and reality in their practice.


Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Stephanie White

Stephanie White

Victoria Department of Education
I found my way to evaluation to help me answer questions about education program quality and success. Professionally, I have diverse experiences in education and evaluation, from delivering playgroups under trees in the NT to reports on educator resources to senior education bureaucrats... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Man vs. Machine: Reflections on machine-assisted and human-driven approaches used to examine open-text progress reports.
Thursday September 19, 2024 1:30pm - 2:00pm AEST
Authors: Stephanie Quail (ARTD Consultants), Kathleen De Rooy (ARTD Consultants, AU)

Progress reports and case notes contain rich information about program participants' experiences and frequently describe theoretically important risk and protective factors that are not typically recorded in administrative datasets. However, the unstructured narrative nature of these types of data - and, often, the sheer volume of it - is a barrier for human-drive qualitative analysis of this data. Often, the data cannot be included in evaluations because it is too time and resource intensive to do so.

This paper will describe three approaches to the qualitative analysis of progress reports used to examine within-program trajectories for participants, and the factors important for program success as part of an evaluation of the Queensland Drug and Alcohol Court.

It will explore how we navigated the balance between human and machine-driven qualitative analysis. We will reflect on the benefits and challenges of text-mining - how humans and machines stack up against each other when identifying the sentiment and emotion in text, the strengths and challenges of each approach, the lessons we have learned, and considerations for using these types of approaches to analyse datasets of progress reports in future evaluations.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Stephanie Quail

Stephanie Quail

Manager, ARTD Consultants
Thursday September 19, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

A tool for addressing violence against women: An examination of the creation, benefits, and drawbacks of the Evidence Portal
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlotte Bell (Australia's National Research Organisation for Women's Safety (ANROWS)), Lorelei Hine (ANROWS, AU), Elizabeth Watt (ANROWS, AU), Rhiannon Smith (ANROWS, AU)

The first of its kind in Australia, the Evidence Portal is an innovative tool that captures and assesses impact evaluations of interventions from high-income countries that aim to address and end violence against women.

While we know high-quality evaluation evidence is an important component in informing and influencing policy and practice, decision-makers face a variety of potential barriers in accessing this evidence. By providing a curated repository of existing research, evidence portals can support policymakers, practitioners, and evaluators in their decision-making.

Our Evidence Portal consolidates and synthesises impact evaluation evidence via: (1) Evidence and Gap Maps, which provide a big-picture, visual overview of interventions; and (2) Intervention Reviews, which provide a succinct, standardised assessment of interventions in accessible language. Underpinned by a rigorous systematic review methodology, this tool seeks to:
  • Identify existing impact evaluations and gaps in the evidence base, and
  • promote a collective understanding of the nature and effectiveness of interventions that aim to address violence against women

Key points: This presentation will showcase the creation, benefits, and drawbacks of the Evidence Portal, with a focused discussion on the following areas:
  • What are evidence portals and how are they used to inform policy and practice?
  • Why and how was this evidence portal created?
  • What are the challenges in creating this tool and the learnings to date?
  • What other 'ways of knowing' should be considered?

This presentation begins with an in-depth exploration of the Evidence Portal and the important methodological decisions taken to build this tool. It then offers a reflection on our journey of creating this tool with a focus on significant learnings to date. You will gain an understanding of the Evidence Portal and key considerations for future evaluations of violence against women interventions.
Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Charlotte Bell

Charlotte Bell

Research Manager, ANROWS
Charlotte Bell is an experienced researcher who focuses on domestic, family and sexual violence. Charlotte is a Research Manager (Acting) at ANROWS, where she has worked for several years across multiple research projects. With a keen interest in evaluation and impact, and extensive... Read More →
avatar for Lauren Hamilton

Lauren Hamilton

Evaluation and Partnerships Manager, Australia's National Research Organisation for Women's Safety (ANROWS)
Lauren has over 10 years of experience in the evaluation, design and management of social programs, with a focus on violence against women and children, and women’s health. In her current role, Lauren works directly with frontline services and funders of domestic, family and sexual... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Where next? Evaluation to transformation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor), Sarika Bhana (Grosvenor)

What is evaluation? Better Evaluation defines it as "any systematic process to judge merit, worth or significance by combining evidence and values". Many government organisations and some private and not-for-profit entities use evaluations as an auditing tool to measure how well their programs are delivering against intended outcomes and impacts and achieving value for money. This lends itself to viewing evaluation as an audit or 'tick-box' exercise when it is really measuring the delivery of an organisation's mandate or strategy (or part thereof). Viewing evaluation more as an audit than a core part of continuous improvement presents a risk of our reports collecting dust.

During this session, we will discuss factors that build a continuous improvement mindset across evaluation teams, as well as across the broader organisation. This will include exploring how to manage the balance between providing independent advice with practical solutions that program owners and other decision-makers can implement more readily, as well as how to obtain greater buy-in to evaluation practice. We present the features that evaluations should have to ensure findings and conclusions can be easily translated into clear actions for improvement.

We contend that it is important to consider evaluation within the broader organisational context, considering where this might link to strategy or how it may be utilised to provide evidence to support funding bids. This understanding will help to ensure evaluations are designed and delivered in a way that best supports the wider organisation.

We end by sharing our post-evaluation playbook - a practical tool to help take your evaluations from pesky paperweight to purposeful pathway.

Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world two years ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector and not-for-profit space. Rachel... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
  Tools
 
Friday, September 20
 

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Impact evaluation: bringing together quantitative methods and program theory in mixed method evaluations
Friday September 20, 2024 11:00am - 12:00pm AEST
Authors: Harry Greenwell (Australian Centre for Evaluation), Peter Bowers (Australian Centre for Evaluation), Vera  Newman (Australian Centre for Evaluation)

This session will provide an overview of some of the main quantitative methods for identifying the causal impacts of programs and policies, while emphasising the importance of mixed-methods that also incorporate program theory and qualitative research. It is intended for people unfamiliar with quantitative evaluation methods who would like to develop their understanding of these methods in order to better contribute to theory-based, mixed method impact evaluations.

The session will cover 3 of the most common quantitative approaches to separating causality from correlation: i) mixed-method RCTs, ii) discontinuity design, and iii) matching. Each method will be explained with real examples. The session will also cover: the benefits and limitations of each method, and considerations for determining when such methods might be suitable either on their own, or as a complement to other evaluation methods or approaches.

Special attention will be given to the ethical considerations inherent in the choice of impact evaluation method, including issues related to consent, fairness, vulnerability, and potential harm.

After attending this session, participants will have a better understanding of: how program theory can inform the design of quantitative impact evaluations, including through mixed-method impact evaluations; and how to identify when certain quantitative impact evaluation methods may be suitable for an evaluation.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Peter Bowers

Peter Bowers

Assistant Director, Australian Centre for Evaluation (ACE)
I am part of the Australian Centre for Evaluation in Commonwealth Treasury that was set up to increase the volume, quality and use of evaluation across the Commonwealth government. I have a particular interest in RCTs. Come and speak to me if you would like to run an RCT in your... Read More →
avatar for Vera Newman

Vera Newman

Assistant Director
Dr Vera Newman is an Assistant Director in the Impact Evaluation Unit at the Australian Centre for Evaluation. She has many years experience conducting impact evaluations in the private and public sector, and is dedicated to applying credible methods to public policy for generating... Read More →
Friday September 20, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Gamified, flexible, and creative tools for evaluating a support program for palliative children and their families
Friday September 20, 2024 11:30am - 12:00pm AEST
104
Authors: Claire Treadgold (Starlight Children's Foundation Australia), Erika Fortunati (Starlight Children's Foundation, AU)

Our program creates personalised experiences of fun, joy, and happiness for families with a palliative child, aiming to foster family connections and celebrate the simple joys of childhood during this challenging circumstance. Evaluating the program is of utmost importance to ensure that it meets the needs of the families involved. Equally, due to the program's sensitivity and deeply personal nature, a low-pressure, flexible evaluation approach is necessary.
In our session, we will showcase our response to this need and share our highly engaging, low-burden tools to gather participant feedback that leverages concepts of gamification and accessibility to boost evaluation responses and reduce participant burden. In particular, we will focus on our innovative “activity book”, which evaluates the program through artistic expression. By emphasising creativity and flexibility, our tools aim to enrich the evaluation process and respect the diverse preferences and abilities of the participating families.
The core argument will focus on our innovative evaluation methodology, how it aligns with best practices in the literature, and our key learnings. Key points include the considerations needed for evaluating programs involving palliative children, empowering children and young people through their active involvement in the evaluation process, and how gamification and creativity boost participation and engagement.
Outline of the session:
  • Introduction to the palliative care program and the need for flexible, creative, and respectful evaluation methods
  • What the literature tells us about evaluation methods for programs involving palliative children and their families
  • A presentation of our evaluation protocol
  • Case studies illustrating the feedback collected and its impact
Our learnings and their implications for theory and practice
Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Erika Fortunati

Erika Fortunati

Research and Evaluation Manager, Starlight Children's Foundation Australia
Erika is the Research and Evaluation Manager at Starlight Children's Foundation, an Australian not-for-profit organisation dedicated to brightening the lives of seriously ill children. In her current role, Erika manages research projects and program evaluations to ensure that programs... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

A sprint, not a marathon: Rapid Evaluation as an approach for generating fast evidence and insights
Friday September 20, 2024 12:00pm - 12:30pm AEST
104
Authors: Marnie Carter (Allen + Clarke Consulting)

Increasingly, evaluators are called upon to quickly equip decision makers with evidence from which to take action. A program may be imminently approaching the end of a funding cycle; a critical event may have taken place and leadership needs to understand the causes and learnings; or a new program of work is being designed for which it is important to ensure that finite resources are being directed to the most effective interventions. For such circumstances, Rapid Evaluation can be a useful tool.

Rapid Evaluation is not simply doing an evaluation quickly. It requires a deliberate, interlinked and iterative approach to gathering evidence to generate fast insights. What makes Rapid Evaluation different is that the evaluation design needs to be especially flexible, constantly adapting to the context. Data collection and analysis don't tend to follow a linear manner, but rather iterate back and forth during the evaluation. Rapid Evaluation is often conducted in response to specific circumstances that have arisen, and evaluators therefore need to manage a high level of scrutiny.

This presentation will provide an overview of how to conduct a rapid evaluation, illustrated by practical examples including rapid evaluations of a fund to support children who have been exposed to family violence, and a quickly-established employment program delivered during the COVID-19 pandemic. It will discuss the methodological approach to conducting a Rapid Evaluation, share lessons on how to manage the evolving nature of data collection as the evaluation progresses, and discuss how to maintain robustness while evaluating at pace.


Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Marnie Carter

Marnie Carter

Evaluation and Research Practice Lead, Allen + Clarke Consulting
Marnie is the Evaluation and Research Practice Lead for Allen + Clarke Consulting. She is experienced in program and policy evaluation, monitoring, strategy development, training and facilitation. Marnie is particularly skilled in qualitative research methods. She is an expert at... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.