Loading…
Conference hashtag #aes24MEL
Tools clear filter
arrow_back View All Dates
Friday, September 20
 

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Impact evaluation: bringing together quantitative methods and program theory in mixed method evaluations
Friday September 20, 2024 11:00am - 12:00pm AEST
Authors: Harry Greenwell (Australian Treasury), To be determined (Australian Treasury, AU)

This session will provide an overview of some of the main quantitative methods for identifying the causal impacts of programs and policies, while emphasising the importance of mixed-methods that also incorporate program theory and qualitative research. It is intended for people unfamiliar with quantitative evaluation methods who would like to develop their understanding of these methods in order to better contribute to theory-based, mixed method impact evaluations.

The session will cover 3 of the most common quantitative approaches to separating causality from correlation: i) mixed-method RCTs, ii) discontinuity design, and iii) matching. Each method will be explained with real examples. The session will also cover: the benefits and limitations of each method, and considerations for determining when such methods might be suitable either on their own, or as a complement to other evaluation methods or approaches.

Special attention will be given to the ethical considerations inherent in the choice of impact evaluation method, including issues related to consent, fairness, vulnerability, and potential harm.

After attending this session, participants will have a better understanding of: how program theory can inform the design of quantitative impact evaluations, including through mixed-method impact evaluations; and how to identify when certain quantitative impact evaluation methods may be suitable for an evaluation.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Harry Greenwell

Harry Greenwell

Senior Adviser, Australian Treasury
Harry Greenwell is Director of the Impact Evaluation Unit at the Australian Centre for Evaluation (ACE) in the Australian Treasury. He previously worked for five years at the Behavioural Economics Team of the Australian Government (BETA). Before that, he worked for many years in the... Read More →
avatar for Vera Newman

Vera Newman

Assistant Director
Dr Vera Newman is an Assistant Director in the Impact Evaluation Unit at the Australian Centre for Evaluation. She has many years experience conducting impact evaluations in the private and public sector, and is dedicated to applying credible methods to public policy for generating... Read More →
Friday September 20, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Gamified, flexible, and creative tools for evaluating a support program for palliative children and their families
Friday September 20, 2024 11:30am - 12:00pm AEST
104
Authors: Claire Treadgold (Starlight Children's Foundation Australia), Erika Fortunati (Starlight Children's Foundation, AU)

Our program creates personalised experiences of fun, joy, and happiness for families with a palliative child, aiming to foster family connections and celebrate the simple joys of childhood during this challenging circumstance. Evaluating the program is of utmost importance to ensure that it meets the needs of the families involved. Equally, due to the program's sensitivity and deeply personal nature, a low-pressure, flexible evaluation approach is necessary.
In our session, we will showcase our response to this need and share our highly engaging, low-burden tools to gather participant feedback that leverages concepts of gamification and accessibility to boost evaluation responses and reduce participant burden. In particular, we will focus on our innovative “activity book”, which evaluates the program through artistic expression. By emphasising creativity and flexibility, our tools aim to enrich the evaluation process and respect the diverse preferences and abilities of the participating families.
The core argument will focus on our innovative evaluation methodology, how it aligns with best practices in the literature, and our key learnings. Key points include the considerations needed for evaluating programs involving palliative children, empowering children and young people through their active involvement in the evaluation process, and how gamification and creativity boost participation and engagement.
Outline of the session:
  • Introduction to the palliative care program and the need for flexible, creative, and respectful evaluation methods
  • What the literature tells us about evaluation methods for programs involving palliative children and their families
  • A presentation of our evaluation protocol
  • Case studies illustrating the feedback collected and its impact
Our learnings and their implications for theory and practice
Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Erika Fortunati

Erika Fortunati

Research and Evaluation Manager, Starlight Children's Foundation Australia
Erika is the Research and Evaluation Manager at Starlight Children's Foundation, an Australian not-for-profit organisation dedicated to brightening the lives of seriously ill children. In her current role, Erika manages research projects and program evaluations to ensure that programs... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

A sprint, not a marathon: Rapid Evaluation as an approach for generating fast evidence and insights
Friday September 20, 2024 12:00pm - 12:30pm AEST
104
Authors: Marnie Carter (Allen + Clarke Consulting)

Increasingly, evaluators are called upon to quickly equip decision makers with evidence from which to take action. A program may be imminently approaching the end of a funding cycle; a critical event may have taken place and leadership needs to understand the causes and learnings; or a new program of work is being designed for which it is important to ensure that finite resources are being directed to the most effective interventions. For such circumstances, Rapid Evaluation can be a useful tool.

Rapid Evaluation is not simply doing an evaluation quickly. It requires a deliberate, interlinked and iterative approach to gathering evidence to generate fast insights. What makes Rapid Evaluation different is that the evaluation design needs to be especially flexible, constantly adapting to the context. Data collection and analysis don't tend to follow a linear manner, but rather iterate back and forth during the evaluation. Rapid Evaluation is often conducted in response to specific circumstances that have arisen, and evaluators therefore need to manage a high level of scrutiny.

This presentation will provide an overview of how to conduct a rapid evaluation, illustrated by practical examples including rapid evaluations of a fund to support children who have been exposed to family violence, and a quickly-established employment program delivered during the COVID-19 pandemic. It will discuss the methodological approach to conducting a Rapid Evaluation, share lessons on how to manage the evolving nature of data collection as the evaluation progresses, and discuss how to maintain robustness while evaluating at pace.


Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Marnie Carter

Marnie Carter

Evaluation and Research Practice Lead, Allen + Clarke Consulting
Marnie is the Evaluation and Research Practice Lead for Allen + Clarke Consulting. She is experienced in program and policy evaluation, monitoring, strategy development, training and facilitation. Marnie is particularly skilled in qualitative research methods. She is an expert at... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -