Loading…
Conference hashtag #aes24MEL
strong>Intermediate [clear filter]
arrow_back View All Dates
Friday, September 20
 

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Climate Change, Energy, the Environmentand Water NSW
It was a short step from studying environmental science, and working on cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing stories of what you've done and learned, especially in energy, climate change, environment and... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Place-based evaluation: collaborating to navigate learning in complex and dynamic contexts
Friday September 20, 2024 12:00pm - 12:30pm AEST
105
Authors: Sandra Opoku (Relationships Australia Victoria), Kate Matthies-Brown (Relationships Australia Victoria, AU)

Yarra Communities That Care (CTC) is a network of 24 local partner agencies who share a commitment to support the healthy development of young people in the City of Yarra. One of the key initiatives of Yarra CTC is the collaborative delivery of evidence-based social and emotional messaging to families by a centrally coordinated Facilitator Network involving multiple partner agencies. Building on positive feedback and program achievements from 2017-2022, we led an evaluation of the collaborative and place-based approach of the Yarra CTC Facilitator Network to better understand its contribution to systemic change and apply learnings to future place-based approaches for our respective organisations. The evaluation project team adopted the 'Place-Based Evaluation Framework' and was informed by a comprehensive theory of change. This provided an anchor in an otherwise complex and dynamic environment and unfamiliar territory.
There is an increased focus on collaborative place-based approaches at the federal, state and local levels as a promising approach to addressing complex social problems. Previous evaluations and literature identify successful collaboration and a strong support entity or backbone as key enabling factors that make place-based approaches successful. The collaborative place-based approach to strengthening family relationships in Yarra provides a local example of this.

Consistent with systems change frameworks this evaluation provided evidence of structural changes. These changes, manifested in the form of improved practices and dedicated resources and supports, ultimately leading to effective collaborative and transformative changes for the community.

This presentation will share the journey, key insights, and learnings of the evaluation project team over a two-year period to collaboratively gather evidence to inform ongoing program development and contribute to future place-based approaches. The Yarra CTC Facilitator Network serves as a valuable template for implementing best practices for place-based coalitions due to its focus on collaboration and fostering a sense of community.

Chair Speakers
avatar for Sandra Opoku

Sandra Opoku

Senior Manager Evaluation and Social Impact, Relationships Australia Victoria
My role leads impact, evidence and innovation activities at Relationships Australia Victoria. These activities contribute to achieving strategic objectives and improving outcomes for individuals, families and communities. This now also includes oversight of several key prevention... Read More →
avatar for Kate Matthies-Brown

Kate Matthies-Brown

Since 2022, Kate has supported RAV’s evaluation and social impact activities, including program evaluation, practice development, and evidence reviews. She is a qualified social worker with experience in family services, youth mental health and academia. Kate has experience with... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

A sprint, not a marathon: Rapid Evaluation as an approach for generating fast evidence and insights
Friday September 20, 2024 12:00pm - 12:30pm AEST
104
Authors: Marnie Carter (Allen + Clarke Consulting)

Increasingly, evaluators are called upon to quickly equip decision makers with evidence from which to take action. A program may be imminently approaching the end of a funding cycle; a critical event may have taken place and leadership needs to understand the causes and learnings; or a new program of work is being designed for which it is important to ensure that finite resources are being directed to the most effective interventions. For such circumstances, Rapid Evaluation can be a useful tool.

Rapid Evaluation is not simply doing an evaluation quickly. It requires a deliberate, interlinked and iterative approach to gathering evidence to generate fast insights. What makes Rapid Evaluation different is that the evaluation design needs to be especially flexible, constantly adapting to the context. Data collection and analysis don't tend to follow a linear manner, but rather iterate back and forth during the evaluation. Rapid Evaluation is often conducted in response to specific circumstances that have arisen, and evaluators therefore need to manage a high level of scrutiny.

This presentation will provide an overview of how to conduct a rapid evaluation, illustrated by practical examples including rapid evaluations of a fund to support children who have been exposed to family violence, and a quickly-established employment program delivered during the COVID-19 pandemic. It will discuss the methodological approach to conducting a Rapid Evaluation, share lessons on how to manage the evolving nature of data collection as the evaluation progresses, and discuss how to maintain robustness while evaluating at pace.


Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Marnie Carter

Marnie Carter

Evaluation and Research Practice Lead, Allen + Clarke Consulting
Marnie is the Evaluation and Research Practice Lead for Allen + Clarke Consulting. She is experienced in program and policy evaluation, monitoring, strategy development, training and facilitation. Marnie is particularly skilled in qualitative research methods. She is an expert at... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from the past: Reflections and opportunities for embedding measurement and evaluation in the national agenda to end Violence against Women and Children
Friday September 20, 2024 1:30pm - 2:30pm AEST
106
Authors: Lucy Macmillan (ANROWS), Micaela Cronin (Domestic and Family Violence Commission, AU), Tessa Boyd-Caine (ANROWS, AU),Tiffiny Lewin (Lived Experience Advisory Council Member) (National Lived Experience Advisory Council, AU)

As evaluators, we are often asked to examine complex, systems-change initiatives. Domestic, family and sexual violence is a national crisis. In late 2022, the Commonwealth Government, alongside all state and territory governments released the second National Plan to End Violence against Women and Children 2022-2032. The plan provides an overarching national policy framework to guide actions across all parts of society, including governments, businesses, media, educational institutions and communities to achieve a shared vision of ending gender-based violence in one generation.

After 12 years of implementation under the first National Plan, assessing whether our efforts had made a meaningful difference towards ending violence against women was a difficult task. We ask: As we embark on setting up measurement and evaluation systems against the second National Plan, how do we avoid making the same mistakes again?

The Domestic, Family and Sexual Violence Commission was established in 2022 to focus on practical and meaningful ways to measure progress towards the objectives outlined in the National Plan. This session will discuss:
  1. the current plans, opportunities and challenges in monitoring progress, and evaluating the impact of this national framework, and
  2. the role of lived-experience in evaluation and how large publicly-funded institutions can balance their monitoring and sensemaking roles at the national-level with accountability to victim-survivors.

The panel will explore common challenges faced when seeking to monitor and evaluate complex national policy initiatives, including data capture, consistency and capacity, and explore some of the opportunities ahead.

The audience will have the opportunity to contribute their insights and expertise on how we, as evaluators, approach the evaluation of complex systems-change at a national scale, and over extended durations, while also prioritising the voices of those most affected. How do we collectively contribute to understanding if these national policy agendas will make a difference?


Chair
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Speakers
avatar for Lucy Macmillan

Lucy Macmillan

Dir Evaluation & Impact, ANROWS
Lucy has more than 20 years of monitoring and evaluation experience in both the Australian and international contexts. She is trained in trauma-informed and culturally safe approaches, and committed to ensuring that the voices of people with lived experience are respected and heard... Read More →
avatar for Tessa Boyd-Caine

Tessa Boyd-Caine

CEO, ANROWS
Tessa was born and grew up on unceded Gadigal land (Sydney), where she lives again after living overseas including in England, China and India.Prior to joining ANROWS in 2024, Tessa was the founding CEO of Health Justice Australia, the national centre for health justice partners... Read More →
avatar for Micaela Cronin

Micaela Cronin

Domestic Family and Sexual Violence Commissioner, Domestic Family and Sexual Violence Commission
Micaela Cronin began her career as a social worker in family violence and sexual assault services. Since then, she has held leadership roles across the social service sector in Australia and internationally, including as President of the Australian Council of Social Services.    Micaela... Read More →
TL

Tiffiny Lewin

Tiffiny is a lived-experience advocate and survivor of childhood sexual abuse, family violence and sexual assault. Her 30-year career spanning industry sectors across Australia and Japan informs her deep understanding of leading transformational change in diverse cultural, regulatory... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Navigating ethics dilemmas when evaluating for government: The good, the bad and the ugly
Friday September 20, 2024 1:30pm - 2:30pm AEST
Authors: Kristy Hornby (Grosvenor), Eleanor Williams (Australian Centre for Evaluation), Mandy Chaman (Outcomes, Practice and Evidence Network)

Navigating ethics is an essential part of any evaluation journey. As evaluators we often encounter complex situations that require thoughtful consideration of ethical principles and practice, far beyond the formal ethics process itself.

This session will explore real-world scenarios and provide attendees with actionable strategies to enhance ethical decision-making in their evaluation practice. The panel will speak to questions of managing commissioners' expectations, how to speak frankly to program areas where under-performance is found, issues of confidentiality, ensuring culturally sensitive practice, and ensuring power imbalances are acknowledged and addressed.

The panel presentation will take attendees through the journey of ethical practice and will consider:
- The overall significance of ethical thinking in evaluation
- Common ethical challenges faced by evaluators
- Practical tools and frameworks that empower evaluators to uphold their ethical standards and deliver meaningful results that can withstand scrutiny
- From an internal evaluator perspective, the balancing act of managing these tensions successfully
- Case studies that can illustrate the application of practical ethics in evaluation
- Takeaways and recommendations.

Eleanor Williams, Managing Director of the Australian Centre for Evaluation; Mandy Charman, Project Manager for the Outcome, Performance and Evidence Network in the Centre for Excellence in Child and Family Welfare; and Kristy Hornby, Victorian Program Evaluation Practice Lead at Grosvenor will be the panellists. Our expert panellists will talk to their deidentified war stories in their current and previous roles to set out exactly what kind of challenges evaluators can face in the conduct of their work, and learn from the panellists' hands-on experience on what to do about them. Attendees will be encouraged to participate in a dynamic dialogue with the panellists and with each other, to share their own experiences and strategies for addressing ethical concerns, building on the content shared through the session.
Chair
avatar for Sally Clifford

Sally Clifford

General Manager, Matrix on Board
Having graduated from QUT in Brisbane with a Bachelor of Arts in Drama ( Hons)( 1992) and then a Master of Arts ( CCD in Healthcare settings)( 1997) I worked for 6 years as a freelance community cultural development artist across Brisbane and SE Qld. In 1998 I was invited to develop... Read More →
Speakers
MC

Mandy Charman

Project Manager, Outome Practice and Evidence Network, Centre for Excellence in Child and Family welfare
Dr Mandy Charman is the Project Manager for the Outcome, Performance and Evidence Network (OPEN) in the Centre for Excellence in Child and Family Welfare. OPEN, which represents a sector–government–research collaboration, has been developed to strengthen the evidence base of the... Read More →
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
  Footprints

2:00pm AEST

A practical approach to designing and implementing outcome measures in psychosocial support services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
Authors: Lauren Gibson (Mind Australia ),Dr. Edith Botchway (Mind Australia, AU), Dr. Laura Hayes (Mind Australia, AU)

Outcome measurement in mental health services is recommend as best practice and provides an opportunity for clients and staff to track progress and navigate the complex road to recovery together. However, there are many barriers to embedding outcome measures in mental health services, including time constraints, low perceived value among staff and clients, and not receiving feedback on outcomes regularly. To overcome these challenges, a national not-for-profit provider of residential and non-residential psychosocial support services, created an innovative approach for designing and implementing outcome measures. The objective of our presentation is to describe this approach which has resulted in average outcome measure completion rates of over 80% across 73 services in Australia.

Design
We believe the key to achieving these completion rates is through understanding the needs of outcome measures end-users, including clients, carers, service providers, centralised support teams, and funding bodies. In this presentation we will share how we:
  • "Begin with the end in mind" through working with stakeholders to create user personas and program logics to identify meaningful outcomes and survey instruments.
  • Design easy to use digital tools to record quality data and provide stakeholders with dashboards to review their outcomes in real time through visualising data at an individual client level, and service level.

Implementation
Also key to embedding outcome measures is having a structured, multi-stage approach for implementation, with tailored support provided to:
  • Prepare services (e.g., Training)
  • Install and embed outcome measures in routine practice (e.g., Service champions)
  • Maintain fidelity over time (e.g., Performance monitoring)

The presentation will highlight the salient barriers and enablers identified during each design and implementation stage.

Overall, the presentation will provide a practical example of how to design and implement outcome measures in mental health services to ensure they are adding value for relevant stakeholders and enabling efficient and meaningful evaluation.

Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Lauren Gibson

Lauren Gibson

Researcher, Mind Australia
Dr. Lauren Gibson’s research focuses on understanding the prevalence and impact of initiatives aimed at improving physical and mental health outcomes among mental health service users. She has been a researcher within the Research and Evaluation team at Mind Australia for over two... Read More →
Friday September 20, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -