Loading…
Conference hashtag #aes24MEL
Public administration and safety clear filter
arrow_back View All Dates
Friday, September 20
 

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Impact evaluation: bringing together quantitative methods and program theory in mixed method evaluations
Friday September 20, 2024 11:00am - 12:00pm AEST
Authors: Harry Greenwell (Australian Treasury), To be determined (Australian Treasury, AU)

This session will provide an overview of some of the main quantitative methods for identifying the causal impacts of programs and policies, while emphasising the importance of mixed-methods that also incorporate program theory and qualitative research. It is intended for people unfamiliar with quantitative evaluation methods who would like to develop their understanding of these methods in order to better contribute to theory-based, mixed method impact evaluations.

The session will cover 3 of the most common quantitative approaches to separating causality from correlation: i) mixed-method RCTs, ii) discontinuity design, and iii) matching. Each method will be explained with real examples. The session will also cover: the benefits and limitations of each method, and considerations for determining when such methods might be suitable either on their own, or as a complement to other evaluation methods or approaches.

Special attention will be given to the ethical considerations inherent in the choice of impact evaluation method, including issues related to consent, fairness, vulnerability, and potential harm.

After attending this session, participants will have a better understanding of: how program theory can inform the design of quantitative impact evaluations, including through mixed-method impact evaluations; and how to identify when certain quantitative impact evaluation methods may be suitable for an evaluation.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Harry Greenwell

Harry Greenwell

Senior Adviser, Australian Treasury
Harry Greenwell is Director of the Impact Evaluation Unit at the Australian Centre for Evaluation (ACE) in the Australian Treasury. He previously worked for five years at the Behavioural Economics Team of the Australian Government (BETA). Before that, he worked for many years in the... Read More →
avatar for Vera Newman

Vera Newman

Assistant Director
Dr Vera Newman is an Assistant Director in the Impact Evaluation Unit at the Australian Centre for Evaluation. She has many years experience conducting impact evaluations in the private and public sector, and is dedicated to applying credible methods to public policy for generating... Read More →
Friday September 20, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

The ACT Evidence and Evaluation Academy 2021-24: Lessons learned from a sustained whole-of-government ECB effort
Friday September 20, 2024 11:30am - 12:00pm AEST
105
Authors: Duncan Rintoul (UTS Institute for Public Policy and Governance (IPPG) ),George Argyrous (UTS Institute for Public Policy and Governance (IPPG), AU),Tish Creenaune (UTS Institute for Public Policy and Governance (IPPG), AU),Narina Dahms (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Peter Robinson (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Robert Gotts (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU)

The ACT Evidence and Evaluation Academy is a prominent and promising example of sustained central agency investment in evaluation capability building (ECB).

The Academy was launched in 2021 as a new initiative to improve the practice and culture of evidence-based decision-making in the ACT public sector. Its features include:
  • a competitive application process, requiring executive support and financial co-contribution
  • a series of in-person professional learning workshops where participants learn alongside colleagues from other Directorates
  • a workplace project, through which participants apply their learning, receive 1-1 coaching, solve an evaluation-related challenge in their work and share their insights back to the group
  • executive-level professional learning and practice sharing, for nominated evaluation champions in each Directorate
  • sharing of resources and development of evaluation communities of practice in the Directorates
  • an annual masterclass, which brings current participants together with alumni and executive champions.

Four years and over 100 participants later, the Academy is still going strong. There has been an ongoing process of evaluation and fine tuning from one cohort to the next, with encouraging evidence of impact. This impact is seen not only for those individuals who have taken part but also for others in their work groups, including in policy areas where evaluation has not historically enjoyed much of a foothold.

The learning design of the Academy brings into focus a number of useful strategies - pedagogical, structural and otherwise - that other central agencies and line agencies may like to consider as part of their own ECB efforts.

The Academy story also highlights some of the exciting opportunities for positioning evaluation at the heart of innovation in the public sector, particularly in the context of whole-of-government wellbeing frameworks, cross-agency collaboration and strategic linkage of data sets to support place-based outcome measurement.

Chair Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
Friday September 20, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Climate Change, Energy, the Environmentand Water NSW
It was a short step from studying environmental science, and working on cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing stories of what you've done and learned, especially in energy, climate change, environment and... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from the past: Reflections and opportunities for embedding measurement and evaluation in the national agenda to end Violence against Women and Children
Friday September 20, 2024 1:30pm - 2:30pm AEST
106
Authors: Lucy Macmillan (ANROWS), Micaela Cronin (Domestic and Family Violence Commission, AU), Tessa Boyd-Caine (ANROWS, AU),Tiffiny Lewin (Lived Experience Advisory Council Member) (National Lived Experience Advisory Council, AU)

As evaluators, we are often asked to examine complex, systems-change initiatives. Domestic, family and sexual violence is a national crisis. In late 2022, the Commonwealth Government, alongside all state and territory governments released the second National Plan to End Violence against Women and Children 2022-2032. The plan provides an overarching national policy framework to guide actions across all parts of society, including governments, businesses, media, educational institutions and communities to achieve a shared vision of ending gender-based violence in one generation.

After 12 years of implementation under the first National Plan, assessing whether our efforts had made a meaningful difference towards ending violence against women was a difficult task. We ask: As we embark on setting up measurement and evaluation systems against the second National Plan, how do we avoid making the same mistakes again?

The Domestic, Family and Sexual Violence Commission was established in 2022 to focus on practical and meaningful ways to measure progress towards the objectives outlined in the National Plan. This session will discuss:
  1. the current plans, opportunities and challenges in monitoring progress, and evaluating the impact of this national framework, and
  2. the role of lived-experience in evaluation and how large publicly-funded institutions can balance their monitoring and sensemaking roles at the national-level with accountability to victim-survivors.

The panel will explore common challenges faced when seeking to monitor and evaluate complex national policy initiatives, including data capture, consistency and capacity, and explore some of the opportunities ahead.

The audience will have the opportunity to contribute their insights and expertise on how we, as evaluators, approach the evaluation of complex systems-change at a national scale, and over extended durations, while also prioritising the voices of those most affected. How do we collectively contribute to understanding if these national policy agendas will make a difference?


Chair
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Speakers
avatar for Lucy Macmillan

Lucy Macmillan

Dir Evaluation & Impact, ANROWS
Lucy has more than 20 years of monitoring and evaluation experience in both the Australian and international contexts. She is trained in trauma-informed and culturally safe approaches, and committed to ensuring that the voices of people with lived experience are respected and heard... Read More →
avatar for Tessa Boyd-Caine

Tessa Boyd-Caine

CEO, ANROWS
Tessa was born and grew up on unceded Gadigal land (Sydney), where she lives again after living overseas including in England, China and India.Prior to joining ANROWS in 2024, Tessa was the founding CEO of Health Justice Australia, the national centre for health justice partners... Read More →
avatar for Micaela Cronin

Micaela Cronin

Domestic Family and Sexual Violence Commissioner, Domestic Family and Sexual Violence Commission
Micaela Cronin began her career as a social worker in family violence and sexual assault services. Since then, she has held leadership roles across the social service sector in Australia and internationally, including as President of the Australian Council of Social Services.    Micaela... Read More →
TL

Tiffiny Lewin

Tiffiny is a lived-experience advocate and survivor of childhood sexual abuse, family violence and sexual assault. Her 30-year career spanning industry sectors across Australia and Japan informs her deep understanding of leading transformational change in diverse cultural, regulatory... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Investment logic mapping or evaluation logic modelling? Similarities and differences.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlie TULLOCH (Policy Performance)

Evaluation logic modelling is a frequently used technique, looking at such things as inputs, activities, outputs and outcomes.
Since the early 2000s, the Department of Treasury and Finance (Victoria) has used an adapted logic modelling format called Investment Logic Mapping (ILM). It is now used nation-wide and internationally to support resource allocation planning, along with stakeholder engagement.

ILMs and evaluation logic modelling have many similarities, but some major differences.

This ignite presentation will compare and contrast both these tools, and describe when and why to use each.

Attendees will very quickly understand the main similarities and differences, their advantages and drawbacks.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Program Evaluation Fundamentals in the NSW Department of Planning, Housing and Infrastructure: An eLearning course on evaluation
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Anabelle (Pin-Ju) Chen (NSW Department of Planning, Housing and Infrastructure)

Introducing Program Evaluation Fundamentals (PEF) in the NSW Department of Planning, Housing and Infrastructure, an eLearning course designed to facilitate a coherent journey of learning within the department. With learning and adapting together in mind, the design of PEF empowers individuals at all levels to navigate the fundamentals of evaluation. Through interactive modules, learners will understand key evaluation concepts and cultivate best practices. PEF promotes transformative growth by emphasising foundational evaluation knowledge. By leveraging PEF, we can shift our approach, embrace innovation, and advance the field of evaluation across the public sector, fostering a supportive community of forward-thinking evaluators.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Anabelle (Pin-Ju) Chen

Anabelle (Pin-Ju) Chen

Senior Analyst, Evidence and Evaluation, NSW Department of Planning, Housing and Infrastructure
Anabelle (Pin-Ju) Chen is a distinguished senior analyst hailing from Taiwan, with a global perspective on evaluation, data analysis, and project management. Having studied in Taiwan, the United Kingdom, and Australia, Anabelle brings a diverse range of experiences and insights to... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -