Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
Short paper clear filter
Wednesday, September 18
 

11:00am AEST

Evaluation that adds value for People and Planet: Perspectives, Challenges, and Opportunities for Indigenous Knowledge Systems in Africa.
Wednesday September 18, 2024 11:00am - 11:30am AEST
104
Authors: Awuor PONGE (African Policy Centre (APC) )

Indigenous Knowledge Systems (IKS) in Africa have long been marginalized and undervalued, despite their potential to offer sustainable solutions to pressing challenges faced by communities across the continent. This presentation explores the perspectives, challenges, and opportunities for incorporating IKS into evaluation practices that create value for both people and the planet.

From a people-centric perspective, IKS offer a holistic and culturally relevant approach to understanding local contexts, priorities, and value systems. By embracing these knowledge systems, evaluations can better capture the multidimensional nature of well-being, including spiritual, social, and environmental aspects that are often overlooked in conventional evaluation frameworks. However, challenges arise in reconciling IKS with dominant Western paradigms and navigating power dynamics that have historically suppressed indigenous voices.

From a planetary perspective, IKS offer invaluable insights into sustainable resource management, biodiversity conservation, and climate change adaptation strategies that have been honed over generations of lived experiences. Integrating these knowledge systems into evaluation can shed light on the intricate relationships between human activities and ecosystem health, enabling more informed decision-making for environmental sustainability. Nonetheless, challenges exist in bridging the divide between traditional and scientific knowledge systems, as well as addressing concerns around intellectual property rights and benefit-sharing.

This presentation will explore innovative approaches to overcoming these challenges, such as participatory and community-based evaluation methodologies, capacity-building initiatives, and cross-cultural dialogue platforms. By fostering a deeper appreciation and understanding of IKS, evaluation practices can become more inclusive, relevant, and effective in creating value for both people and the planet in the African context.


Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Awuor Ponge

Awuor Ponge

Senior Associate Research Fellow, African Policy Centre (APC)
Dr. Awuor Ponge, is a Senior Associate Fellow, in-charge of Research, Policy and Evaluation at the African Policy Centre (APC). He is also the Vice-President of the African Evaluation Association (AfrEA). He holds a Doctor of Philosophy (PhD) Degree in Gender and Development Studies... Read More →
Wednesday September 18, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Culturally Responsive Initiatives: Introducing the First Nations Investment Framework
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104
Authors: Eugenia Marembo

Representatives of First Nations communities have been advocating for changes in the way initiatives are planned, prioritised, and assessed. This includes greater visibility on where funding is going, more partnerships on designing initiatives and more evaluation on the outcomes being achieved, to inform government decision making.

This paper presents key insights on what constitutes good practice when designing and appraising initiatives that affect First Nations people and communities. The National Agreement on Closing the Gap is built around four new Priority Reforms that will change the way governments work with Aboriginal and Torres Strait Islander people and communities. Priority Reform Three is about transforming government institutions and organisations. As part of this Priority Reform, parties commit to systemic and structural transformation of mainstream government organisations to improve accountability, and to respond to the needs of First Nations people.

The findings presented in this paper draw on insights from consultations with various First Nations community representatives and government stakeholders in New South Wales, and the subsequent process of developing a government department's First Nations investment framework which seeks to strengthen the evidence on what works to improve outcome for First Nations people. Additionally, the frameworks to improve practice across government processes and better inform how initiatives are designed, prioritised and funded.
Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Steven Legg

Steven Legg

Associate Director, NSW Treasury
avatar for Eugenia Marembo

Eugenia Marembo

NSW Treasury, Senior Analyst, First Nations Economic Wellbeing
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

From bottlenecks to breakthroughs: Insights from a teacher workforce initiative evaluation
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104
Authors: Rhiannon Birch (Victorian Department of Education ),Hayden Jose (Urbis, AU),Joanna Petkowksi (Victorian Department of Education, AU),Ekin Masters (Victorian Department of Education, AU)

How can an evaluation balance the need to generate pragmatic insights while meeting central agency requirements for rigorous measurement of outcomes? What ingredients can facilitate the effective evaluation of a government initiative and achieve improved outcomes? This paper explores the essential ingredients for evaluating a large-scale government program using an example of a statewide initiative aimed at attracting and retaining suitably qualified teachers in hard-to-staff positions in Victorian government schools.

We showcase how an adaptive and evidence-led method of enquiry helped identify program implementation bottlenecks and probe potentially unintended program outcomes over a three-year evaluation. We discuss enablers for the integration of evaluation recommendations into program implementation and future policy direction, particularly on participatory action approaches and deep relationships with policy and implementation teams. We will also present the robust and varied methodology, particularly the novel use of system data to facilitate a quasi-experimental design that aligned with central agency requirements and met stakeholder needs.
This presentation will benefit policymakers, program evaluators, and others interested in evaluating government programs, by sharing key learnings on how evaluations can balance pragmatic insights with central agency requirements and identifying the key elements for influencing such programs and achieving improved outcomes.
Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Rhiannon Birch

Rhiannon Birch

Senior Evaluation and Research Officer, Department of Education
Rhiannon is a dedicated research and evaluation specialist committed to enhancing health, social, education, and environmental outcomes for people and the planet. With over 10 years of experience in evaluation, she has worked extensively across emergency services, public health, and... Read More →
avatar for Hayden Jose

Hayden Jose

Associate Director, Urbis
Hayden brings 13 years’ experience as an evaluator, applied researcher and policy practitioner with extensive work in complex evaluations in government and not-for-profit settings. Across his roles, he has worked to consider complex system problems and translate evidence effectively... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Warlpiri ways of assessing impact - How an Aboriginal community is defining, assessing and taking action for a good life in their community.
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
Authors: Emily Lapinski (Central Land Council ),Malkirdi Napaljarri Rose (Centre For Human Security and Social Change, La Trobe University, AU), Glenda Napaljarri Wayne (Central Land Council, AU), Geoffrey Jungarrayi Barnes (Central Land Council, AU), Alex Gyles (Centre For Human Security and Social Change, La Trobe University, AU)

For evaluation to support transformational change, research suggests strategies must focus on localised Indigenous values, beliefs and worldviews. Decolonising evaluation involves identifying and addressing power and considering what is being evaluated, by whom and how. In this paper we argue that these developments are necessary but insufficient and suggest a possible way forward for further decolonising the field of evaluation. To support change for Indigenous Australians the emphasis needs to move from simple evaluation of individual programs to more critical examination of their combined impact on communities from local perspectives.

This paper explores how Warlpiri and non-Indigenous allies are collaborating to create and use their own community-level impact assessment tool. The 5-year Good Community Life Project is supporting Warlpiri residents of Lajamanu in the Northern Territory to define, assess and take action for a 'good community life'. Warlpiri will explain how they created the approach for assessing wellbeing in Lajamanu, and how they are using emerging results to give voice to their interests and advocate for the life they envision for future generations.

The project involves collaboration between Warlpiri community members, land council staff and university researchers, drawing on Indigenous concepts of 'two-way' seeing and working, relationality, and centring Indigenous voice and values. Applying these concepts in practice is challenging, particularly for non-Indigenous allies who must constantly reflect and use their privilege to challenge traditional views on 'robust' evaluation methodology.

Warlpiri and the land council see potential for this work to improve life in Lajamanu and as an approach that could be applied across Central Australian communities. Going beyond co-designed and participatory evaluation to critical examination of impact is the next step in supporting change with Indigenous communities. This paper will focus on Warlpiri perspectives, plus brief reflections from non-Indigenous allies, with time for the audience to discuss broader implications.
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
EL

Emily Lapinski

Monitoring, Evaluation and Learning Coordinator, Central Land Council
avatar for Alex Gyles

Alex Gyles

Research Fellow - Monitoring and Evaluation, Institute for Human Security and Social Change, La Trobe University
Alex Gyles is a Research Fellow working in Monitoring, Evaluation and Learning (MEL) at the Institute for Human Security and Social Change, La Trobe University. He works closely with Marlkirdi Rose Napaljarri on the YWPP project and finds fieldwork with the YWPP team an exciting learning... Read More →
GN

Glenda Napaljarri Wayne

Glenda Wayne Napaljarri is a community researcher on the YWPPproject from Yuendumu. She has developed her practice workingas an adult literacy tutor in Yuendumu’s Community LearningCentre. In addition to conducting research in her home communityof Yuendumu, Glenda has travelled... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating collaborative practice - the role of evaluation in brokering shared outcomes
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
106
Authors: Caroline Crothers (Allen + Clarke Consulting)

A collaborative effort between community organisations and Victoria Police has demonstrated significant impact in addressing youth offending in Victoria's northwest metropolitan region. This initiative brought together 12 partner organisations from various sectors, including police, legal services, and youth support services around the shared goal of reducing youth offending. By diverting young offenders from the criminal justice system, the initiative seeks to enhance overall justice, health, and wellbeing outcomes for vulnerable youth. Allen + Clarke was commissioned to evaluate the success of this initiative during its inaugural year. In this presentation, we share key lessons learned from the evaluation including how minimally resourced and small-scale interventions can have an outsized impact on organisational change to culture and practice. We also reflect on the journey embarked upon and explore how the evaluation process itself serves as a tool for navigating through complex challenges and adapting to changes encountered along the way. Through critical reflection, the presentation delves into the differing perspectives of the delivery partners involved highlighting how the evaluation journey facilitates a shared understanding of the path forward and shaping future strategies and interventions.
Chair
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
Speakers
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Enhancing Stakeholder Engagement Through Culturally Sensitive Approaches: A Focus on Aboriginal and Torres Strait Islander Communities
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105
Authors: Mark Power (Murawin ),Carol Vale (Murawin, AU)

This presentation explores the paramount importance of culturally sensitive engagement methodologies in ensuring meaningful contributions from Aboriginal and Torres Strait Islander communities to mission programs. Murawin, an Aboriginal-led consultancy, has developed a robust Indigenous Engagement Strategy Framework grounded in the principles of reciprocity, free, informed and prior consent, mutual understanding, accountability, power sharing, and respect for Indigenous knowledge systems. Our session aims to share insights into the necessity of prioritising Aboriginal and Torres Strait Islander voices in engagement, co-design, and research, highlighting the significance of cultural competence in fostering mutual respect and understanding.
We will discuss three key messages: the imperative of deep knowledge and understanding of Aboriginal and Torres Strait Islander cultures in engagement practices; the success of co-design processes in facilitating genuine and respectful engagement; and the strategic partnership with CSIRO to enhance cultural competence and inclusivity in addressing Indigenous aspirations and challenges. These points underscore the critical role of acknowledging cultural interactions and ensuring cultural sensitivity in building strong, respectful productive relationships with Indigenous communities.
To achieve our session's objectives, we have designed an interactive format that blends informative presentations with the analysis of case studies, complemented by engaging intercultural discussions. This approach is intended to equip participants with actionable insights drawn from real-world examples of our collaborative ventures and co-designed projects. Through this comprehensive exploration, we aim to enrich participants' understanding of successful strategies for engaging Aboriginal and Torres Strait Islander communities, ultimately contributing to the achievement of more inclusive and impactful outcomes in mission programs and beyond.


Chair
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Speakers
avatar for Carol Vale

Carol Vale

CEO & Co-founder, Murawin, Murawin
Carol Vale is a Dunghutti entrepreneur, businesswoman, CEO and co-founder of Murawin, who’s passion, determination and commitment have driven her impressive 40-year career as a specialist in intercultural consultation, facilitation, and participatory engagement, and an empathetic... Read More →
avatar for Mark Power

Mark Power

Director, Evaluation & Research, Murawin
Mark is an experienced researcher with more than 20 years of experience in Australia and the Pacific. Mark manages Murawin’s evaluation and research practice and leads multiple evaluations for a variety of clients. Mark has overseen more than 30 high-profile, complex projects funded... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Valuing First Nations Cultures in Cost-Benefit Analysis
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
103
Authors: Laura Faulker (NSW Treasury)

This paper presents the key findings from research and engagement on how cost-benefit analysis (CBA) has been applied to First Nations initiatives to date. CBA is an important tool used by governments to help prioritise budget funding decisions. It assesses the potential impacts of an initiative - economic, social, environmental, and cultural - to determine whether it will deliver value for money.

The paper explores the methods in which the value of First Nations cultures has been incorporated into CBAs, along with the associated challenges and opportunities to improve current practice. The findings have informed the development of an investment framework for the design and evaluation of initiatives that affect First Nations people and communities. The framework focuses on the key principles for embedding First Nations perspectives and ensuring culturally informed evaluative thinking.


Chair
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting
Speakers
avatar for Laura Faulkner

Laura Faulkner

Senior Analyst, First Nations Economic Wellbeing, NSW Treasury
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Enhancing evaluation value for small community organisations: A case example
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104
Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke, Carolyn McSporran (Blue Light Victoria, AU)Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke (Assessment and Evaluation Research Centre, University of Melbourne, AU), Elissa Scott (Blue Light Victoria, AU)

This presentation aims to provide a case example of how two small-scale, standard process/outcomes evaluations for a low-budget community organisation increased value for the organisation by identifying and seizing opportunities for evaluation capacity building. Formal evaluations represent a significant financial commitment for low-budget community organisations. By maximising the value provided by such evaluations, evaluators can contribute more to these organisations' mission and ultimately to social betterment.

There are numerous different evaluation capacity building models and frameworks, many of which appear to be quite complex (for example: Volkov & King, 2007; Preskill & Boyle, 2008). Many emphasise planning, documentation, and other resource intensive components as part of any evaluation capacity building effort. This session provides a case example of intentional but light-touch and opportunistic evaluation capacity building. Through such an approach, evaluators may need to do only minimal additional activities to provide extra value to an organisation. Reflection-in-action during the evaluation process is as important as the final reporting (Schwandt & Gates, 2021). The session emphasises, though, that a critical enabler will be the organisation's leadership and culture, and willingness to seize the opportunity offered by a formal evaluation. The session is co-presented by two members of the evaluation team and the Head of Strategy, Insights, and Impact of the client organisation.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
avatar for Stephanie Button

Stephanie Button

Research Associate & Evaluator, Assessment & Evaluation Research Centre
Stephanie has worked as a policy manager, analyst, strategist, researcher, and evaluator across the social policy spectrum in the public and non-profit sector for over 12 years. She is passionate about evidence-based policy, pragmatic evaluation, and combining rigour with equitable... Read More →
avatar for Carolyn McSporran

Carolyn McSporran

Head of Strategy, Insights and Impact, Blue Light Victoria
Passionate about social inclusion, Carolyn's work has spanned diverse portfolios across the justice and social services sectors. With a fervent belief in the power of preventative and early intervention strategies, she is committed to unlocking the full potential of individuals and... Read More →
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Evaluation for whom? Shifting evaluation to increase its value for local actors
Wednesday September 18, 2024 2:00pm - 2:30pm AEST
104
Authors: Linda Kelly (Praxis Consultants), Mary Raori (UNDP Pacific, FJ)

This presentation outlines an approach to program assessment of a long-term governance program working across the Pacific, the UNDP Governance for Resilience program. It tells the story of the program’s maturing evaluation approach which has shifted from serving the information needs of those with money and power to focus more particularly on the values and interests of local participants and partners..
Despite the well-documented limitations of single methodology evaluation approaches for complex programs, many international development donors and corresponding international and regional organisations, continue to require program assessment that serves their needs and values. Typically, this includes narrowing evaluation to assessment against quantitative indicators. Notwithstanding the extensive limitations of this approach, it serves the (usually short-term) needs of international donors and other large bureaucracies. It generates simple information that can be communicated and showcased in uncritical forms. It provides numbers that are easily aggregated and used for concise reporting to senior and political masters.
Such approaches risk crowding out attention to the information needs of other participants and undermine attempts to support more locally led processes. This presentation will explain how this long-term and large-scale program has shifted, making use of a values-based evaluative approach to better serve the interests of partners and participants in the Pacific. This has involved both a methodological and political shift, broadening the range of data collection and analysis methodologies and approaches, increasing resourcing to accommodate different types of data and data collection and internal and external advocacy. This one program experience echoes wider views across the Pacific about the limitations of externally imposed measures and the lack of attention to what is valued by pacific countries and people.


Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Lisa Buggy

Lisa Buggy

Strategy, Learning and Innovation Specialist, UNDP Pacific Office
Ms. Lisa Buggy commenced with the UNDP Pacific Office in Fiji in January 2021 and has recently transitioned into the role of Strategy, Learning and Innovation Specialist with the Governance for Resilient Development in the Pacific project. Her current role focuses on influencing systems... Read More →
avatar for Linda Vaike

Linda Vaike

Programme Adviser - Climate Risk Finance and Governance, Pacific Islands Forum Secretariat
Wednesday September 18, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Navigating a path to system impact: designing a strategic impact evaluation of education programs
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104
Authors: Amanda Reeves (Victorian Department of Education), Rhiannon Birch (Victorian Department of Education, AU), Eunice Sotelo (Victorian Department of Education, AU)

To provide insight to complex policy problems, evaluations need to adopt a systems perspective and examine the structures, relationships and contexts that influence program outcomes.

This paper outlines the design of a 4-year strategic evaluation that seeks to understand how a portfolio of over 25 education programs are interacting and collectively contributing to system-level outcomes. In this context, policy makers require evaluation to look beyond the boundaries of individual programs and assess the holistic impact of this investment to inform where and how resources can be directed to maximise system outcomes.

The strategic evaluation presented is theory-based and multi-layered, using logic modelling to identify outcomes at the program, cluster and system level and draw linkages to develop a causal pathway to impact. The strategic evaluation and the evaluations of individual education programs are being designed together to build-in common measures to enable meta-analysis and synthesis of evidence to assess system-level outcomes. The design process has been broad and encompassing, considering a diverse range of methods to understand impact including quantitative scenario modelling and value for money analysis.

The authors will describe how the strategic evaluation has been designed to respond to system complexity and add value. The evaluation adopts an approach that is:
• interdisciplinary, drawing on a range of theory and methods to examine underlying drivers, system structures, contextual factors and program impacts
• collaborative, using expertise of both internal and external evaluators, to design evaluations that are aligned and can tell a story of impact at the system-level
• exploratory, embracing a learning mindset to test and adapt evaluation activities over time.

This paper will be valuable for anyone who is interested in approaches to evaluating the relative and collective contribution of multiple programs and detecting their effects at the system level to inform strategic decision-making.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Amanda Reeves

Amanda Reeves

Principal Evaluation Officer, Victorian Department of Education
Amanda is an evaluation specialist with over 12 years experience leading evaluation projects in government, not-for-profit organisations and as a private consultant. She has worked across a range of issues and sectors including in education, youth mental health, industry policy and... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Senior Evaluation & Research Officer, Department of Education (Victoria)
I'm here for evaluation but passionate about so many other things - education (as a former classroom teacher); language, neuroscience and early years development (recently became a mom so my theory reading at the moment is on these topics); outdoors and travel. Workwise, I'm wrangling... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

If treaty is like a marriage, state evaluation needs sustained deep work: Evaluation and Victoria's First Peoples Treaty
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
105
Authors: Kate Nichols (Department of Jobs, Skills, Industry and Regions - Victoria), Milbert Gawaya (Department of Jobs, Skills, Industry and Regions, AU)

First Peoples/settler state treaties have been likened to marriage - an evolving and changeable (political) relationship, not an endpoint or divorce (Blackburn, 2007). But what does this look like in practice given marriage's checkered connection with power imbalance and violence through to romance and deep, trusting companionship?

Contemporary colonial 'settlerism' (after Aunty/Dr Lilla Watson, in Watego, 2021) is undergoing structural change in Victoria, with Victoria's First Peoples sitting down with the Victorian State Government in 2024 to commence statewide treaty negotiations. Treaty is an acknowledgement that British sovereignty did not extinguish Aboriginal sovereignty, opening-up a "third space of sovereignty" (after Bruyneel, 2007) where co-existing sovereigns can further contest the "sovereignty impasse" (ibid., 2007), while Indigenous people control their own affairs.

Treaty is expected to reshape how the Victorian state government operates, challenging state laws, institutions, policies, programs and processes, which together, have contributed to First Nations disadvantage and suffering. Government evaluation practices will need their own shake-up.

How can public sector evaluators help establish an equal, strong and nourishing treaty marriage? This short paper shares emerging ally insights into how local practices are evolving to support Victoria's Treaty and self-determination. It shares reflections from a recent evaluation of Traditional Owner grant programs, conducted in partnership between key Aboriginal and non-Aboriginal public sector staff. It is a story of both-ways practice and the time, trust and bravery required to achieve deep change. It also highlights the role of lifelong cultural learning and behaviour change for ally evaluators. Culturally responsive evaluation, Indigenous research practices, restorative justice and the AES First Nations Cultural Safety Framework provide useful framing. Although focused on the Victorian treaty context, the paper may be transferable to other jurisdictions and evaluations involving or impacting Aboriginal and Torres Strait Islander peoples in support of their sovereignty and self-determination.
ion trainers and facilitators.
Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
Speakers
avatar for Kate Nichols

Kate Nichols

Senior Evaluator, Department of Economic Development, Jobs, Transport & Resources
I've been a practising evaluator since Missy Elliot released 'Work it' which a) reveals a bit too much about my age, but b) gives you a sense of how much I'm into this stuff. I've recently returned to an evaluation role in the Victorian public sector after working in a private sector... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When "parachuting in" is not an option: Exploring value with integrity across languages, continents and time zones
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106
Authors: Julian King (Julian King & Associates), Adrian Field (Dovetail)

The rapid growth of video-conferencing technology has increased the ability for evaluations to be conducted across multiple countries and time zones. People are increasingly used to meeting and working entirely online, and evaluations can in principle be designed and delivered without need for face to face engagement. Translational AI software is even able to break through language barriers, providing further efficiencies and enabling evaluation funds to be directed more to design, data gathering and analysis.

Yet the efficiency of delivery should not compromise the integrity with which an evaluation is conducted. This is particularly true in situations where different dimensions of equity come into question, and in an evaluation where two or more languages are being used, ensuring that the design and delivery are meaningful and accessible to all participants, not just the funder.

The growth of remote evaluation working presents a very real, if not even more pressing danger, of the consultant "parachuting in" and offering solutions that have little or no relevance to the communities who are at the centre of the evaluation process.

In this presentation we explore the wayfinding process in designing and implementing a Value for Investment evaluation of an urban initiative focusing on the developmental needs of young children, in Jundiaí, Brazil. We discuss the challenges and opportunities presented by a largely (but ultimately not entirely) online format, in leading a rigorously collaborative evaluation process, and gathering data in a way that ensures all stakeholder perspective are appropriately reflected. We discuss the trade-offs involved in this process, the reflections of evaluation participants, and the value of ensuring that underlying principles of collaborative and cross-cultural engagement are adhered to.

Chair
avatar for Melinda Mann

Melinda Mann

Academic Lead Jilbay First Nations RHD Academy, CQUniversity
Melinda Mann is a Darumbal and South Sea Islander woman based in Rockhampton, Qld. Her work focuses on Indigenous Nation building, Pacific sovereignties, and regional and rural communities. Melinda has a background in student services, learning design, school and tertiary education... Read More →
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Elevating evaluation: practical insights for supporting systems transformation
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
Authors: Kathryn Erskine (Cube Group), Michael Maher (Cube Group, AU)

The session is intended to provide a practical example of how a traditional program evaluation was re-orientated to allow the application of findings to inform broader system transformation. This echoes current discourse in evaluation - particularly since the global pandemic - whereby traditional notions about how the field of evaluation is viewed and developed are being challenged (Ofir, Z. 2021). Specifically, there are calls to rethink and elevate evaluation practice to actively contribute to and support systems transformation (see Dart 2023; Norman 2021), beyond a narrow programmatic focus.

This session will illuminate this discussion by examining a mental health program evaluation in the context of significant service reform across the Victorian mental health system. The presentation will outline insights and techniques about how to lift and reconfigure a tightly defined program evaluation into one which can have broader application to the system ecosphere. It outlines how and why the pivot was made; changes we made to the methodology and the key benefits that arose from taking an expansive view of the sector in which the program operated within.

The design of the session will be a presentation format supported by a PowerPoint slide deck, comprising:
•    Introduction and purpose of session
•    Overview of the program we evaluated
•    Key challenges which required an evaluation 'pivot' - and how we worked with our client
•    Key changes made to the methodology
•    Key benefits from elevating from a programmatic to systems focus.


Chair
avatar for Nick Field

Nick Field

Director (Public Sector), Urbis
Nick has twenty years of public sector consulting experience, backed more recently by six years as a Chief Operating Officer in the Victorian Public Sector. A specialist generalist in a broad range of professional advisory services, Nick has expertise in the implementation of state-wide... Read More →
Speakers
avatar for Kathryn Erskine

Kathryn Erskine

Director, Cube Group
Combining academic rigour with a practical ‘can-do’ approach, Kathryn is committed to delivering evidence-based change that improves the lives of Australians.Kathryn brings a depth and breadth of experience in the public, private and not-for-profit sectors, specialising in program... Read More →
avatar for Michael Maher

Michael Maher

Partner & Evaluation Lead, Cube Group
Leading Cube Group’s Evaluation and Review practice, Michael brings over 30 years of experience in the public, private and not-for-profit sectors. Michael’s work spans all areas of social policy with particular expertise in early childhood, education, justice, human services... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When speed is of the essence: How to make sure the rubber hits the road
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103
Authors: Kristy Hornby (Grosvenor)

There is a lot of interest in rapid M&E planning and rapid evaluations at present; partially borne out of rapid contexts in a COVID-19 policy context; and partially borne out of constricting appetites for time and money spent on evaluations. It is unlikely this trend will reverse in the short-term, so what do we do about it to acquit our responsibilities as evaluators, ethically and appropriately, in a rapid context? This session sets out a step by step approach to conducting a rapid evaluation, inviting attendees to follow along with their own program in mind, to come away from the session with a pathway for conducting their own rapid evaluation. The session uses a fictional case study as the construct to move the rapid evaluation approach forward, describing throughout the session how you can use literature reviews, qualitative and quantitative data collection and analysis techniques, and report writing approaches innovatively to save you time while not compromising rigour.

We contend it is possible to do a rapid evaluation ethically and appropriately, but the backbone of doing so is good planning and execution. This session shares practical tips and approaches for doing so through each key phase of an evaluation, so attendees are well-equipped for their next rapid evaluation.

To consolidate the learning, attendees will be provided a framework to come away from the session with a high level plan of how to conduct their own rapid evaluation, to increase their chance of success.

Chair Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Developing a Tool for Measuring Evaluation Maturity at a Federal Agency
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105
Authors: Eleanor Kerdo (Attorney Generals Department ),Claudia Oke (Attorney Generals Department, AU),Michael Amon (Attorney Generals Department, AU),Anthony Alindogan (Attorney Generals Department, AU)

To embed a culture of evaluation across the Australian Public Service (Commonwealth of Australia, 2021), we must first have an accurate understanding of the current state of evaluation capability and priorities across Commonwealth agencies. This paper shares tools on how to build an effective measurement framework for evaluation culture, and discusses how to use these for evaluation capability uplift.
We explore quantitative and qualitative methods to gather and analyse data to measure an organisation's readiness to change its culture towards evaluation. This includes assessing staff attitudes towards evaluation, the level of opportunity for staff to conduct and use evaluation, and confidence in their knowledge of evaluation.
We discuss the development of a staff evaluation culture survey based on Preskill & Boyle's ROLE and how behavioural insight tools can be utilised to boost engagement. The paper discusses the utility of holding focus groups with senior leaders to understand authorising environments for evaluation and key leverage points. Also discussed, are challenges and innovative solutions that were encountered throughout the assessment process.
This paper will be valuable for those who work in, or with, any government agency with an interest in evaluation capacity building and driving an evaluation culture within organisations. This paper explains each stage of measurement design, data analysis and results, and discussing opportunities for action.
1 Preskill, H., & Boyle, S. (2008). A Multidisciplinary Model of Evaluation Capacity Building. American Journal of Evaluation, 29(4), 443-459. ://journals.sagepub.com/doi/10.1177/1098214008324182

2 Michie S, Atkins L, West R. (2014) The Behaviour Change Wheel: A Guide to Designing Interventions. London: Silverback Publishing. www.behaviourchangewheel.com.

3 Lahey, R. (2009). A Framework for Developing an Effective Monitoring and Evaluation System in the Public Sector: Key considerations from International Experience. Framework for developing an effective ME system in the public sector (studylib.net)
Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
avatar for Anthony Alindogan

Anthony Alindogan

Evaluation Lead, Attorney-General's Department
Anthony is an experienced evaluator with a particular interest in outcomes measurement and value-for-money. He completed his Master of Evaluation degree from the University of Melbourne. Anthony is an enthusiastic writer and has publications in various journals including the Evaluation... Read More →
avatar for Claudia Oke

Claudia Oke

Project Officer / Data Analyst, Australian Public Service Commission
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Chair Speakers
avatar for Jake Clark

Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Trigger warnings - do they just trigger people more?
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104
Authors: Kizzy Gandy (Verian (formerly Kantar Public) )

As evaluators, one of our key ethical responsibilities is to not cause psychological harm or distress through our methods. We often start workshops or interviews with a warning that the topic may be upsetting and provide contact information for mental health services to participants under the assumption this is the most ethical practice.

Trigger warnings are used with good intentions and are often recommended in evaluation ethics guidelines. However, what do we know about their impact? Is there a risk they actually trigger people more?

This talk examines the evidence on whether trigger warnings are an effective strategy for reducing the risk of trauma and re-traumatisation when discussing topics such as sexual assault, mental health, violence, drug use, and other sensitive issues. It also touches on new evidence from neuroscience about how emotions are understood differently now compared to in the past.

This session will not provide a definitive answer on when or how to use trigger warnings but aims to challenge the audience to think critically about whether trigger warnings are useful in their own work.
Chair Speakers
avatar for Kizzy Gandy

Kizzy Gandy

National Director, Program Evaluation, Verian
Dr Kizzy Gandy is Verian's National Director of Program Evaluation. She leads a team of expert methodologists and provides quality assurance. With 20 years’ experience in consultancy, federal and state government, and academia, Kizzy has overseen the design and evaluation of over... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Development and implementation of a culturally grounded evaluation Framework: Learnings from an Aboriginal and Torres Strait Islander Peak.
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
Authors: Candice Butler (Queensland Aboriginal and Torres Strait Islander Child Protection Peak ),Michelle McIntyre (Queensland Aboriginal and Torres Strait Islander Child Protection Peak, AU),John Prince (JKP Consulting, AU)

There is increasing recognition that evaluations of Aboriginal and Torres Strait Islander programs must be culturally safe and appropriate, and represent the worldviews, priorities, and perspectives of Aboriginal and Torres Strait Islander communities. Aboriginal and Torres Strait Islander peoples have the cultural knowledge and cultural authority to design appropriate evaluations that are safe, and that tell the true story of the impacts of our ways of working.

As a peak body for Aboriginal and Torres Strait Islander community-controlled organisations we wanted to ensure that the worldviews and perspectives of our members and communities are embedded in any evaluations of services delivered by our member organisations. This is a necessary step towards building an evidence base for our ways of working, developed by and for Aboriginal and Torres Strait Islander people. To that end we developed an evaluation framework to enable self-determination and data sovereignty in evaluation, and to build capacity among our member organisations to undertake and/or commission culturally grounded evaluations. Culturally grounded evaluations are led by Aboriginal and Torres Strait Islander people and guided by our worldviews and knowledge systems - our ways of knowing, being and doing.

This paper reports on the development and implementation process used in the project and describes the standards and principles which underpin the framework. An example of how the framework is being applied in practice is also outlined. Our principles for evaluation describe the core values which underpin culturally grounded and safe evaluation including self-determination; cultural authority; truth-telling; two-way learning; and holistic approaches. The evaluation standards and associated elements operationalise our principles and embed them in evaluative practice.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Candice Butler

Candice Butler

Executive Director, Centre of Excellence, Queensland Aboriginal and Torres Strait Islander Child Protection Peak
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Failing your way to better practice: How to tread carefully when things aren't going as planned
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105
Authors: Stephanie White (Victoria Department of Education )

Evaluators can fail in many ways. The consequences of these failures can be relatively contained or wide ranging within the evaluation and can also flow on to program operations. But failure is a part of life and can be a useful catalyst for professional growth. What happens when you find yourself failing and can see the risks ahead? How do you keep going?

The session focuses on the experiences of an emerging evaluator who failed while leading a large-scale education evaluation. When some elements of the evaluation became untenable, they struggled to find the right path forward and could foresee the risks materialising if the situation wasn’t addressed. On the other side of it, they reflect on how they drew on tools in every evaluator’s toolkit to start remedying their previous inaction and missteps to get the evaluation back on track…and improve their practice along the way!

This session is relevant to any evaluator who grapples with the messiness of expectations and reality in their practice.


Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Stephanie White

Stephanie White

Victoria Department of Education
I found my way to evaluation to help me answer questions about education program quality and success. Professionally, I have diverse experiences in education and evaluation, from delivering playgroups under trees in the NT to reports on educator resources to senior education bureaucrats... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

National impact, regional delivery - Robust M&E for best practice Australian horticulture industry development.
Thursday September 19, 2024 10:30am - 11:00am AEST
Authors: Ossie Lang (RMCG ),Donna Lucas (RMCG ),Carl Larsen (RMCG ),Zarmeen Hassan (AUSVEG ),Cherry Emerick (AUSVEG ),Olive Hood (Hort Innovation )

How do you align ten regionally delivered projects with differing focus topics to nationally consistent outcomes? Take advantage of this opportunity to explore the journey of building and implementing a robust Monitoring and Evaluation (M&E) program that showcases regional nuances and aligns national outcomes, making a significant contribution to the success of this horticultural industry extension project.

Join us for an insightful presentation on how a national vegetable extension project focused on adoption of best management practices on-farm, has successfully implemented a dynamic M&E program. Over the two and a half years of project delivery, the national M&E manager, in collaboration with ten regional partners, has crafted a program that demonstrates regional impact consistently on a national scale and adapts to the project's evolving needs.

The presentation will highlight the team's key strategies, including the upskilling of Regional Development Officers in M&E practices. Learn how templates and tools were designed to ensure consistent data collection across approximately 40 topics. The team will share the frameworks utilised to capture quantitative and qualitative monitoring data, providing a holistic view of tracking progress against national and regional outcomes and informing continuous improvement in regional delivery.

Flexibility has been a cornerstone of the M&E program, allowing it to respond to the changing needs of growers, industry, and the funding partner and seamlessly incorporate additional data points. Discover how this adaptability has enhanced the project's overall impact assessment and shaped its delivery strategy.

The presentation will not only delve into the national perspective but also feature a firsthand account from one of the Regional Development Officers. Gain insights into how the M&E program has supported their on-the-ground delivery, instilling confidence in providing data back to the national project manager. This unique perspective offers a real-world understanding of the national program's effectiveness at a regional level.
Chair Speakers
avatar for Ossie Lang

Ossie Lang

Consultant-Regional Development Officer, RMCG
Thursday September 19, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating organisational turbulence: An evaluation-based strategic learning model for organisational sustainability
Thursday September 19, 2024 10:30am - 11:00am AEST
103
Authors: Shefton Parker, Monash Univeristy; Amanda Sampson, Monash Univeristy

Increasingly, turbulent, and rapidly changing global operating environments are disrupting organisational plan implementation and strategy realisation of institutions. The session introduces a novel organisational collaborative strategic learning and effectiveness model, intended to bolster organisational resilience responses amidst such turbulence.
A scarcity of suitable organisational strategic learning systems thinking models utilising evaluation methodology in a joined-up way, prompted the presenters to develop a model. The model is tailored for strategic implementation in a complex organisational system environment, operating across decentralised portfolios with multiple planning and operational layers. The model amalgamates evaluation methodologies to identify, capture, share and respond to strategic learning in a complex system. It is hypothesised the model will outperform conventional organisational performance-based reporting systems, in terms of organisational responsiveness, agility, adaptability, collaboration, and strategic effectiveness.
The presentation highlights the potential value of integrating and embedding evaluation approaches into an organisation's strategy, governance and operations using a three-pronged approach:
- Sensing: Gathering relevant, useful timely data (learning);
- Making sense: Analysing and contextualising learning data alongside other relevant data (institutional performance data, emerging trends, policy, and legislative reform etc); and
- Good sense decisions: Providing timely and relevant evaluative intelligence and insights to support evidence based good decision making.
The presenters advocate for a shift from viewing evaluation use as a 'nice to have' to a 'must have' aspect of organisational growth and sustainability. The model aims to foster a leadership culture where decision makers value the insights that contextualised holistic organisational intelligence can provide for;

i) Strategic planning: Enhanced planning and strategic alignment across portfolios;

ii) Operational efficiency: Reducing duplication in strategic effort and better collaboration towards strategic outcomes;

iii) Business resilience and sustainability: Improved identification and quicker response to emerging opportunities and challenges; and

iv) Strategic effectiveness: Informing activity adaptation recommendations for strategic goal realisation.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Shefton Parker

Shefton Parker

Senior Evidence & Evaluation Adviser, Monash University - Institutional Planning
Dr Shefton Parker is an evaluator and researcher with over 15 years of specialist experience in program and systems evaluation within the Vocational and Higher Education sectors. Recently, his evaluation of innovative education programs were referenced as evidence in the University... Read More →
avatar for Amanda Sampson

Amanda Sampson

Senior Manager, Institutional Planning, Monash University
I am leading the development and implementation of an Institutional Evaluation Model which a complex organisation to support organisational resilience, strategic adaptation and execution to realise the 10 year organisational strategic objectives. I am interested in learning how to... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Learn, evolve, adapt: Evaluation of climate change and disaster risk reduction programs
Thursday September 19, 2024 11:00am - 11:30am AEST
104
Authors: Justine Smith (Nation Partners )

There is a pressing need to reduce the risks associated with climate change and the disasters that are likely to increase as a result. Along with the need to take action, comes the need to show we are making a difference - or perhaps more importantly the need to learn and evolve to ensure we are making a difference. However when operating in an ever changing, uncertain environment, with layers of complexity and outcomes that may not be realised for some time, or until disaster strikes, evidence of impact is not always easy to collect nor a priority.

Drawing on experience developing evaluation frameworks and delivering evaluation projects in the areas of climate change and disaster and emergency management, I will present some of the challenges and opportunities I have observed. In doing so, I propose that there is no 'one way' to do things. Rather, taking the time to understand what we are evaluating and to continually learn, evolve and adjust how we evaluate is key. This includes having clarity on what we really mean when we are talking about reducing risk and increasing resilience. Ideas I will explore include:
  • The concepts of risk reduction and resilience.
  • The difference between evaluation for accountability and for genuine learning and improvement.
  • Balancing an understanding of and progress towards big picture outcomes with project level, time and funding bound outcomes.
  • The challenge and potential benefits of event-based evaluation to learn and improve.

Evaluation has the capacity to contribute positively to action taken to reduce climate change risks and improve our management of disasters and recovery from disasters. As evaluators we too need to be innovative and open-minded in our approaches, to learn from and with those working directly in this space for the benefit of all.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Justine Smith

Justine Smith

Principal, Nation Partners
With a background spanning research, government, non-government organisations and consulting, Justine brings technical knowledge and over 10 years of experience to the projects she works on. As a highly experienced program evaluator and strategic thinker, Justine has applied her skills... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Culturally inclusive evaluation with culturally and linguistically diverse communities in Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
Author Lena Etuk (CIRCA Research, AU)

In this presentation we will outline an approach to culturally inclusive evaluation with people from culturally and linguistically diverse backgrounds in Australia, its strengths, and its growth opportunities. This approach fills a critical gap in the way evaluation and research with culturally and linguistically diverse communities is traditionally conducted in Australia.

In this presentation we will explain how the Cultural & Indigenous Research Centre Australia (CIRCA) conducts in-culture and in-language evaluation with diverse cohorts of Australians, and how this practice fits within the broader methodological discourse in evaluation and social science more broadly. We will illustrate how our culturally inclusive methodology is put into practice with findings from CIRCA's own internal research into the way cultural considerations shape our data collection process. We will conclude with reflections on how CIRCA might further draw on and leverage standpoint theory and culturally responsive evaluation as this practice is further refined.

Our key argument is that doing culturally inclusive evaluation is a process that requires reflexivity and learning, alongside strong and transparent institutional processes. Combining these approaches creates systemic ways of acknowledging and working within stratified and unequal social systems, inherent to any research. Our findings will advance knowledge within the field of evaluation about how to engage and represent culturally and linguistically diverse community members across Australia.
Chair Speakers
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Culturally Inclusive Research Centre Australia
I’m an applied Sociologist with 18+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Our journey so far: a story of evaluation to support community change in South Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
103
Authors: Penny Baldock (Department of Human Services South Australia ),Jessie Sleep (Far West Community Partnerships, AU)

The multi-jurisdictional South Australian Safety and Wellbeing Taskforce is the lead mechanism, and the accountable body to develop strategies and sustainable, place-based responses that ensure the safety and wellbeing of remote Aboriginal Visitors in Adelaide and other regional centres in the State.

This presentation discusses the challenges of establishing an evaluative learning strategy for the Taskforce that meets the needs of multiple government agencies and stakeholders, multiple regional and remote communities, and multiple nation groups.

In a complex system, this is a learning journey, requiring us to adapt together to seek new ways of understanding and working that truly honour the principles of data sovereignty, community self-determination, and shared decision-making.
As we begin to more truly centre communities as the locus of control, and consider the far- reaching reform that will be necessary to deliver on our commitments under Closing the Gap, this presentation provides an important reflection on the skills, knowledge and expertise that will be required to build evaluation systems and processes that support change.

One of the most exciting developments to date has been the establishment of a multi-agency data sharing agreement, which will enable government data to be shared with Far West Community Partnerships, a community change organisation based in Ceduna, and combined with their community owned data in order to drive and inform the Far West Change Agenda.

We present the story of our journey so far, our successes, our failures, and extend an invitation to be part of the ongoing conversation. to support the change required for evaluation success.

Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
PB

PENNY BALDOCK

Department of Human Services
avatar for Jessie Sleep

Jessie Sleep

Chief Executive, Far West Community Partnerships
Jessie is an innovative thinker and strategist, emerging as a leader in her field, redefining the role of strategic implementation with monitoring and evaluation. With the fast paced growth of the social impact lens in Australia, Jessie is part of the new generation of strategic leaders... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Julia Suh

Julia Suh

Principal, Tobias
avatar for JESSICA LEEFE

JESSICA LEEFE

Principal, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

"Nothing about us, without us": Developing evaluation framework alongside victim-survivors of modern slavery using representative participatory approaches
Thursday September 19, 2024 11:30am - 12:00pm AEST
Authors: Ellie Taylor (The Salvation Army)

Amplifying survivor voices has been the cornerstone of The Salvation Army's work in the anti-slavery realm. How does this translate to the monitoring and evaluation space? How do we truly represent the voices and experiences of those with lived experience of modern slavery in monitoring and evaluation, whilst aligning with key human rights principles?

Our Research Team are exploring how to centre survivor voices in the evaluation space. This session will detail use of a representative participatory evaluation approach to monitor and evaluate the Lived Experience Engagement Program (LEEP) for survivors of criminal labour exploitation. In this session we will explore the challenges and learnings uncovered through this project.

The LEEP is designed to empower survivors of criminal labour exploitation to share their expertise to make change. Piloted in 2022-2023, and continuing into 2024-2025, the LEEP - and resulting Survivor Advisory Council - provides a forum for survivors to use their lived experience to consult with government - to assist in preventing, identifying and responding to modern slavery.

The key points explored in this session will include:
  • Realities of implementing an adaptive model, including continuous integration of evaluation findings into an iterative survivor engagement model.
  • The importance of stakeholder inclusivity, integrating lived experience voices and amplifying them alongside program facilitators and government representatives.
  • Complexities of evaluation in the modern slavery space, particularly when victim-survivors of forced marriage are included. We will speak to the need for trauma-informed, strengths-based measures and facilitating partnerships with the people the program serves.

Leading the session will be the The Salvation Army's project lead with a PhD in mental health and over 12 years of experience working with diverse community groups in Australia and internationally. They have extensive experience presenting at conferences both domestically and internationally.
Chair Speakers
avatar for Ellie Taylor

Ellie Taylor

Senior Research Analyst, The Salvation Army
Ellie has a background in mental health and has spent 12+ years designing and conducting research and evaluation initiatives with diverse communities across Australia and internationally. In this time, she's worked with people from all walks of life, across the lifespan, from infants... Read More →
Thursday September 19, 2024 11:30am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Systems evaluation to the rescue!: How do we use systems evaluation to improve societal and planetary wellbeing?
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104
Authors: Kristy Hornby (Grosvenor), Tenille Moselen (First Person Consulting)

Systems evaluation - many might have heard the term, but few have done one. This session shares two case studies of different systems evaluations and the learnings from these to benefit other evaluators who are conducting or about to begin a systems evaluation.

The session will open with an overview and explanation of what systems evaluation is, in terms of its key features and how it is distinguished from other forms of evaluation. The presenters will then talk through their case studies, one of which centres on the disability justice system in the ACT, while the other takes a sector-wide focus across the whole of Victoria. The co-presenters will share openly and honestly their initial plans for commencing the systems evaluations, how they had to amend those plans in response to real-world conditions, and the tips and tricks and innovations they picked up along the way.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
avatar for Dr Katherine Dix

Dr Katherine Dix

Principal Research Fellow, School and System Improvement, Australian Council for Educational Research
Dr Katherine Dix is a Principal Research Fellow at ACER, with over 20 years as a program evaluator, educational researcher and Project Director. Dr Dix is the National Project Manager for Australia’s participation in OECD TALIS 2024, and is a leading expert in wellbeing and whole-school... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Getting to the value add: Timely insights from a realist developmental evaluation
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Phillip Belling (NSW Department of Education), Liam Downing (NSW Department of Education, AU)

This paper is aimed at early career and experienced evaluators interested in realist evaluation, but with concerns about the time a realist approach might take. The authors respond to this concern with an innovative blending of realist and developmental evaluation. Participants will exit the room with a working understanding of realist developmental evaluation, including its potential for adaptive rigour that meets the needs of policy makers and implementers.

Realist evaluation is theoretically and methodologically robust, delivering crucial insights about how, for whom and why interventions do and don't work (House, 1991; Pawson & Tilley, 1997; Pawson, 2006). It aims to help navigate unfamiliar territory towards our destination by bringing assumptions about how and why change happens out in the open.

But even realism's most enthusiastic practitioners admit it takes time to surface and test program theory (Marchal et al., 2012; van Belle, Westhorp & Marchal, 2021). And evaluation commissioners and other stakeholders have understandable concerns about the timeliness of obtaining actionable findings (Blamey & Mackenzie, 2007; Pedersen & Rieper, 2008).

Developmental evaluation (Patton, 1994, 2011 2021; Patton, McKegg, & Wehipeihana, 2015) is more about what happens along the way. It appeals because it provides a set of principles for wayfinding in situations of complexity and innovation. Realist and developmental approaches do differ, but do they share some waypoints to reliably unpack perplexing problems of practice?

This paper documents a journey towards coherence and rigour in an evaluation where developmental and realist approaches complement each other, and deliver an evidence base for program or policy decision-making that is not only robust but also timely.

We show that, in complex environments, with programs involving change and social innovation, realist developmental evaluation can meet the needs of an often-varied cast of stakeholders, and can do so at pace, at scale, and economically.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating the unfamiliar: Evaluation and sustainable finance
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Donna Loveridge (Independent Consultant), Ed Hedley (Itad Ltd UK, GB)

The nature and magnitude of global challenges, such as climate change, poverty and inequality, biodiversity loss, food insecurity and so on, means that $4 trillion is needed annually to achieve the Sustainable Development Goals by 2030. Government and philanthropic funding is not enough but additional tools include businesses and sustainable finance. Evaluators may relate to many objectives that business and sustainable finance seek to contribute to but discomfort can arise in the mixing of profit, financial returns, impact and purpose.

Sustainable finance, impact investing, and business for good are growing globally and provides opportunities and challenges for evaluators, evaluation practice and the profession.
This session explores this new landscape and examines:
  • What makes us uncomfortable about dual objectives of purpose and profit, notions of finance and public good, and unfamiliar stakeholders and languages, and what evaluators can do in response.
  • The opportunities for evaluators to contribute to solving interesting and complex problems with current tools and skills and where is the space for developing evaluation theory and practice.
  • How evaluation practice and evaluators' competencies might expand and deepen, and not get left behind in these new fields, and also sustaining evaluations relevance to addressing complex challenges.

The session draws on experience in Australia and internationally to share some practical navigation maps, tools and tips to help evaluators traverse issues of values and value, working with investors and businesses, and identify opportunities to add value.
Chair
Thursday September 19, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Man vs. Machine: Reflections on machine-assisted and human-driven approaches used to examine open-text progress reports.
Thursday September 19, 2024 1:30pm - 2:00pm AEST
Authors: Stephanie Quail (ARTD Consultants), Kathleen De Rooy (ARTD Consultants, AU)

Progress reports and case notes contain rich information about program participants' experiences and frequently describe theoretically important risk and protective factors that are not typically recorded in administrative datasets. However, the unstructured narrative nature of these types of data - and, often, the sheer volume of it - is a barrier for human-drive qualitative analysis of this data. Often, the data cannot be included in evaluations because it is too time and resource intensive to do so.

This paper will describe three approaches to the qualitative analysis of progress reports used to examine within-program trajectories for participants, and the factors important for program success as part of an evaluation of the Queensland Drug and Alcohol Court.

It will explore how we navigated the balance between human and machine-driven qualitative analysis. We will reflect on the benefits and challenges of text-mining - how humans and machines stack up against each other when identifying the sentiment and emotion in text, the strengths and challenges of each approach, the lessons we have learned, and considerations for using these types of approaches to analyse datasets of progress reports in future evaluations.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Stephanie Quail

Stephanie Quail

Manager, ARTD Consultants
Thursday September 19, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Monitoring and Evaluation Journeys: Making footprints, community-based enterprise in Australian First Nations contexts
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104
Authors: Donna-Maree Stephens (Community First Development ),Sharon Babyack (Community First Development, AU)

As First Nations' economies grow and develop, wayfinding of monitoring and evaluation frameworks that meaningfully address the holistic outcomes of First Nations' economic independence are a necessity. Culturally responsive monitoring and evaluation frameworks provide footprints for distinct ways of thinking about the holistic and significant contribution that First Nations' economies make to their communities and the broad Australian economic landscape.
Presenting findings from an organisation with more than 20 years of experience working alongside First Nations' communities and businesses grounded in collective and community focused outcomes, this presentation will highlight key learnings of monitoring and evaluation from First Nations' enterprises. It is an invitation to explore and rethink notions of success by drawing on experiences and Dreams (long-term goals) for community organisations, businesses and journeys towards positive outcomes alongside the role of one culturally responsive monitoring and evaluation approach. Our presentation will provide an overview of our work in the community economic development space and key learnings developed through our monitoring and evaluation yarns with First Nations' enterprises across a national First Nations' economic landscape that includes urban, regional and remote illustrations.
Chair
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
Speakers
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, Community First Development
My role at Community First Development involves oversight of research, evaluation, communications and effectiveness of the Community Development program. During my time with the organisation I have led teams to deliver major change processes and strategic priorities, have had carriage... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

A long road ahead: Evaluating long-term change in complex policy areas. A case study of school active travel programs in the ACT
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106
Authors: Mallory Notting (First Person Consulting)

The ACT Government implemented a suite of programs over the ten year period between 2012 and 2022 aiming to increase the rates of students actively travelling to and from school. 102 schools in the ACT participated in at least one of the three programs during this time which targeted well-known barriers to active travel, including parental perceptions of safety and infrastructure around school. The programs were intended to contribute towards a range of broader priorities, including health, safety, and environmental outcomes.

This short-paper session will share learnings from evaluating long-term behaviour change at a population level, based on the school active travel evaluation. The evaluation represents a unique case study, as the evaluators needed to look retrospectively over ten years of program delivery and assess whether the combination of programs had created changes within the system and had resulted in the achievement of wider goals.

The presenter will illustrate that the line between short-term and long-term outcomes is rarely linear or clear, as is the relationship between individual interventions and whole of system change. This will be done by summarising the approach taken for the evaluation and sharing the diversity of information collated for analysis, which included individual program data and attitudinal and infrastructure-level data spanning the whole school environment.

Evaluators are often only able to examine the shorter term outcomes of an intervention, even in complex policy areas, and then rely on a theory of change to illustrate the assumed intended wider impacts. The presenter was able to scrutinise these wider impacts during the active travel evaluation, an opportunity not regularly afforded to evaluators. The lessons from the active travel evaluation are therefore pertinent for other evaluations in complex policy areas and may carry implications for program design as the focus shifts increasingly towards population-level, systems change.

Chair
avatar for Carolyn Wallace

Carolyn Wallace

Manager Research and Impact, VicHealth
Carolyn is an established leader in health and community services with over 22 years of experience across regional Victoria, Melbourne, and Ireland. She has held roles including CEO, executive director, policy officer, and researcher, specialising in community wellbeing and social... Read More →
Speakers
avatar for Mallory Notting

Mallory Notting

Principal Consultant, First Person Consulting
Mallory is a Principal Consultant at First Person Consulting. She manages and contributes to projects primarily in the area of cultural wellbeing, social inclusion, mental health, and public health and health promotion. In 2023, Mallory was the recipient of the Australian Evaluation... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Our new ways: Reforming our approach to impact measurement and learning
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105
Authors: Kaitlyn Scannell (Minderoo Foundation), Adriaan Wolvaardt (Minderoo Foundation, AU), Nicola Johnstone (Minderoo Foundation, AU), Kirsty Kirkwood (Minderoo Foundation, AU)

We have been on a journey to bring awareness, evidence and understanding to the impact of our organisation since inception, and in earnest since 2016. For years, we felt the tension of trying to solve complex problems with measurement and learning approaches that are better suited to solving simple problems.

To change the world, we must first change ourselves. In early 2023 we had the extraordinary opportunity to completely reimagine our approach to impact measurement and learning. What we sought was an approach to measurement and learning that could thrive in complexity, rather than merely tolerate it, or worse, resist it.
We are not alone in our pursuit. Across government and the for-purpose sector, practitioners are exploring and discovering how to measure, learn, manage, and lead in complexity. Those who explore often discover that the first step they need to take is to encourage the repatterning of their own organisational system. A system which, which in the words of Donella Meadows, "naturally resists its own transformation."

In this presentation we will delve into two themes that have emerged from our journey so far:
  • Transforming ourselves - We will explore what it takes to embed a systems-led approach to measurement, evaluation and learning in an organisation.
  • Sharing knowledge - We will discuss methods for generating, sharing, and storing knowledge about what works for measuring, evaluating, and learning in complexity.

The purpose of this session is to share what we have learnt with anyone who is grappling with how their organisation might measure and learn in complexity. We have been touched by the generosity of those who have accompanied us on our journey, sharing their experiences and wisdom. This presentation marks our initial effort to pay that generosity forward.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Chair
avatar for Emily Saurman

Emily Saurman

Delegate, University of Sydney - School of Rural Health
Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Where next? Evaluation to transformation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor), Sarika Bhana (Grosvenor)

What is evaluation? Better Evaluation defines it as "any systematic process to judge merit, worth or significance by combining evidence and values". Many government organisations and some private and not-for-profit entities use evaluations as an auditing tool to measure how well their programs are delivering against intended outcomes and impacts and achieving value for money. This lends itself to viewing evaluation as an audit or 'tick-box' exercise when it is really measuring the delivery of an organisation's mandate or strategy (or part thereof). Viewing evaluation more as an audit than a core part of continuous improvement presents a risk of our reports collecting dust.

During this session, we will discuss factors that build a continuous improvement mindset across evaluation teams, as well as across the broader organisation. This will include exploring how to manage the balance between providing independent advice with practical solutions that program owners and other decision-makers can implement more readily, as well as how to obtain greater buy-in to evaluation practice. We present the features that evaluations should have to ensure findings and conclusions can be easily translated into clear actions for improvement.

We contend that it is important to consider evaluation within the broader organisational context, considering where this might link to strategy or how it may be utilised to provide evidence to support funding bids. This understanding will help to ensure evaluations are designed and delivered in a way that best supports the wider organisation.

We end by sharing our post-evaluation playbook - a practical tool to help take your evaluations from pesky paperweight to purposeful pathway.

Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world two years ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector and not-for-profit space. Rachel... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
  Tools
 
Friday, September 20
 

10:30am AEST

Involving children and young people in evaluations: Equity through active participation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Sharon Marra-Brown (ARTD Consultants), Moya Johansson (ARTD Consultants, AU)

Think it's important to enable children and young people to have a voice in evaluations, but find it challenging? This paper presents tried and tested strategies for ensuring ethical engagement with children and young people and encouraging meaningful participation.

Involving children and young people in evaluation is critical to ensure that we arrive at evaluations that accurately reflect their experiences and capture the outcomes they consider most important. Children and young people have the right to have a say about their experiences, and evaluations that avoid their involvement risk perpetuating ongoing inequities.

However, involving children and young people in evaluations can prompt ethical concerns in relation to their comprehension of research, capacity to provide consent, potential coercion by parents, and the potential conflicting values and interests between parents and children. Depending on the subject, it can also create concerns about safety and readiness.

Based on our experience successfully achieving ethics approval for multiple evaluations of services for children and young people across Australia, which include interviews with children and young people who have accessed these services, we will talk through considerations for ensuring the voice of children and young people in evaluation while safeguarding them from unnecessary risks.

We will then take you through how we've overcome challenges engaging children and young people in evaluations with youth-centred innovative solutions, including carefully considering the language we use and how we reach out. We will demonstrate the developmental benefits of meaningful participation of children and young people once ethical considerations have been carefully considered and navigated.

Finally, we will take you through our tips for ensuring meaningful and safe engagement with children and young people. We will point you in the direction of Guidelines and practice guides for involving young people in research and evaluation in a safe and meaningful way.

The presenters are evaluators with extensive experience in designing, delivering and reporting on evaluations that include data collection with children and young people. This includes recently achieving ethics approval and commencing interviews with children as young as seven, accessing a suicide aftercare service.

While much attention is devoted to ensuring safe and inclusive data collection with various demographics, specific considerations for engaging children and young people remain relatively uncommon. Recognising the unique needs of this population, coupled with the understandably cautious stance of ethics committees, underscores the necessity for a thoughtful and deliberate approach to evaluations involving children and young people.

Given the additional complexities and ethical considerations involved, the default tendency can be to exclude children and young people from evaluation processes. However, it is important that children and young people are able to have a say in the programs, policies and services that they use. Participation in evaluations can be a positive experience, if risks are managed and the process is designed to be empowering.

This session will provide valuable insights, actionable strategies, and an opportunity for participants to reflect on their own practices, fostering a culture of inclusivity and responsiveness in evaluation.
Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Mitchell Rice-Brading

Mitchell Rice-Brading

ARTD Consultants
I started with ARTD in early 2022 after completing his Bachelor of Psychological Science (Honours) in 2021. This, in combination with experience as a Psychology research assistant, helped me develop strong research skills, namely the ability to synthesise and critically evaluate qualitative... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Our five guiding waypoints: Y Victoria's journey and learning from applying organisation-wide social impact measurement
Friday September 20, 2024 11:30am - 12:00pm AEST
103
Authors: Caitlin Barry (Y Victoria), Eugene Liston (Clear Horizon Consulting, AU)

The demand for organisations to measure impact seems to be ever increasing. However, impact measurement looks different depending on what level you are measuring it (program level, organisation-wide, ecosystem level etc). While a lot of organisations focus on measuring social impact at a program level, what appears to be less commonly achieved is the jump to effective measurement of impact at an organisation-wide level.

The literature providing guidance on how to implement org-wide social impact measurement makes it seem so straight-forward, like a Roman highway - all straight lines. But what is it really like in practice? How does it differ from program-level impact measurement? How can it be done? What resources does it take? And, what are the pitfalls?

The Y Victoria has spent the last three years on a journey to embed org-wide social impact measurement under the guidance of our evaluation partner. The Y Victoria is a large and diverse organisation covering 7 different sectors/service lines; over 5,500 staff; over 180 centres; and delivering services to all ages of the community. This presented quite a challenge for measuring organisation-wide impact in a meaningful way.

While the journey wasn't 'straight-forward', we've learnt a lot from navigating through it. This presentation will discuss the approach taken, tell the story of the challenges faced, trade-offs, lessons learnt (both from the client and consultant's perspective), and how we have adapted along the way.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Jess Boyden

Jess Boyden

Senior Social Impact Manager - Recreation, YMCA Victoria
Hello! I'm Jess and I bring 20 years of experience in program design, strategy and social impact measurement within international aid and local community development settings. I specialise in creating practical and meaningful approaches to measuring social impact, using the power... Read More →
avatar for Caitlin Barry

Caitlin Barry

Principal Consultant, Caitlin Barry Consulting
Caitlin has extensive experience in monitoring and evaluation and holds a Masters of Evaluation (First Class Honours) from the University of Melbourne and an Environmental Science Degree (Honours) from James Cook University. The focus of Caitlin's presentation will be from her work... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Reflections on a Developmental Evaluation of a traditional healing service model for the Kimberley region of Western Australia
Friday September 20, 2024 11:30am - 12:00pm AEST
106
Authors: Gillian Kennedy (The University of Notre Dame Australia), Tammy Solonec (Kimberley Aboriginal Law and Culture Centre, AU)

Traditional Healers, known in the Kimberley as mabarn (medicine men) and parnany parnany warnti (group of women healers), have been practicing their craft for millennia, however, cultural forms of healing are not funded or incorporated into health services in Western Australia. In 2019 a Kimberley cultural organisation was funded to develop and trial a service delivery model of traditional healing. The trial ended in November 2023.

This presentation will reflect on a Developmental Evaluation (DE) that was undertaken throughout the model development and trial of this traditional healing service using a multi-method approach, incorporating participant observation; semi-structured interviews; small group discussions; and a client survey. Data was collated into a 'checklist matrix', using a traffic light system to show how each element of the model was tracking according to different stakeholder perspectives. This information was then provided back to the healing team iteratively to incorporate further into the model design.

The DE team acted as a 'critical friend' to the project. Two Aboriginal research assistants (one male and one female) were able to provide valuable cultural interpreting for the project to ensure that cultural sensitivities around the healing practices were carefully managed. The DE team also helped the healing team to develop a set of guiding principles and a Theory of Change to help the project stay true to their underpinning cultural values.

The DE process helped to inform a culturally-governed and owned clinic model, working with both men and women healers, that is unique to the Kimberley. DE puts the evaluation team inside the project. This relational element is reflective of Aboriginal worldviews but may bring challenges for perceptions of objectivity that are championed in traditional forms of evaluation. We argue that the evaluator as a trusted, critical friend was ultimately part of the success of the healing project.


Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Tammy Solonec

Tammy Solonec

Jalngangurru Healing Coordinator, Kimberley Aboriginal Law and Cultural Centre (KALACC)
Tammy Solonec is a Nyikina woman from Derby in the Kimberley of Western Australia. Since late 2020, Tammy has been engaged by KALACC as Project Coordinator for Jalngangurru Healing, formally known as the Traditional Healing Practices Pilot (THPP). Prior to that from 2014 Tammy was... Read More →
avatar for Gillian Kennedy

Gillian Kennedy

Translational Research Fellow, The University of Notre Dame Australia
Gillian Kennedy is a Translational Research Fellow with Nulungu Research Institute at The University of Notre Dame, Broome campus and has 20 years’ experience as an educator and facilitator. Her research focus is on program and impact evaluation within the justice, education, and... Read More →
avatar for Eva Nargoodah

Eva Nargoodah

Cultural advisor and healer, Jalngangurru Healing, Kimberley Aboriginal Law and Culture Centre
Eva Nargoodah is a senior Walmajarri woman who was born on Christmas Creek Station in the Kimberley region of Western Australia. As a child she lived at Christmas Creek Station, GoGo Station and Cherrabun Station. Eva completed her schooling in Derby and worked as a teacher. She has... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

The ACT Evidence and Evaluation Academy 2021-24: Lessons learned from a sustained whole-of-government ECB effort
Friday September 20, 2024 11:30am - 12:00pm AEST
105
Authors: Duncan Rintoul (UTS Institute for Public Policy and Governance (IPPG) ),George Argyrous (UTS Institute for Public Policy and Governance (IPPG), AU),Tish Creenaune (UTS Institute for Public Policy and Governance (IPPG), AU),Narina Dahms (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Peter Robinson (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Robert Gotts (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU)

The ACT Evidence and Evaluation Academy is a prominent and promising example of sustained central agency investment in evaluation capability building (ECB).

The Academy was launched in 2021 as a new initiative to improve the practice and culture of evidence-based decision-making in the ACT public sector. Its features include:
  • a competitive application process, requiring executive support and financial co-contribution
  • a series of in-person professional learning workshops where participants learn alongside colleagues from other Directorates
  • a workplace project, through which participants apply their learning, receive 1-1 coaching, solve an evaluation-related challenge in their work and share their insights back to the group
  • executive-level professional learning and practice sharing, for nominated evaluation champions in each Directorate
  • sharing of resources and development of evaluation communities of practice in the Directorates
  • an annual masterclass, which brings current participants together with alumni and executive champions.

Four years and over 100 participants later, the Academy is still going strong. There has been an ongoing process of evaluation and fine tuning from one cohort to the next, with encouraging evidence of impact. This impact is seen not only for those individuals who have taken part but also for others in their work groups, including in policy areas where evaluation has not historically enjoyed much of a foothold.

The learning design of the Academy brings into focus a number of useful strategies - pedagogical, structural and otherwise - that other central agencies and line agencies may like to consider as part of their own ECB efforts.

The Academy story also highlights some of the exciting opportunities for positioning evaluation at the heart of innovation in the public sector, particularly in the context of whole-of-government wellbeing frameworks, cross-agency collaboration and strategic linkage of data sets to support place-based outcome measurement.

Chair Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
Friday September 20, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Gamified, flexible, and creative tools for evaluating a support program for palliative children and their families
Friday September 20, 2024 11:30am - 12:00pm AEST
104
Authors: Claire Treadgold (Starlight Children's Foundation Australia), Erika Fortunati (Starlight Children's Foundation, AU)

Our program creates personalised experiences of fun, joy, and happiness for families with a palliative child, aiming to foster family connections and celebrate the simple joys of childhood during this challenging circumstance. Evaluating the program is of utmost importance to ensure that it meets the needs of the families involved. Equally, due to the program's sensitivity and deeply personal nature, a low-pressure, flexible evaluation approach is necessary.
In our session, we will showcase our response to this need and share our highly engaging, low-burden tools to gather participant feedback that leverages concepts of gamification and accessibility to boost evaluation responses and reduce participant burden. In particular, we will focus on our innovative “activity book”, which evaluates the program through artistic expression. By emphasising creativity and flexibility, our tools aim to enrich the evaluation process and respect the diverse preferences and abilities of the participating families.
The core argument will focus on our innovative evaluation methodology, how it aligns with best practices in the literature, and our key learnings. Key points include the considerations needed for evaluating programs involving palliative children, empowering children and young people through their active involvement in the evaluation process, and how gamification and creativity boost participation and engagement.
Outline of the session:
  • Introduction to the palliative care program and the need for flexible, creative, and respectful evaluation methods
  • What the literature tells us about evaluation methods for programs involving palliative children and their families
  • A presentation of our evaluation protocol
  • Case studies illustrating the feedback collected and its impact
Our learnings and their implications for theory and practice
Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Erika Fortunati

Erika Fortunati

Research and Evaluation Manager, Starlight Children's Foundation Australia
Erika is the Research and Evaluation Manager at Starlight Children's Foundation, an Australian not-for-profit organisation dedicated to brightening the lives of seriously ill children. In her current role, Erika manages research projects and program evaluations to ensure that programs... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Embracing the L in "MEL": A Journey Towards Participatory Evaluation in Government Programs
Friday September 20, 2024 12:00pm - 12:30pm AEST
103
Authors: Milena Gongora (Great Barrier Reef Foundation)

Best practice in evaluation encompasses a crucial step of learning, yet it often receives inadequate emphasis, particularly within government-funded initiatives. Our paper documents the journey of transforming a top-down, prescriptive evaluation process within a government-funded program into an inclusive, consultative approach aligned with Monitoring, Evaluation, and Learning (MEL) principles.

Funded by Australian Government, and managed by the Great Barrier Reef Foundation, the Reef Trust Partnership (RTP) launched in 2018 aiming to enhance the resilience of the Great Barrier Reef. Within it, a $200 million portfolio aims to improve water quality working with the agricultural industry. A framework for impact evaluation was developed in its early days. Whilst appropriate, due to the need to comply with broader government requirements, it was top-down in nature.

Four years into implementation, the Foundation was ready to synthesise, interpret and report on the program's impact. The Foundation could have simply reported "up" to government. However, we acknowledged that in doing so, we risked missing critical context, simplifying findings, misinterpreting information and presenting yet another tokenistic meaningless report.

Interested in doing things better, we instead circled back with our stakeholders in a participatory reflection process. Through a series of carefully planned workshops, we invited on-ground program practitioners to ground-truth our findings, share contextual nuances, and collectively strategise for future improvements.

Despite initial reservations, participants embraced the opportunity, fostering an atmosphere of open dialogue and knowledge exchange. This reflective process not only enriched our understanding of program impact but also enhanced collaboration, strengthening overall program outcomes.

Our experience highlights the importance of transcending tokenistic evaluation practices, particularly in environments where top-down directives prevail. Participatory approaches can be implemented at any scale, contributing to a culture of continuous improvement and strategic learning, ultimately enhancing the impact and relevance of evaluation efforts.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Climate Change, Energy, the Environmentand Water NSW
It was a short step from studying environmental science, and working on cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing stories of what you've done and learned, especially in energy, climate change, environment and... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Place-based evaluation: collaborating to navigate learning in complex and dynamic contexts
Friday September 20, 2024 12:00pm - 12:30pm AEST
105
Authors: Sandra Opoku (Relationships Australia Victoria), Kate Matthies-Brown (Relationships Australia Victoria, AU)

Yarra Communities That Care (CTC) is a network of 24 local partner agencies who share a commitment to support the healthy development of young people in the City of Yarra. One of the key initiatives of Yarra CTC is the collaborative delivery of evidence-based social and emotional messaging to families by a centrally coordinated Facilitator Network involving multiple partner agencies. Building on positive feedback and program achievements from 2017-2022, we led an evaluation of the collaborative and place-based approach of the Yarra CTC Facilitator Network to better understand its contribution to systemic change and apply learnings to future place-based approaches for our respective organisations. The evaluation project team adopted the 'Place-Based Evaluation Framework' and was informed by a comprehensive theory of change. This provided an anchor in an otherwise complex and dynamic environment and unfamiliar territory.
There is an increased focus on collaborative place-based approaches at the federal, state and local levels as a promising approach to addressing complex social problems. Previous evaluations and literature identify successful collaboration and a strong support entity or backbone as key enabling factors that make place-based approaches successful. The collaborative place-based approach to strengthening family relationships in Yarra provides a local example of this.

Consistent with systems change frameworks this evaluation provided evidence of structural changes. These changes, manifested in the form of improved practices and dedicated resources and supports, ultimately leading to effective collaborative and transformative changes for the community.

This presentation will share the journey, key insights, and learnings of the evaluation project team over a two-year period to collaboratively gather evidence to inform ongoing program development and contribute to future place-based approaches. The Yarra CTC Facilitator Network serves as a valuable template for implementing best practices for place-based coalitions due to its focus on collaboration and fostering a sense of community.

Chair Speakers
avatar for Sandra Opoku

Sandra Opoku

Senior Manager Evaluation and Social Impact, Relationships Australia Victoria
My role leads impact, evidence and innovation activities at Relationships Australia Victoria. These activities contribute to achieving strategic objectives and improving outcomes for individuals, families and communities. This now also includes oversight of several key prevention... Read More →
avatar for Kate Matthies-Brown

Kate Matthies-Brown

Since 2022, Kate has supported RAV’s evaluation and social impact activities, including program evaluation, practice development, and evidence reviews. She is a qualified social worker with experience in family services, youth mental health and academia. Kate has experience with... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

A sprint, not a marathon: Rapid Evaluation as an approach for generating fast evidence and insights
Friday September 20, 2024 12:00pm - 12:30pm AEST
104
Authors: Marnie Carter (Allen + Clarke Consulting)

Increasingly, evaluators are called upon to quickly equip decision makers with evidence from which to take action. A program may be imminently approaching the end of a funding cycle; a critical event may have taken place and leadership needs to understand the causes and learnings; or a new program of work is being designed for which it is important to ensure that finite resources are being directed to the most effective interventions. For such circumstances, Rapid Evaluation can be a useful tool.

Rapid Evaluation is not simply doing an evaluation quickly. It requires a deliberate, interlinked and iterative approach to gathering evidence to generate fast insights. What makes Rapid Evaluation different is that the evaluation design needs to be especially flexible, constantly adapting to the context. Data collection and analysis don't tend to follow a linear manner, but rather iterate back and forth during the evaluation. Rapid Evaluation is often conducted in response to specific circumstances that have arisen, and evaluators therefore need to manage a high level of scrutiny.

This presentation will provide an overview of how to conduct a rapid evaluation, illustrated by practical examples including rapid evaluations of a fund to support children who have been exposed to family violence, and a quickly-established employment program delivered during the COVID-19 pandemic. It will discuss the methodological approach to conducting a Rapid Evaluation, share lessons on how to manage the evolving nature of data collection as the evaluation progresses, and discuss how to maintain robustness while evaluating at pace.


Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Marnie Carter

Marnie Carter

Evaluation and Research Practice Lead, Allen + Clarke Consulting
Marnie is the Evaluation and Research Practice Lead for Allen + Clarke Consulting. She is experienced in program and policy evaluation, monitoring, strategy development, training and facilitation. Marnie is particularly skilled in qualitative research methods. She is an expert at... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

From evaluation to impact-practical steps in a qualitative impact study
Friday September 20, 2024 1:30pm - 2:00pm AEST
Authors: Linda Kelly (Praxis Consultants), Elizabeth Jackson (Latrobe University, AU)

This presentation focuses on a multi-year program funded by Australia that aims to empower people marginalised by gender, disability and other factors. Like similar programs, the work is subject to regular monitoring and evaluation - testing the effectiveness of program activities largely from the perspective of the Australian and national country Government.
But what of the views of the people served by the program? Is the impact of the various activities sufficient to empower them beyond their current condition? How significant are the changes introduced by the program, given the structural, economic, social and other disadvantages experienced by the marginalised individuals and groups.
Drawing from feminist theory, qualitative research methods and managed with local research and communication experts this presentation outlines the study focused on the long-term impact of the program.

The presentation will outline the methodology and practical considerations in the development of the approach and data collection methodologies. It will highlight the value of exploring impact from a qualitative perspective, while outlining the considerable management and conceptual challenges required in designing, introducing and supporting such an approach. It will consider some of the implications in shifting from traditional evaluation methods to more open-ended enquiry and consider whose values are best served through evaluation versus impact assessment?


Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Elisabeth Jackson

Elisabeth Jackson

Senior Research Fellow, Centre for Human Security and Social Change, La Trobe University
Dr Elisabeth Jackson is a Senior Research Fellow at the Centre for Human Security and Social Change where she conducts research and evaluation in Southeast Asia and the Pacific. She is currently co-leading an impact evaluation of a program working with diverse marginalised groups... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Fidelity to context: A realist perspective on implementation science
Friday September 20, 2024 1:30pm - 2:00pm AEST
105
Authors: Andrew McLachlan (NSW Department of Education)

At first glance, realist methodology seems ideally suited to investigating implementation problems (Dalkin et al., 2021). It is versatile in that it draws on theories from diverse fields of social inquiry. It is pragmatic in that the theories it adopts are deemed useful only in so far as they offer explanatory insight. And it is transferable; realist methodology is less concerned with generalising findings than in understanding how programs work under different conditions and circumstances.

As for implementation science, its founding aim is purpose built for realist work; it seeks to improve the uptake of evidence-based practices by investigating the barriers and facilitators to implementation. Yet despite the affinity between realist methodology and implementation science, so far there have been few attempts to formalise the relationship (Sarkies et al., 2022).

This paper offers insights into how evaluators can harness realist methodology to better understand challenges of program implementation. It demonstrates how implementation concepts like fidelity (the degree to which a program is delivered as intended), adaptation (the process of modifying a program to achieve better fit), and translation (the ability to transfer knowledge across organisational borders) can be combined with realist concepts to develop a more active understanding of context.

In showing how to construct program theories that are responsive to changing conditions, the paper promises to equip evaluators with tools that can help them navigate the complexities of program implementation in their own work.



Chair Speakers
avatar for Andrew McLachlan

Andrew McLachlan

Evaluation Lead - Strategy, NSW Department of Education
Andrew McLachlan is an Evaluation Lead for the NSW Department of Education. Before becoming an evaluator, Andrew had over 10 years of experience as a teacher, working in settings as diverse as far North Queensland and Bangladesh. Since 2021, Andrew has worked as an embedded evaluator... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

A practical approach to designing and implementing outcome measures in psychosocial support services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
Authors: Lauren Gibson (Mind Australia ),Dr. Edith Botchway (Mind Australia, AU), Dr. Laura Hayes (Mind Australia, AU)

Outcome measurement in mental health services is recommend as best practice and provides an opportunity for clients and staff to track progress and navigate the complex road to recovery together. However, there are many barriers to embedding outcome measures in mental health services, including time constraints, low perceived value among staff and clients, and not receiving feedback on outcomes regularly. To overcome these challenges, a national not-for-profit provider of residential and non-residential psychosocial support services, created an innovative approach for designing and implementing outcome measures. The objective of our presentation is to describe this approach which has resulted in average outcome measure completion rates of over 80% across 73 services in Australia.

Design
We believe the key to achieving these completion rates is through understanding the needs of outcome measures end-users, including clients, carers, service providers, centralised support teams, and funding bodies. In this presentation we will share how we:
  • "Begin with the end in mind" through working with stakeholders to create user personas and program logics to identify meaningful outcomes and survey instruments.
  • Design easy to use digital tools to record quality data and provide stakeholders with dashboards to review their outcomes in real time through visualising data at an individual client level, and service level.

Implementation
Also key to embedding outcome measures is having a structured, multi-stage approach for implementation, with tailored support provided to:
  • Prepare services (e.g., Training)
  • Install and embed outcome measures in routine practice (e.g., Service champions)
  • Maintain fidelity over time (e.g., Performance monitoring)

The presentation will highlight the salient barriers and enablers identified during each design and implementation stage.

Overall, the presentation will provide a practical example of how to design and implement outcome measures in mental health services to ensure they are adding value for relevant stakeholders and enabling efficient and meaningful evaluation.

Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Lauren Gibson

Lauren Gibson

Researcher, Mind Australia
Dr. Lauren Gibson’s research focuses on understanding the prevalence and impact of initiatives aimed at improving physical and mental health outcomes among mental health service users. She has been a researcher within the Research and Evaluation team at Mind Australia for over two... Read More →
Friday September 20, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Chair Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.