Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
Education and training clear filter
Wednesday, September 18
 

11:00am AEST

Evaluation that adds value for People and Planet: Perspectives, Challenges, and Opportunities for Indigenous Knowledge Systems in Africa.
Wednesday September 18, 2024 11:00am - 11:30am AEST
104
Authors: Awuor PONGE (African Policy Centre (APC) )

Indigenous Knowledge Systems (IKS) in Africa have long been marginalized and undervalued, despite their potential to offer sustainable solutions to pressing challenges faced by communities across the continent. This presentation explores the perspectives, challenges, and opportunities for incorporating IKS into evaluation practices that create value for both people and the planet.

From a people-centric perspective, IKS offer a holistic and culturally relevant approach to understanding local contexts, priorities, and value systems. By embracing these knowledge systems, evaluations can better capture the multidimensional nature of well-being, including spiritual, social, and environmental aspects that are often overlooked in conventional evaluation frameworks. However, challenges arise in reconciling IKS with dominant Western paradigms and navigating power dynamics that have historically suppressed indigenous voices.

From a planetary perspective, IKS offer invaluable insights into sustainable resource management, biodiversity conservation, and climate change adaptation strategies that have been honed over generations of lived experiences. Integrating these knowledge systems into evaluation can shed light on the intricate relationships between human activities and ecosystem health, enabling more informed decision-making for environmental sustainability. Nonetheless, challenges exist in bridging the divide between traditional and scientific knowledge systems, as well as addressing concerns around intellectual property rights and benefit-sharing.

This presentation will explore innovative approaches to overcoming these challenges, such as participatory and community-based evaluation methodologies, capacity-building initiatives, and cross-cultural dialogue platforms. By fostering a deeper appreciation and understanding of IKS, evaluation practices can become more inclusive, relevant, and effective in creating value for both people and the planet in the African context.


Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Awuor Ponge

Awuor Ponge

Senior Associate Research Fellow, African Policy Centre (APC)
Dr. Awuor Ponge, is a Senior Associate Fellow, in-charge of Research, Policy and Evaluation at the African Policy Centre (APC). He is also the Vice-President of the African Evaluation Association (AfrEA). He holds a Doctor of Philosophy (PhD) Degree in Gender and Development Studies... Read More →
Wednesday September 18, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106
Authors: Samantha Abbato (Visual Insights People )

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Evaluation capacity building is increasingly becoming a core part of evaluation practice and a critical part of incorporating evaluation into the everyday activity of organisations (Preskill and Boyle, 2008, White, Percy and Small, 2018). Reaching the point where evaluation becomes the way of doing business requires a change of knowledge, skills, and attitudes.

Changes need to happen at the level of individuals, teams, organisations, and partnerships. This journey requires supporting and managing change to systematic enquiry processes as much as it requires evaluation expertise. In this skill-building session, we introduce Jonathan Haidt's 'rider, elephant and pathway' metaphor as a framework to support change and strengthen evaluation capacity (Haidt, 2018).

Haidt's metaphor for change includes the rider (our rational thinking side) atop an elephant (our emotional side). Behaviour change for individuals and collectives requires steps that (1) support the rider, such as giving clear directions, (2) motivate the elephant by tapping into emotions, and (3) shape a pathway to change, including clearing obstacles. In this interactive session, the facilitator will provide case studies applying Haidt's metaphor,spanning two decades Through these examples the power of this framework to support evaluation capacity building is demonstrated. Examples include using Haidt's framework for:
1. Building a Monitoring, Evaluation and Learning (MEL) system with a medium-sized community organisation;
2. Increasing the maturity of MEL in an existing large organisation; and
3. Increasing the impact of evaluation partnerships.

The active skill-building component incorporates:_
  • Cartoon elephant, rider and pathway flashcards;
  • A 'snakes and ladders' style game; and
  • Evaluation-specific examples.

The combination of examples and activities are designed to support participant learning. The session will encourage discussion of barriers, enablers and actions to build evaluation capacity relevant to different situations and contexts.

Learning objectives include:
  • Knowledge of a sound and memorable psychological framework for supporting evaluation capacity building;
  • Ability to apply Haidt's metaphor
Chair
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

From bottlenecks to breakthroughs: Insights from a teacher workforce initiative evaluation
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104
Authors: Rhiannon Birch (Victorian Department of Education ),Hayden Jose (Urbis, AU),Joanna Petkowksi (Victorian Department of Education, AU),Ekin Masters (Victorian Department of Education, AU)

How can an evaluation balance the need to generate pragmatic insights while meeting central agency requirements for rigorous measurement of outcomes? What ingredients can facilitate the effective evaluation of a government initiative and achieve improved outcomes? This paper explores the essential ingredients for evaluating a large-scale government program using an example of a statewide initiative aimed at attracting and retaining suitably qualified teachers in hard-to-staff positions in Victorian government schools.

We showcase how an adaptive and evidence-led method of enquiry helped identify program implementation bottlenecks and probe potentially unintended program outcomes over a three-year evaluation. We discuss enablers for the integration of evaluation recommendations into program implementation and future policy direction, particularly on participatory action approaches and deep relationships with policy and implementation teams. We will also present the robust and varied methodology, particularly the novel use of system data to facilitate a quasi-experimental design that aligned with central agency requirements and met stakeholder needs.
This presentation will benefit policymakers, program evaluators, and others interested in evaluating government programs, by sharing key learnings on how evaluations can balance pragmatic insights with central agency requirements and identifying the key elements for influencing such programs and achieving improved outcomes.
Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Rhiannon Birch

Rhiannon Birch

Senior Evaluation and Research Officer, Department of Education
Rhiannon is a dedicated research and evaluation specialist committed to enhancing health, social, education, and environmental outcomes for people and the planet. With over 10 years of experience in evaluation, she has worked extensively across emergency services, public health, and... Read More →
avatar for Hayden Jose

Hayden Jose

Associate Director, Urbis
Hayden brings 13 years’ experience as an evaluator, applied researcher and policy practitioner with extensive work in complex evaluations in government and not-for-profit settings. Across his roles, he has worked to consider complex system problems and translate evidence effectively... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Warlpiri ways of assessing impact - How an Aboriginal community is defining, assessing and taking action for a good life in their community.
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
Authors: Emily Lapinski (Central Land Council ),Malkirdi Napaljarri Rose (Centre For Human Security and Social Change, La Trobe University, AU), Glenda Napaljarri Wayne (Central Land Council, AU), Geoffrey Jungarrayi Barnes (Central Land Council, AU), Alex Gyles (Centre For Human Security and Social Change, La Trobe University, AU)

For evaluation to support transformational change, research suggests strategies must focus on localised Indigenous values, beliefs and worldviews. Decolonising evaluation involves identifying and addressing power and considering what is being evaluated, by whom and how. In this paper we argue that these developments are necessary but insufficient and suggest a possible way forward for further decolonising the field of evaluation. To support change for Indigenous Australians the emphasis needs to move from simple evaluation of individual programs to more critical examination of their combined impact on communities from local perspectives.

This paper explores how Warlpiri and non-Indigenous allies are collaborating to create and use their own community-level impact assessment tool. The 5-year Good Community Life Project is supporting Warlpiri residents of Lajamanu in the Northern Territory to define, assess and take action for a 'good community life'. Warlpiri will explain how they created the approach for assessing wellbeing in Lajamanu, and how they are using emerging results to give voice to their interests and advocate for the life they envision for future generations.

The project involves collaboration between Warlpiri community members, land council staff and university researchers, drawing on Indigenous concepts of 'two-way' seeing and working, relationality, and centring Indigenous voice and values. Applying these concepts in practice is challenging, particularly for non-Indigenous allies who must constantly reflect and use their privilege to challenge traditional views on 'robust' evaluation methodology.

Warlpiri and the land council see potential for this work to improve life in Lajamanu and as an approach that could be applied across Central Australian communities. Going beyond co-designed and participatory evaluation to critical examination of impact is the next step in supporting change with Indigenous communities. This paper will focus on Warlpiri perspectives, plus brief reflections from non-Indigenous allies, with time for the audience to discuss broader implications.
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
EL

Emily Lapinski

Monitoring, Evaluation and Learning Coordinator, Central Land Council
avatar for Alex Gyles

Alex Gyles

Research Fellow - Monitoring and Evaluation, Institute for Human Security and Social Change, La Trobe University
Alex Gyles is a Research Fellow working in Monitoring, Evaluation and Learning (MEL) at the Institute for Human Security and Social Change, La Trobe University. He works closely with Marlkirdi Rose Napaljarri on the YWPP project and finds fieldwork with the YWPP team an exciting learning... Read More →
GN

Glenda Napaljarri Wayne

Glenda Wayne Napaljarri is a community researcher on the YWPPproject from Yuendumu. She has developed her practice workingas an adult literacy tutor in Yuendumu’s Community LearningCentre. In addition to conducting research in her home communityof Yuendumu, Glenda has travelled... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Enhancing Stakeholder Engagement Through Culturally Sensitive Approaches: A Focus on Aboriginal and Torres Strait Islander Communities
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105
Authors: Mark Power (Murawin ),Carol Vale (Murawin, AU)

This presentation explores the paramount importance of culturally sensitive engagement methodologies in ensuring meaningful contributions from Aboriginal and Torres Strait Islander communities to mission programs. Murawin, an Aboriginal-led consultancy, has developed a robust Indigenous Engagement Strategy Framework grounded in the principles of reciprocity, free, informed and prior consent, mutual understanding, accountability, power sharing, and respect for Indigenous knowledge systems. Our session aims to share insights into the necessity of prioritising Aboriginal and Torres Strait Islander voices in engagement, co-design, and research, highlighting the significance of cultural competence in fostering mutual respect and understanding.
We will discuss three key messages: the imperative of deep knowledge and understanding of Aboriginal and Torres Strait Islander cultures in engagement practices; the success of co-design processes in facilitating genuine and respectful engagement; and the strategic partnership with CSIRO to enhance cultural competence and inclusivity in addressing Indigenous aspirations and challenges. These points underscore the critical role of acknowledging cultural interactions and ensuring cultural sensitivity in building strong, respectful productive relationships with Indigenous communities.
To achieve our session's objectives, we have designed an interactive format that blends informative presentations with the analysis of case studies, complemented by engaging intercultural discussions. This approach is intended to equip participants with actionable insights drawn from real-world examples of our collaborative ventures and co-designed projects. Through this comprehensive exploration, we aim to enrich participants' understanding of successful strategies for engaging Aboriginal and Torres Strait Islander communities, ultimately contributing to the achievement of more inclusive and impactful outcomes in mission programs and beyond.


Chair
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Speakers
avatar for Carol Vale

Carol Vale

CEO & Co-founder, Murawin, Murawin
Carol Vale is a Dunghutti entrepreneur, businesswoman, CEO and co-founder of Murawin, who’s passion, determination and commitment have driven her impressive 40-year career as a specialist in intercultural consultation, facilitation, and participatory engagement, and an empathetic... Read More →
avatar for Mark Power

Mark Power

Director, Evaluation & Research, Murawin
Mark is an experienced researcher with more than 20 years of experience in Australia and the Pacific. Mark manages Murawin’s evaluation and research practice and leads multiple evaluations for a variety of clients. Mark has overseen more than 30 high-profile, complex projects funded... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Enhancing evaluation value for small community organisations: A case example
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104
Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke, Carolyn McSporran (Blue Light Victoria, AU)Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke (Assessment and Evaluation Research Centre, University of Melbourne, AU), Elissa Scott (Blue Light Victoria, AU)

This presentation aims to provide a case example of how two small-scale, standard process/outcomes evaluations for a low-budget community organisation increased value for the organisation by identifying and seizing opportunities for evaluation capacity building. Formal evaluations represent a significant financial commitment for low-budget community organisations. By maximising the value provided by such evaluations, evaluators can contribute more to these organisations' mission and ultimately to social betterment.

There are numerous different evaluation capacity building models and frameworks, many of which appear to be quite complex (for example: Volkov & King, 2007; Preskill & Boyle, 2008). Many emphasise planning, documentation, and other resource intensive components as part of any evaluation capacity building effort. This session provides a case example of intentional but light-touch and opportunistic evaluation capacity building. Through such an approach, evaluators may need to do only minimal additional activities to provide extra value to an organisation. Reflection-in-action during the evaluation process is as important as the final reporting (Schwandt & Gates, 2021). The session emphasises, though, that a critical enabler will be the organisation's leadership and culture, and willingness to seize the opportunity offered by a formal evaluation. The session is co-presented by two members of the evaluation team and the Head of Strategy, Insights, and Impact of the client organisation.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
avatar for Stephanie Button

Stephanie Button

Research Associate & Evaluator, Assessment & Evaluation Research Centre
Stephanie has worked as a policy manager, analyst, strategist, researcher, and evaluator across the social policy spectrum in the public and non-profit sector for over 12 years. She is passionate about evidence-based policy, pragmatic evaluation, and combining rigour with equitable... Read More →
avatar for Carolyn McSporran

Carolyn McSporran

Head of Strategy, Insights and Impact, Blue Light Victoria
Passionate about social inclusion, Carolyn's work has spanned diverse portfolios across the justice and social services sectors. With a fervent belief in the power of preventative and early intervention strategies, she is committed to unlocking the full potential of individuals and... Read More →
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Envisioning and Encountering Relational Aboriginal and Pacific Research Futures
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Lisa Faerua (Vanuatu), Nathan Sentance (Museum of Applied Arts and Sciences, AU), David Lakisa (Talanoa Consultancy, AU)

In the inaugural ANU Coral Bell Lecture on Indigenous Diplomacy, Dr Mary Graham outlined a powerful legacy of Aboriginal and Torres Strait Islander relational methods that have operated across a spectacular time scale. She envisioned a compelling future for its renewed application and spoke of these practices as a type of "thinking in formation, a type of slow, collective, and emergent process".

Inspired by Dr Graham's vision, this panel explores synergies, distinctions, and complementarities in local and Indigenous research methods across Australia and the Pacific. The panel features Wiradjuri, Samoan (Polynesian), Ni-Vanuatu (Melanesian) and settler-background (Australian) researchers from a range of fields who will explore, engage and showcase locally specific methodologies that connect across Australia and the Pacific continents, as ways of knowing, doing, and relating with the land, the moana (ocean) and air.

This session frames evaluation and research approaches as reflecting their contextual political order. While the panel will critique the legacies of individualist and survivalist research methods, it will focus on exploring the futures that relational research methods could realize. How do we evolve current institutional approaches to become more commensurate with Indigenous methods? Would institutionalizing these methods resolve the legacy, structure, and form of colonialist political approaches? Panelists will speak to their experience in working to evolve institutions in this way and the research and evaluation methodologies used within them.

The session also situates evaluation within a cannon of contextualizing evidence-based practices (such as political economy analysis, GEDSI analysis or feasibility.
Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
Speakers
avatar for Lisa Faerua

Lisa Faerua

Lisa Faerua is a Pacific Freelance Consultant. She brings 17 years of experience in international and community development in the areas of leadership, design, monitoring and evaluation. Lisa has provided technical support to DFAT, MFAT, and Non-Government Organisations such Oxfam... Read More →
avatar for Nathan Sentance

Nathan Sentance

Nathan “mudyi” Sentance is a cis Wiradjuri librarian and museum collections worker who grew up on Darkinjung Country. Nathan currently works at the Powerhouse Museum as Head of Collections, First Nations and writes about history, critical librarianship and critical museology from... Read More →
avatar for David Lakisa

David Lakisa

Managing Director, Talanoa Consultancy
Dr David Lakisa specialises in Pacific training and development, educational leadership and diversity management. He is of Samoan (Polynesian) ancestry and completed his PhD on 'Pacific Diversity Management' at the University of Technology Sydney (UTS) Business School.
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Revitalising Survey Engagement: Strategies to Tackle Low Response Rates
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Kizzy Gandy

Surveys are an excellent data collection tool when they reach their target response rate, but low response rates hinder the generalisability and reliability of the findings.

This Ignite presentation will discuss techniques Verian evaluators have applied to increase survey response rates while also assessing the efficacy and efficiency of these techniques. We will also explore other evidence-based strategies for boosting response rates and the value of drawing on other data sources if your response rates are still low.
Chair Speakers
avatar for Hannah Nguyen

Hannah Nguyen

Economist, Verian Group
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Cultivating Equity: A Roadmap for New and Student Evaluators' Journeys
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
Authors: Ayesha Boyce (Arizona State University), Aileen Reid (UNC Greensboro, US)

Evaluation can be positioned as a social, cultural, and political force to address issues of inequity. We co-direct a 'lab' that provides new evaluators with hands-on applied research and evaluation experience to support their professional development. We are proud of our social justice commitments, and they show up in all aspects of our work. We believe the next generation of evaluators must be trained and mentored in high-quality technical, strengths-based, interpersonal, contextual, social justice-oriented, and values-engaged evaluation. We have found that novice evaluators are able to engage culturally responsive approaches to evaluation at the conceptual level, but have difficulty translating theoretical constructs into practice. This paper presentation builds upon our experiences and previous work of introducing a framework for teaching culturally responsive approaches to evaluation (Boyce & Chouinard, 2017) and a non-course-based, real-world-focused, adaptable training model (Reid, Boyce, et al., 2023). We will discuss how we have taught new evaluators three formal and informal methodologies that have helped them align their values with praxis. Drawing from our work across multiple United States National Science Foundation-funded projects we will overview how the incorporation of photovoice methodology, just-in-time feedback, and reflective practice have supported our commitments to meaningfully, and respectfully attend to issues of culture, race, diversity, power, inclusion, and equity in evaluation. We will also discuss our thoughts on the implications of globalization, Artificial Intelligence, and shifting politics on evaluation capacity building and training of new evaluators.

Chair
avatar for Nick Field

Nick Field

Director (Public Sector), Urbis
Nick has twenty years of public sector consulting experience, backed more recently by six years as a Chief Operating Officer in the Victorian Public Sector. A specialist generalist in a broad range of professional advisory services, Nick has expertise in the implementation of state-wide... Read More →
Speakers
avatar for Ayesha Boyce

Ayesha Boyce

Associate Professor, Arizona State University
Ayesha Boyce is an associate professor in the Division of Educational Leadership and Innovation at Arizona State University. Her research career began with earning a B.S. in psychology from Arizona State University, an M.A. in research psychology from California State University... Read More →
avatar for Aileen M. Reid

Aileen M. Reid

Assistant Professor, UNC Greensboro
Dr. Aileen Reid is an Assistant Professor of Educational Research Methodology in the Information, Library and Research Sciences department and a Senior Fellow in the Office of Assessment, Evaluation, and Research Services (OAERS) at UNC Greensboro. Dr. Reid has expertise in culturally... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Navigating a path to system impact: designing a strategic impact evaluation of education programs
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104
Authors: Amanda Reeves (Victorian Department of Education), Rhiannon Birch (Victorian Department of Education, AU), Eunice Sotelo (Victorian Department of Education, AU)

To provide insight to complex policy problems, evaluations need to adopt a systems perspective and examine the structures, relationships and contexts that influence program outcomes.

This paper outlines the design of a 4-year strategic evaluation that seeks to understand how a portfolio of over 25 education programs are interacting and collectively contributing to system-level outcomes. In this context, policy makers require evaluation to look beyond the boundaries of individual programs and assess the holistic impact of this investment to inform where and how resources can be directed to maximise system outcomes.

The strategic evaluation presented is theory-based and multi-layered, using logic modelling to identify outcomes at the program, cluster and system level and draw linkages to develop a causal pathway to impact. The strategic evaluation and the evaluations of individual education programs are being designed together to build-in common measures to enable meta-analysis and synthesis of evidence to assess system-level outcomes. The design process has been broad and encompassing, considering a diverse range of methods to understand impact including quantitative scenario modelling and value for money analysis.

The authors will describe how the strategic evaluation has been designed to respond to system complexity and add value. The evaluation adopts an approach that is:
• interdisciplinary, drawing on a range of theory and methods to examine underlying drivers, system structures, contextual factors and program impacts
• collaborative, using expertise of both internal and external evaluators, to design evaluations that are aligned and can tell a story of impact at the system-level
• exploratory, embracing a learning mindset to test and adapt evaluation activities over time.

This paper will be valuable for anyone who is interested in approaches to evaluating the relative and collective contribution of multiple programs and detecting their effects at the system level to inform strategic decision-making.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Amanda Reeves

Amanda Reeves

Principal Evaluation Officer, Victorian Department of Education
Amanda is an evaluation specialist with over 12 years experience leading evaluation projects in government, not-for-profit organisations and as a private consultant. She has worked across a range of issues and sectors including in education, youth mental health, industry policy and... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Senior Evaluation & Research Officer, Department of Education (Victoria)
I'm here for evaluation but passionate about so many other things - education (as a former classroom teacher); language, neuroscience and early years development (recently became a mom so my theory reading at the moment is on these topics); outdoors and travel. Workwise, I'm wrangling... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When "parachuting in" is not an option: Exploring value with integrity across languages, continents and time zones
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106
Authors: Julian King (Julian King & Associates), Adrian Field (Dovetail)

The rapid growth of video-conferencing technology has increased the ability for evaluations to be conducted across multiple countries and time zones. People are increasingly used to meeting and working entirely online, and evaluations can in principle be designed and delivered without need for face to face engagement. Translational AI software is even able to break through language barriers, providing further efficiencies and enabling evaluation funds to be directed more to design, data gathering and analysis.

Yet the efficiency of delivery should not compromise the integrity with which an evaluation is conducted. This is particularly true in situations where different dimensions of equity come into question, and in an evaluation where two or more languages are being used, ensuring that the design and delivery are meaningful and accessible to all participants, not just the funder.

The growth of remote evaluation working presents a very real, if not even more pressing danger, of the consultant "parachuting in" and offering solutions that have little or no relevance to the communities who are at the centre of the evaluation process.

In this presentation we explore the wayfinding process in designing and implementing a Value for Investment evaluation of an urban initiative focusing on the developmental needs of young children, in Jundiaí, Brazil. We discuss the challenges and opportunities presented by a largely (but ultimately not entirely) online format, in leading a rigorously collaborative evaluation process, and gathering data in a way that ensures all stakeholder perspective are appropriately reflected. We discuss the trade-offs involved in this process, the reflections of evaluation participants, and the value of ensuring that underlying principles of collaborative and cross-cultural engagement are adhered to.

Chair
avatar for Melinda Mann

Melinda Mann

Academic Lead Jilbay First Nations RHD Academy, CQUniversity
Melinda Mann is a Darumbal and South Sea Islander woman based in Rockhampton, Qld. Her work focuses on Indigenous Nation building, Pacific sovereignties, and regional and rural communities. Melinda has a background in student services, learning design, school and tertiary education... Read More →
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Chair Speakers
avatar for Jake Clark

Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Failing your way to better practice: How to tread carefully when things aren't going as planned
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105
Authors: Stephanie White (Victoria Department of Education )

Evaluators can fail in many ways. The consequences of these failures can be relatively contained or wide ranging within the evaluation and can also flow on to program operations. But failure is a part of life and can be a useful catalyst for professional growth. What happens when you find yourself failing and can see the risks ahead? How do you keep going?

The session focuses on the experiences of an emerging evaluator who failed while leading a large-scale education evaluation. When some elements of the evaluation became untenable, they struggled to find the right path forward and could foresee the risks materialising if the situation wasn’t addressed. On the other side of it, they reflect on how they drew on tools in every evaluator’s toolkit to start remedying their previous inaction and missteps to get the evaluation back on track…and improve their practice along the way!

This session is relevant to any evaluator who grapples with the messiness of expectations and reality in their practice.


Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Stephanie White

Stephanie White

Victoria Department of Education
I found my way to evaluation to help me answer questions about education program quality and success. Professionally, I have diverse experiences in education and evaluation, from delivering playgroups under trees in the NT to reports on educator resources to senior education bureaucrats... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating organisational turbulence: An evaluation-based strategic learning model for organisational sustainability
Thursday September 19, 2024 10:30am - 11:00am AEST
103
Authors: Shefton Parker, Monash Univeristy; Amanda Sampson, Monash Univeristy

Increasingly, turbulent, and rapidly changing global operating environments are disrupting organisational plan implementation and strategy realisation of institutions. The session introduces a novel organisational collaborative strategic learning and effectiveness model, intended to bolster organisational resilience responses amidst such turbulence.
A scarcity of suitable organisational strategic learning systems thinking models utilising evaluation methodology in a joined-up way, prompted the presenters to develop a model. The model is tailored for strategic implementation in a complex organisational system environment, operating across decentralised portfolios with multiple planning and operational layers. The model amalgamates evaluation methodologies to identify, capture, share and respond to strategic learning in a complex system. It is hypothesised the model will outperform conventional organisational performance-based reporting systems, in terms of organisational responsiveness, agility, adaptability, collaboration, and strategic effectiveness.
The presentation highlights the potential value of integrating and embedding evaluation approaches into an organisation's strategy, governance and operations using a three-pronged approach:
- Sensing: Gathering relevant, useful timely data (learning);
- Making sense: Analysing and contextualising learning data alongside other relevant data (institutional performance data, emerging trends, policy, and legislative reform etc); and
- Good sense decisions: Providing timely and relevant evaluative intelligence and insights to support evidence based good decision making.
The presenters advocate for a shift from viewing evaluation use as a 'nice to have' to a 'must have' aspect of organisational growth and sustainability. The model aims to foster a leadership culture where decision makers value the insights that contextualised holistic organisational intelligence can provide for;

i) Strategic planning: Enhanced planning and strategic alignment across portfolios;

ii) Operational efficiency: Reducing duplication in strategic effort and better collaboration towards strategic outcomes;

iii) Business resilience and sustainability: Improved identification and quicker response to emerging opportunities and challenges; and

iv) Strategic effectiveness: Informing activity adaptation recommendations for strategic goal realisation.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Shefton Parker

Shefton Parker

Senior Evidence & Evaluation Adviser, Monash University - Institutional Planning
Dr Shefton Parker is an evaluator and researcher with over 15 years of specialist experience in program and systems evaluation within the Vocational and Higher Education sectors. Recently, his evaluation of innovative education programs were referenced as evidence in the University... Read More →
avatar for Amanda Sampson

Amanda Sampson

Senior Manager, Institutional Planning, Monash University
I am leading the development and implementation of an Institutional Evaluation Model which a complex organisation to support organisational resilience, strategic adaptation and execution to realise the 10 year organisational strategic objectives. I am interested in learning how to... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Culturally inclusive evaluation with culturally and linguistically diverse communities in Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
Author Lena Etuk (CIRCA Research, AU)

In this presentation we will outline an approach to culturally inclusive evaluation with people from culturally and linguistically diverse backgrounds in Australia, its strengths, and its growth opportunities. This approach fills a critical gap in the way evaluation and research with culturally and linguistically diverse communities is traditionally conducted in Australia.

In this presentation we will explain how the Cultural & Indigenous Research Centre Australia (CIRCA) conducts in-culture and in-language evaluation with diverse cohorts of Australians, and how this practice fits within the broader methodological discourse in evaluation and social science more broadly. We will illustrate how our culturally inclusive methodology is put into practice with findings from CIRCA's own internal research into the way cultural considerations shape our data collection process. We will conclude with reflections on how CIRCA might further draw on and leverage standpoint theory and culturally responsive evaluation as this practice is further refined.

Our key argument is that doing culturally inclusive evaluation is a process that requires reflexivity and learning, alongside strong and transparent institutional processes. Combining these approaches creates systemic ways of acknowledging and working within stratified and unequal social systems, inherent to any research. Our findings will advance knowledge within the field of evaluation about how to engage and represent culturally and linguistically diverse community members across Australia.
Chair Speakers
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Culturally Inclusive Research Centre Australia
I’m an applied Sociologist with 18+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Julia Suh

Julia Suh

Principal, Tobias
avatar for JESSICA LEEFE

JESSICA LEEFE

Principal, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
avatar for Dr Katherine Dix

Dr Katherine Dix

Principal Research Fellow, School and System Improvement, Australian Council for Educational Research
Dr Katherine Dix is a Principal Research Fellow at ACER, with over 20 years as a program evaluator, educational researcher and Project Director. Dr Dix is the National Project Manager for Australia’s participation in OECD TALIS 2024, and is a leading expert in wellbeing and whole-school... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Getting to the value add: Timely insights from a realist developmental evaluation
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Phillip Belling (NSW Department of Education), Liam Downing (NSW Department of Education, AU)

This paper is aimed at early career and experienced evaluators interested in realist evaluation, but with concerns about the time a realist approach might take. The authors respond to this concern with an innovative blending of realist and developmental evaluation. Participants will exit the room with a working understanding of realist developmental evaluation, including its potential for adaptive rigour that meets the needs of policy makers and implementers.

Realist evaluation is theoretically and methodologically robust, delivering crucial insights about how, for whom and why interventions do and don't work (House, 1991; Pawson & Tilley, 1997; Pawson, 2006). It aims to help navigate unfamiliar territory towards our destination by bringing assumptions about how and why change happens out in the open.

But even realism's most enthusiastic practitioners admit it takes time to surface and test program theory (Marchal et al., 2012; van Belle, Westhorp & Marchal, 2021). And evaluation commissioners and other stakeholders have understandable concerns about the timeliness of obtaining actionable findings (Blamey & Mackenzie, 2007; Pedersen & Rieper, 2008).

Developmental evaluation (Patton, 1994, 2011 2021; Patton, McKegg, & Wehipeihana, 2015) is more about what happens along the way. It appeals because it provides a set of principles for wayfinding in situations of complexity and innovation. Realist and developmental approaches do differ, but do they share some waypoints to reliably unpack perplexing problems of practice?

This paper documents a journey towards coherence and rigour in an evaluation where developmental and realist approaches complement each other, and deliver an evidence base for program or policy decision-making that is not only robust but also timely.

We show that, in complex environments, with programs involving change and social innovation, realist developmental evaluation can meet the needs of an often-varied cast of stakeholders, and can do so at pace, at scale, and economically.
Chair
avatar for Vanessa Hood

Vanessa Hood

Associate Director, Rooftop Social
I've been working as a facilitator and evaluator for over 20 years, in a wide range of contexts, including horticulture, sustainability and financial literacy. Duncan Rintoul and I run Rooftop Social, which provides consulting services in evaluation, social research, facilitation... Read More →
Speakers
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Scaling Impact: How Should We Evaluate the Success of a Scaling Journey?
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106
Authors: John Gargani (Gargani + Co)

The world has never faced larger problems—climate change, refugee crises, and Covid19, to name just three. And organizations have responded by scaling solutions to unprecedented size—sustainable development goals, global refugee policies, and universal vaccination programs. But scaling is a journey to a destination imperfectly imagined at the onset and difficult to recognize upon arrival. At what point is scaling a program, policy, or product successful? Under what conditions should scaling stop? Or "descaling" begin? Robert McLean and I posed these and other questions to innovators in the Global South and shared what we learned in our recent book *Scaling Impact: Innovation for the Public Good*. In this session, we outline the book's four research-based scaling principles—justification, optimal scale, coordination, and dynamic evaluation. Then we discuss how to (1) define success as achieving impact at optimal scale, (2) choose a scaling strategy best suited to achieve success, and (3) judge success with dynamic evaluation. My presentation goes beyond the book, reflecting our most current thinking and research, and I provide participants with access to free resources, including electronic copies of the book.
Chair
avatar for Carolyn Wallace

Carolyn Wallace

Manager Research and Impact, VicHealth
Carolyn is an established leader in health and community services with over 22 years of experience across regional Victoria, Melbourne, and Ireland. She has held roles including CEO, executive director, policy officer, and researcher, specialising in community wellbeing and social... Read More →
Speakers
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

A long road ahead: Evaluating long-term change in complex policy areas. A case study of school active travel programs in the ACT
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106
Authors: Mallory Notting (First Person Consulting)

The ACT Government implemented a suite of programs over the ten year period between 2012 and 2022 aiming to increase the rates of students actively travelling to and from school. 102 schools in the ACT participated in at least one of the three programs during this time which targeted well-known barriers to active travel, including parental perceptions of safety and infrastructure around school. The programs were intended to contribute towards a range of broader priorities, including health, safety, and environmental outcomes.

This short-paper session will share learnings from evaluating long-term behaviour change at a population level, based on the school active travel evaluation. The evaluation represents a unique case study, as the evaluators needed to look retrospectively over ten years of program delivery and assess whether the combination of programs had created changes within the system and had resulted in the achievement of wider goals.

The presenter will illustrate that the line between short-term and long-term outcomes is rarely linear or clear, as is the relationship between individual interventions and whole of system change. This will be done by summarising the approach taken for the evaluation and sharing the diversity of information collated for analysis, which included individual program data and attitudinal and infrastructure-level data spanning the whole school environment.

Evaluators are often only able to examine the shorter term outcomes of an intervention, even in complex policy areas, and then rely on a theory of change to illustrate the assumed intended wider impacts. The presenter was able to scrutinise these wider impacts during the active travel evaluation, an opportunity not regularly afforded to evaluators. The lessons from the active travel evaluation are therefore pertinent for other evaluations in complex policy areas and may carry implications for program design as the focus shifts increasingly towards population-level, systems change.

Chair
avatar for Carolyn Wallace

Carolyn Wallace

Manager Research and Impact, VicHealth
Carolyn is an established leader in health and community services with over 22 years of experience across regional Victoria, Melbourne, and Ireland. She has held roles including CEO, executive director, policy officer, and researcher, specialising in community wellbeing and social... Read More →
Speakers
avatar for Mallory Notting

Mallory Notting

Principal Consultant, First Person Consulting
Mallory is a Principal Consultant at First Person Consulting. She manages and contributes to projects primarily in the area of cultural wellbeing, social inclusion, mental health, and public health and health promotion. In 2023, Mallory was the recipient of the Australian Evaluation... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Charting the Course: Measuring Organisational Evaluation Capacity Building
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Rochelle Tobin (Curtin University)

Measuring evaluation capacity building is complex, and there are few examples of quantitative measurement tools to enable evaluators to chart progress. WAAC (WA AIDS Council) and Curtin established a five-year partnership to build evaluation capacity within WAAC. To measure progress, a validated tool (Schwarzman et al. 2019) to assess organisational evaluation capacity was modified and combined with another partnership-based tool (Tobin et al. in press). The survey was administered to WAAC staff at baseline (n = 17) and then one year after the partnership was established (n = 19). Significant improvements were seen in individual skills for evaluation tasks, tools for evaluation and evaluation systems and structures. These tools provide a rigorous approach to tracking progress towards organisational evaluation capacity.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Rochelle Tobin

Rochelle Tobin

PhD candidate
I am a PhD candidate investigating SiREN's (Sexual Health and Blood-borne Virus Research and Evaluation Network) influence on research and evaluation practices in the Western Australian sexual health and blood-borne virus sector. I also support SiREN's knowledge translation activities... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Journey Mapping: Visualising Competing Needs within Evaluations
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Jolenna Deo (Allen and Clarke Consulting)

Journey mapping acts as a GPS to grasp audience or consumer experience in evaluating policies or programs, highlighting twists, hidden gems, and pitfalls It can be a useful tool to help evaluators capture disparities and competing needs among intended demographics. This session will discuss the journey mapping method, drawing from an evaluation of a Community Capacity Building Program which used journey mapping to illustrate key consumer personas. It will explore the integration of multiple data sources to provide a comprehensive understanding of complex disparities and the cultural and historical contexts in which these arise.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Jolenna Deo

Jolenna Deo

Consultant, Allen and Clarke Consulting
Jolénna is a consultant at Allen + Clarke consulting. She is a proud Mirriam Mer, Pasifika women with a background in Development studies, Pacific studies and social policy, combining her interests in indigenous methodologies and social justice. She is experienced in community and... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Reflections by a non-analyst on the use of state-wide data sets and modelled data in evaluation
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Gabby Lindsay-Smith 

Using linked Government data sets provide an opportunity to investigate the impact of state-wide programs and policies but are often out of reach for many evaluators, and especially non-analysts. This presentation will detail a non-analyst’s experience incorporating state linked data sets into a recent evaluation of a Victorian-wide family services program evaluation. The presentation will outline tips and tricks for those who may consider incorporating government-level linked data or simulation models into large program or policy evaluations in the future. It will cover areas such as: where to begin, navigating the data and key tips for working with analysts.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The evolution of evaluation: Retracing our steps in evaluation theory to prepare for the future
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: James Ong (University of Melbourne)

As new people enter the evaluation field and as evaluation marches forward into the future, it is important to learn from evaluation theorists that have come before us. My Ignite presentation will argue that modern evaluation is built on evaluation theory, and make the call for evaluators of all levels to learn evaluation theory to:
  1. Appreciate how evaluation has evolved;
  2. Strengthen their evaluation practice; and
  3. Navigate themselves around an ever-changing evaluation landscape.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for James Ong

James Ong

Research Assistant (Evaluations), University of Melbourne
My name is James Ong. I am an Autistic program evaluator working at the University of Melbourne, where I work on evaluation and implementation projects in various public health projects such as the AusPathoGen program and the SPARK initiative. I not only have a strong theoretical... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Friday, September 20
 

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Nicole Tujague

Nicole Tujague

Founder and Director, The Seedling Group
Nicole TujagueBachelor of Indigenous Studies (Trauma and Healing/Managing Organisations)1st Class Honours, Indigenous ResearchPhD in Indigenous-led Evaluation, Gnibi College, Southern Cross UniversityNicole is a descendant of the Kabi Kabi nation from Mt Bauple, Queensland and the... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Reviewing and writing for the Evaluation Journal of Australasia
Friday September 20, 2024 10:30am - 11:30am AEST
103
Authors: John Guenther (Batchelor Institute Of Indigenous Tertiary Education), Anthea Rutter (University of Melbourne, AU), Yvonne Zurynski (Macquarie Univesity, AU)

The Evaluation Journal of Australasia (EJA) supports evaluators who wish to share their knowledge and practical experiences in a peer-reviewed article. Documenting evidence, including for programs which do not achieve expected results, is critical for improving evaluation practice, building the evidence base, and advancing evaluation methodologies that are rigorous and ethical.

The EJA depends on volunteer reviewers who can offer critical feedback on articles that are submitted. Reviewers help to improve the quality of manuscripts the Journal receives.

The focus of this presentation is on how to write a good review: how to be academically critical, while at the same time providing constructive feedback that will benefit authors and readers. The presenters will offer step-by-step advice on what to look for, how to judge the quality of a manuscript, and how to make constructive suggestions for authors to consider.

The presentation will also explain how reviewing fits within the publication process, from submission to production. It will be most helpful to potential authors and current and potential reviewers. Authors will learn how to prepare their articles so they receive a favourable review, and reviewers will receive clear guidance on presenting their review feedback to authors.
Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for John Guenther

John Guenther

Research Leader, Education and Training, Batchelor Institute of Indigenous Tertiary Education
John Guenther is a senior researcher and evaluator with the Batchelor Institute of Indigenous Tertiary Education, based in Darwin. Much of his work has been based in the field of education. He has worked extensively with community-based researchers in many remote parts of the Northern... Read More →
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
avatar for Jeff Adams

Jeff Adams

Managing Editor | Senior Lecturer, Evaluation Journal of Australasia | Eastern Institute of Technology
I am the Managing Editor of the Evaluation Journal of Australasia - talk to me about publishing in, or reviewing for the journal. I also teach postgraduate Health Sciences at Eastern Institute of Technology, Auckland.
Friday September 20, 2024 10:30am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Value Propositions: Clearing the path from theory of change to rubrics
Friday September 20, 2024 11:00am - 12:30pm AEST
Authors: Julian King (Julian King & Associates Limited), Adrian Field (Dovetail Consulting Limited, NZ)

Evaluation rubrics are increasingly used to help make evaluative reasoning explicit. Rubrics can also be used as wayfinding tools to help stakeholders understand and participate meaningfully in evaluation. Developing rubrics is conceptually challenging work and the search is on for additional navigation tools and models that might help ease the cognitive load.

As a preliminary step toward rubric development it is often helpful to co-create a theory of change, proposing a chain of causality from actions to impacts, documenting a shared understanding of a program, and providing a point of reference for scoping a logical, coherent set of criteria.

However, it's easy to become disoriented when getting from a theory of change to a set of criteria, because the former deals with impact and the latter with value. Implicitly, a theory of change may focus on activities and impacts that people value, but this cannot be taken for granted - and we argue that value should be made more explicit in program theories.

Specifying a program's value proposition can improve wayfinding between a theory of change and a set of criteria, addressing the aspects of performance and value that matter to stakeholders. Defining a value proposition prompts us to think differently about a program. For example, in addition to what's already in the theory of change, we need to consider to whom the program is valuable, in what ways it is valuable, and how the value is created.

In this presentation, we will share what we've learnt about developing and using value propositions. We'll share a simple framework for developing a value proposition and, using roving microphones, engage participants in co-developing a value proposition in real time. We'll conclude the session by sharing some examples of value propositions from recent evaluations.

Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Friday September 20, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Fidelity to context: A realist perspective on implementation science
Friday September 20, 2024 1:30pm - 2:00pm AEST
105
Authors: Andrew McLachlan (NSW Department of Education)

At first glance, realist methodology seems ideally suited to investigating implementation problems (Dalkin et al., 2021). It is versatile in that it draws on theories from diverse fields of social inquiry. It is pragmatic in that the theories it adopts are deemed useful only in so far as they offer explanatory insight. And it is transferable; realist methodology is less concerned with generalising findings than in understanding how programs work under different conditions and circumstances.

As for implementation science, its founding aim is purpose built for realist work; it seeks to improve the uptake of evidence-based practices by investigating the barriers and facilitators to implementation. Yet despite the affinity between realist methodology and implementation science, so far there have been few attempts to formalise the relationship (Sarkies et al., 2022).

This paper offers insights into how evaluators can harness realist methodology to better understand challenges of program implementation. It demonstrates how implementation concepts like fidelity (the degree to which a program is delivered as intended), adaptation (the process of modifying a program to achieve better fit), and translation (the ability to transfer knowledge across organisational borders) can be combined with realist concepts to develop a more active understanding of context.

In showing how to construct program theories that are responsive to changing conditions, the paper promises to equip evaluators with tools that can help them navigate the complexities of program implementation in their own work.



Chair Speakers
avatar for Andrew McLachlan

Andrew McLachlan

Evaluation Lead - Strategy, NSW Department of Education
Andrew McLachlan is an Evaluation Lead for the NSW Department of Education. Before becoming an evaluator, Andrew had over 10 years of experience as a teacher, working in settings as diverse as far North Queensland and Bangladesh. Since 2021, Andrew has worked as an embedded evaluator... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Program Evaluation Fundamentals in the NSW Department of Planning, Housing and Infrastructure: An eLearning course on evaluation
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Anabelle (Pin-Ju) Chen (NSW Department of Planning, Housing and Infrastructure)

Introducing Program Evaluation Fundamentals (PEF) in the NSW Department of Planning, Housing and Infrastructure, an eLearning course designed to facilitate a coherent journey of learning within the department. With learning and adapting together in mind, the design of PEF empowers individuals at all levels to navigate the fundamentals of evaluation. Through interactive modules, learners will understand key evaluation concepts and cultivate best practices. PEF promotes transformative growth by emphasising foundational evaluation knowledge. By leveraging PEF, we can shift our approach, embrace innovation, and advance the field of evaluation across the public sector, fostering a supportive community of forward-thinking evaluators.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Anabelle (Pin-Ju) Chen

Anabelle (Pin-Ju) Chen

Senior Analyst, Evidence and Evaluation, NSW Department of Planning, Housing and Infrastructure
Anabelle (Pin-Ju) Chen is a distinguished senior analyst hailing from Taiwan, with a global perspective on evaluation, data analysis, and project management. Having studied in Taiwan, the United Kingdom, and Australia, Anabelle brings a diverse range of experiences and insights to... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Putting values on the evaluation journey map
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People)

Values guide all evaluation processes, methods and judgements. Although evaluators are often not aware of the values shaping their work and can't readily name them, they know when we are straying off their values path through the experience of conflict or unease. Reflecting on the evaluation literature and two decades of evaluation practice using a 'values' perspective, it is argued that there has never been a more important time to build values literacy. This presentation demonstrates how values literacy can guide conversations with yourself, your team and others and provide signposting and illumination of a more rigorous and ethical evaluation journey.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Voices of the Future: Elevating First Nations Leadership in the Evolution of Educational Excellence
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Skye Trudgett (Kowa ),Sharmay Brierley (Kowa, AU)

This ignite presentation will delve into the pioneering evaluation within the education sector, where a series of education initiatives were designed and implemented by Aboriginal Community Controlled Organisations (ACCO's) and mainstream Education partners to uplift and support young First Nations peoples. We will uncover how the initiative's evaluation framework was revolutionarily constructed with First Nations communities at its heart, applying the reimagining evaluation framework, utilising diverse data collection methods and producing Community Reports that reflect First Nations experiences and voices.

Attendees will be guided through the evaluative journey, showcasing the incorporation of wisdom to demonstrate the profound value of community-delivered initiatives that contribute to change. The session will highlight the success stories and learnings, emphasising how this approach not only benefits the current generation but also lays the groundwork for the prosperity of future generations.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Sharmay Brierley

Sharmay Brierley

Consultant, Kowa Collaboration
Sharmay is a proud Yuin woman and project lead at Kowa with prior experience supporting First Nations peoples across human services sectors.As a proud First Nations woman, and through lived experience, Sharmay has a strong understanding of the many challenges faced by First Nations... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Chair Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.