Loading…
Conference hashtag #aes24MEL
Education and training clear filter
arrow_back View All Dates
Wednesday, September 18
 

11:00am AEST

Evaluation that adds value for People and Planet: Perspectives, Challenges, and Opportunities for Indigenous Knowledge Systems in Africa.
Wednesday September 18, 2024 11:00am - 11:30am AEST
104
Authors: Awuor PONGE (African Policy Centre (APC) )

Indigenous Knowledge Systems (IKS) in Africa have long been marginalized and undervalued, despite their potential to offer sustainable solutions to pressing challenges faced by communities across the continent. This presentation explores the perspectives, challenges, and opportunities for incorporating IKS into evaluation practices that create value for both people and the planet.

From a people-centric perspective, IKS offer a holistic and culturally relevant approach to understanding local contexts, priorities, and value systems. By embracing these knowledge systems, evaluations can better capture the multidimensional nature of well-being, including spiritual, social, and environmental aspects that are often overlooked in conventional evaluation frameworks. However, challenges arise in reconciling IKS with dominant Western paradigms and navigating power dynamics that have historically suppressed indigenous voices.

From a planetary perspective, IKS offer invaluable insights into sustainable resource management, biodiversity conservation, and climate change adaptation strategies that have been honed over generations of lived experiences. Integrating these knowledge systems into evaluation can shed light on the intricate relationships between human activities and ecosystem health, enabling more informed decision-making for environmental sustainability. Nonetheless, challenges exist in bridging the divide between traditional and scientific knowledge systems, as well as addressing concerns around intellectual property rights and benefit-sharing.

This presentation will explore innovative approaches to overcoming these challenges, such as participatory and community-based evaluation methodologies, capacity-building initiatives, and cross-cultural dialogue platforms. By fostering a deeper appreciation and understanding of IKS, evaluation practices can become more inclusive, relevant, and effective in creating value for both people and the planet in the African context.


Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Awuor Ponge

Awuor Ponge

Senior Associate Research Fellow, African Policy Centre (APC)
Dr. Awuor Ponge, is a Senior Associate Fellow, in-charge of Research, Policy and Evaluation at the African Policy Centre (APC). He is also the Vice-President of the African Evaluation Association (AfrEA). He holds a Doctor of Philosophy (PhD) Degree in Gender and Development Studies... Read More →
Wednesday September 18, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106
Authors: Samantha Abbato (Visual Insights People )

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Evaluation capacity building is increasingly becoming a core part of evaluation practice and a critical part of incorporating evaluation into the everyday activity of organisations (Preskill and Boyle, 2008, White, Percy and Small, 2018). Reaching the point where evaluation becomes the way of doing business requires a change of knowledge, skills, and attitudes.

Changes need to happen at the level of individuals, teams, organisations, and partnerships. This journey requires supporting and managing change to systematic enquiry processes as much as it requires evaluation expertise. In this skill-building session, we introduce Jonathan Haidt's 'rider, elephant and pathway' metaphor as a framework to support change and strengthen evaluation capacity (Haidt, 2018).

Haidt's metaphor for change includes the rider (our rational thinking side) atop an elephant (our emotional side). Behaviour change for individuals and collectives requires steps that (1) support the rider, such as giving clear directions, (2) motivate the elephant by tapping into emotions, and (3) shape a pathway to change, including clearing obstacles. In this interactive session, the facilitator will provide case studies applying Haidt's metaphor,spanning two decades Through these examples the power of this framework to support evaluation capacity building is demonstrated. Examples include using Haidt's framework for:
1. Building a Monitoring, Evaluation and Learning (MEL) system with a medium-sized community organisation;
2. Increasing the maturity of MEL in an existing large organisation; and
3. Increasing the impact of evaluation partnerships.

The active skill-building component incorporates:_
  • Cartoon elephant, rider and pathway flashcards;
  • A 'snakes and ladders' style game; and
  • Evaluation-specific examples.

The combination of examples and activities are designed to support participant learning. The session will encourage discussion of barriers, enablers and actions to build evaluation capacity relevant to different situations and contexts.

Learning objectives include:
  • Knowledge of a sound and memorable psychological framework for supporting evaluation capacity building;
  • Ability to apply Haidt's metaphor
Chair
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

From bottlenecks to breakthroughs: Insights from a teacher workforce initiative evaluation
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104
Authors: Rhiannon Birch (Victorian Department of Education ),Hayden Jose (Urbis, AU),Joanna Petkowksi (Victorian Department of Education, AU),Ekin Masters (Victorian Department of Education, AU)

How can an evaluation balance the need to generate pragmatic insights while meeting central agency requirements for rigorous measurement of outcomes? What ingredients can facilitate the effective evaluation of a government initiative and achieve improved outcomes? This paper explores the essential ingredients for evaluating a large-scale government program using an example of a statewide initiative aimed at attracting and retaining suitably qualified teachers in hard-to-staff positions in Victorian government schools.

We showcase how an adaptive and evidence-led method of enquiry helped identify program implementation bottlenecks and probe potentially unintended program outcomes over a three-year evaluation. We discuss enablers for the integration of evaluation recommendations into program implementation and future policy direction, particularly on participatory action approaches and deep relationships with policy and implementation teams. We will also present the robust and varied methodology, particularly the novel use of system data to facilitate a quasi-experimental design that aligned with central agency requirements and met stakeholder needs.
This presentation will benefit policymakers, program evaluators, and others interested in evaluating government programs, by sharing key learnings on how evaluations can balance pragmatic insights with central agency requirements and identifying the key elements for influencing such programs and achieving improved outcomes.
Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Rhiannon Birch

Rhiannon Birch

Senior Evaluation and Research Officer, Department of Education
Rhiannon is a dedicated research and evaluation specialist committed to enhancing health, social, education, and environmental outcomes for people and the planet. With over 10 years of experience in evaluation, she has worked extensively across emergency services, public health, and... Read More →
avatar for Hayden Jose

Hayden Jose

Associate Director, Urbis
Hayden brings 13 years’ experience as an evaluator, applied researcher and policy practitioner with extensive work in complex evaluations in government and not-for-profit settings. Across his roles, he has worked to consider complex system problems and translate evidence effectively... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Warlpiri ways of assessing impact - How an Aboriginal community is defining, assessing and taking action for a good life in their community.
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
Authors: Emily Lapinski (Central Land Council ),Malkirdi Napaljarri Rose (Centre For Human Security and Social Change, La Trobe University, AU), Glenda Napaljarri Wayne (Central Land Council, AU), Geoffrey Jungarrayi Barnes (Central Land Council, AU), Alex Gyles (Centre For Human Security and Social Change, La Trobe University, AU)

For evaluation to support transformational change, research suggests strategies must focus on localised Indigenous values, beliefs and worldviews. Decolonising evaluation involves identifying and addressing power and considering what is being evaluated, by whom and how. In this paper we argue that these developments are necessary but insufficient and suggest a possible way forward for further decolonising the field of evaluation. To support change for Indigenous Australians the emphasis needs to move from simple evaluation of individual programs to more critical examination of their combined impact on communities from local perspectives.

This paper explores how Warlpiri and non-Indigenous allies are collaborating to create and use their own community-level impact assessment tool. The 5-year Good Community Life Project is supporting Warlpiri residents of Lajamanu in the Northern Territory to define, assess and take action for a 'good community life'. Warlpiri will explain how they created the approach for assessing wellbeing in Lajamanu, and how they are using emerging results to give voice to their interests and advocate for the life they envision for future generations.

The project involves collaboration between Warlpiri community members, land council staff and university researchers, drawing on Indigenous concepts of 'two-way' seeing and working, relationality, and centring Indigenous voice and values. Applying these concepts in practice is challenging, particularly for non-Indigenous allies who must constantly reflect and use their privilege to challenge traditional views on 'robust' evaluation methodology.

Warlpiri and the land council see potential for this work to improve life in Lajamanu and as an approach that could be applied across Central Australian communities. Going beyond co-designed and participatory evaluation to critical examination of impact is the next step in supporting change with Indigenous communities. This paper will focus on Warlpiri perspectives, plus brief reflections from non-Indigenous allies, with time for the audience to discuss broader implications.
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
EL

Emily Lapinski

Monitoring, Evaluation and Learning Coordinator, Central Land Council
avatar for Alex Gyles

Alex Gyles

Research Fellow - Monitoring and Evaluation, Institute for Human Security and Social Change, La Trobe University
Alex Gyles is a Research Fellow working in Monitoring, Evaluation and Learning (MEL) at the Institute for Human Security and Social Change, La Trobe University. He works closely with Marlkirdi Rose Napaljarri on the YWPP project and finds fieldwork with the YWPP team an exciting learning... Read More →
GN

Glenda Napaljarri Wayne

Glenda Wayne Napaljarri is a community researcher on the YWPPproject from Yuendumu. She has developed her practice workingas an adult literacy tutor in Yuendumu’s Community LearningCentre. In addition to conducting research in her home communityof Yuendumu, Glenda has travelled... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Enhancing Stakeholder Engagement Through Culturally Sensitive Approaches: A Focus on Aboriginal and Torres Strait Islander Communities
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105
Authors: Mark Power (Murawin ),Carol Vale (Murawin, AU)

This presentation explores the paramount importance of culturally sensitive engagement methodologies in ensuring meaningful contributions from Aboriginal and Torres Strait Islander communities to mission programs. Murawin, an Aboriginal-led consultancy, has developed a robust Indigenous Engagement Strategy Framework grounded in the principles of reciprocity, free, informed and prior consent, mutual understanding, accountability, power sharing, and respect for Indigenous knowledge systems. Our session aims to share insights into the necessity of prioritising Aboriginal and Torres Strait Islander voices in engagement, co-design, and research, highlighting the significance of cultural competence in fostering mutual respect and understanding.
We will discuss three key messages: the imperative of deep knowledge and understanding of Aboriginal and Torres Strait Islander cultures in engagement practices; the success of co-design processes in facilitating genuine and respectful engagement; and the strategic partnership with CSIRO to enhance cultural competence and inclusivity in addressing Indigenous aspirations and challenges. These points underscore the critical role of acknowledging cultural interactions and ensuring cultural sensitivity in building strong, respectful productive relationships with Indigenous communities.
To achieve our session's objectives, we have designed an interactive format that blends informative presentations with the analysis of case studies, complemented by engaging intercultural discussions. This approach is intended to equip participants with actionable insights drawn from real-world examples of our collaborative ventures and co-designed projects. Through this comprehensive exploration, we aim to enrich participants' understanding of successful strategies for engaging Aboriginal and Torres Strait Islander communities, ultimately contributing to the achievement of more inclusive and impactful outcomes in mission programs and beyond.


Chair
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Speakers
avatar for Carol Vale

Carol Vale

CEO & Co-founder, Murawin, Murawin
Carol Vale is a Dunghutti entrepreneur, businesswoman, CEO and co-founder of Murawin, who’s passion, determination and commitment have driven her impressive 40-year career as a specialist in intercultural consultation, facilitation, and participatory engagement, and an empathetic... Read More →
avatar for Mark Power

Mark Power

Director, Evaluation & Research, Murawin
Mark is an experienced researcher with more than 20 years of experience in Australia and the Pacific. Mark manages Murawin’s evaluation and research practice and leads multiple evaluations for a variety of clients. Mark has overseen more than 30 high-profile, complex projects funded... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Enhancing evaluation value for small community organisations: A case example
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104
Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke, Carolyn McSporran (Blue Light Victoria, AU)Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke (Assessment and Evaluation Research Centre, University of Melbourne, AU), Elissa Scott (Blue Light Victoria, AU)

This presentation aims to provide a case example of how two small-scale, standard process/outcomes evaluations for a low-budget community organisation increased value for the organisation by identifying and seizing opportunities for evaluation capacity building. Formal evaluations represent a significant financial commitment for low-budget community organisations. By maximising the value provided by such evaluations, evaluators can contribute more to these organisations' mission and ultimately to social betterment.

There are numerous different evaluation capacity building models and frameworks, many of which appear to be quite complex (for example: Volkov & King, 2007; Preskill & Boyle, 2008). Many emphasise planning, documentation, and other resource intensive components as part of any evaluation capacity building effort. This session provides a case example of intentional but light-touch and opportunistic evaluation capacity building. Through such an approach, evaluators may need to do only minimal additional activities to provide extra value to an organisation. Reflection-in-action during the evaluation process is as important as the final reporting (Schwandt & Gates, 2021). The session emphasises, though, that a critical enabler will be the organisation's leadership and culture, and willingness to seize the opportunity offered by a formal evaluation. The session is co-presented by two members of the evaluation team and the Head of Strategy, Insights, and Impact of the client organisation.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
avatar for Stephanie Button

Stephanie Button

Research Associate & Evaluator, Assessment & Evaluation Research Centre
Stephanie has worked as a policy manager, analyst, strategist, researcher, and evaluator across the social policy spectrum in the public and non-profit sector for over 12 years. She is passionate about evidence-based policy, pragmatic evaluation, and combining rigour with equitable... Read More →
avatar for Carolyn McSporran

Carolyn McSporran

Head of Strategy, Insights and Impact, Blue Light Victoria
Passionate about social inclusion, Carolyn's work has spanned diverse portfolios across the justice and social services sectors. With a fervent belief in the power of preventative and early intervention strategies, she is committed to unlocking the full potential of individuals and... Read More →
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Envisioning and Encountering Relational Aboriginal and Pacific Research Futures
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Lisa Faerua (Vanuatu), Nathan Sentance (Museum of Applied Arts and Sciences, AU), David Lakisa (Talanoa Consultancy, AU)

In the inaugural ANU Coral Bell Lecture on Indigenous Diplomacy, Dr Mary Graham outlined a powerful legacy of Aboriginal and Torres Strait Islander relational methods that have operated across a spectacular time scale. She envisioned a compelling future for its renewed application and spoke of these practices as a type of "thinking in formation, a type of slow, collective, and emergent process".

Inspired by Dr Graham's vision, this panel explores synergies, distinctions, and complementarities in local and Indigenous research methods across Australia and the Pacific. The panel features Wiradjuri, Samoan (Polynesian), Ni-Vanuatu (Melanesian) and settler-background (Australian) researchers from a range of fields who will explore, engage and showcase locally specific methodologies that connect across Australia and the Pacific continents, as ways of knowing, doing, and relating with the land, the moana (ocean) and air.

This session frames evaluation and research approaches as reflecting their contextual political order. While the panel will critique the legacies of individualist and survivalist research methods, it will focus on exploring the futures that relational research methods could realize. How do we evolve current institutional approaches to become more commensurate with Indigenous methods? Would institutionalizing these methods resolve the legacy, structure, and form of colonialist political approaches? Panelists will speak to their experience in working to evolve institutions in this way and the research and evaluation methodologies used within them.

The session also situates evaluation within a cannon of contextualizing evidence-based practices (such as political economy analysis, GEDSI analysis or feasibility.
Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
Speakers
avatar for Lisa Faerua

Lisa Faerua

Lisa Faerua is a Pacific Freelance Consultant. She brings 17 years of experience in international and community development in the areas of leadership, design, monitoring and evaluation. Lisa has provided technical support to DFAT, MFAT, and Non-Government Organisations such Oxfam... Read More →
avatar for Nathan Sentance

Nathan Sentance

Nathan “mudyi” Sentance is a cis Wiradjuri librarian and museum collections worker who grew up on Darkinjung Country. Nathan currently works at the Powerhouse Museum as Head of Collections, First Nations and writes about history, critical librarianship and critical museology from... Read More →
avatar for David Lakisa

David Lakisa

Managing Director, Talanoa Consultancy
Dr David Lakisa specialises in Pacific training and development, educational leadership and diversity management. He is of Samoan (Polynesian) ancestry and completed his PhD on 'Pacific Diversity Management' at the University of Technology Sydney (UTS) Business School.
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Revitalising Survey Engagement: Strategies to Tackle Low Response Rates
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Kizzy Gandy

Surveys are an excellent data collection tool when they reach their target response rate, but low response rates hinder the generalisability and reliability of the findings.

This Ignite presentation will discuss techniques Verian evaluators have applied to increase survey response rates while also assessing the efficacy and efficiency of these techniques. We will also explore other evidence-based strategies for boosting response rates and the value of drawing on other data sources if your response rates are still low.
Chair Speakers
avatar for Hannah Nguyen

Hannah Nguyen

Analyst, Verian Group
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Cultivating Equity: A Roadmap for New and Student Evaluators' Journeys
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
Authors: Ayesha Boyce (Arizona State University), Aileen Reid (UNC Greensboro, US)

Evaluation can be positioned as a social, cultural, and political force to address issues of inequity. We co-direct a 'lab' that provides new evaluators with hands-on applied research and evaluation experience to support their professional development. We are proud of our social justice commitments, and they show up in all aspects of our work. We believe the next generation of evaluators must be trained and mentored in high-quality technical, strengths-based, interpersonal, contextual, social justice-oriented, and values-engaged evaluation. We have found that novice evaluators are able to engage culturally responsive approaches to evaluation at the conceptual level, but have difficulty translating theoretical constructs into practice. This paper presentation builds upon our experiences and previous work of introducing a framework for teaching culturally responsive approaches to evaluation (Boyce & Chouinard, 2017) and a non-course-based, real-world-focused, adaptable training model (Reid, Boyce, et al., 2023). We will discuss how we have taught new evaluators three formal and informal methodologies that have helped them align their values with praxis. Drawing from our work across multiple United States National Science Foundation-funded projects we will overview how the incorporation of photovoice methodology, just-in-time feedback, and reflective practice have supported our commitments to meaningfully, and respectfully attend to issues of culture, race, diversity, power, inclusion, and equity in evaluation. We will also discuss our thoughts on the implications of globalization, Artificial Intelligence, and shifting politics on evaluation capacity building and training of new evaluators.

Chair
avatar for Nick Field

Nick Field

Director (Public Sector), Urbis
Nick has twenty years of public sector consulting experience, backed more recently by six years as a Chief Operating Officer in the Victorian Public Sector. A specialist generalist in a broad range of professional advisory services, Nick has expertise in the implementation of state-wide... Read More →
Speakers
avatar for Ayesha Boyce

Ayesha Boyce

Associate Professor, Arizona State University
Ayesha Boyce is an associate professor in the Division of Educational Leadership and Innovation at Arizona State University. Her research career began with earning a B.S. in psychology from Arizona State University, an M.A. in research psychology from California State University... Read More →
avatar for Aileen M. Reid

Aileen M. Reid

Assistant Professor, UNC Greensboro
Dr. Aileen Reid is an Assistant Professor of Educational Research Methodology in the Information, Library and Research Sciences department and a Senior Fellow in the Office of Assessment, Evaluation, and Research Services (OAERS) at UNC Greensboro. Dr. Reid has expertise in culturally... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Navigating a path to system impact: designing a strategic impact evaluation of education programs
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104
Authors: Amanda Reeves (Victorian Department of Education), Rhiannon Birch (Victorian Department of Education, AU), Eunice Sotelo (Victorian Department of Education, AU)

To provide insight to complex policy problems, evaluations need to adopt a systems perspective and examine the structures, relationships and contexts that influence program outcomes.

This paper outlines the design of a 4-year strategic evaluation that seeks to understand how a portfolio of over 25 education programs are interacting and collectively contributing to system-level outcomes. In this context, policy makers require evaluation to look beyond the boundaries of individual programs and assess the holistic impact of this investment to inform where and how resources can be directed to maximise system outcomes.

The strategic evaluation presented is theory-based and multi-layered, using logic modelling to identify outcomes at the program, cluster and system level and draw linkages to develop a causal pathway to impact. The strategic evaluation and the evaluations of individual education programs are being designed together to build-in common measures to enable meta-analysis and synthesis of evidence to assess system-level outcomes. The design process has been broad and encompassing, considering a diverse range of methods to understand impact including quantitative scenario modelling and value for money analysis.

The authors will describe how the strategic evaluation has been designed to respond to system complexity and add value. The evaluation adopts an approach that is:
• interdisciplinary, drawing on a range of theory and methods to examine underlying drivers, system structures, contextual factors and program impacts
• collaborative, using expertise of both internal and external evaluators, to design evaluations that are aligned and can tell a story of impact at the system-level
• exploratory, embracing a learning mindset to test and adapt evaluation activities over time.

This paper will be valuable for anyone who is interested in approaches to evaluating the relative and collective contribution of multiple programs and detecting their effects at the system level to inform strategic decision-making.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Amanda Reeves

Amanda Reeves

Principal Evaluation Officer, Victorian Department of Education
Amanda is an evaluation specialist with over 12 years experience leading evaluation projects in government, not-for-profit organisations and as a private consultant. She has worked across a range of issues and sectors including in education, youth mental health, industry policy and... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Senior Evaluation & Research Officer, Department of Education (Victoria)
I'm here for evaluation but passionate about so many other things - education (as a former classroom teacher); language, neuroscience and early years development (recently became a mom so my theory reading at the moment is on these topics); outdoors and travel. Workwise, I'm wrangling... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When "parachuting in" is not an option: Exploring value with integrity across languages, continents and time zones
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106
Authors: Julian King (Julian King & Associates), Adrian Field (Dovetail)

The rapid growth of video-conferencing technology has increased the ability for evaluations to be conducted across multiple countries and time zones. People are increasingly used to meeting and working entirely online, and evaluations can in principle be designed and delivered without need for face to face engagement. Translational AI software is even able to break through language barriers, providing further efficiencies and enabling evaluation funds to be directed more to design, data gathering and analysis.

Yet the efficiency of delivery should not compromise the integrity with which an evaluation is conducted. This is particularly true in situations where different dimensions of equity come into question, and in an evaluation where two or more languages are being used, ensuring that the design and delivery are meaningful and accessible to all participants, not just the funder.

The growth of remote evaluation working presents a very real, if not even more pressing danger, of the consultant "parachuting in" and offering solutions that have little or no relevance to the communities who are at the centre of the evaluation process.

In this presentation we explore the wayfinding process in designing and implementing a Value for Investment evaluation of an urban initiative focusing on the developmental needs of young children, in Jundiaí, Brazil. We discuss the challenges and opportunities presented by a largely (but ultimately not entirely) online format, in leading a rigorously collaborative evaluation process, and gathering data in a way that ensures all stakeholder perspective are appropriately reflected. We discuss the trade-offs involved in this process, the reflections of evaluation participants, and the value of ensuring that underlying principles of collaborative and cross-cultural engagement are adhered to.

Chair
avatar for Melinda Mann

Melinda Mann

Academic Lead Jilbay First Nations RHD Academy, CQUniversity
Melinda Mann is a Darumbal and South Sea Islander woman based in Rockhampton, Qld. Her work focuses on Indigenous Nation building, Pacific sovereignties, and regional and rural communities. Melinda has a background in student services, learning design, school and tertiary education... Read More →
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Chair Speakers
avatar for Jake Clark

Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Failing your way to better practice: How to tread carefully when things aren't going as planned
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105
Authors: Stephanie White (Victoria Department of Education )

Evaluators can fail in many ways. The consequences of these failures can be relatively contained or wide ranging within the evaluation and can also flow on to program operations. But failure is a part of life and can be a useful catalyst for professional growth. What happens when you find yourself failing and can see the risks ahead? How do you keep going?

The session focuses on the experiences of an emerging evaluator who failed while leading a large-scale education evaluation. When some elements of the evaluation became untenable, they struggled to find the right path forward and could foresee the risks materialising if the situation wasn’t addressed. On the other side of it, they reflect on how they drew on tools in every evaluator’s toolkit to start remedying their previous inaction and missteps to get the evaluation back on track…and improve their practice along the way!

This session is relevant to any evaluator who grapples with the messiness of expectations and reality in their practice.


Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Stephanie White

Stephanie White

Victoria Department of Education
I found my way to evaluation to help me answer questions about education program quality and success. Professionally, I have diverse experiences in education and evaluation, from delivering playgroups under trees in the NT to reports on educator resources to senior education bureaucrats... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -