Loading…
Conference hashtag #aes24MEL
International development clear filter
arrow_back View All Dates
Friday, September 20
 

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Nicole Tujague

Nicole Tujague

Founder and Director, The Seedling Group
Nicole TujagueBachelor of Indigenous Studies (Trauma and Healing/Managing Organisations)1st Class Honours, Indigenous ResearchPhD in Indigenous-led Evaluation, Gnibi College, Southern Cross UniversityNicole is a descendant of the Kabi Kabi nation from Mt Bauple, Queensland and the... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Value Propositions: Clearing the path from theory of change to rubrics
Friday September 20, 2024 11:00am - 12:30pm AEST
Authors: Julian King (Julian King & Associates Limited), Adrian Field (Dovetail Consulting Limited, NZ)

Evaluation rubrics are increasingly used to help make evaluative reasoning explicit. Rubrics can also be used as wayfinding tools to help stakeholders understand and participate meaningfully in evaluation. Developing rubrics is conceptually challenging work and the search is on for additional navigation tools and models that might help ease the cognitive load.

As a preliminary step toward rubric development it is often helpful to co-create a theory of change, proposing a chain of causality from actions to impacts, documenting a shared understanding of a program, and providing a point of reference for scoping a logical, coherent set of criteria.

However, it's easy to become disoriented when getting from a theory of change to a set of criteria, because the former deals with impact and the latter with value. Implicitly, a theory of change may focus on activities and impacts that people value, but this cannot be taken for granted - and we argue that value should be made more explicit in program theories.

Specifying a program's value proposition can improve wayfinding between a theory of change and a set of criteria, addressing the aspects of performance and value that matter to stakeholders. Defining a value proposition prompts us to think differently about a program. For example, in addition to what's already in the theory of change, we need to consider to whom the program is valuable, in what ways it is valuable, and how the value is created.

In this presentation, we will share what we've learnt about developing and using value propositions. We'll share a simple framework for developing a value proposition and, using roving microphones, engage participants in co-developing a value proposition in real time. We'll conclude the session by sharing some examples of value propositions from recent evaluations.

Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Friday September 20, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

From evaluation to impact-practical steps in a qualitative impact study
Friday September 20, 2024 1:30pm - 2:00pm AEST
Authors: Linda Kelly (Praxis Consultants), Elizabeth Jackson (Latrobe University, AU)

This presentation focuses on a multi-year program funded by Australia that aims to empower people marginalised by gender, disability and other factors. Like similar programs, the work is subject to regular monitoring and evaluation - testing the effectiveness of program activities largely from the perspective of the Australian and national country Government.
But what of the views of the people served by the program? Is the impact of the various activities sufficient to empower them beyond their current condition? How significant are the changes introduced by the program, given the structural, economic, social and other disadvantages experienced by the marginalised individuals and groups.
Drawing from feminist theory, qualitative research methods and managed with local research and communication experts this presentation outlines the study focused on the long-term impact of the program.

The presentation will outline the methodology and practical considerations in the development of the approach and data collection methodologies. It will highlight the value of exploring impact from a qualitative perspective, while outlining the considerable management and conceptual challenges required in designing, introducing and supporting such an approach. It will consider some of the implications in shifting from traditional evaluation methods to more open-ended enquiry and consider whose values are best served through evaluation versus impact assessment?


Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Elisabeth Jackson

Elisabeth Jackson

Senior Research Fellow, Centre for Human Security and Social Change, La Trobe University
Dr Elisabeth Jackson is a Senior Research Fellow at the Centre for Human Security and Social Change where she conducts research and evaluation in Southeast Asia and the Pacific. She is currently co-leading an impact evaluation of a program working with diverse marginalised groups... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -