Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
International development clear filter
Wednesday, September 18
 

11:00am AEST

Innovating Value for Money: Finding Our Way to Greater Value for All
Wednesday September 18, 2024 11:00am - 12:00pm AEST
105
Authors: John Gargani (Gargani + Co ),Julian King (Julian King & Associates, NZ)

In this participatory session, we pose the question, "How should evaluators innovate the practice of value-for-money assessment to meet the needs of an expanding set of actors that include governments, philanthropists, impact investors, social entrepreneurs, program designers, and Indigenous and First Nations communities?" We begin by framing value for money as an evaluative question about an economic problem. How well are we using resources, and are we using them well enough to justify their use? Then we suggest new methods intended to help innovate the practice of value for money based on our body of published and current research spanning over 10 years.
These include new methods that (1) produce "holistic" assessments of value for money, (2) reflect rather than hide multiple value perspectives even when values conflict, (3) estimate social benefit-cost ratios without monetizing benefits or costs, and (4) adjust monetary and nonmonetary value for risk using Bayesian methods. Along the way, we facilitate discussions with participants, asking them to consider if, how, and by whom these innovations should be pursued, and what other innovations may be needed. We provide participants with access to a collection of our published and draft papers, and invite them to comment and continue our discussion after the conference.
Chair
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
avatar for Farida Fleming

Farida Fleming

Evaluation Principal, Assai
I'm an evaluator with over 25 years of experience in international development. I'm currently one of a core team supporting DFAT implement its Evaluation Improvement Strategy.
Wednesday September 18, 2024 11:00am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Envisioning and Encountering Relational Aboriginal and Pacific Research Futures
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Lisa Faerua (Vanuatu), Nathan Sentance (Museum of Applied Arts and Sciences, AU), David Lakisa (Talanoa Consultancy, AU)

In the inaugural ANU Coral Bell Lecture on Indigenous Diplomacy, Dr Mary Graham outlined a powerful legacy of Aboriginal and Torres Strait Islander relational methods that have operated across a spectacular time scale. She envisioned a compelling future for its renewed application and spoke of these practices as a type of "thinking in formation, a type of slow, collective, and emergent process".

Inspired by Dr Graham's vision, this panel explores synergies, distinctions, and complementarities in local and Indigenous research methods across Australia and the Pacific. The panel features Wiradjuri, Samoan (Polynesian), Ni-Vanuatu (Melanesian) and settler-background (Australian) researchers from a range of fields who will explore, engage and showcase locally specific methodologies that connect across Australia and the Pacific continents, as ways of knowing, doing, and relating with the land, the moana (ocean) and air.

This session frames evaluation and research approaches as reflecting their contextual political order. While the panel will critique the legacies of individualist and survivalist research methods, it will focus on exploring the futures that relational research methods could realize. How do we evolve current institutional approaches to become more commensurate with Indigenous methods? Would institutionalizing these methods resolve the legacy, structure, and form of colonialist political approaches? Panelists will speak to their experience in working to evolve institutions in this way and the research and evaluation methodologies used within them.

The session also situates evaluation within a cannon of contextualizing evidence-based practices (such as political economy analysis, GEDSI analysis or feasibility.
Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
Speakers
avatar for Lisa Faerua

Lisa Faerua

Lisa Faerua is a Pacific Freelance Consultant. She brings 17 years of experience in international and community development in the areas of leadership, design, monitoring and evaluation. Lisa has provided technical support to DFAT, MFAT, and Non-Government Organisations such Oxfam... Read More →
avatar for Nathan Sentance

Nathan Sentance

Nathan “mudyi” Sentance is a cis Wiradjuri librarian and museum collections worker who grew up on Darkinjung Country. Nathan currently works at the Powerhouse Museum as Head of Collections, First Nations and writes about history, critical librarianship and critical museology from... Read More →
avatar for David Lakisa

David Lakisa

Managing Director, Talanoa Consultancy
Dr David Lakisa specialises in Pacific training and development, educational leadership and diversity management. He is of Samoan (Polynesian) ancestry and completed his PhD on 'Pacific Diversity Management' at the University of Technology Sydney (UTS) Business School.
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from failure at a NFP - pitfalls and pointers
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Victoria Pilbeam (WWF-Australia)

Across social and environmental movements, we are often reticent to talk about failure. But as innovation and learning gain greater emphasis across the sector, Not-for Profits are finding new ways to share and learn from their failures (eg: Engineers Without Borders failure reports, Save the Children Fail Fest, etc.). In this presentation, I will share both insights from the available research and reflect on my own journey developing failure programming at WWF-Australia. This presentation will provide practical guidance to evaluators and organisations navigating the challenging terrain of learning from failure.
Chair Speakers
avatar for Victoria Pilbeam

Victoria Pilbeam

MEL Adviser, The Pacific Community (SPC)
At the Pacific Community, I support MEL for fisheries, aquaculture and marine ecosystems across the region. Previously, I worked for WWF-Australia and inconsulting with a range of not-for profit, government , and philanthropic partners. I love MEL that is approachable, equitable... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Evaluation for whom? Shifting evaluation to increase its value for local actors
Wednesday September 18, 2024 2:00pm - 2:30pm AEST
104
Authors: Linda Kelly (Praxis Consultants), Mary Raori (UNDP Pacific, FJ)

This presentation outlines an approach to program assessment of a long-term governance program working across the Pacific, the UNDP Governance for Resilience program. It tells the story of the program’s maturing evaluation approach which has shifted from serving the information needs of those with money and power to focus more particularly on the values and interests of local participants and partners..
Despite the well-documented limitations of single methodology evaluation approaches for complex programs, many international development donors and corresponding international and regional organisations, continue to require program assessment that serves their needs and values. Typically, this includes narrowing evaluation to assessment against quantitative indicators. Notwithstanding the extensive limitations of this approach, it serves the (usually short-term) needs of international donors and other large bureaucracies. It generates simple information that can be communicated and showcased in uncritical forms. It provides numbers that are easily aggregated and used for concise reporting to senior and political masters.
Such approaches risk crowding out attention to the information needs of other participants and undermine attempts to support more locally led processes. This presentation will explain how this long-term and large-scale program has shifted, making use of a values-based evaluative approach to better serve the interests of partners and participants in the Pacific. This has involved both a methodological and political shift, broadening the range of data collection and analysis methodologies and approaches, increasing resourcing to accommodate different types of data and data collection and internal and external advocacy. This one program experience echoes wider views across the Pacific about the limitations of externally imposed measures and the lack of attention to what is valued by pacific countries and people.


Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Lisa Buggy

Lisa Buggy

Strategy, Learning and Innovation Specialist, UNDP Pacific Office
Ms. Lisa Buggy commenced with the UNDP Pacific Office in Fiji in January 2021 and has recently transitioned into the role of Strategy, Learning and Innovation Specialist with the Governance for Resilient Development in the Pacific project. Her current role focuses on influencing systems... Read More →
avatar for Linda Vaike

Linda Vaike

Programme Adviser - Climate Risk Finance and Governance, Pacific Islands Forum Secretariat
Wednesday September 18, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When "parachuting in" is not an option: Exploring value with integrity across languages, continents and time zones
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106
Authors: Julian King (Julian King & Associates), Adrian Field (Dovetail)

The rapid growth of video-conferencing technology has increased the ability for evaluations to be conducted across multiple countries and time zones. People are increasingly used to meeting and working entirely online, and evaluations can in principle be designed and delivered without need for face to face engagement. Translational AI software is even able to break through language barriers, providing further efficiencies and enabling evaluation funds to be directed more to design, data gathering and analysis.

Yet the efficiency of delivery should not compromise the integrity with which an evaluation is conducted. This is particularly true in situations where different dimensions of equity come into question, and in an evaluation where two or more languages are being used, ensuring that the design and delivery are meaningful and accessible to all participants, not just the funder.

The growth of remote evaluation working presents a very real, if not even more pressing danger, of the consultant "parachuting in" and offering solutions that have little or no relevance to the communities who are at the centre of the evaluation process.

In this presentation we explore the wayfinding process in designing and implementing a Value for Investment evaluation of an urban initiative focusing on the developmental needs of young children, in Jundiaí, Brazil. We discuss the challenges and opportunities presented by a largely (but ultimately not entirely) online format, in leading a rigorously collaborative evaluation process, and gathering data in a way that ensures all stakeholder perspective are appropriately reflected. We discuss the trade-offs involved in this process, the reflections of evaluation participants, and the value of ensuring that underlying principles of collaborative and cross-cultural engagement are adhered to.

Chair
avatar for Melinda Mann

Melinda Mann

Academic Lead Jilbay First Nations RHD Academy, CQUniversity
Melinda Mann is a Darumbal and South Sea Islander woman based in Rockhampton, Qld. Her work focuses on Indigenous Nation building, Pacific sovereignties, and regional and rural communities. Melinda has a background in student services, learning design, school and tertiary education... Read More →
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Navigating the choppy waters of the evaluation landscape in the Pacific
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
106
Authors: Allan Mua Illingworth (Mua'akia Consulting and Insight Pasifika) Fiona Fandim (Pacific Community (SPC), FJ), Eroni Wavu (MEL Officer for Pacific Women Lead at Pacific Community (SPC) and cofounder of the Fiji Monitoring, Evaluation & Learning Community), Mereani Rokotuibau (Balance of Power Program, FJ) and Chris Roche (La Trobe University),

In recent years there have been a number of Pacific driven initiatives designed to promote monitoring and evaluation practice which is culturally and contextually appropriate. These have occurred with projects and programs as well as at national and regional levels. At the same time geo-political interest in the Pacific region has resulted in an increased number of bi and multilateral donor agencies becoming present in the region and/or funding development programs, local organisations, national governments and regional bodies. This has in turn led to an evaluation landscape where notions of 'international best practice' as well as donor policies and practices and associated international researcher and consulting companies, risk crowding out emergent Pacific led evaluation initiatives.

This panel will bring together key participants who are leading four examples of these Pacific experiences: the Rebbilib process initiated by the Pacific Community (SPC ), Insight Pasifika (an emerging Pacific led and owned collective focused on evaluation in the first instance): the Fiji Monitoring, Evaluation & Learning Community and the Balance of Power program (a Pacific-led initiative, supported by the Australian Government, focused improving the political, social and economic opportunities for women and girls) each of whom are seeking to create space for processes of monitoring, evaluation and learning which are consistent with Pacific ways of knowing and being. They will share their experience, the challenges they face and ideas about what forms of support might be provided by international donors, consultants and advisors which are enabling rather than undermining.

Moderated by Prof. Chris Roche the panel and audience will also draw out the lessons from these four cases about what might contribute to more systemic change in the evaluation landscape more generally.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Allan Mua Illingworth

Allan Mua Illingworth

Adjunct Research Fellow, La Trobe University
Allan Mua Illingworth is a Monitoring and Evaluation specialist of Pacific Island heritage with a long career of international development experience and an extensive network of contacts who have worked to support development regionally and across many Pacific Island countries over... Read More →
avatar for Chris Roche

Chris Roche

Professor of Development Practice, La Trobe University
I am Professor Development Practice with the Centre for Human Security and Social Change at La Trobe University - (https://www.latrobe.edu.au/socialchange) - and former Deputy Director of the Developmental Leadership Program (www,dlprog.org) and member of the intellectual leadership... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Chair
MA

Mary Ann Wong

Research Specialist, California State University, Sacramento
Speakers
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating the unfamiliar: Evaluation and sustainable finance
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Donna Loveridge (Independent Consultant), Ed Hedley (Itad Ltd UK, GB)

The nature and magnitude of global challenges, such as climate change, poverty and inequality, biodiversity loss, food insecurity and so on, means that $4 trillion is needed annually to achieve the Sustainable Development Goals by 2030. Government and philanthropic funding is not enough but additional tools include businesses and sustainable finance. Evaluators may relate to many objectives that business and sustainable finance seek to contribute to but discomfort can arise in the mixing of profit, financial returns, impact and purpose.

Sustainable finance, impact investing, and business for good are growing globally and provides opportunities and challenges for evaluators, evaluation practice and the profession.
This session explores this new landscape and examines:
  • What makes us uncomfortable about dual objectives of purpose and profit, notions of finance and public good, and unfamiliar stakeholders and languages, and what evaluators can do in response.
  • The opportunities for evaluators to contribute to solving interesting and complex problems with current tools and skills and where is the space for developing evaluation theory and practice.
  • How evaluation practice and evaluators' competencies might expand and deepen, and not get left behind in these new fields, and also sustaining evaluations relevance to addressing complex challenges.

The session draws on experience in Australia and internationally to share some practical navigation maps, tools and tips to help evaluators traverse issues of values and value, working with investors and businesses, and identify opportunities to add value.
Chair
Thursday September 19, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Scaling Impact: How Should We Evaluate the Success of a Scaling Journey?
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106
Authors: John Gargani (Gargani + Co)

The world has never faced larger problems—climate change, refugee crises, and Covid19, to name just three. And organizations have responded by scaling solutions to unprecedented size—sustainable development goals, global refugee policies, and universal vaccination programs. But scaling is a journey to a destination imperfectly imagined at the onset and difficult to recognize upon arrival. At what point is scaling a program, policy, or product successful? Under what conditions should scaling stop? Or "descaling" begin? Robert McLean and I posed these and other questions to innovators in the Global South and shared what we learned in our recent book *Scaling Impact: Innovation for the Public Good*. In this session, we outline the book's four research-based scaling principles—justification, optimal scale, coordination, and dynamic evaluation. Then we discuss how to (1) define success as achieving impact at optimal scale, (2) choose a scaling strategy best suited to achieve success, and (3) judge success with dynamic evaluation. My presentation goes beyond the book, reflecting our most current thinking and research, and I provide participants with access to free resources, including electronic copies of the book.
Chair
avatar for Carolyn Wallace

Carolyn Wallace

Manager Research and Impact, VicHealth
Carolyn is an established leader in health and community services with over 22 years of experience across regional Victoria, Melbourne, and Ireland. She has held roles including CEO, executive director, policy officer, and researcher, specialising in community wellbeing and social... Read More →
Speakers
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The evolution of evaluation: Retracing our steps in evaluation theory to prepare for the future
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: James Ong (University of Melbourne)

As new people enter the evaluation field and as evaluation marches forward into the future, it is important to learn from evaluation theorists that have come before us. My Ignite presentation will argue that modern evaluation is built on evaluation theory, and make the call for evaluators of all levels to learn evaluation theory to:
  1. Appreciate how evaluation has evolved;
  2. Strengthen their evaluation practice; and
  3. Navigate themselves around an ever-changing evaluation landscape.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for James Ong

James Ong

Research Assistant (Evaluations), University of Melbourne
My name is James Ong. I am an Autistic program evaluator working at the University of Melbourne, where I work on evaluation and implementation projects in various public health projects such as the AusPathoGen program and the SPARK initiative. I not only have a strong theoretical... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

From KPIs to systems change: Reimagining organisational learning
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Katrina Barnes (Clear Horizon), Irene Guijt (Oxfam Great Britain, GB), Chipo Peggah (Oxfam Great Britain, ZW)

Traditional measures of success for international non-governmental organizations (INGOs) have been based on western (and often colonial), theories of change, use of predefined metrics and ways of knowing - rarely fitting local realities and interests. Projectised pre-determined understandings of change, limit honest reflection on larger transformative change, and inhibit meaningful learning and adaptation.

INGOs globally are being challenged to decolonise their knowledge and evaluation processes. Over the past 18 months, Oxfam Great Britain has undergone a journey to redesign how we understand impact, to rebalance and reframe accountability and strengthen learning. This new approach focuses on collective storytelling, sensemaking and regular reflection on practice. We are taking a theory-led approach to make meaning out of signals that systems are shifting across a portfolio of work. Drawing on a bricolage of various evaluation methodologies (Outcome Harvesting-lite, meta-evaluation and synthesis, evaluative rubrics, and impact evaluations) we are slowly building a picture up over time across the organisation, to tell a story of systemic change. We have seen how meaningful and honest evidence and learning processes, have enabled a stronger culture of learning.

Although we are far from the end of this journey, we have learnt some critical lessons and face ongoing challenges. We are not the only ones, many foundations, funders, and philanthropic organisations are going through similar processes as organisations increasingly try to understand their contribution to systems change. These conversations are therefore imperative to the field of evaluation, as organisations navigate new ways to 'evaluate' their own work.

At this presentation, we will start the discussion by sharing Oxfam Great Britain's journey with key challenges faced and lessons learnt. After this, we will invite a Q&A conversation to harvest insights from others also seeking to reimagine organisational learning that is grounded in decolonising knowledge processes and seeking to understand systems change.
Chair
avatar for Elissa Mortimer

Elissa Mortimer

Manager & MEL Specialist, Palladium
I have worked in the international development and health sectors for the past 25 years, primarily in nutrition, maternal and child health, HIV, tobacco control, non-communicable diseases and skills development. I have worked on a broad variety of projects, including local community... Read More →
Speakers
avatar for Katrina Barnes

Katrina Barnes

Principal Consultant, Clear Horizon
Thursday September 19, 2024 3:30pm - 4:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Friday, September 20
 

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Nicole Tujague

Nicole Tujague

Founder and Director, The Seedling Group
Nicole TujagueBachelor of Indigenous Studies (Trauma and Healing/Managing Organisations)1st Class Honours, Indigenous ResearchPhD in Indigenous-led Evaluation, Gnibi College, Southern Cross UniversityNicole is a descendant of the Kabi Kabi nation from Mt Bauple, Queensland and the... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Value Propositions: Clearing the path from theory of change to rubrics
Friday September 20, 2024 11:00am - 12:30pm AEST
Authors: Julian King (Julian King & Associates Limited), Adrian Field (Dovetail Consulting Limited, NZ)

Evaluation rubrics are increasingly used to help make evaluative reasoning explicit. Rubrics can also be used as wayfinding tools to help stakeholders understand and participate meaningfully in evaluation. Developing rubrics is conceptually challenging work and the search is on for additional navigation tools and models that might help ease the cognitive load.

As a preliminary step toward rubric development it is often helpful to co-create a theory of change, proposing a chain of causality from actions to impacts, documenting a shared understanding of a program, and providing a point of reference for scoping a logical, coherent set of criteria.

However, it's easy to become disoriented when getting from a theory of change to a set of criteria, because the former deals with impact and the latter with value. Implicitly, a theory of change may focus on activities and impacts that people value, but this cannot be taken for granted - and we argue that value should be made more explicit in program theories.

Specifying a program's value proposition can improve wayfinding between a theory of change and a set of criteria, addressing the aspects of performance and value that matter to stakeholders. Defining a value proposition prompts us to think differently about a program. For example, in addition to what's already in the theory of change, we need to consider to whom the program is valuable, in what ways it is valuable, and how the value is created.

In this presentation, we will share what we've learnt about developing and using value propositions. We'll share a simple framework for developing a value proposition and, using roving microphones, engage participants in co-developing a value proposition in real time. We'll conclude the session by sharing some examples of value propositions from recent evaluations.

Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Friday September 20, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

From evaluation to impact-practical steps in a qualitative impact study
Friday September 20, 2024 1:30pm - 2:00pm AEST
Authors: Linda Kelly (Praxis Consultants), Elizabeth Jackson (Latrobe University, AU)

This presentation focuses on a multi-year program funded by Australia that aims to empower people marginalised by gender, disability and other factors. Like similar programs, the work is subject to regular monitoring and evaluation - testing the effectiveness of program activities largely from the perspective of the Australian and national country Government.
But what of the views of the people served by the program? Is the impact of the various activities sufficient to empower them beyond their current condition? How significant are the changes introduced by the program, given the structural, economic, social and other disadvantages experienced by the marginalised individuals and groups.
Drawing from feminist theory, qualitative research methods and managed with local research and communication experts this presentation outlines the study focused on the long-term impact of the program.

The presentation will outline the methodology and practical considerations in the development of the approach and data collection methodologies. It will highlight the value of exploring impact from a qualitative perspective, while outlining the considerable management and conceptual challenges required in designing, introducing and supporting such an approach. It will consider some of the implications in shifting from traditional evaluation methods to more open-ended enquiry and consider whose values are best served through evaluation versus impact assessment?


Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Elisabeth Jackson

Elisabeth Jackson

Senior Research Fellow, Centre for Human Security and Social Change, La Trobe University
Dr Elisabeth Jackson is a Senior Research Fellow at the Centre for Human Security and Social Change where she conducts research and evaluation in Southeast Asia and the Pacific. She is currently co-leading an impact evaluation of a program working with diverse marginalised groups... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.