Loading…
Ensure that your profile is set to public (not private). If your profile is private you will be unable to fully participate and will miss important conference announcements.
The online program is currently invite only. Over the coming weeks we will grant access to presenters and delegates so they can create a profile in the online app and choose sessions they’d like to attend. Please be patient, if you are a registered delegate or presenter we will send instructions when we are ready to invite you.
Register at: https://conference2024.aes.asn.au
Wednesday, September 18
 

9:00am AEST

Opening plenary: Welcome to Country followed by June Oscar "Wiyi Yani U Thangani - re-imagining evaluation with a gender justice lens"
Wednesday September 18, 2024 9:00am - 10:30am AEST
Welcome to Country & Smoking Ceremony

Opening address: Kiri Parata, President, Australian Evaluation Society

Keynote address: "Wiyi Yani U Thangani - re-imagining evaluation with a gender justice lens"

June Oscar AO

As the first woman to be the Aboriginal and Torres Strait Islander Social Justice Commissioner, June led the ground-breaking national project, Wiyi Yani U Thangani (Women’s Voices). In March 2024, the Wiyi Yani U Thangani Institute for First Nations Gender Justice was launched at the Australian National University to carry the legacy of the project—which has gathered a powerful evidence base of the rights, issues and aspirations of thousands of First Nations women and girls.
The Institute is developing an applied measurement, evaluation and learning approach formed by the voices and cultures of First Nations women and girls. This approach will guarantee that women can own their own evidence and identify areas of research which respond to their strengths and priorities. June’s address will explore this unique approach and how evaluation methods shaped by women’s voices can lead to systemic transformation benefiting all Australians.



Speakers
avatar for June Oscar AO

June Oscar AO

Inaugural Chair, The Wiyi Yani U Thangani Institute for First Nations Gender Justice
June Oscar AO is a proud Bunuba woman from the remote town of Fitzroy Crossing in Western Australia’s Kimberley region. She is a strong advocate for Indigenous Australian languages, social justice, women’s issues, and has worked tirelessly to reduce Fetal Alcohol Spectrum Disorder... Read More →
Wednesday September 18, 2024 9:00am - 10:30am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Evaluation that adds value for People and Planet: Perspectives, Challenges, and Opportunities for Indigenous Knowledge Systems in Africa.
Wednesday September 18, 2024 11:00am - 11:30am AEST
104
Authors: Awuor PONGE (African Policy Centre (APC) )

Indigenous Knowledge Systems (IKS) in Africa have long been marginalized and undervalued, despite their potential to offer sustainable solutions to pressing challenges faced by communities across the continent. This presentation explores the perspectives, challenges, and opportunities for incorporating IKS into evaluation practices that create value for both people and the planet.

From a people-centric perspective, IKS offer a holistic and culturally relevant approach to understanding local contexts, priorities, and value systems. By embracing these knowledge systems, evaluations can better capture the multidimensional nature of well-being, including spiritual, social, and environmental aspects that are often overlooked in conventional evaluation frameworks. However, challenges arise in reconciling IKS with dominant Western paradigms and navigating power dynamics that have historically suppressed indigenous voices.

From a planetary perspective, IKS offer invaluable insights into sustainable resource management, biodiversity conservation, and climate change adaptation strategies that have been honed over generations of lived experiences. Integrating these knowledge systems into evaluation can shed light on the intricate relationships between human activities and ecosystem health, enabling more informed decision-making for environmental sustainability. Nonetheless, challenges exist in bridging the divide between traditional and scientific knowledge systems, as well as addressing concerns around intellectual property rights and benefit-sharing.

This presentation will explore innovative approaches to overcoming these challenges, such as participatory and community-based evaluation methodologies, capacity-building initiatives, and cross-cultural dialogue platforms. By fostering a deeper appreciation and understanding of IKS, evaluation practices can become more inclusive, relevant, and effective in creating value for both people and the planet in the African context.


Speakers
avatar for Awuor Ponge

Awuor Ponge

Senior Associate Research Fellow, African Policy Centre (APC)
Dr. Awuor Ponge, is a Senior Associate Fellow, in-charge of Research, Policy and Evaluation at the African Policy Centre (APC). He is also the Vice-President of the African Evaluation Association (AfrEA). He holds a Doctor of Philosophy (PhD) Degree in Gender and Development Studies... Read More →
Wednesday September 18, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Appreciating First Nations voices: Using appreciative inquiry and participation in the evaluation of Community Justice Groups
Wednesday September 18, 2024 11:00am - 12:00pm AEST
Authors: Michael Limerick (Myuma Pty Ltd ),Melinda Mann (Myuma Pty Ltd, AU),Melissa Osborn (Myuma Pty Ltd, AU)

Emerging best practice principles for Indigenous evaluations encourage evaluators to find new ways of conducting evaluations of programs delivered in First Nations settings. The impetus for this work is a growing awareness that evaluation activity carries the risk of perpetuating colonising impacts on First Nations people, especially in relation to the sovereignty over knowledge and data, the level of consent and self-determination in the process, the level of appreciation of cultural insights and community strengths, and the sharing of the benefits of evaluation activity. For the evaluation of the Community Justice Group (CJG) Program in Queensland, the Department of Justice & Attorney General engaged our organisation, an Aboriginal social enterprise from Queensland, to deliver an evaluation guided by best practice Indigenous evaluation principles. Encouraged by the Department's evaluation brief, our organisation assembled a team of predominantly Indigenous people with deep community connections to facilitate a strengths-based and collaborative approach that would put First Nations voices and perspectives at the centre of the evaluation. Over three years, the team followed a process of working with CJG staff and members to co-design and deliver place-based 'local evaluations' in 25 locations, as the central feature of the Statewide program evaluation. The goal was to 'walk alongside' CJGs to respect their agency and afford them growth opportunities, and to seek out stories of success rather than evidence of deficit. Working in partnership, our organisation and the Department learned much on this journey. Fully implementing Indigenous ethical evaluation principles was not without its challenges - for example, meaningful participation can only occur by relationship-building that takes time and stretches evaluation budgets, and principles such as Indigenous data sovereignty can be difficult to implement in government contexts. However, the value of the approach is evident in firstly, the way that many CJGs embraced the local evaluations, and secondly, in the powerful qualitative evidence of program success yielded by the Appreciative Inquiry-inspired storytelling methods.
Speakers
avatar for Michael Limerick

Michael Limerick

Lead Consultant, Myuma
Dr Michael Limerick is a Brisbane-based consultant and lawyer specialising in Indigenous governance and policy.  He is Lead Consultant for the research and evaluation arm of Aboriginal social enterprise, Myuma Pty Ltd, and an Adjunct Associate Professor at the Institute for Social... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Design-stage evaluative thinking: helping NGOs and grant makers learn to love evaluation from the start
Wednesday September 18, 2024 11:00am - 12:00pm AEST
103
Authors: Claire Grealy (Rooftop Social ),Duncan Rintoul (Rooftop Social, AU),Virginia Poggio (Paul Ramsay Foundation, AU),Luciana Campello (NSW Department of Communities and Justice, AU),Kirsty Burow (NSW Department of Communities and Justice, AU),Jacqueline Webb (National Association for the Prevention of Child Abuse and Neglect (NAPCAN), AU)

The evaluation of grant programs has long frustrated grantees and perplexed fund managers.
Evaluators often arrive at the end, and may find a strong narrative about the funded activity (assuming the project staff are still in place) but less of the documentation and data that demonstrates the impact or learning, or shows the link between each project to the fund objectives.

Fund managers have often had to be content with the limited results available to them, sometimes as basic as acquittals on activity and expenditure. This limits funders' ability to capture learning, feed into new fund designs, mount budget bids, or tell a compelling story about the work grant holders are doing.

This panel brings together a cross-section of key players and beneficiaries from a variety of contexts:
* a state government fund manager in the human services sector
* an evaluation lead from a large national philanthropic organisation
* an experienced project manager from a national NGO that receives grants from various sources
* two evaluation specialists who have deep experience working in this space, developing and delivering this kind of support.

Drawing on case studies from practice, this panel will share some innovative approaches from their work, which bring the right mix of expectation and support to the design stage of grant-based projects, from the time of submitting an EOI through to the point of evaluation readiness.

The fruit that hangs off this tree includes:
* strengthening the 'evaluability' of each project and the overall fund
* testing each project's assumptions and ambitions
* deep conversations between grant makes and grant holders about outcome alignment
* building the evaluative thinking and capability of project teams and organisations, activating the 'ripple effect' as participants share their newfound commitment and skills with their colleagues.
"You couldn't drag me to program logic workshop before this. And now look at me - I took that process you did with us and yesterday I ran it with my team on another project."
Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →
avatar for Virginia Poggio

Virginia Poggio

MERL Associate, Paul Ramsay Foundation
As a Measurement, Evaluation, Research, and Learning (MERL) Associate at the Paul Ramsay Foundation, I lead teams to deliver evidence-based advice to inform the Foundation’s strategic initiatives. My role involves commissioning, supporting, and managing independent evaluations of... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106
Authors: Samantha Abbato (Visual Insights People )

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Evaluation capacity building is increasingly becoming a core part of evaluation practice and a critical part of incorporating evaluation into the everyday activity of organisations (Preskill and Boyle, 2008, White, Percy and Small, 2018). Reaching the point where evaluation becomes the way of doing business requires a change of knowledge, skills, and attitudes.

Changes need to happen at the level of individuals, teams, organisations, and partnerships. This journey requires supporting and managing change to systematic enquiry processes as much as it requires evaluation expertise. In this skill-building session, we introduce Jonathan Haidt's 'rider, elephant and pathway' metaphor as a framework to support change and strengthen evaluation capacity (Haidt, 2018).

Haidt's metaphor for change includes the rider (our rational thinking side) atop an elephant (our emotional side). Behaviour change for individuals and collectives requires steps that (1) support the rider, such as giving clear directions, (2) motivate the elephant by tapping into emotions, and (3) shape a pathway to change, including clearing obstacles. In this interactive session, the facilitator will provide case studies applying Haidt's metaphor,spanning two decades Through these examples the power of this framework to support evaluation capacity building is demonstrated. Examples include using Haidt's framework for:
1. Building a Monitoring, Evaluation and Learning (MEL) system with a medium-sized community organisation;
2. Increasing the maturity of MEL in an existing large organisation; and
3. Increasing the impact of evaluation partnerships.

The active skill-building component incorporates:_
  • Cartoon elephant, rider and pathway flashcards;
  • A 'snakes and ladders' style game; and
  • Evaluation-specific examples.

The combination of examples and activities are designed to support participant learning. The session will encourage discussion of barriers, enablers and actions to build evaluation capacity relevant to different situations and contexts.

Learning objectives include:
  • Knowledge of a sound and memorable psychological framework for supporting evaluation capacity building;
  • Ability to apply Haidt's metaphor
Speakers
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Innovating Value for Money: Finding Our Way to Greater Value for All
Wednesday September 18, 2024 11:00am - 12:00pm AEST
105
Authors: John Gargani (Gargani + Co ),Julian King (Julian King & Associates, NZ)

In this participatory session, we pose the question, "How should evaluators innovate the practice of value-for-money assessment to meet the needs of an expanding set of actors that include governments, philanthropists, impact investors, social entrepreneurs, program designers, and Indigenous and First Nations communities?" We begin by framing value for money as an evaluative question about an economic problem. How well are we using resources, and are we using them well enough to justify their use? Then we suggest new methods intended to help innovate the practice of value for money based on our body of published and current research spanning over 10 years.
These include new methods that (1) produce "holistic" assessments of value for money, (2) reflect rather than hide multiple value perspectives even when values conflict, (3) estimate social benefit-cost ratios without monetizing benefits or costs, and (4) adjust monetary and nonmonetary value for risk using Bayesian methods. Along the way, we facilitate discussions with participants, asking them to consider if, how, and by whom these innovations should be pursued, and what other innovations may be needed. We provide participants with access to a collection of our published and draft papers, and invite them to comment and continue our discussion after the conference.
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Planning your conference voyage: Key evaluation concepts for novice sailors
Wednesday September 18, 2024 11:00am - 12:30pm AEST
Authors: Charlie Tulloch (Policy Performance)

Many people will be attending their first AES conference - welcome!
This workshop is targeted at new and emerging evaluators who are seeking to build their familiarity with the key concepts and language used in evaluation and the main elements of evaluation project delivery.
This will provide you with a plotted history of the evaluation field so you feel more comfortable engaging in deeper-dive topics over coming days. The session is grounded in theory, drawing on leading thinkers and methods that are central to our practice.
Key concepts to be covered include: what is evaluation? why should we evaluate? what can be evaluated? when to evaluate? how to evaluate? where can I learn more about evaluation?
The forum will help many brave young and emerging evaluators to navigate through often choppy evaluation waters so they don't feel 'all at sea' over coming days.
Speakers
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a conference sponsor!Evaluation, training and all aspects of excellence in public sector service delivery.Helping those new to evaluation to thrive.
Wednesday September 18, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Culturally Responsive Initiatives: Lessons in effective Initiative Design and Evaluation of initiatives that affect First Nations people
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104
Authors: Eugenia Marembo

Representatives of First Nations communities have been advocating for changes in the way initiatives are planned, prioritised, and assessed. This includes greater visibility on where funding is going, more partnerships on designing initiatives and more evaluation on the outcomes being achieved, to inform government decision making.

This paper presents key insights on what constitutes good practice when designing and appraising initiatives that affect First Nations people and communities. The National Agreement on Closing the Gap is built around four new Priority Reforms that will change the way governments work with Aboriginal and Torres Strait Islander people and communities. Priority Reform Three is about transforming government institutions and organisations. As part of this Priority Reform, parties commit to systemic and structural transformation of mainstream government organisations to improve accountability, and to respond to the needs of First Nations people.

The findings presented in this paper draw on insights from consultations with various First Nations community representatives and government stakeholders in New South Wales, and the subsequent process of developing a government department's First Nations investment framework which seeks to strengthen the evidence on what works to improve outcome for First Nations people. Additionally, the frameworks to improve practice across government processes and better inform how initiatives are designed, prioritised and funded.
Speakers
avatar for Eugenia Marembo

Eugenia Marembo

Senior Analyst, First Nations Economic Wellbeing
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

From bottlenecks to breakthroughs: Insights from a teacher workforce initiative evaluation
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104
Authors: Rhiannon Birch (Victorian Department of Education ),Hayden Jose (Urbis, AU),Joanna Petkowksi (Victorian Department of Education, AU),Ekin Masters (Victorian Department of Education, AU)

How can an evaluation balance the need to generate pragmatic insights while meeting central agency requirements for rigorous measurement of outcomes? What ingredients can facilitate the effective evaluation of a government initiative and achieve improved outcomes? This paper explores the essential ingredients for evaluating a large-scale government program using an example of a statewide initiative aimed at attracting and retaining suitably qualified teachers in hard-to-staff positions in Victorian government schools.

We showcase how an adaptive and evidence-led method of enquiry helped identify program implementation bottlenecks and probe potentially unintended program outcomes over a three-year evaluation. We discuss enablers for the integration of evaluation recommendations into program implementation and future policy direction, particularly on participatory action approaches and deep relationships with policy and implementation teams. We will also present the robust and varied methodology, particularly the novel use of system data to facilitate a quasi-experimental design that aligned with central agency requirements and met stakeholder needs.
This presentation will benefit policymakers, program evaluators, and others interested in evaluating government programs, by sharing key learnings on how evaluations can balance pragmatic insights with central agency requirements and identifying the key elements for influencing such programs and achieving improved outcomes.
Speakers
avatar for Rhiannon Birch

Rhiannon Birch

Senior Evaluation and Research Officer, Department of Education
Rhiannon is a dedicated research and evaluation specialist committed to enhancing health, social, education, and environmental outcomes for people and the planet. With over 10 years of experience in evaluation, she has worked extensively across emergency services, public health, and... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Monitoring and Evaluation Journeys: Making footprints, community-based enterprise in Australian First Nations contexts
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
Authors: Donna-Maree Stephens (Community First Development ),Sharon Babyack (Community First Development, AU)

As First Nations' economies grow and develop, wayfinding of monitoring and evaluation frameworks that meaningfully address the holistic outcomes of First Nations' economic independence are a necessity. Culturally responsive monitoring and evaluation frameworks provide footprints for distinct ways of thinking about the holistic and significant contribution that First Nations' economies make to their communities and the broad Australian economic landscape.
Presenting findings from an organisation with more than 20 years of experience working alongside First Nations' communities and businesses grounded in collective and community focused outcomes, this presentation will highlight key learnings of monitoring and evaluation from First Nations' enterprises. It is an invitation to explore and rethink notions of success by drawing on experiences and Dreams (long-term goals) for community organisations, businesses and journeys towards positive outcomes alongside the role of one culturally responsive monitoring and evaluation approach. Our presentation will provide an overview of our work in the community economic development space and key learnings developed through our monitoring and evaluation yarns with First Nations' enterprises across a national First Nations' economic landscape that includes urban, regional and remote illustrations.
Speakers
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, Community First Development
My role at Community First Development involves oversight of research, evaluation, communications and effectiveness of the Community Development program. During my time with the organisation I have led teams to deliver major change processes and strategic priorities, have had carriage... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating collaborative practice - the role of evaluation in brokering shared outcomes
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
106
Authors: Caroline Crothers (Allen + Clarke Consulting)

A collaborative effort between community organisations and Victoria Police has demonstrated significant impact in addressing youth offending in Victoria's northwest metropolitan region. This initiative brought together 12 partner organisations from various sectors, including police, legal services, and youth support services around the shared goal of reducing youth offending. By diverting young offenders from the criminal justice system, the initiative seeks to enhance overall justice, health, and wellbeing outcomes for vulnerable youth. Allen + Clarke was commissioned to evaluate the success of this initiative during its inaugural year. In this presentation, we share key lessons learned from the evaluation including how minimally resourced and small-scale interventions can have an outsized impact on organisational change to culture and practice. We also reflect on the journey embarked upon and explore how the evaluation process itself serves as a tool for navigating through complex challenges and adapting to changes encountered along the way. Through critical reflection, the presentation delves into the differing perspectives of the delivery partners involved highlighting how the evaluation journey facilitates a shared understanding of the path forward and shaping future strategies and interventions.
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Enhancing Stakeholder Engagement Through Culturally Sensitive Approaches: A Focus on Aboriginal and Torres Strait Islander Communities
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105
Authors: Mark Power (Murawin ),Carol Vale (Murawin, AU)

This presentation explores the paramount importance of culturally sensitive engagement methodologies in ensuring meaningful contributions from Aboriginal and Torres Strait Islander communities to mission programs. Murawin, an Aboriginal-led consultancy, has developed a robust Indigenous Engagement Strategy Framework grounded in the principles of reciprocity, free, informed and prior consent, mutual understanding, accountability, power sharing, and respect for Indigenous knowledge systems. Our session aims to share insights into the necessity of prioritising Aboriginal and Torres Strait Islander voices in engagement, co-design, and research, highlighting the significance of cultural competence in fostering mutual respect and understanding.
We will discuss three key messages: the imperative of deep knowledge and understanding of Aboriginal and Torres Strait Islander cultures in engagement practices; the success of co-design processes in facilitating genuine and respectful engagement; and the strategic partnership with CSIRO to enhance cultural competence and inclusivity in addressing Indigenous aspirations and challenges. These points underscore the critical role of acknowledging cultural interactions and ensuring cultural sensitivity in building strong, respectful productive relationships with Indigenous communities.
To achieve our session's objectives, we have designed an interactive format that blends informative presentations with the analysis of case studies, complemented by engaging intercultural discussions. This approach is intended to equip participants with actionable insights drawn from real-world examples of our collaborative ventures and co-designed projects. Through this comprehensive exploration, we aim to enrich participants' understanding of successful strategies for engaging Aboriginal and Torres Strait Islander communities, ultimately contributing to the achievement of more inclusive and impactful outcomes in mission programs and beyond.


Speakers
avatar for Carol Vale

Carol Vale

Managing Director, Murawin
I am a Dunghutti woman with an extensive career in public sector management and service delivery in the realm of Aboriginal Affairs. My academic background is primarily in the social sciences and leadership development particularly as they relate to overcoming disadvantage. What should... Read More →
avatar for Mark Power

Mark Power

Director, Evaluation & Research, Murawin
Mark is an experienced researcher with more than 20 years of experience in Australia and the Pacific. Mark manages Murawin’s evaluation and research practice and leads multiple evaluations for a variety of clients. Mark has overseen more than 30 high-profile, complex projects funded... Read More →
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Valuing First Nations Cultures in Cost-Benefit Analysis
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
103
Authors: Laura Faulker (NSW Treasury)

This paper presents the key findings from research and engagement on how cost-benefit analysis (CBA) has been applied to First Nations initiatives to date. CBA is an important tool used by governments to help prioritise budget funding decisions. It assesses the potential impacts of an initiative - economic, social, environmental, and cultural - to determine whether it will deliver value for money.

The paper explores the methods in which the value of First Nations cultures has been incorporated into CBAs, along with the associated challenges and opportunities to improve current practice. The findings have informed the development of an investment framework for the design and evaluation of initiatives that affect First Nations people and communities. The framework focuses on the key principles for embedding First Nations perspectives and ensuring culturally informed evaluative thinking.


Speakers
avatar for Laura Faulkner

Laura Faulkner

Senior Analyst, First Nations Economic Wellbeing, NSW Treasury
Wednesday September 18, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Enhancing evaluation value for small community organisations: A case example
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104
Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke, Carolyn McSporran (Blue Light Victoria, AU)Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke (Assessment and Evaluation Research Centre, University of Melbourne, AU), Elissa Scott (Blue Light Victoria, AU)

This presentation aims to provide a case example of how two small-scale, standard process/outcomes evaluations for a low-budget community organisation increased value for the organisation by identifying and seizing opportunities for evaluation capacity building. Formal evaluations represent a significant financial commitment for low-budget community organisations. By maximising the value provided by such evaluations, evaluators can contribute more to these organisations' mission and ultimately to social betterment.

There are numerous different evaluation capacity building models and frameworks, many of which appear to be quite complex (for example: Volkov & King, 2007; Preskill & Boyle, 2008). Many emphasise planning, documentation, and other resource intensive components as part of any evaluation capacity building effort. This session provides a case example of intentional but light-touch and opportunistic evaluation capacity building. Through such an approach, evaluators may need to do only minimal additional activities to provide extra value to an organisation. Reflection-in-action during the evaluation process is as important as the final reporting (Schwandt & Gates, 2021). The session emphasises, though, that a critical enabler will be the organisation's leadership and culture, and willingness to seize the opportunity offered by a formal evaluation. The session is co-presented by two members of the evaluation team and the Head of Strategy, Insights, and Impact of the client organisation.
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator, Centre for Program Evaluation
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She recently completed her Master of Evaluation... Read More →
avatar for Stephanie Button

Stephanie Button

Research Associate & Evaluator, Assessment & Evaluation Research Centre
Stephanie has worked as a policy manager, analyst, strategist, researcher, and evaluator across the social policy spectrum in the public and non-profit sector for over 12 years. She is passionate about evidence-based policy, pragmatic evaluation, and combining rigour with equitable... Read More →
avatar for Carolyn McSporran

Carolyn McSporran

Head of Strategy, Insights and Impact, Blue Light Victoria
Passionate about social inclusion, Carolyn's work has spanned diverse portfolios across the justice and social services sectors. With a fervent belief in the power of preventative and early intervention strategies, she is committed to unlocking the full potential of individuals and... Read More →
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Envisioning and Encountering Relational Aboriginal and Pacific Research Futures
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Seema Naidu (Tetra Tech, FJ), Nathan Sentance (Museum of Applied Arts and Sciences, AU), David Lakisa (Talanoa Consultancy, AU)

In the inaugural ANU Coral Bell Lecture on Indigenous Diplomacy, Dr Mary Graham outlined a powerful legacy of Aboriginal and Torres Strait Islander relational methods that have operated across a spectacular time scale. She envisioned a compelling future for its renewed application and spoke of these practices as a type of "thinking in formation, a type of slow, collective, and emergent process".

Inspired by Dr Graham's vision, this panel explores synergies, distinctions, and complementarities in local and Indigenous knowledge research methods across Australia and the Pacific. The panel features Wiradjuri, Samoan (Polynesian), Fijian (Melanesian) research specialists from a range of fields who explore and engage with showcase locally specific methodologies that connect across Australia and the Pacific continents, as ways of knowing, doing, and relating with the land, the moana (ocean) and air.

This session frames evaluation and research approaches as reflecting their contextual political order. While the panel will critique the legacies of individualist and survivalist research methods, it will focus on exploring the futures that relational research methods could realize. How do we evolve current institutional approaches to become more commensurate with Indigenous methods? Would institutionalizing these methods resolve the legacy, structure, and form of colonialist political approaches? Panelists will speak to their experience in working to evolve institutions in this way and the research and evaluation methodologies used within them.

The session also situates evaluation within a cannon of contextualizing evidence-based practices (such as political economy analysis, GEDSI analysis or feasibility studies). How might we recast current evaluation and contextualizing or evidence-based approaches to become commensurate with Indigenous, intersectional feminist and local research methods?
Speakers
avatar for Lisa Faerua

Lisa Faerua

Lisa Faerua is a Pacific Freelance Consultant. She brings 17 years of experience in international and community development in the areas of leadership, design, monitoring and evaluation. Lisa has provided technical support to DFAT, MFAT, and Non-Government Organisations such Oxfam... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Don't wait for the final data! The power of using synthetic data in developmental and participatory evaluation contexts.
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Gemma Kerrison (NSW Department of Education )

In participatory and developmental evaluation contexts, evaluators and program stakeholders work together throughout the evaluation cycle to adapt evaluation and implementation activities to rapidly changing environments (Fetterman, 2018; Patton, 2015). This presentation will share reflections from an evaluation project where sharing and discussing synthetic data (also known as fake data) with program stakeholders created a shared understanding of data validity, strengthened stakeholder engagement in analysis and reporting processes, and ultimately led to early learnings which were used to strengthen the rigour and utilisation of evaluation findings.
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from failure at a NFP - pitfalls and pointers
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Victoria Pilbeam (WWF-Australia)

Across social and environmental movements, we are often reticent to talk about failure. But as innovation and learning gain greater emphasis across the sector, Not-for Profits are finding new ways to share and learn from their failures (eg: Engineers Without Borders failure reports, Save the Children Fail Fest, etc.). In this presentation, I will share both insights from the available research and reflect on my own journey developing failure programming at WWF-Australia. This presentation will provide practical guidance to evaluators and organisations navigating the challenging terrain of learning from failure.
Speakers
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Measuring (in)equity for people with disability: How to do it, and why it is rarely done well
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Asahel Bush (CBM Australia / CBM Global Inclusion Advisory Group)

Meaningfully evaluating equity for people with disability is challenging. In low and middle-income countries (LMICs), existing data, tools, expertise and political will are often lacking. Disability monitoring data can rarely measure substantive equality, because comparable population data is not available.
This presentation will explore these challenges, based on a review of evidence on disability data and eye health in LMICs. It will propose some ways forward, including standardising disability measurement tools and pushing for more and better population-based evidence generation. Transforming practice on measuring disability equity is challenging, but possible; it is essential to avoid compounding structural inequities.
Speakers
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

No help? No worries! How to find your way on your own
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor)

As we learn the practice of evaluation, we often have experienced evaluators around us. They provide us guidance and guardrails within which to learn and grow our capabilities - but what happens when we no longer have this support? How do we stop ourselves from getting lost?

This session will provide tips and tricks for evaluators going out into the world on their own. Attendees will leave with practical advice and strategies to help keep them on track while they build their autonomy.
Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world a little over a year ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector. Rachel is passionate... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Revitalising Survey Engagement: Strategies to Tackle Low Response Rates
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Kizzy Gandy

Surveys are an excellent data collection tool when they reach their target response rate, but low response rates hinder the generalisability and reliability of the findings.

This Ignite presentation will discuss techniques Verian evaluators have applied to increase survey response rates while also assessing the efficacy and efficiency of these techniques. We will also explore other evidence-based strategies for boosting response rates and the value of drawing on other data sources if your response rates are still low.
Speakers
avatar for Kizzy Gandy

Kizzy Gandy

National Director, Program Evaluation, Verian
Dr Kizzy Gandy is Verian's National Director of Program Evaluation. She leads a team of expert methodologists and provides quality assurance. With 20 years’ experience in consultancy, federal and state government, and academia, Kizzy has overseen the design and evaluation of over... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Sign here: Supporting Deaf participation in evaluation
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Susie Fletcher (Australian Healthcare Associates)

Auslan is a visual, signed language that was developed by and for the Australian Deaf community. People who use Auslan as their primary or preferred language are not necessarily fluent in English. Our team was engaged to review access to interpreter services for Auslan users, a population group that is often underrepresented in evaluation. In this presentation we will highlight some of the issues evaluators need to consider when working with this marginalised community, and share practical skills and techniques for making their evaluations more accessible.
Speakers
avatar for Susie Fletcher

Susie Fletcher

Senior consultant, Australian Healthcare Associates
Dr Susie Fletcher is an experienced health services researcher with over 50 peer reviewed journal articles and 3 book chapters in mental health and primary care. She is passionate about improving health outcomes through integrating services across sectors; her recent work has focused... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Beyond Numbers: Weaving Stories, Sculpting Change and Signal Spotting through Collaborative Impact Yarns
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
106
Authors: Skye Trudgett (Kowa), Haley Ferguson (Kowa, AU), June Oscar (Australian Human Rights Commission, AU), Katie Stubley (Griffith University, AU), Terri Reid (Australian Human Rights Commission, AU), Chloe Wegener (Australian Human Rights Commission, AU), Banok Rind (Australian Human Rights Commission, AU), Kimberly Hunter (Australian Human Rights Commission, AU)

Measurement and data does not need to be all about numbers—it is about our heart and spirits, it is about voice, story, emotion—it is about truth. Numbers can tell us all sorts of lies, Wiyi Yani U Thangani is and always has been about—your voice—what you are saying about your lives, how you see your future and what matters to you.' June Oscar AO, Aboriginal and Torres Strait Islander Social Justice Commissioner
Quantitative data often dominate measurement and evaluation, yet true understanding requires tapping into the heart and spirit of communities. As June Oscar AO poignantly reminds us, it is the voice, story, emotion, and truth that bring depth to data. This session at the AES conference offers a hands-on experience that transcends traditional data collection, engaging participants in co-creating collaborative Impact Yarns through deep listening, yarning, and creative expression.
Join us in a dynamic workshop where weaving, sculpture, and artwork become powerful tools for storytelling and knowledge sharing. Participants will learn to capture the nuanced experiences of First Nations communities, reflecting on how these creative practices can reveal the interconnectedness of our lives and contribute to systemic change. By integrating Indigenous methodologies, we will collectively explore the rich, qualitative data that emerges from individuals' lived realities and aspirations.
As we craft and shape our narratives, we will reflect on how these stories can inform and transform policies and initiatives. This immersive session is not just about creating art; it is about embodying the principles of gender justice and equality, respecting cultural heritage, and acknowledging the diverse ways communities envision their future.
Experience the power of collaborative creation, where every thread woven and every form sculpted enriches our collective understanding of impact. This workshop is an invitation to step away from the spreadsheet and into a space where every voice contributes to a tapestry of change. Come, let us shape a more empathetic and embracing approach to measurement—one that values the stories and truths of all peoples. truths of all peoples.
Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Cultivating Equity: A Roadmap for New and Student Evaluators' Journeys
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
Authors: Ayesha Boyce (Arizona State University), Aileen Reid (UNC Greensboro, US)

Evaluation can be positioned as a social, cultural, and political force to address issues of inequity. We co-direct a 'lab' that provides new evaluators with hands-on applied research and evaluation experience to support their professional development. We are proud of our social justice commitments, and they show up in all aspects of our work. We believe the next generation of evaluators must be trained and mentored in high-quality technical, strengths-based, interpersonal, contextual, social justice-oriented, and values-engaged evaluation. We have found that novice evaluators are able to engage culturally responsive approaches to evaluation at the conceptual level, but have difficulty translating theoretical constructs into practice. This paper presentation builds upon our experiences and previous work of introducing a framework for teaching culturally responsive approaches to evaluation (Boyce & Chouinard, 2017) and a non-course-based, real-world-focused, adaptable training model (Reid, Boyce, et al., 2023). We will discuss how we have taught new evaluators three formal and informal methodologies that have helped them align their values with praxis. Drawing from our work across multiple United States National Science Foundation-funded projects we will overview how the incorporation of photovoice methodology, just-in-time feedback, and reflective practice have supported our commitments to meaningfully, and respectfully attend to issues of culture, race, diversity, power, inclusion, and equity in evaluation. We will also discuss our thoughts on the implications of globalization, Artificial Intelligence, and shifting politics on evaluation capacity building and training of new evaluators.

Speakers
avatar for Ayesha Boyce

Ayesha Boyce

Associate Professor, Arizona State University
Ayesha Boyce is an associate professor in the Division of Educational Leadership and Innovation at Arizona State University. Her research career began with earning a B.S. in psychology from Arizona State University, an M.A. in research psychology from California State University... Read More →
avatar for Aileen Reid

Aileen Reid

Assistant Professor, UNC Greensboro
Dr. Aileen Reid is an Assistant Professor of Educational Research Methodology in the Information, Library and Research Sciences department and a Senior Fellow in the Office of Assessment, Evaluation, and Research Services (OAERS) at UNC Greensboro. Dr. Reid has expertise in culturally... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Link-Up Services and wayfinding: Co-creating and navigating a culturally safe national monitoring and evaluation strategy
Wednesday September 18, 2024 1:30pm - 3:00pm AEST
Authors: Kathleen Stacey (beyond...Kathleen Stacey & Associates Pty Ltd), Cheryl Augustsson (Yorgum Healing Services), Raelene Rosas (NT Stolen Generation Aboriginal Corporation), Pat Thompson (Link-Up (Qld) Aboriginal Corporation), Jamie Sampson (Link-Up (NSW) Aboriginal Corporation)

Link-Up Services support Aboriginal and/or Torres Strait Islander people who were forcibly removed, fostered or adopted from their families as children, and their descendants who live with the ongoing impact of forcible removal policies, to reconnect with family, community, culture and Country. Wayfinding is at the core of our work - navigating unfamiliar territory with clients towards a hoped for destination of a greater sense of 'home', wherever this is possible, in a culturally safe, appropriate and trauma-informed manner.

In 2019, the National Indigenous Australians Agency funded development of a national Link-Up monitoring and evaluation strategy with the eight Link-Up Services operate across six jurisdictions. Each Link-Up Service is either a stand-alone Aboriginal community controlled organisations or based in an Aboriginal community controlled organisation.

This interactive workshop invites participants into our collective experiences of co-creating and implementing the M&E Strategy on a national basis, presented from the voices and position of Link-Up Services. We believe our experiences and learnings will be instructive for monitoring and evaluation activity with other Aboriginal and Torres Strait Islander organisations and programs.

Travel with us in reflecting on our monitoring and evaluation wayfinding journey over three phases of work. Pause with us at key points throughout the session to exercise your critical self-reflection and analysis skills, share your ideas and learn what has worked well or presented challenges for us and why in creating, navigating and implementing a culturally safe monitoring and evaluation strategy in a complex and demanding service context.
Speakers
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
RR

Raelene Rosas

Interim CEO, Northern Territory Stolen Generations Corporation
PT

Patricia Thompson AM

CEO, Link-Up Queensland
Wednesday September 18, 2024 1:30pm - 3:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Evaluation for whom? Shifting evaluation to increase its value for local actors
Wednesday September 18, 2024 2:00pm - 2:30pm AEST
104
Authors: Linda Kelly (Praxis Consultants), Mary Raori (UNDP Pacific, FJ)

This presentation outlines an approach to program assessment of a long-term governance program working across the Pacific, the UNDP Governance for Resilience program. It tells the story of the program’s maturing evaluation approach which has shifted from serving the information needs of those with money and power to focus more particularly on the values and interests of local participants and partners..
Despite the well-documented limitations of single methodology evaluation approaches for complex programs, many international development donors and corresponding international and regional organisations, continue to require program assessment that serves their needs and values. Typically, this includes narrowing evaluation to assessment against quantitative indicators. Notwithstanding the extensive limitations of this approach, it serves the (usually short-term) needs of international donors and other large bureaucracies. It generates simple information that can be communicated and showcased in uncritical forms. It provides numbers that are easily aggregated and used for concise reporting to senior and political masters.
Such approaches risk crowding out attention to the information needs of other participants and undermine attempts to support more locally led processes. This presentation will explain how this long-term and large-scale program has shifted, making use of a values-based evaluative approach to better serve the interests of partners and participants in the Pacific. This has involved both a methodological and political shift, broadening the range of data collection and analysis methodologies and approaches, increasing resourcing to accommodate different types of data and data collection and internal and external advocacy. This one program experience echoes wider views across the Pacific about the limitations of externally imposed measures and the lack of attention to what is valued by pacific countries and people.


Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Linda Vaike

Linda Vaike

Programme Adviser - Climate Risk Finance and Governance, Pacific Islands Forum Secretariat
Wednesday September 18, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Navigating a path to system impact: designing a strategic impact evaluation of education programs
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104
Authors: Amanda Reeves (Victorian Department of Education), Rhiannon Birch (Victorian Department of Education, AU), Eunice Sotelo (Victorian Department of Education, AU)

To provide insight to complex policy problems, evaluations need to adopt a systems perspective and examine the structures, relationships and contexts that influence program outcomes.

This paper outlines the design of a 4-year strategic evaluation that seeks to understand how a portfolio of over 25 education programs are interacting and collectively contributing to system-level outcomes. In this context, policy makers require evaluation to look beyond the boundaries of individual programs and assess the holistic impact of this investment to inform where and how resources can be directed to maximise system outcomes.

The strategic evaluation presented is theory-based and multi-layered, using logic modelling to identify outcomes at the program, cluster and system level and draw linkages to develop a causal pathway to impact. The strategic evaluation and the evaluations of individual education programs are being designed together to build-in common measures to enable meta-analysis and synthesis of evidence to assess system-level outcomes. The design process has been broad and encompassing, considering a diverse range of methods to understand impact including quantitative scenario modelling and value for money analysis.

The authors will describe how the strategic evaluation has been designed to respond to system complexity and add value. The evaluation adopts an approach that is:
• interdisciplinary, drawing on a range of theory and methods to examine underlying drivers, system structures, contextual factors and program impacts
• collaborative, using expertise of both internal and external evaluators, to design evaluations that are aligned and can tell a story of impact at the system-level
• exploratory, embracing a learning mindset to test and adapt evaluation activities over time.

This paper will be valuable for anyone who is interested in approaches to evaluating the relative and collective contribution of multiple programs and detecting their effects at the system level to inform strategic decision-making.
Speakers
avatar for Amanda Reeves

Amanda Reeves

Principal Evaluation Officer, Victorian Department of Education
Amanda is an evaluation specialist with over 12 years experience leading evaluation projects in government, not-for-profit organisations and as a private consultant. She has worked across a range of issues and sectors including in education, youth mental health, industry policy and... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

If treaty is like a marriage, state evaluation needs sustained deep work: Evaluation and Victoria's First Peoples Treaty
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
105
Authors: Kate Nichols (Department of Jobs, Skills, Industry and Regions - Victoria), Milbert Gawaya (Department of Jobs, Skills, Industry and Regions, AU)

First Peoples/settler state treaties have been likened to marriage - an evolving and changeable (political) relationship, not an endpoint or divorce (Blackburn, 2007). But what does this look like in practice given marriage's checkered connection with power imbalance and violence through to romance and deep, trusting companionship?

Contemporary colonial 'settlerism' (after Aunty/Dr Lilla Watson, in Watego, 2021) is undergoing structural change in Victoria, with Victoria's First Peoples sitting down with the Victorian State Government in 2024 to commence statewide treaty negotiations. Treaty is an acknowledgement that British sovereignty did not extinguish Aboriginal sovereignty, opening-up a "third space of sovereignty" (after Bruyneel, 2007) where co-existing sovereigns can further contest the "sovereignty impasse" (ibid., 2007), while Indigenous people control their own affairs.

Treaty is expected to reshape how the Victorian state government operates, challenging state laws, institutions, policies, programs and processes, which together, have contributed to First Nations disadvantage and suffering. Government evaluation practices will need their own shake-up.

How can public sector evaluators help establish an equal, strong and nourishing treaty marriage? This short paper shares emerging ally insights into how local practices are evolving to support Victoria's Treaty and self-determination. It shares reflections from a recent evaluation of Traditional Owner grant programs, conducted in partnership between key Aboriginal and non-Aboriginal public sector staff. It is a story of both-ways practice and the time, trust and bravery required to achieve deep change. It also highlights the role of lifelong cultural learning and behaviour change for ally evaluators. Culturally responsive evaluation, Indigenous research practices, restorative justice and the AES First Nations Cultural Safety Framework provide useful framing. Although focused on the Victorian treaty context, the paper may be transferable to other jurisdictions and evaluations involving or impacting Aboriginal and Torres Strait Islander peoples in support of their sovereignty and self-determination.
ion trainers and facilitators.
Speakers
avatar for Kate Nichols

Kate Nichols

Senior Evaluator, Department of Economic Development, Jobs, Transport & Resources
I've been a practising evaluator since Missy Elliot released 'Work it' which a) reveals a bit too much about my age, but b) gives you a sense of how much I'm into this stuff. I've recently returned to an evaluation role in the Victorian public sector after working in a private sector... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When "parachuting in" is not an option: Exploring value with integrity across languages, continents and time zones
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106
Authors: Julian King (Julian King & Associates), Adrian Field (Dovetail)

Ethics in Pacific Evaluation: Voices of the Rebbilib is a research project under the Strengthening Pacific MEL initiative of the Pacific Community. The ethics of evaluation practice are some of the most consistent norms documented by evaluation societies and associations internationally. This is because a clear set of ethical guidelines provides evaluators of all skill levels an understanding of their roles and responsibilities in the way they conduct themselves in practice. Similarly, codifying ethical principles of evaluation within a guideline or manual assumes that this is an effective way of ensuring evaluators adhere to a high standard of ethical practice. This presentation presents a provocation of the assumed norms of ethical evaluation practice through the sharing of experiences from the Pacific.
The panel will consider the question of whether normative ethics actually improve evaluation practice in the Pacific. A literature review of evaluation principles guidelines and manuals identified the normative ethics including the principle of best judgement in the application of ethical principles and values. This principle asks evaluators to confront their positionality within an evaluatative context and confront their own power in evaluation. The presentation will illuminate what this looks like for experienced evaluators working in the Pacific.
This provocation entices a new ethics-oriented entry point to help us seek solutions to accelerate action towards localising evaluation practice in the Pacific. This panel discussion will include key findings from a broader study on this topic which includes several leading Pacific evaluators. The findings suggests that the principle of exercising one's best judgement requires greater appreciation of culture and context with recommendations about how to achieve this. Building on the Pacific Monitoring, Evaluation and Learning Capacity Strengthening wayfinding tool the Rebbilib, this research calls on evaluators and evaluation commissioners to demand more from evaluative principles and values.

Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Elevating evaluation: practical insights for supporting systems transformation
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
Authors: Kathryn Erskine (Cube Group), Michael Maher (Cube Group, AU)

The session is intended to provide a practical example of how a traditional program evaluation was re-orientated to allow the application of findings to inform broader system transformation. This echoes current discourse in evaluation - particularly since the global pandemic - whereby traditional notions about how the field of evaluation is viewed and developed are being challenged (Ofir, Z. 2021). Specifically, there are calls to rethink and elevate evaluation practice to actively contribute to and support systems transformation (see Dart 2023; Norman 2021), beyond a narrow programmatic focus.

This session will illuminate this discussion by examining a mental health program evaluation in the context of significant service reform across the Victorian mental health system. The presentation will outline insights and techniques about how to lift and reconfigure a tightly defined program evaluation into one which can have broader application to the system ecosphere. It outlines how and why the pivot was made; changes we made to the methodology and the key benefits that arose from taking an expansive view of the sector in which the program operated within.

The design of the session will be a presentation format supported by a PowerPoint slide deck, comprising:
•    Introduction and purpose of session
•    Overview of the program we evaluated
•    Key challenges which required an evaluation 'pivot' - and how we worked with our client
•    Key changes made to the methodology
•    Key benefits from elevating from a programmatic to systems focus.


Speakers
avatar for Kathryn Erskine

Kathryn Erskine

Director, Cube Group
Combining academic rigour with a practical ‘can-do’ approach, Kathryn is committed to delivering evidence-based change that improves the lives of Australians.Kathryn brings a depth and breadth of experience in the public, private and not-for-profit sectors, specialising in program... Read More →
avatar for Michael Maher

Michael Maher

Partner & Evaluation Lead, Cube Group
Leading Cube Group’s Evaluation and Review practice, Michael brings over 30 years of experience in the public, private and not-for-profit sectors. Michael’s work spans all areas of social policy with particular expertise in early childhood, education, justice, human services... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When speed is of the essence: How to make sure the rubber hits the road
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103
Authors: Kristy Hornby (Grosvenor)

There is a lot of interest in rapid M&E planning and rapid evaluations at present; partially borne out of rapid contexts in a COVID-19 policy context; and partially borne out of constricting appetites for time and money spent on evaluations. It is unlikely this trend will reverse in the short-term, so what do we do about it to acquit our responsibilities as evaluators, ethically and appropriately, in a rapid context? This session sets out a step by step approach to conducting a rapid evaluation, inviting attendees to follow along with their own program in mind, to come away from the session with a pathway for conducting their own rapid evaluation. The session uses a fictional case study as the construct to move the rapid evaluation approach forward, describing throughout the session how you can use literature reviews, qualitative and quantitative data collection and analysis techniques, and report writing approaches innovatively to save you time while not compromising rigour.

We contend it is possible to do a rapid evaluation ethically and appropriately, but the backbone of doing so is good planning and execution. This session shares practical tips and approaches for doing so through each key phase of an evaluation, so attendees are well-equipped for their next rapid evaluation.

To consolidate the learning, attendees will be provided a framework to come away from the session with a high level plan of how to conduct their own rapid evaluation, to increase their chance of success.

Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Plenary: James Copestake "What next? From evaluating to anticipating"
Wednesday September 18, 2024 3:30pm - 4:30pm AEST
Professor, International Development, University of Bath, UK

As evaluators, we are often asked to find out ‘what caused what’ in the past, but also resist being labelled solely as hired evidence collectors. I argue for further effort to move beyond ‘what worked?’ to ‘what next?’- from outcome-activity links in the past to scenario-action possibilities and options in the future. Anticipatory evaluation can enhance the relevance and usefulness of our work, but also accentuates the challenges of appropriate framing, causal analysis, normative deliberation, and influencing. It increases uncertainty, recasts stakeholder relationships, and requires use of a wider range of complexity-informed methods. I explore these issues by reflecting on the scope for anticipatory evaluative practice in four diverse fields, each at a different level: planning doctoral research (interpersonal), appraising impact investment (organisational), mainstreaming social policy initiatives (national), and rethinking development (global). Becoming more future-oriented makes new demands on each of us as evaluation professionals, but also requires collective action to build stronger bridges with communities of practice in anticipatory action, appraisal, foresight, and future thinking.
Speakers
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Wednesday September 18, 2024 3:30pm - 4:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Developing a Tool for Measuring Evaluation Maturity at a Federal Agency
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105
Authors: Eleanor Kerdo (Attorney Generals Department ),Claudia Oke (Attorney Generals Department, AU),Michael Amon (Attorney Generals Department, AU),Anthony Alindogan (Attorney Generals Department, AU)

To embed a culture of evaluation across the Australian Public Service (Commonwealth of Australia, 2021), we must first have an accurate understanding of the current state of evaluation capability and priorities across Commonwealth agencies. This paper shares tools on how to build an effective measurement framework for evaluation culture, and discusses how to use these for evaluation capability uplift.
We explore quantitative and qualitative methods to gather and analyse data to measure an organisation's readiness to change its culture towards evaluation. This includes assessing staff attitudes towards evaluation, the level of opportunity for staff to conduct and use evaluation, and confidence in their knowledge of evaluation.
We discuss the development of a staff evaluation culture survey based on Preskill & Boyle's ROLE and how behavioural insight tools can be utilised to boost engagement. The paper discusses the utility of holding focus groups with senior leaders to understand authorising environments for evaluation and key leverage points. Also discussed, are challenges and innovative solutions that were encountered throughout the assessment process.
This paper will be valuable for those who work in, or with, any government agency with an interest in evaluation capacity building and driving an evaluation culture within organisations. This paper explains each stage of measurement design, data analysis and results, and discussing opportunities for action.
1 Preskill, H., & Boyle, S. (2008). A Multidisciplinary Model of Evaluation Capacity Building. American Journal of Evaluation, 29(4), 443-459. ://journals.sagepub.com/doi/10.1177/1098214008324182

2 Michie S, Atkins L, West R. (2014) The Behaviour Change Wheel: A Guide to Designing Interventions. London: Silverback Publishing. www.behaviourchangewheel.com.

3 Lahey, R. (2009). A Framework for Developing an Effective Monitoring and Evaluation System in the Public Sector: Key considerations from International Experience. Framework for developing an effective ME system in the public sector (studylib.net)
Speakers
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
avatar for Eleanor Kerdo

Eleanor Kerdo

Eleanor is passionate about social justice, science, and access to safe and high quality health, human services and education. Eleanor is an experienced evaluator specialising in participatory realist approaches and has experience in both biomedical research and consumer lead research... Read More →
avatar for Anthony Alindogan

Anthony Alindogan

Evaluation Officer, Attorney-General's Department
Anthony is an experienced evaluator with a particular interest in outcomes measurement and value-for-money. He completed his Master of Evaluation degree from the University of Melbourne. Anthony is an enthusiastic writer and has publications in various journals including the Evaluation... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Speakers
avatar for Dr Jake Clark

Dr Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Evaluation Lab: Using design to solve evaluation challenges
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Authors: Matt Healey (First Person Consulting)

The Design and Evaluation Special Interest Group (DESIG) was established in 2017. Its primary aim has been to explore the intersection of evaluation and design, and that aim has been interpreted in different ways over time. In 2024, the DESIG identified an opportunity to take the SIG model in a slightly different direction, embarking on an innovative venture with the launch of the Evaluation Lab, an initiative aimed at talk into action, and taking evaluators through a design process to address evaluation challenges.
Drawing inspiration from the concept of 'living labs,' which serve as real-world testing grounds, the Evaluation Lab created a space where evaluation professionals could come together. Employing a design-thinking process, the Lab guided participants through a structured expedition of defining, ideating, and prototyping solutions to tackle nominated challenges. Participants also learned pitch skills to communicate their solutions.
This Big Room Session provides an opportunity for the DESIG to outline the Evaluation Lab model, capped off with participants presenting their solutions through rapid-fire pitches, either live or pre-recorded, akin to explorers sharing tales of new lands discovered. The session's innovative twist lies in the audience's role, acting as both audience and judges. The audience will vote on their favourite solution, and be involved in crowing the first AES Evaluation Lab winner.
By blending lecture-style content with dynamic team presentations and active audience engagement, the Big Room Session not only highlights the critical role of design in navigating evaluation challenges but also demonstrates the practical application of these methodologies in charting a course through real-world problems.

Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Navigating the choppy waters of the evaluation landscape in the Pacific
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
106
Authors: Allan Mua Illingworth (Mua'akia Consulting), Chris Roche (La Trobe University), Kaita Sem (Pacific Community (SPC), FJ),Mereoni Chung Chung (Talanoa Consulting Fiji, FJ),Epeli Tinivata (Balance of Power Program, FJ)

In recent years there have been a number of Pacific driven initiatives designed to promote monitoring and evaluation practice which is culturally and contextually appropriate. These have occurred with projects and programs as well as at national and regional levels. At the same time geo-political interest in the Pacific region has resulted in an increased number of bi and multilateral donor agencies becoming present in the region and/or funding development programs, local organisations, national governments and regional bodies. This has in turn led to an evaluation landscape where notions of 'international best practice' as well as donor policies and practices and associated international researcher and consulting companies, risk crowding out emergent Pacific led evaluation initiatives.

This panel will bring together key participants who are leading three examples of these Pacific experiences: the Rebbilib process initiated by the Pacific Community (SPC - ideally if this submission panel and SPC's presentation proposal were to be successful they would both feature in the same session at the conference as they would complement each other well and add greater depth to the session), Insight Pacific (an emerging Pacific led and owned collective focused on evaluation in the first instance) and the Balance of Power program (a Pacific-led initiative, supported by the Australian Government, focused improving the political, social and economic opportunities for women and girls) each of whom are seeking to create space for processes of monitoring, evaluation and learning which are consistent with Pacific ways of knowing and being. They will share their experience, the challenges they face and ideas about what forms of support might be provided by international donors, consultants and advisors which are enabling rather than undermining.

Moderated by Prof. Chris Roche the panel and audience will also draw out the lessons from these three cases about what might contribute to more systemic change in the evaluation landscape more generally.
Speakers
avatar for Kaita Sem

Kaita Sem

Relationship and Learning Adviser, The Pacific Community
I am a Pacific professional emerging in the MEL space with a background in policy. As a relatively new MEL practitioner, I am really open to chat about anything or talanoa about how MEL can enable transformation globally and particularly in the Pacific. I look forward to meeting... Read More →
avatar for Chris Roche

Chris Roche

Professor of Development Practice, La Trobe University
I am Professor Development Practice with the Centre for Human Security and Social Change at La Trobe University - (https://www.latrobe.edu.au/socialchange) - and former Deputy Director of the Developmental Leadership Program (www,dlprog.org) and member of the intellectual leadership... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Rebalancing Power Dynamics: Philanthropy Through the Lens of First Nations Community-Driven MEL
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
103
Authors: Skye Trudgett (Kowa), Rachel Kerry (CAGES Foundation, AU)

The philanthropic sector has long grappled with power imbalances inherent in funding relationships, particularly in the context of support for First Nations communities. This panel session explores a groundbreaking Monitoring, Evaluation, and Learning (MEL) approach that inverts traditional power structures, placing First Nations communities in the driver's seat to assess the adherence of a leading philanthropic organisation to their stated values and principles._x000D_
Drawing from the collaborative efforts of one foundation and its MEL partner, this session showcases a MEL model that exemplifies shared power and mutual accountability. The panel will consist of thought leaders from the philanthropic sector, First Nations community representatives, and MEL experts who have been at the forefront of developing and implementing this innovative approach.
Through a facilitated discussion, panellists will delve into the process of co-creating a MEL framework that empowers communities to evaluate the performance of philanthropists against a set of mutually agreed-upon criteria. This approach ensures that philanthropic actions align with community expectations, cultural protocols, and contribute to genuine and sustainable impact.
Attendees will gain insights into the challenges and successes of operationalising this community-centric MEL method. The session aims to inspire other philanthropic entities to reflect on their practices and adopt similar approaches that truly shift power to First Nations communities.
Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Trigger warnings - do they just trigger people more?
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104
Authors: Kizzy Gandy (Verian (formerly Kantar Public) )

As evaluators, one of our key ethical responsibilities is to not cause psychological harm or distress through our methods. We often start workshops or interviews with a warning that the topic may be upsetting and provide contact information for mental health services to participants under the assumption this is the most ethical practice.

Trigger warnings are used with good intentions and are often recommended in evaluation ethics guidelines. However, what do we know about their impact? Is there a risk they actually trigger people more?

This talk examines the evidence on whether trigger warnings are an effective strategy for reducing the risk of trauma and re-traumatisation when discussing topics such as sexual assault, mental health, violence, drug use, and other sensitive issues. It also touches on new evidence from neuroscience about how emotions are understood differently now compared to in the past.

This session will not provide a definitive answer on when or how to use trigger warnings but aims to challenge the audience to think critically about whether trigger warnings are useful in their own work.
Speakers
avatar for Kizzy Gandy

Kizzy Gandy

National Director, Program Evaluation, Verian
Dr Kizzy Gandy is Verian's National Director of Program Evaluation. She leads a team of expert methodologists and provides quality assurance. With 20 years’ experience in consultancy, federal and state government, and academia, Kizzy has overseen the design and evaluation of over... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Development and implementation of a culturally grounded evaluation Framework: Learnings from an Aboriginal and Torres Strait Islander Peak.
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
Authors: Candice Butler (Queensland Aboriginal and Torres Strait Islander Child Protection Peak ),Michelle McIntyre (Queensland Aboriginal and Torres Strait Islander Child Protection Peak, AU),John Prince (JKP Consulting, AU)

There is increasing recognition that evaluations of Aboriginal and Torres Strait Islander programs must be culturally safe and appropriate, and represent the worldviews, priorities, and perspectives of Aboriginal and Torres Strait Islander communities. Aboriginal and Torres Strait Islander peoples have the cultural knowledge and cultural authority to design appropriate evaluations that are safe, and that tell the true story of the impacts of our ways of working.

As a peak body for Aboriginal and Torres Strait Islander community-controlled organisations we wanted to ensure that the worldviews and perspectives of our members and communities are embedded in any evaluations of services delivered by our member organisations. This is a necessary step towards building an evidence base for our ways of working, developed by and for Aboriginal and Torres Strait Islander people. To that end we developed an evaluation framework to enable self-determination and data sovereignty in evaluation, and to build capacity among our member organisations to undertake and/or commission culturally grounded evaluations. Culturally grounded evaluations are led by Aboriginal and Torres Strait Islander people and guided by our worldviews and knowledge systems - our ways of knowing, being and doing.

This paper reports on the development and implementation process used in the project and describes the standards and principles which underpin the framework. An example of how the framework is being applied in practice is also outlined. Our principles for evaluation describe the core values which underpin culturally grounded and safe evaluation including self-determination; cultural authority; truth-telling; two-way learning; and holistic approaches. The evaluation standards and associated elements operationalise our principles and embed them in evaluative practice.
Speakers
avatar for Candice Butler

Candice Butler

Executive Director, Centre of Excellence, Queensland Aboriginal and Torres Strait Islander Child Protection Peak
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Failing your way to better practice: How to tread carefully when things aren't going as planned
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105
Authors: Stephanie White (Victoria Department of Education )

Evaluators can fail in many ways. The consequences of these failures can be relatively contained or wide ranging within the evaluation and can also flow on to program operations. But failure is a part of life and can be a useful catalyst for professional growth. What happens when you find yourself failing and can see the risks ahead? How do you keep going?

The session focuses on the experiences of an emerging evaluator who failed while leading a large-scale education evaluation. When some elements of the evaluation became untenable, they struggled to find the right path forward and could foresee the risks materialising if the situation wasn’t addressed. On the other side of it, they reflect on how they drew on tools in every evaluator’s toolkit to start remedying their previous inaction and missteps to get the evaluation back on track…and improve their practice along the way!

This session is relevant to any evaluator who grapples with the messiness of expectations and reality in their practice.


Speakers
avatar for Stephanie White

Stephanie White

Victoria Department of Education
I found my way to evaluation in the last few years after mulling over questions of education program quality and success for years. Now working as a Senior Evaluation and Research Officer at the Victoria Department of Education, most of my career has been in the Northern Territory... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

5:30pm AEST

AES 2024 Annual General Meeting
Wednesday September 18, 2024 5:30pm - 6:15pm AEST
Join the Australian Evaluation Society (AES) Board as we celebrate another year’s achievements by members of the AES, and introduce newly elected Board members.

Speakers
KP

Kiri Parata

President, Australian Evaluation Society
Wednesday September 18, 2024 5:30pm - 6:15pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

8:30am AEST

Plenary: Elizabeth Hoffecker "Wayfinding tools for learning and evaluation in complex systems"
Thursday September 19, 2024 8:30am - 10:00am AEST
Lead Research Scientist, Local Innovation Group, Massachusetts Institute of Technology (MIT), USA

What does a wayfinding approach look like when seeking to learn from and evaluate interventions into complex systems? 

Many of the most intractable challenges facing communities around the world are system challenges requiring system-level responses. Development-focused donors and implementers at various levels are recognizing this and funding system-strengthening and systems-change work across a variety of systems. Monitoring, evaluation, and learning work, however, has traditionally been focused at the project level, not the level of the dynamic local systems in which projects operate. A new kind of evaluation is needed for this work and is in the early stages of being developed, tested, and improved through learning-by-doing.

In forums such as the UNDP’s M&E Sandbox and the BMGF-funded Systems Monitoring, Learning, and Evaluation initiative, development donors, implementers, and evaluators are asking questions such as: what evaluation designs and approaches are most suitable for learning from and evaluating system and portfolio-level interventions? And “how do we know if we are making progress, generating results, and contributing to positive change in a complex system?”
Drawing on experience implementing “complexity-aware” evaluations of system-change interventions in Northern India and Guatemala, this session develops and explores responses to these questions. The presentation shares an evaluation approach and six related tools that are being used to evaluate, learn, and implement adaptively in these two very different system contexts. The tools--while humble and likely familiar--can become powerful wayfinding instruments for navigating complexity when combined with a systems-informed evaluation design. This session introduces this approach through a keynote presentation and then further develops it through an interactive panel with systems-informed evaluators working both internationally and domestically in Australia.
Speakers
avatar for Elizabeth Hoffecker

Elizabeth Hoffecker

Lead Research Scientist, Local Innovation Group, Massachusetts Institute of Technology (MIT), USA
Elizabeth Hoffecker is a social scientist who researches and evaluates processes of local innovation and systems change in the context of addressing global development challenges. She directs the MIT Local Innovation Group, an interdisciplinary research group housed at the Sociotechnical... Read More →
Thursday September 19, 2024 8:30am - 10:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Applying outcome harvesting methodology in the media development sector
Thursday September 19, 2024 10:30am - 11:00am AEST
103
Authors: Seila Sar (Internews Network )

Interventions seeking to influence policy and practice are notoriously difficult to evaluate due to the confluence of factors that impact any policy-making process. Media is commonly considered to have significant influence over policy-making processes; however, the connection between the cause and effect is complex, making it difficult to ascertain the attribution of media interventions on policy outcomes. In order to overcome this challenge, the author employs the outcome harvesting methodology adapted from Ricardo Wilson-Grau (2018) to capture the impact of a media intervention on environmental related policy and practice. Started in 2017, this seven-year intervention aims to improve environmental related governance and accountability in the Asia Pacific region through a strengthened information ecosystem for informed decision-making and action by citizens, political leaders, and other key decision makers. The objective of this session is to share the experience and lessons learned from the implementation of outcome harvesting methodology in the media development sector. The session will highlight how the outcome harvesting methodology is used to collect evidence of impact and unpack the complexity of the policy environment. This approach generates rigorous data on the extent to which quality information influences discourse and environmental practices, prompting actions by local governments to improve environmental related governance and accountability issues in the region. The presentation will also lay out some challenges and mitigation strategies applying this methodology in the media sector and conclude with a few questions from the audiences towards the end of the session. This session is very beneficial for evaluation practitioners who are interested in integrating this methodology in their evaluation work. Participants will also be able to use the insights gained from the session to refine their own outcome harvesting process for better results.
Speakers
Thursday September 19, 2024 10:30am - 11:00am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

National impact, regional delivery - Robust M&E for best practice Australian horticulture industry development.
Thursday September 19, 2024 10:30am - 11:00am AEST
Authors: Ossie Lang (RMCG ),Donna Lucas (RMCG ),Carl Larsen (RMCG ),Zarmeen Hassan (AUSVEG ),Cherry Emerick (AUSVEG ),Olive Hood (Hort Innovation )

How do you align ten regionally delivered projects with differing focus topics to nationally consistent outcomes? Take advantage of this opportunity to explore the journey of building and implementing a robust Monitoring and Evaluation (M&E) program that showcases regional nuances and aligns national outcomes, making a significant contribution to the success of this horticultural industry extension project.

Join us for an insightful presentation on how a national vegetable extension project focused on adoption of best management practices on-farm, has successfully implemented a dynamic M&E program. Over the two and a half years of project delivery, the national M&E manager, in collaboration with ten regional partners, has crafted a program that demonstrates regional impact consistently on a national scale and adapts to the project's evolving needs.

The presentation will highlight the team's key strategies, including the upskilling of Regional Development Officers in M&E practices. Learn how templates and tools were designed to ensure consistent data collection across approximately 40 topics. The team will share the frameworks utilised to capture quantitative and qualitative monitoring data, providing a holistic view of tracking progress against national and regional outcomes and informing continuous improvement in regional delivery.

Flexibility has been a cornerstone of the M&E program, allowing it to respond to the changing needs of growers, industry, and the funding partner and seamlessly incorporate additional data points. Discover how this adaptability has enhanced the project's overall impact assessment and shaped its delivery strategy.

The presentation will not only delve into the national perspective but also feature a firsthand account from one of the Regional Development Officers. Gain insights into how the M&E program has supported their on-the-ground delivery, instilling confidence in providing data back to the national project manager. This unique perspective offers a real-world understanding of the national program's effectiveness at a regional level.
Thursday September 19, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Navigating the maze of causality: Understanding the relationship between carcinogenic betel nut consumption and learning outcomes
Thursday September 19, 2024 10:30am - 11:30am AEST
106
Authors: Kabira Namit (Abt Global ),Kathryn Lee (Abt Global, AU)

This hands-on session is designed to strengthen intuition of causality in non-experimental evaluations among emerging evaluators.

In environments where conventional RCTs are unethical or unfeasible, identifying causal factors within the complex weave of societal factors and individual behaviours presents a significant challenge. Centred on a novel research project from Papua New Guinea, this session navigates this maze through real-world research (exploring the intricate relationship between the consumption of carcinogenic betel nut and its impact on educational outcomes). By focusing on this specific case study, we provide a concrete context for participants to understand the broader implications of causal explorations in fragile and sensitive settings.

Participants will actively engage in small group discussions in a collaborative learning environment where they can practice and refine their skills in causal evaluation by discussing scenarios that are reflective of real-world complexities.

This session aims to move beyond simply documenting correlations, encouraging a deep dive into the underlying dynamics of causal linkages. Through this exploration, we aim to eventually guide participants to discussions on pathways for targeted interventions and policy formulations which take causal chains into account.

Additionally, we aim to spark dialogue on the ethical dimensions of 'activist research,' exploring how evaluators can navigate moral dilemmas while advocating for meaningful change. This hands-on session not only seeks to build evaluative skills but also to inspire participants to consider the broader implications of their work on societal well-being and ethical research practices.
Thursday September 19, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Commissioning evaluations - finding the way from a transactional to a relational approach
Thursday September 19, 2024 10:30am - 12:00pm AEST
Authors: Eleanor Williams (Australian Department of Treasury ),Josephine Norman (Victorian Department of Health, AU),Melissa Kaltner (Lumenia, AU),Skye Trudgett (Kowa Collaboration, AU),George Argyrous (Paul Ramsay Foundation, AU),Luke Craven (National Centre for Place-Based Collaboration (Nexus), AU)

Delivering great evaluations requires a strong professional relationship between those commissioning and delivering the evaluation, as well as all relevant stakeholders.

Traditional evaluation commissioning approaches have tended to treat evaluation as a one-off exchange focusing on the completion of pre-defined tasks. However, the evolving landscape of policies and programs tackling complex issues demands a more nuanced and relational approach to get the most out of the journey of evaluation.

This big room panel session brings together speakers who are at the forefront of thinking around collaborative commissioning partnerships from the perspectives of government, not-for-profit and Indigenous-led organisations, and the private sector who can play the full suite of roles on the commissioning journey. The discussion will delve into the experiences of a range of organisations involved in commissioning who are seeking to build enduring relationships, and in some case partnerships, between the commissioners, the evaluators and the stakeholders to whom we are accountable.

Drawing on real-world case studies and empirical evidence, the discussion will highlight the challenges and rewards of transitioning from a transactional model to a relational model. It will explore how this paradigm shift can enhance collaboration and ultimately lead to a range of positive outcomes.

Attendees will be invited to respond to engage in dialogue with the panel to bring the collective wisdom of attendees together and consider how the destination of better commissioning relationships would look, the practical obstacle we face on our pathway, and how we can reach our destination. To facilitate this active discussion, attendees will have the opportunity to use Sli.do throughout the session to provide input on key questions, share experience in real-time and ask questions of the expert panel.
Speakers
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
avatar for Josephine Norman

Josephine Norman

Director, Centre for Evaluation and Research Evidence, Dept of Health/Dept of Families, Fairness and Housing
I run a large internal evaluation unit, directing a team of 30 expert evaluators and analysts to: directly deliver high priority projects; support program area colleagues to make the best use of external evaluators; and, build generalist staff capacity in evaluation principles and... Read More →
avatar for Luke Craven

Luke Craven

Independent Consultant
Thursday September 19, 2024 10:30am - 12:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Learn, evolve, adapt: Evaluation of climate change and disaster risk reduction programs
Thursday September 19, 2024 11:00am - 11:30am AEST
104
Authors: Justine Smith (Nation Partners )

There is a pressing need to reduce the risks associated with climate change and the disasters that are likely to increase as a result. Along with the need to take action, comes the need to show we are making a difference - or perhaps more importantly the need to learn and evolve to ensure we are making a difference. However when operating in an ever changing, uncertain environment, with layers of complexity and outcomes that may not be realised for some time, or until disaster strikes, evidence of impact is not always easy to collect nor a priority.

Drawing on experience developing evaluation frameworks and delivering evaluation projects in the areas of climate change and disaster and emergency management, I will present some of the challenges and opportunities I have observed. In doing so, I propose that there is no 'one way' to do things. Rather, taking the time to understand what we are evaluating and to continually learn, evolve and adjust how we evaluate is key. This includes having clarity on what we really mean when we are talking about reducing risk and increasing resilience. Ideas I will explore include:
  • The concepts of risk reduction and resilience.
  • The difference between evaluation for accountability and for genuine learning and improvement.
  • Balancing an understanding of and progress towards big picture outcomes with project level, time and funding bound outcomes.
  • The challenge and potential benefits of event-based evaluation to learn and improve.

Evaluation has the capacity to contribute positively to action taken to reduce climate change risks and improve our management of disasters and recovery from disasters. As evaluators we too need to be innovative and open-minded in our approaches, to learn from and with those working directly in this space for the benefit of all.
Speakers
avatar for Justine Smith

Justine Smith

Principal, Nation Partners
With a background spanning research, government, non-government organisations and consulting, Justine brings technical knowledge and over 10 years of experience to the projects she works on. As a highly experienced program evaluator and strategic thinker, Justine has applied her skills... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Culturally inclusive evaluation with culturally and linguistically diverse communities in Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
Authors: Thushara Dibley (CIRCA Research ),Lena Etuk (CIRCA Research, AU)

In this presentation we will outline an approach to culturally inclusive evaluation with people from culturally and linguistically diverse backgrounds in Australia, its strengths, and its growth opportunities. This approach fills a critical gap in the way evaluation and research with culturally and linguistically diverse communities is traditionally conducted in Australia.

In this presentation we will explain how the Cultural & Indigenous Research Centre Australia (CIRCA) conducts in-culture and in-language evaluation with diverse cohorts of Australians, and how this practice fits within the broader methodological discourse in evaluation and social science more broadly. We will illustrate how our culturally inclusive methodology is put into practice with findings from CIRCA's own internal research into the way cultural considerations shape our data collection process. We will conclude with reflections on how CIRCA might further draw on and leverage standpoint theory and culturally responsive evaluation as this practice is further refined.

Our key argument is that doing culturally inclusive evaluation is a process that requires reflexivity and learning, alongside strong and transparent institutional processes. Combining these approaches creates systemic ways of acknowledging and working within stratified and unequal social systems, inherent to any research. Our findings will advance knowledge within the field of evaluation about how to engage and represent culturally and linguistically diverse community members across Australia.
Speakers
avatar for Thushara Dibley

Thushara Dibley

CIRCA Research
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Cultural & Indigenous Research Centre Australia
I’m an applied Sociologist with 16+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Our journey so far: a story of evaluation to support community change in South Australia
Thursday September 19, 2024 11:00am - 11:30am AEST
103
Authors: Penny Baldock (Department of Human Services South Australia ),Jessie Sleep (Far West Community Partnerships, AU)

The multi-jurisdictional South Australian Safety and Wellbeing Taskforce is the lead mechanism, and the accountable body to develop strategies and sustainable, place-based responses that ensure the safety and wellbeing of remote Aboriginal Visitors in Adelaide and other regional centres in the State.

This presentation discusses the challenges of establishing an evaluative learning strategy for the Taskforce that meets the needs of multiple government agencies and stakeholders, multiple regional and remote communities, and multiple nation groups.

In a complex system, this is a learning journey, requiring us to adapt together to seek new ways of understanding and working that truly honour the principles of data sovereignty, community self-determination, and shared decision-making.
As we begin to more truly centre communities as the locus of control, and consider the far- reaching reform that will be necessary to deliver on our commitments under Closing the Gap, this presentation provides an important reflection on the skills, knowledge and expertise that will be required to build evaluation systems and processes that support change.

One of the most exciting developments to date has been the establishment of a multi-agency data sharing agreement, which will enable government data to be shared with Far West Community Partnerships, a community change organisation based in Ceduna, and combined with their community owned data in order to drive and inform the Far West Change Agenda.

We present the story of our journey so far, our successes, our failures, and extend an invitation to be part of the ongoing conversation. to support the change required for evaluation success.

Speakers
PB

PENNY BALDOCK

Department of Human Services
Thursday September 19, 2024 11:00am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Bringing the "human" into measurement: From in-depth inquiry to systemic change
Thursday September 19, 2024 11:30am - 12:00pm AEST
104
Authors: Julia Suh (Tobias)

Humans are complex and diverse. To create social change, what do we need to understand about them?

Their behaviours and mindsets are key, but the broader context and systems they operate within paints a fuller picture of the multiple moving parts that need to change simultaneously for sustained impact. These changes can be mapped, with embedded evaluative thinking, building a pathway for formal evaluation.

In this session, experts in Human-Centred Design and social change share their innovative approaches to thinking beyond the project- or program-level goals or organisational level performance indicators. Examples are drawn from direct experiences working across various transformation projects, from reducing child sexual exploitation and preventing academic misconduct to improving the care economy and elevating patient outcomes. They demonstrate how program goals and social change vision can not only be realised together, but also how a combination of strategic prioritisation, collaboration capability building and network can accelerate the process.
Speakers
avatar for Julia Suh

Julia Suh

Lead strategic designer, Tobias
Thursday September 19, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

"Nothing about us, without us": Developing evaluation framework alongside victim-survivors of modern slavery using representative participatory approaches
Thursday September 19, 2024 11:30am - 12:00pm AEST
Authors: Ellie Taylor (The Salvation Army)

Amplifying survivor voices has been the cornerstone of The Salvation Army's work in the anti-slavery realm. How does this translate to the monitoring and evaluation space? How do we truly represent the voices and experiences of those with lived experience of modern slavery in monitoring and evaluation, whilst aligning with key human rights principles?

Our Research Team are exploring how to centre survivor voices in the evaluation space. This session will detail use of a representative participatory evaluation approach to monitor and evaluate the Lived Experience Engagement Program (LEEP) for survivors of criminal labour exploitation. In this session we will explore the challenges and learnings uncovered through this project.

The LEEP is designed to empower survivors of criminal labour exploitation to share their expertise to make change. Piloted in 2022-2023, and continuing into 2024-2025, the LEEP - and resulting Survivor Advisory Council - provides a forum for survivors to use their lived experience to consult with government - to assist in preventing, identifying and responding to modern slavery.

The key points explored in this session will include:
  • Realities of implementing an adaptive model, including continuous integration of evaluation findings into an iterative survivor engagement model.
  • The importance of stakeholder inclusivity, integrating lived experience voices and amplifying them alongside program facilitators and government representatives.
  • Complexities of evaluation in the modern slavery space, particularly when victim-survivors of forced marriage are included. We will speak to the need for trauma-informed, strengths-based measures and facilitating partnerships with the people the program serves.

Leading the session will be the The Salvation Army's project lead with a PhD in mental health and over 12 years of experience working with diverse community groups in Australia and internationally. They have extensive experience presenting at conferences both domestically and internationally.
Speakers
avatar for Ellie Taylor

Ellie Taylor

Senior Research Analyst, The Salvation Army
Ellie has a background in mental health and has spent 12+ years designing and conducting research and evaluation initiatives with diverse communities across Australia and internationally. In this time, she's worked with people from all walks of life, across the lifespan, from infants... Read More →
Thursday September 19, 2024 11:30am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Social Impact Measurement & Evaluation – the similarities & differences that complement our journey to more fit-for-purpose destinations.
Thursday September 19, 2024 11:30am - 12:30pm AEST
106
Authors: Laura Glynn (Simna)

The measurement space has seen many new actors, terms, approaches and “gold standards” emerge in the last two decades. More than ever before has it become difficult to navigate and explore our intended destination in the space of measurement and evaluation. What schools of thought are worth exploring? What value do they offer to an existing evaluation skillset? We are also traversing through heightened levels of complexity, with cost of living, environmental and society fabric crises. In this busy and crowded environment, the Social Impact Measurement Network (SIMNA) led panel will seek to explore the similarities and differences between evaluation and social impact measurement (SIM) as mindsets to help steer us towards our destination.

The panel will involve 3 speakers from diverse sectoral backgrounds – government, not-for-profit, and private spheres, all commenting (broadly) on the questions: Are evaluation and social impact measurement the same? To what extent do they differ? How can they complement one another? While the questions themselves will be more nuanced than that, the answers will hold broad value for attendees in considering how they can bring complementary approaches and mindsets to navigating the work they do in measurement and evaluation. The panellists will draw on their unique perspectives across different sectoral and practice spaces to discuss this complementarity.
Thursday September 19, 2024 11:30am - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia
  Journey

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Speakers
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Systems evaluation to the rescue!: How do we use systems evaluation to improve societal and planetary wellbeing?
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104
Authors: Kristy Hornby (Grosvenor),  Dan Healy (First Person Consulting)

Systems evaluation - many might have heard the term, but few have done one. This session shares two case studies of different systems evaluations and the learnings from these to benefit other evaluators who are conducting or about to begin a systems evaluation.

The session will open with an overview and explanation of what systems evaluation is, in terms of its key features and how it is distinguished from other forms of evaluation. The presenters will then talk through their case studies, one of which centres on a regionally based health initiative while the other takes a sector-wide focus across the whole of Victoria. The co-presenters will share openly and honestly their initial plans for commencing the systems evaluations, how they had to amend those plans in response to real-world conditions, and the tips and tricks and innovations they picked up along the way.
Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.html
Thursday September 19, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Speakers
TV

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Getting to the value add: Timely insights from a realist developmental evaluation
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Phillip Belling (NSW Department of Education), Liam Downing (NSW Department of Education, AU)

This paper is aimed at early career and experienced evaluators interested in realist evaluation, but with concerns about the time a realist approach might take. The authors respond to this concern with an innovative blending of realist and developmental evaluation. Participants will exit the room with a working understanding of realist developmental evaluation, including its potential for adaptive rigour that meets the needs of policy makers and implementers.

Realist evaluation is theoretically and methodologically robust, delivering crucial insights about how, for whom and why interventions do and don't work (House, 1991; Pawson & Tilley, 1997; Pawson, 2006). It aims to help navigate unfamiliar territory towards our destination by bringing assumptions about how and why change happens out in the open.

But even realism's most enthusiastic practitioners admit it takes time to surface and test program theory (Marchal et al., 2012; van Belle, Westhorp & Marchal, 2021). And evaluation commissioners and other stakeholders have understandable concerns about the timeliness of obtaining actionable findings (Blamey & Mackenzie, 2007; Pedersen & Rieper, 2008).

Developmental evaluation (Patton, 1994, 2011 2021; Patton, McKegg, & Wehipeihana, 2015) is more about what happens along the way. It appeals because it provides a set of principles for wayfinding in situations of complexity and innovation. Realist and developmental approaches do differ, but do they share some waypoints to reliably unpack perplexing problems of practice?

This paper documents a journey towards coherence and rigour in an evaluation where developmental and realist approaches complement each other, and deliver an evidence base for program or policy decision-making that is not only robust but also timely.

We show that, in complex environments, with programs involving change and social innovation, realist developmental evaluation can meet the needs of an often-varied cast of stakeholders, and can do so at pace, at scale, and economically.
Speakers
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
avatar for Liam Downing

Liam Downing

Manager, Evaluation and Data, Quality Teaching Practice, NSW Department of Education
Liam is an experienced and impactful evaluation leader, with 18+ years of experience. He is focused on ensuring that evaluation is rigorous in its design, meaningful in informing next steps, and driven by building the capacity of as many people as possible to engage deeply in evaluative... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating the unfamiliar: Evaluation and sustainable finance
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Donna Loveridge (Independent Consultant), Ed Hedley (Itad Ltd UK, GB)

The nature and magnitude of global challenges, such as climate change, poverty and inequality, biodiversity loss, food insecurity and so on, means that $4 trillion is needed annually to achieve the Sustainable Development Goals by 2030. Government and philanthropic funding is not enough but additional tools include businesses and sustainable finance. Evaluators may relate to many objectives that business and sustainable finance seek to contribute to but discomfort can arise in the mixing of profit, financial returns, impact and purpose.

Sustainable finance, impact investing, and business for good are growing globally and provides opportunities and challenges for evaluators, evaluation practice and the profession.
This session explores this new landscape and examines:
  • What makes us uncomfortable about dual objectives of purpose and profit, notions of finance and public good, and unfamiliar stakeholders and languages, and what evaluators can do in response.
  • The opportunities for evaluators to contribute to solving interesting and complex problems with current tools and skills and where is the space for developing evaluation theory and practice.
  • How evaluation practice and evaluators' competencies might expand and deepen, and not get left behind in these new fields, and also sustaining evaluations relevance to addressing complex challenges.

The session draws on experience in Australia and internationally to share some practical navigation maps, tools and tips to help evaluators traverse issues of values and value, working with investors and businesses, and identify opportunities to add value.
Speakers
avatar for Donna Loveridge

Donna Loveridge

Impact strategy and evaluation consultant
I work with public sector and not for profit organisations and businesses to design and conduct evaluations and embed evaluative thinking in management systems and processes to strengthen learning and decision-making. Most of my work focuses on inclusive economic growth through impact... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Man vs. Machine: Reflections on machine-assisted and human-driven approaches used to examine open-text progress reports.
Thursday September 19, 2024 1:30pm - 2:00pm AEST
Authors: Stephanie Quail (ARTD Consultants), Kathleen De Rooy (ARTD Consultants, AU)

Progress reports and case notes contain rich information about program participants' experiences and frequently describe theoretically important risk and protective factors that are not typically recorded in administrative datasets. However, the unstructured narrative nature of these types of data - and, often, the sheer volume of it - is a barrier for human-drive qualitative analysis of this data. Often, the data cannot be included in evaluations because it is too time and resource intensive to do so.

This paper will describe three approaches to the qualitative analysis of progress reports used to examine within-program trajectories for participants, and the factors important for program success as part of an evaluation of the Queensland Drug and Alcohol Court.

It will explore how we navigated the balance between human and machine-driven qualitative analysis. We will reflect on the benefits and challenges of text-mining - how humans and machines stack up against each other when identifying the sentiment and emotion in text, the strengths and challenges of each approach, the lessons we have learned, and considerations for using these types of approaches to analyse datasets of progress reports in future evaluations.
Thursday September 19, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Digging up the evaluation theory tree- reconsidering our foundations in the assertion of Indigenous approaches and knowledge systems in the evaluation field
Thursday September 19, 2024 1:30pm - 2:30pm AEST
104
Authors: Aneta Cram (Victoria University of Wellington)

In this presentation, the presenter will speak to a developing model entitled, 'The Decolonized Garden.' This model presents a metaphor for developing Indigenous evaluation frameworks and shifting the foundations of the field to allow for Indigenous Evaluation approaches to develop and thrive. The presenter will share how this model was developed, it's content and how it can be used.

This model was created as a result of a doctoral research project exploring Indigenous evaluation frameworks. It draws together findings from the development process of existing Indigenous evaluation frameworks, the context they were developed in and the impact that they are having for the Indigenous peoples that they were developed for. These frameworks include the 'Evaluation with Aloha' framework (Queen Lili'Uokalani Trust, 2019), the 'Ngaa-bi-nya' framework (Williams, 2018), the 'Na-gah mo Waasbishkizi Ojijaak Bimise Keetwaatino: Singing White Crane Flying North' bundle (Rowe & Kirkpatrick, 2018) and 'Te Korekoreka' (Tokona te Raki, 2021).

This is both a tool for Indigenous communities, to develop their own unique framework, and for the evaluation field to consider how they are working towards or hindering an environment for Indigenous evaluation approaches to thrive.
Speakers
Thursday September 19, 2024 1:30pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Scaling Impact: How Should We Evaluate the Success of a Scaling Journey?
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106
Authors: John Gargani (Gargani + Co)

The world has never faced larger problems—climate change, refugee crises, and Covid19, to name just three. And organizations have responded by scaling solutions to unprecedented size—sustainable development goals, global refugee policies, and universal vaccination programs. But scaling is a journey to a destination imperfectly imagined at the onset and difficult to recognize upon arrival. At what point is scaling a program, policy, or product successful? Under what conditions should scaling stop? Or "descaling" begin? Robert McLean and I posed these and other questions to innovators in the Global South and shared what we learned in our recent book *Scaling Impact: Innovation for the Public Good*. In this session, we outline the book's four research-based scaling principles—justification, optimal scale, coordination, and dynamic evaluation. Then we discuss how to (1) define success as achieving impact at optimal scale, (2) choose a scaling strategy best suited to achieve success, and (3) judge success with dynamic evaluation. My presentation goes beyond the book, reflecting our most current thinking and research, and I provide participants with access to free resources, including electronic copies of the book.
Speakers
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

A tool for addressing violence against women: An examination of the creation, benefits, and drawbacks of the Evidence Portal
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlotte Bell (Australia's National Research Organisation for Women's Safety (ANROWS)), Lorelei Hine (ANROWS, AU), Elizabeth Watt (ANROWS, AU), Rhiannon Smith (ANROWS, AU)

The first of its kind in Australia, the Evidence Portal is an innovative tool that captures and assesses impact evaluations of interventions from high-income countries that aim to address and end violence against women.

While we know high-quality evaluation evidence is an important component in informing and influencing policy and practice, decision-makers face a variety of potential barriers in accessing this evidence. By providing a curated repository of existing research, evidence portals can support policymakers, practitioners, and evaluators in their decision-making.

Our Evidence Portal consolidates and synthesises impact evaluation evidence via: (1) Evidence and Gap Maps, which provide a big-picture, visual overview of interventions; and (2) Intervention Reviews, which provide a succinct, standardised assessment of interventions in accessible language. Underpinned by a rigorous systematic review methodology, this tool seeks to:
  • Identify existing impact evaluations and gaps in the evidence base, and
  • promote a collective understanding of the nature and effectiveness of interventions that aim to address violence against women

Key points: This presentation will showcase the creation, benefits, and drawbacks of the Evidence Portal, with a focused discussion on the following areas:
  • What are evidence portals and how are they used to inform policy and practice?
  • Why and how was this evidence portal created?
  • What are the challenges in creating this tool and the learnings to date?
  • What other 'ways of knowing' should be considered?

This presentation begins with an in-depth exploration of the Evidence Portal and the important methodological decisions taken to build this tool. It then offers a reflection on our journey of creating this tool with a focus on significant learnings to date. You will gain an understanding of the Evidence Portal and key considerations for future evaluations of violence against women interventions.
Speakers
Thursday September 19, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra-Brown

Sharon Marra-Brown

Senior manager, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Perspectives on the Appropriate Use of RCTs in Evaluation?
Thursday September 19, 2024 1:30pm - 3:00pm AEST
Authors: Moderator:  Prof Rick Cummings, Emeritus Professor Murdoch University, AES Fellow
Key Speaker: Eleanor Williams, Managing Director, Australian Centre for Evaluation
Panelists: Prof Lisa Cameron, Professional Research Fellow, Melbourne Institute of Applied Economic and Social Research, University of Melbourne
Dr Wendy Jarvie, Adjunct Professor, Public Service Research Group, University of NSW
Bruce Cunningham, Assistant Secretary, Employment Evaluation Branch, Commonwealth Department of Employment and Workplace Relations
Commentators: Prof Patricia Rogers, Former Professor of Public Sector Evaluation, RMIT University, AES Fellow
Scott Bayley, Principal, Scott Bayley Evaluation Service, AES Fellow


The Commonwealth Government has established the Australian Centre for Evaluation (ACE) to put evaluation evidence at the heart of policy design and decision-making by improving the volume, quality, and use of evaluation evidence to support better policy and programs that improve the lives of Australians. This aligns well with the aim of the AES to improve the theory, practice and use of evaluation for people involved in evaluation. The creation of ACE provides an excellent opportunity for the AES and its members to work with a government agency on our common purposes. This collaboration has already commenced through shared activities and, in particular, the involvement of the responsible Minister, Dr Andrew Leigh, as a keynote speaker at the 2023 AES Conference.

An area that has attracted considerable attention is the mandate for ACE to include randomised control trials (RCTs) in at least some of their evaluation studies of Commonwealth programs. This issue was the central topic of Minister Leigh’s keynote address and created considerable debate and discussion at the conference. This demonstrates that this is a topic of importance for the AES and its members.

The aim of the session is to explore the appropriate use of RCTs in evaluation studies of public policy in Australia. The strategy is to commence a communication process on this key topic between ACE and the evaluation community as represented by the AES. Ideally, this will lead to collaboration between ACE and the AES to improve evaluation practice in Australia.

The Fellows Forum session will commence with a prepared presentation by a senior staff member of ACE explaining its mandate and outlining its approach to including RCTs in evaluation studies. This will be followed by a panel of evaluators who have experience with RCTs to explain how they included RCTs in an evaluation study or where they chose not to include an RCT and the reasons why. They will also explore what they learned from this experience to inform their future evaluation practice. Finally, one or two Fellows will act as discussants, responding to the previous presentations with their thoughts on this issue. The session will be moderated by a Fellow and there will be time for audience members to ask questions of the panel members and discussants.

Moderator:  Prof Rick Cummings, Emeritus Professor Murdoch University, AES Fellow
Key Speaker: Eleanor Williams, Managing Director, Australian Centre for Evaluation
Panelists:  Prof Lisa Cameron, Professional Research Fellow, Melbourne Institute of Applied Economic and Social Research, University of Melbourne
                       Dr Wendy Jarvie, Adjunct Professor, Public Service Research Group, University of NSW
                       Bruce Cunningham, Assistant Secretary, Employment Evaluation Branch, Commonwealth Department of Employment and Workplace Relations
Commentators:  Prof Patricia Rogers, Former Professor of Public Sector Evaluation, RMIT University, AES Fellow
Scott Bayley, Principal, Scott Bayley Evaluation Service, AES Fellow
Chair
avatar for Rick Cummings

Rick Cummings

Emeritus Professor, Murdoch University
Rick Cummings is an Emeritus Professor in Public Policy at Murdoch University. He has 40 years of experience conducting evaluation studies in education, training, health, and crime prevention primarily for the state and commonwealth government agencies and the World Bank. He currently... Read More →
Speakers
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
avatar for Patricia Rogers

Patricia Rogers

Evaluator and researcher, Footprint Evaluation Initiative
Founder of BetterEvaluation and former Professor of Public Sector Evaluation at RMIT University. Now working as consultant and advisor. My work has focused on supporting appropriate choice and use of evaluation methods and approaches to suit purposes and context. I am currently working... Read More →
avatar for Scott Bayley

Scott Bayley

Managing Director, Scott Bayley Evaluation Services
Scott Bayley manages his own evaluation consultancy business and holds a MA in Public Policy majoring in evaluation and social measurement. He has over 25 years of experience in evaluation and is a Fellow of the Australian Evaluation Society. Prior to having his own consultancy Scott... Read More →
Thursday September 19, 2024 1:30pm - 3:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Harnessing AI for Qualitative Data Analysis
Thursday September 19, 2024 2:00pm - 2:30pm AEST
Authors: Ethel Karskens (Clear Horizon)

This presentation covers the strategic integration of Artificial Intelligence (AI) methodologies for qualitative data analysis in evaluation processes. The increasing demand for sophisticated analytical tools necessitates a deep dive into AI's transformative potential in streamlining qualitative analysis. Through practical demonstrations and case studies, this session showcases how AI technologies can efficiently tackle the complexities of analysing qualitative data. Attendees will acquire actionable insights into leveraging AI to augment the efficiency and accuracy of qualitative analysis, empowering them to navigate the evolving landscape of evaluation methodologies.

Additionally, the presentation conducts a comprehensive comparative analysis of major AI models available in the market. By delineating their unique strengths and functionalities, participants will gain invaluable discernment in selecting the most appropriate AI model tailored to their evaluation objectives.

Moreover, the session delves into robust quality assurance (QA) strategies for validating AI-generated outputs, emphasising the essential role of evaluators as integral stakeholders in the analysis process. Attendees will explore techniques for seamlessly integrating human expertise with AI capabilities to refine and validate analysis outcomes. We will also explore ways to do this in a way that respects common data privacy laws and policies. By fostering a symbiotic relationship between AI technologies and human evaluators, this presentation underscores the importance of collaborative synergy in optimising evaluation efficacy.

In conclusion, this presentation offers a comprehensive exploration of the transformative potential of AI in qualitative data analysis within evaluation contexts. Attendees will depart equipped with actionable strategies and insights to harness AI's power effectively, elevating the quality and efficiency of their evaluation processes to new heights.
Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.
Thursday September 19, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Warlpiri ways of assessing impact - How an Aboriginal community is defining, assessing and taking action for a good life in their community.
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104
Authors: Emily Lapinski (Central Land Council ),Malkirdi Napaljarri Rose (Centre For Human Security and Social Change, La Trobe University, AU), Glenda Napaljarri Wayne (Central Land Council, AU), Geoffrey Jungarrayi Barnes (Central Land Council, AU), Alex Gyles (Centre For Human Security and Social Change, La Trobe University, AU)

For evaluation to support transformational change, research suggests strategies must focus on localised Indigenous values, beliefs and worldviews. Decolonising evaluation involves identifying and addressing power and considering what is being evaluated, by whom and how. In this paper we argue that these developments are necessary but insufficient and suggest a possible way forward for further decolonising the field of evaluation. To support change for Indigenous Australians the emphasis needs to move from simple evaluation of individual programs to more critical examination of their combined impact on communities from local perspectives.

This paper explores how Warlpiri and non-Indigenous allies are collaborating to create and use their own community-level impact assessment tool. The 5-year Good Community Life Project is supporting Warlpiri residents of Lajamanu in the Northern Territory to define, assess and take action for a 'good community life'. Warlpiri will explain how they created the approach for assessing wellbeing in Lajamanu, and how they are using emerging results to give voice to their interests and advocate for the life they envision for future generations.

The project involves collaboration between Warlpiri community members, land council staff and university researchers, drawing on Indigenous concepts of 'two-way' seeing and working, relationality, and centring Indigenous voice and values. Applying these concepts in practice is challenging, particularly for non-Indigenous allies who must constantly reflect and use their privilege to challenge traditional views on 'robust' evaluation methodology.

Warlpiri and the land council see potential for this work to improve life in Lajamanu and as an approach that could be applied across Central Australian communities. Going beyond co-designed and participatory evaluation to critical examination of impact is the next step in supporting change with Indigenous communities. This paper will focus on Warlpiri perspectives, plus brief reflections from non-Indigenous allies, with time for the audience to discuss broader implications.
Speakers
EL

Emily Lapinski

Monitoring, Evaluation and Learning Coordinator, Central Land Council
avatar for Alex Gyles

Alex Gyles

Research Fellow - Monitoring and Evaluation, Institute for Human Security and Social Change, La Trobe University
Alex Gyles is a Research Fellow working in Monitoring, Evaluation and Learning (MEL) at the Institute for Human Security and Social Change, La Trobe University. He works closely with Marlkirdi Rose Napaljarri on the YWPP project and finds fieldwork with the YWPP team an exciting learning... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

A long road ahead: Evaluating long-term change in complex policy areas. A case study of school active travel programs in the ACT
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106
Authors: Mallory Notting (First Person Consulting)

The ACT Government implemented a suite of programs over the ten year period between 2012 and 2022 aiming to increase the rates of students actively travelling to and from school. 102 schools in the ACT participated in at least one of the three programs during this time which targeted well-known barriers to active travel, including parental perceptions of safety and infrastructure around school. The programs were intended to contribute towards a range of broader priorities, including health, safety, and environmental outcomes.

This short-paper session will share learnings from evaluating long-term behaviour change at a population level, based on the school active travel evaluation. The evaluation represents a unique case study, as the evaluators needed to look retrospectively over ten years of program delivery and assess whether the combination of programs had created changes within the system and had resulted in the achievement of wider goals.

The presenter will illustrate that the line between short-term and long-term outcomes is rarely linear or clear, as is the relationship between individual interventions and whole of system change. This will be done by summarising the approach taken for the evaluation and sharing the diversity of information collated for analysis, which included individual program data and attitudinal and infrastructure-level data spanning the whole school environment.

Evaluators are often only able to examine the shorter term outcomes of an intervention, even in complex policy areas, and then rely on a theory of change to illustrate the assumed intended wider impacts. The presenter was able to scrutinise these wider impacts during the active travel evaluation, an opportunity not regularly afforded to evaluators. The lessons from the active travel evaluation are therefore pertinent for other evaluations in complex policy areas and may carry implications for program design as the focus shifts increasingly towards population-level, systems change.

Speakers
avatar for Mallory Notting

Mallory Notting

Principal Consultant, First Person Consulting
Mallory is a Principal Consultant at First Person Consulting. She manages and contributes to projects primarily in the area of cultural wellbeing, social inclusion, mental health, and public health and health promotion. In 2023, Mallory was the recipient of the Australian Evaluation... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Our new ways: Reforming our approach to impact measurement and learning
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105
Authors: Kaitlyn Scannell (Minderoo Foundation), Adriaan Wolvaardt (Minderoo Foundation, AU), Nicola Johnstone (Minderoo Foundation, AU), Kirsty Kirkwood (Minderoo Foundation, AU)

We have been on a journey to bring awareness, evidence and understanding to the impact of our organisation since inception, and in earnest since 2016. For years, we felt the tension of trying to solve complex problems with measurement and learning approaches that are better suited to solving simple problems.

To change the world, we must first change ourselves. In early 2023 we had the extraordinary opportunity to completely reimagine our approach to impact measurement and learning. What we sought was an approach to measurement and learning that could thrive in complexity, rather than merely tolerate it, or worse, resist it.
We are not alone in our pursuit. Across government and the for-purpose sector, practitioners are exploring and discovering how to measure, learn, manage, and lead in complexity. Those who explore often discover that the first step they need to take is to encourage the repatterning of their own organisational system. A system which, which in the words of Donella Meadows, "naturally resists its own transformation."

In this presentation we will delve into two themes that have emerged from our journey so far:
  • Transforming ourselves - We will explore what it takes to embed a systems-led approach to measurement, evaluation and learning in an organisation.
  • Sharing knowledge - We will discuss methods for generating, sharing, and storing knowledge about what works for measuring, evaluating, and learning in complexity.

The purpose of this session is to share what we have learnt with anyone who is grappling with how their organisation might measure and learn in complexity. We have been touched by the generosity of those who have accompanied us on our journey, sharing their experiences and wisdom. This presentation marks our initial effort to pay that generosity forward.
Speakers
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

An update on practical applications of machine learning in evaluation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
Authors: Gerard Atkinson (ARTD Consultants)

Last year saw the rise of large language models, with words like Chat-GPT and Bard becoming part of common discussion. The evaluation community was not immune to this trend, and papers were published that looked at just how well machine learning approaches could do against human evaluators on topics such as qualitative analysis and evaluative judgement. The answer? Not as well as you would think (but you could get wrong answers faster than ever!)

But the designers of these models took on the feedback and created newer and more sophisticated tools. In addition, there have been innovations in hybrid models which combine the best features of different methods while minimising their weaknesses. Coupled to this is the growing field of standalone models that can be run on a desktop computer but produce responses that match or exceed cloud-based models, and models that can draw on rich contextual information (such as documentation or full interview transcripts) to make decisions.

This presentation provides an update on the state of machine learning in 2024 and presents new findings in relation to the performance of machine learning models on tasks including topic classification and rubric analysis.


Speakers
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in:- program and policy evaluation- workshop and community facilitation- machine learning and AI- market and social research- financial and operational modelling- non-profit, government and business strategyI am also a board member... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Where next? Evaluation to transformation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor), Sarika Bhana (Grosvenor)

What is evaluation? Better Evaluation defines it as "any systematic process to judge merit, worth or significance by combining evidence and values". Many government organisations and some private and not-for-profit entities use evaluations as an auditing tool to measure how well their programs are delivering against intended outcomes and impacts and achieving value for money. This lends itself to viewing evaluation as an audit or 'tick-box' exercise when it is really measuring the delivery of an organisation's mandate or strategy (or part thereof). Viewing evaluation more as an audit than a core part of continuous improvement presents a risk of our reports collecting dust.

During this session, we will discuss factors that build a continuous improvement mindset across evaluation teams, as well as across the broader organisation. This will include exploring how to manage the balance between providing independent advice with practical solutions that program owners and other decision-makers can implement more readily, as well as how to obtain greater buy-in to evaluation practice. We present the features that evaluations should have to ensure findings and conclusions can be easily translated into clear actions for improvement.

We contend that it is important to consider evaluation within the broader organisational context, considering where this might link to strategy or how it may be utilised to provide evidence to support funding bids. This understanding will help to ensure evaluations are designed and delivered in a way that best supports the wider organisation.

We end by sharing our post-evaluation playbook - a practical tool to help take your evaluations from pesky paperweight to purposeful pathway.

Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world a little over a year ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector. Rachel is passionate... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
  Tools

3:30pm AEST

Constructing a Wisdom Base: A Hands-On Exploration of First Nations Knowledge Systems
Thursday September 19, 2024 3:30pm - 4:30pm AEST
106
Authors: Skye Trudgett (Kowa ),Haley Ferguson (Kowa, AU),Tara Beattie (Kowa, AU),Levi McKenzie-Kirkbright (Kowa, AU),Jess Dart (Clear Horizon, AU)

In the pursuit of understanding and honouring the depth of First Nations wisdom, this hands-on session at the AES conference introduces the Ancestral Knowledge Tapestry —a living guide for developing a repository of ancestral knowledge, practices, and philosophies. Participants will actively engage in co-creating a 'Wisdom Base,' a collective endeavour to encapsulate the richness of old and new First Nations knowledges and their application to contemporary evaluative practices.

Through interactive exercises, collaborative dialogue, and reflective practices, attendees will delve into the components of the Ancestral Knowledge Tapestry, exploring the symbiosis between deep knowing, artefacts, deep listening and truth-telling. The session aims to empower participants, particularly those from First Nations communities, to identify, document, and share their unique wisdom in ways that foster self-determination and cultural continuity.
Attendees will emerge from this workshop not only with a deeper appreciation for the intrinsic value of First Nations knowledge systems but also with practical insights into how to cultivate a Wisdom Base that not only preserves but actively revitalises First Nations wisdom for future generations.

Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for Levi McKenzie-Kirkbright

Levi McKenzie-Kirkbright

Software Engineer, Kowa Collaboration
Software engineer at Kowa investigating how to implement Indigenous data sovereignty principles into software systems.
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Charting the Course: Measuring Organisational Evaluation Capacity Building
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Rochelle Tobin (Curtin University)

Measuring evaluation capacity building is complex, and there are few examples of quantitative measurement tools to enable evaluators to chart progress. WAAC (WA AIDS Council) and Curtin established a five-year partnership to build evaluation capacity within WAAC. To measure progress, a validated tool (Schwarzman et al. 2019) to assess organisational evaluation capacity was modified and combined with another partnership-based tool (Tobin et al. in press). The survey was administered to WAAC staff at baseline (n = 17) and then one year after the partnership was established (n = 19). Significant improvements were seen in individual skills for evaluation tasks, tools for evaluation and evaluation systems and structures. These tools provide a rigorous approach to tracking progress towards organisational evaluation capacity.
Speakers
avatar for Rochelle Tobin

Rochelle Tobin

PhD candidate
I am a PhD candidate investigating SiREN's (Sexual Health and Blood-borne Virus Research and Evaluation Network) influence on research and evaluation practices in the Western Australian sexual health and blood-borne virus sector. I also support SiREN's knowledge translation activities... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Frank conversations: A direct path to evaluations that lead to change
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Author: Hadeel Al-Nawab (Allen and Clarke Consulting)

We all want evaluations that deliver on their purpose but can come up against 'messiness' as the evaluation progresses. It can be hard to distinguish stakeholder needs from stakeholder wants, and the evaluation purpose can get lost as unanticipated project changes arise.

Frank conversations can offer a way to navigate through these challenges. This presentation uses a 2022 evaluation of a health information platform as an example to provide tips on getting comfortable with uncomfortable questions, answers, and hard truths to get the evaluation back on track.


Speakers
avatar for Dee Al-Nawab

Dee Al-Nawab

Senior Consultant, Allen and Clarke Consulting
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Journey Mapping: Visualising Competing Needs within Evaluations
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Jolenna Deo (Allen and Clarke Consulting)

Journey mapping acts as a GPS to grasp audience or consumer experience in evaluating policies or programs, highlighting twists, hidden gems, and pitfalls It can be a useful tool to help evaluators capture disparities and competing needs among intended demographics. This session will discuss the journey mapping method, drawing from an evaluation of a Community Capacity Building Program which used journey mapping to illustrate key consumer personas. It will explore the integration of multiple data sources to provide a comprehensive understanding of complex disparities and the cultural and historical contexts in which these arise.
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Reflections by a novice on the use of a powerful linked-data simulation model (the Victorian Social Investment Model; VicSIM) in evaluation.
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Gabby Lindsay-Smith (Victorian Department of Health)

Using linked Government data sets provide an opportunity to investigate the impact of state-wide programs and policies but are often out of reach for many evaluators, and especially non-analysts. This presentation will detail a non-analyst's experience using the Victorian Social Investment Model (VicSIM) in a recent evaluation of a Victorian-wide family services program evaluation. The presentation will outline tips and tricks for those who may consider incorporating government-level linked data or simulation models into large program or policy evaluations in the future. It will cover questions such as: where to begin, navigating the data and now what?
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The evolution of evaluation: Retracing our steps in evaluation theory to prepare for the future
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: James Ong (University of Melbourne)

As new people enter the evaluation field and as evaluation marches forward into the future, it is important to learn from evaluation theorists that have come before us. My Ignite presentation will argue that modern evaluation is built on evaluation theory, and make the call for evaluators of all levels to learn evaluation theory to:
  1. Appreciate how evaluation has evolved;
  2. Strengthen their evaluation practice; and
  3. Navigate themselves around an ever-changing evaluation landscape.
Speakers
avatar for James Ong

James Ong

Research Assistant (Evaluations), University of Melbourne
My name is James Ong. I am an Autistic program evaluator working at the University of Melbourne, where I work on evaluation and implementation projects in various public health projects such as the AusPathoGen program and the SPARK initiative. I not only have a strong theoretical... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

There and back again: a digital journey from outcomes to dashboards (and back!)
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Nikki Sloan (Evaluator)

Over the past few years we have seen an explosion in the opportunities presented by digital products including AI and dashboards. However, as a non-digital and data person, this can be an intimidating space, that often feels like it misses out on the most important part - the people!

This ignite session will draw on my experiences as working as a go between and bridge between two different and highly technical niche's - evaluation and information technology. The session will present reflections, learning, and most importantly practical takeaways that audience members can use in their roles.
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

From KPIs to systems change: Reimagining organisational learning
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Katrina Barnes (Clear Horizon), Irene Guijt (Oxfam Great Britain, GB), Chipo Peggah (Oxfam Great Britain, ZW)

Traditional measures of success for international non-governmental organizations (INGOs) have been based on western (and often colonial), theories of change, use of predefined metrics and ways of knowing - rarely fitting local realities and interests. Projectised pre-determined understandings of change, limit honest reflection on larger transformative change, and inhibit meaningful learning and adaptation.

INGOs globally are being challenged to decolonise their knowledge and evaluation processes. Over the past 18 months, Oxfam Great Britain has undergone a journey to redesign how we understand impact, to rebalance and reframe accountability and strengthen learning. This new approach focuses on collective storytelling, sensemaking and regular reflection on practice. We are taking a theory-led approach to make meaning out of signals that systems are shifting across a portfolio of work. Drawing on a bricolage of various evaluation methodologies (Outcome Harvesting-lite, meta-evaluation and synthesis, evaluative rubrics, and impact evaluations) we are slowly building a picture up over time across the organisation, to tell a story of systemic change. We have seen how meaningful and honest evidence and learning processes, have enabled a stronger culture of learning.

Although we are far from the end of this journey, we have learnt some critical lessons and face ongoing challenges. We are not the only ones, many foundations, funders, and philanthropic organisations are going through similar processes as organisations increasingly try to understand their contribution to systems change. These conversations are therefore imperative to the field of evaluation, as organisations navigate new ways to 'evaluate' their own work.

At this presentation, we will start the discussion by sharing Oxfam Great Britain's journey with key challenges faced and lessons learnt. After this, we will invite a Q&A conversation to harvest insights from others also seeking to reimagine organisational learning that is grounded in decolonising knowledge processes and seeking to understand systems change.
Speakers
Thursday September 19, 2024 3:30pm - 4:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Growing Australia's future evaluators: Lessons from emerging evaluator networks across the Asia Pacific
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Amanda Mottershead (Tetra Tech International Development), Qudratullah Jahid (Oxford Policy Management Australia, AU),Eroni Wavu (Pacific Community, FJ)

The sustainability of the evaluation sector requires emerging evaluators to be supported in pursuing high-quality practice. What this support needs to be and how it should be developed is much less certain. What topics should we focus on? How should we deliver it? Who should we deliver it to? How can the broader evaluation community support emerging evaluators?

Global experiences in emerging evaluator support contain a treasure trove of lessons which can fill this knowledge gap and inform effective support here in Australia. Experiences show that fostering a strong evaluation community, that includes emerging evaluators, can nurture, ignite and shape future evaluation practices. A variety of approaches are being adopted across the region, and the globe, to foster this sense of community, that range from formal approaches to capacity building to more informal approaches that focus on experience sharing.

In this session, we bring together current and former emerging evaluator leaders from across the Asia Pacific region to answer some of these questions and understand what approaches could work best for the Australian context. This will include presentations and discussion on in-demand topics, how to formulate support, how to target emerging evaluators and the best means of delivery. The session will be highly interactive, engaging the audience in a question-and-answer forum on this important topic. All panel members have been engaged with emerging evaluator networks in their countries or regions and bring diverse experiences to facilitate cross learning. The session will provide practical ways forward for the broader evaluation community to grow and support the future of evaluation.
Speakers
avatar for Qudratullah Jahid

Qudratullah Jahid

Evaluator/MEL Specialist, Oxford Policy Management
Fulbright scholar. Over a decade in evaluation. Focusing on developing countries.
avatar for Amanda Mottershead

Amanda Mottershead

Consultant - Research, Monitoring and Evaluation, Tetra Tech International Development
I enjoy the breadth of evaluation in international development. I've had the opportunity to work across sectors including economic development, infrastructure, energy, education and inclusion. I enjoy generating evidence that promotes improvements to organisations, policies and programs... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Committed to mentoring
Thursday September 19, 2024 3:30pm - 4:30pm AEST
103
Authors: Julie Elliott (Independent Evaluator), Jill Thomas (J.A Thomas & Associates, AU), Martina Donkers (Independent Evaluator, AU)

Mentors and mentees from the AES Group Mentoring Program share rich experiences of group learning, knowledge sharing, and reflective practice, exploring the Wayfinding skills, knowledge, and expertise they have found through the program and the valuable lessons learned.

AES remains committed to mentoring, and this session provides a unique opportunity to hear perspectives across the mentoring spectrum, from Fellows to emerging evaluators, and the ways that sharing our professional practice enhances our work. Since 2021, the AES Group Mentoring Program has been a trailblazer in fostering professional growth and competencies for emerging and mid-career evaluators, enabling mentors and peers to help navigate unfamiliar territories, incorporating various tools and strategies.

Our dynamic panel will discuss how evaluators have adapted their approaches to mentoring and to evaluation practice with the support of the program. It's a session where personal and professional growth intersect and will offer a unique perspective on the transformative power of mentorship.

This discussion is for evaluators who are passionate about learning - both their own and that of other AES members! Whether you're a seasoned professional eager to contribute to your community, an emerging talent or a mid-career evaluator navigating contemporary evaluation ecosystems, this session is for you. Don't miss this opportunity to hear directly from mentors and mentees who value the shared, continuous journey of social learning and adaptation.




Speakers
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.
avatar for Rick Cummings

Rick Cummings

Emeritus Professor, Murdoch University
Rick Cummings is an Emeritus Professor in Public Policy at Murdoch University. He has 40 years of experience conducting evaluation studies in education, training, health, and crime prevention primarily for the state and commonwealth government agencies and the World Bank. He currently... Read More →
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured interviews... Read More →
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The learning journey: competency self-assessment for personal learning and profession development
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105
Authors: Amy Gullickson (University of Melbourne), Taimur Siddiqi (Victorian Legal Services, AU)

AES in collaboration with learnevaluation.org offers a competency self-assessment to members. The aim to help individuals understand their strengths and plan their learning journey, to help the AES continue to tailor its professional development offerings and develop pathways to professionalisation, and to contribute to ongoing research about evaluation learners. In this session, members of the AES Pathways Committee will briefly summarise the findings from the self-assessment and then invite participants into groups by their discipline and sector to discuss: Which competencies are really core and why? Reporting out from groups will will reveal whether the core competencies differ based on the sectors/background of the evaluators. The follow up discussion will then explore: What do the findings mean for evaluation practice, and teaching and learning? How do they relate to professionalisation? If we want to increase clarity about what good evaluation practice looks like - what are our next steps related to the competencies?

Participants will benefit from reflecting on their own competency self-assessment in relation to the findings and discussion, and discovering how the backgrounds of learners influences their ideas about core competencies. The session findings will be shared with the AES Pathways Committee to inform AES' next steps for the competencies, self-assessment, and ongoing discussion of pathways to professionalisation.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Taimur Siddiqi

Taimur Siddiqi

Evaluation manager, Victorian Legal Services Board+Commissioner
Taimur is an experienced evaluation and impact measurement professional who is currently the evaluation manager at the Victorian Legal Services Board + Commissioner and a member of the AES Board Pathways Committee. He is also a freelance evaluation consultant and was previously the... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Friday, September 20
 

9:00am AEST

Plenary: Indy Johar keynote address
Friday September 20, 2024 9:00am - 10:00am AEST
Indy Johar, RIBA register architect, serial social entrepreneur, and Good Growth Advisor to the Mayor of London, UK

Abstract to follow.
Speakers
avatar for Indy Johar

Indy Johar

RIBA register architect, serial social entrepreneur, and Good Growth Advisor to the Mayor of London, UK
Indy Johar is an RIBA register architect, serial social entrepreneur, and Good Growth Advisor to the Mayor of London. Indy was born in Acton, West London & is a lifelong Londoner. He is focused on the strategic design of new super scale civic assets for transition – specifically... Read More →
Friday September 20, 2024 9:00am - 10:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Involving children and young people in evaluations: Equity through active participation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Sharon Marra-Brown (ARTD Consultants), Moya Johansson (ARTD Consultants, AU)

Think it's important to enable children and young people to have a voice in evaluations, but find it challenging? This paper presents tried and tested strategies for ensuring ethical engagement with children and young people and encouraging meaningful participation.

Involving children and young people in evaluation is critical to ensure that we arrive at evaluations that accurately reflect their experiences and capture the outcomes they consider most important. Children and young people have the right to have a say about their experiences, and evaluations that avoid their involvement risk perpetuating ongoing inequities.

However, involving children and young people in evaluations can prompt ethical concerns in relation to their comprehension of research, capacity to provide consent, potential coercion by parents, and the potential conflicting values and interests between parents and children. Depending on the subject, it can also create concerns about safety and readiness.

Based on our experience successfully achieving ethics approval for multiple evaluations of services for children and young people across Australia, which include interviews with children and young people who have accessed these services, we will talk through considerations for ensuring the voice of children and young people in evaluation while safeguarding them from unnecessary risks.

We will then take you through how we've overcome challenges engaging children and young people in evaluations with youth-centred innovative solutions, including carefully considering the language we use and how we reach out. We will demonstrate the developmental benefits of meaningful participation of children and young people once ethical considerations have been carefully considered and navigated.

Finally, we will take you through our tips for ensuring meaningful and safe engagement with children and young people. We will point you in the direction of Guidelines and practice guides for involving young people in research and evaluation in a safe and meaningful way.

The presenters are evaluators with extensive experience in designing, delivering and reporting on evaluations that include data collection with children and young people. This includes recently achieving ethics approval and commencing interviews with children as young as seven, accessing a suicide aftercare service.

While much attention is devoted to ensuring safe and inclusive data collection with various demographics, specific considerations for engaging children and young people remain relatively uncommon. Recognising the unique needs of this population, coupled with the understandably cautious stance of ethics committees, underscores the necessity for a thoughtful and deliberate approach to evaluations involving children and young people.

Given the additional complexities and ethical considerations involved, the default tendency can be to exclude children and young people from evaluation processes. However, it is important that children and young people are able to have a say in the programs, policies and services that they use. Participation in evaluations can be a positive experience, if risks are managed and the process is designed to be empowering.

This session will provide valuable insights, actionable strategies, and an opportunity for participants to reflect on their own practices, fostering a culture of inclusivity and responsiveness in evaluation.
Speakers
avatar for Sharon Marra-Brown

Sharon Marra-Brown

Senior manager, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Mitchell Rice-Brading

Mitchell Rice-Brading

ARTD Consultants
I started with ARTD in early 2022 after completing his Bachelor of Psychological Science (Honours) in 2021. This, in combination with experience as a Psychology research assistant, helped me develop strong research skills, namely the ability to synthesise and critically evaluate qualitative... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.html
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

The Road Home - an evaluation journey to discover and demonstrate a new and exciting way to deliver a crisis housing response.
Friday September 20, 2024 10:30am - 11:30am AEST
105
Authors: Anne Smyth (LDC Group), Lesley Thornton (LDC Group, AU), Kym Coupe (First Step, AU), Caroline Lynch (Launch Housing, AU)

As all Wayfinders would understand, when we embarked on a developmental evaluation of the Road Home, we really had no idea how the program or evaluation would play out in practice. We did know however that the usual way of delivering crisis housing services was not working well for either clients or staff. Something needed to change. We needed to change. So, we did - we being the Road Home team working with the evaluators.

Road Home centres on a strong and engaged multidisciplinary team to deliver mental health, medical, legal and housing services to people in crisis accommodation, where they are, and when they need it the most. This integrated way of working is in stark contrast to the conventional, single discipline outreach and in-reach approaches that characterise service delivery in the community sector - its impact has been significant.

This panel will bring leading representatives of the Road Home team and the evaluators together to explore with our audience what we have learned; what it takes to do this well, the benefits to clients, staff and participating organisations, the pitfalls and challenges and the value of developmental evaluation and its methods.

We now have a much better idea of what Road Home looks like, what it takes to support and enable it, to achieve valued outcomes and to meaningfully evaluate it. The role of evaluators and the project manager in holding the uncertain and evolving space characteristic of developmental evaluation and wayfinding is central - it has taken clarity, alignment of purpose, a lot of patience and much persistence, not to mention flexibility. It has been and remains quite the journey!
Friday September 20, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Speakers
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Reviewing and writing for the Evaluation Journal of Australasia
Friday September 20, 2024 10:30am - 11:30am AEST
103
Authors: John Guenther (Batchelor Institute Of Indigenous Tertiary Education), Anthea Rutter (University of Melbourne, AU), Yvonne Zurynski (Macquarie Univesity, AU)

The Evaluation Journal of Australasia (EJA) supports evaluators who wish to share their knowledge and practical experiences in a peer-reviewed article. Documenting evidence, including for programs which do not achieve expected results, is critical for improving evaluation practice, building the evidence base, and advancing evaluation methodologies that are rigorous and ethical.

The EJA depends on volunteer reviewers who can offer critical feedback on articles that are submitted. Reviewers help to improve the quality of manuscripts the Journal receives.

The focus of this presentation is on how to write a good review: how to be academically critical, while at the same time providing constructive feedback that will benefit authors and readers. The presenters will offer step-by-step advice on what to look for, how to judge the quality of a manuscript, and how to make constructive suggestions for authors to consider.

The presentation will also explain how reviewing fits within the publication process, from submission to production. It will be most helpful to potential authors and current and potential reviewers. Authors will learn how to prepare their articles so they receive a favourable review, and reviewers will receive clear guidance on presenting their review feedback to authors.
Speakers
avatar for John Guenther

John Guenther

Research Leader, Education and Training, Batchelor Institute of Indigenous Tertiary Education
John Guenther is a senior researcher and evaluator with the Batchelor Institute of Indigenous Tertiary Education, based in Darwin. Much of his work has been based in the field of education. He has worked extensively with community-based researchers in many remote parts of the Northern... Read More →
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
avatar for Jeff Adams

Jeff Adams

Eastern Institute of Technology | Evaluation Journal of Australasia
I am the Managing Editor of the Evaluation Journal of Australasia.
Friday September 20, 2024 10:30am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Speakers
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured interviews... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Impact evaluation: bringing together quantitative methods and program theory in mixed method evaluations
Friday September 20, 2024 11:00am - 12:00pm AEST
Authors: Harry Greenwell (Australian Treasury), To be determined (Australian Treasury, AU)

This session will provide an overview of some of the main quantitative methods for identifying the causal impacts of programs and policies, while emphasising the importance of mixed-methods that also incorporate program theory and qualitative research. It is intended for people unfamiliar with quantitative evaluation methods who would like to develop their understanding of these methods in order to better contribute to theory-based, mixed method impact evaluations.

The session will cover 3 of the most common quantitative approaches to separating causality from correlation: i) mixed-method RCTs, ii) discontinuity design, and iii) matching. Each method will be explained with real examples. The session will also cover: the benefits and limitations of each method, and considerations for determining when such methods might be suitable either on their own, or as a complement to other evaluation methods or approaches.

Special attention will be given to the ethical considerations inherent in the choice of impact evaluation method, including issues related to consent, fairness, vulnerability, and potential harm.

After attending this session, participants will have a better understanding of: how program theory can inform the design of quantitative impact evaluations, including through mixed-method impact evaluations; and how to identify when certain quantitative impact evaluation methods may be suitable for an evaluation.
Speakers
avatar for Harry Greenwell

Harry Greenwell

Senior Adviser, Australian Treasury
Harry Greenwell is Director of the Impact Evaluation Unit at the Australian Centre for Evaluation (ACE) in the Australian Treasury. He previously worked for five years at the Behavioural Economics Team of the Australian Government (BETA). Before that, he worked for many years in the... Read More →
avatar for Vera Newman

Vera Newman

Assistant Director
Dr Vera Newman is an Assistant Director in the Impact Evaluation Unit at the Australian Centre for Evaluation. She has many years experience conducting impact evaluations in the private and public sector, and is dedicated to applying credible methods to public policy for generating... Read More →
Friday September 20, 2024 11:00am - 12:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Value Propositions: Clearing the path from theory of change to rubrics
Friday September 20, 2024 11:00am - 12:30pm AEST
Authors: Julian King (Julian King & Associates Limited), Adrian Field (Dovetail Consulting Limited, NZ)

Evaluation rubrics are increasingly used to help make evaluative reasoning explicit. Rubrics can also be used as wayfinding tools to help stakeholders understand and participate meaningfully in evaluation. Developing rubrics is conceptually challenging work and the search is on for additional navigation tools and models that might help ease the cognitive load.

As a preliminary step toward rubric development it is often helpful to co-create a theory of change, proposing a chain of causality from actions to impacts, documenting a shared understanding of a program, and providing a point of reference for scoping a logical, coherent set of criteria.

However, it's easy to become disoriented when getting from a theory of change to a set of criteria, because the former deals with impact and the latter with value. Implicitly, a theory of change may focus on activities and impacts that people value, but this cannot be taken for granted - and we argue that value should be made more explicit in program theories.

Specifying a program's value proposition can improve wayfinding between a theory of change and a set of criteria, addressing the aspects of performance and value that matter to stakeholders. Defining a value proposition prompts us to think differently about a program. For example, in addition to what's already in the theory of change, we need to consider to whom the program is valuable, in what ways it is valuable, and how the value is created.

In this presentation, we will share what we've learnt about developing and using value propositions. We'll share a simple framework for developing a value proposition and, using roving microphones, engage participants in co-developing a value proposition in real time. We'll conclude the session by sharing some examples of value propositions from recent evaluations.

Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Friday September 20, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Our five guiding waypoints: Y Victoria's journey and learning from applying organisation-wide social impact measurement
Friday September 20, 2024 11:30am - 12:00pm AEST
103
Authors: Caitlin Barry (Y Victoria), Eugene Liston (Clear Horizon Consulting, AU)

The demand for organisations to measure impact seems to be ever increasing. However, impact measurement looks different depending on what level you are measuring it (program level, organisation-wide, ecosystem level etc). While a lot of organisations focus on measuring social impact at a program level, what appears to be less commonly achieved is the jump to effective measurement of impact at an organisation-wide level.

The literature providing guidance on how to implement org-wide social impact measurement makes it seem so straight-forward, like a Roman highway - all straight lines. But what is it really like in practice? How does it differ from program-level impact measurement? How can it be done? What resources does it take? And, what are the pitfalls?

The Y Victoria has spent the last three years on a journey to embed org-wide social impact measurement under the guidance of our evaluation partner. The Y Victoria is a large and diverse organisation covering 7 different sectors/service lines; over 5,500 staff; over 180 centres; and delivering services to all ages of the community. This presented quite a challenge for measuring organisation-wide impact in a meaningful way.

While the journey wasn't 'straight-forward', we've learnt a lot from navigating through it. This presentation will discuss the approach taken, tell the story of the challenges faced, trade-offs, lessons learnt (both from the client and consultant's perspective), and how we have adapted along the way.

Speakers
avatar for Caitlin Barry

Caitlin Barry

Principal Consultant, Caitlin Barry Consulting
Caitlin has extensive experience in monitoring and evaluation, and holds a Masters of Evaluation (First Class Honours) from the University of Melbourne and an Environmental Science Degree (Honours) from James Cook University. Caitlin is passionate about building people’s knowledge... Read More →
avatar for Ian Boorman

Ian Boorman

Executive Manager, Social Impact, YMCA Victoria
Ian Boorman is the Executive Manager for Impact & Evaluation at the Y in Victoria. Ian is responsible for developing and implementing a cross-organisation level Social impact framework in partnership with Clear Horizon. Ian is also the Co-chair of Social Impact Measurements Network... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Reflections on a Developmental Evaluation of a traditional healing service model for the Kimberley region of Western Australia
Friday September 20, 2024 11:30am - 12:00pm AEST
106
Authors: Gillian Kennedy (The University of Notre Dame Australia), Tammy Solonec (Kimberley Aboriginal Law and Culture Centre, AU)

Traditional Healers, known in the Kimberley as mabarn (medicine men) and parnany parnany warnti (group of women healers), have been practicing their craft for millennia, however, cultural forms of healing are not funded or incorporated into health services in Western Australia. In 2019 a Kimberley cultural organisation was funded to develop and trial a service delivery model of traditional healing. The trial ended in November 2023.

This presentation will reflect on a Developmental Evaluation (DE) that was undertaken throughout the model development and trial of this traditional healing service using a multi-method approach, incorporating participant observation; semi-structured interviews; small group discussions; and a client survey. Data was collated into a 'checklist matrix', using a traffic light system to show how each element of the model was tracking according to different stakeholder perspectives. This information was then provided back to the healing team iteratively to incorporate further into the model design.

The DE team acted as a 'critical friend' to the project. Two Aboriginal research assistants (one male and one female) were able to provide valuable cultural interpreting for the project to ensure that cultural sensitivities around the healing practices were carefully managed. The DE team also helped the healing team to develop a set of guiding principles and a Theory of Change to help the project stay true to their underpinning cultural values.

The DE process helped to inform a culturally-governed and owned clinic model, working with both men and women healers, that is unique to the Kimberley. DE puts the evaluation team inside the project. This relational element is reflective of Aboriginal worldviews but may bring challenges for perceptions of objectivity that are championed in traditional forms of evaluation. We argue that the evaluator as a trusted, critical friend was ultimately part of the success of the healing project.


Speakers
avatar for Tammy Solonec

Tammy Solonec

Jalngangurru Healing Coordinator, Kimberley Aboriginal Law and Cultural Centre (KALACC)
Tammy Solonec is a Nyikina woman from Derby in the Kimberley of Western Australia. Since late 2020, Tammy has been engaged by KALACC as Project Coordinator for Jalngangurru Healing, formally known as the Traditional Healing Practices Pilot (THPP). Prior to that from 2014 Tammy was... Read More →
avatar for Gillian Kennedy

Gillian Kennedy

Translational Research Fellow, The University of Notre Dame Australia
Gillian Kennedy is a Translational Research Fellow with Nulungu Research Institute at The University of Notre Dame, Broome campus and has 20 years’ experience as an educator and facilitator. Her research focus is on program and impact evaluation within the justice, education, and... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

The ACT Evidence and Evaluation Academy 2021-24: Lessons learned from a sustained whole-of-government ECB effort
Friday September 20, 2024 11:30am - 12:00pm AEST
105
Authors: Duncan Rintoul (UTS Institute for Public Policy and Governance (IPPG) ),George Argyrous (UTS Institute for Public Policy and Governance (IPPG), AU),Tish Creenaune (UTS Institute for Public Policy and Governance (IPPG), AU),Narina Dahms (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Peter Robinson (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU),Robert Gotts (ACT Government: Chief Ministers, Treasury and Economic Development Directorate, AU)

The ACT Evidence and Evaluation Academy is a prominent and promising example of sustained central agency investment in evaluation capability building (ECB).

The Academy was launched in 2021 as a new initiative to improve the practice and culture of evidence-based decision-making in the ACT public sector. Its features include:
  • a competitive application process, requiring executive support and financial co-contribution
  • a series of in-person professional learning workshops where participants learn alongside colleagues from other Directorates
  • a workplace project, through which participants apply their learning, receive 1-1 coaching, solve an evaluation-related challenge in their work and share their insights back to the group
  • executive-level professional learning and practice sharing, for nominated evaluation champions in each Directorate
  • sharing of resources and development of evaluation communities of practice in the Directorates
  • an annual masterclass, which brings current participants together with alumni and executive champions.

Four years and over 100 participants later, the Academy is still going strong. There has been an ongoing process of evaluation and fine tuning from one cohort to the next, with encouraging evidence of impact. This impact is seen not only for those individuals who have taken part but also for others in their work groups, including in policy areas where evaluation has not historically enjoyed much of a foothold.

The learning design of the Academy brings into focus a number of useful strategies - pedagogical, structural and otherwise - that other central agencies and line agencies may like to consider as part of their own ECB efforts.

The Academy story also highlights some of the exciting opportunities for positioning evaluation at the heart of innovation in the public sector, particularly in the context of whole-of-government wellbeing frameworks, cross-agency collaboration and strategic linkage of data sets to support place-based outcome measurement.

Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation, Research, and Learning, Paul Ramsay Foundation
Friday September 20, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Gamified, flexible, and creative tools for evaluating a support program for palliative children and their families
Friday September 20, 2024 11:30am - 12:00pm AEST
104
Authors: Claire Treadgold (Starlight Children's Foundation Australia), Erika Fortunati (Starlight Children's Foundation, AU)

Our program creates personalised experiences of fun, joy, and happiness for families with a palliative child, aiming to foster family connections and celebrate the simple joys of childhood during this challenging circumstance. Evaluating the program is of utmost importance to ensure that it meets the needs of the families involved. Equally, due to the program's sensitivity and deeply personal nature, a low-pressure, flexible evaluation approach is necessary.
In our session, we will showcase our response to this need and share our highly engaging, low-burden tools to gather participant feedback that leverages concepts of gamification and accessibility to boost evaluation responses and reduce participant burden. In particular, we will focus on our innovative “activity book”, which evaluates the program through artistic expression. By emphasising creativity and flexibility, our tools aim to enrich the evaluation process and respect the diverse preferences and abilities of the participating families.
The core argument will focus on our innovative evaluation methodology, how it aligns with best practices in the literature, and our key learnings. Key points include the considerations needed for evaluating programs involving palliative children, empowering children and young people through their active involvement in the evaluation process, and how gamification and creativity boost participation and engagement.
Outline of the session:
  • Introduction to the palliative care program and the need for flexible, creative, and respectful evaluation methods
  • What the literature tells us about evaluation methods for programs involving palliative children and their families
  • A presentation of our evaluation protocol
  • Case studies illustrating the feedback collected and its impact
Our learnings and their implications for theory and practice
Speakers
avatar for Erika Fortunati

Erika Fortunati

Research and Evaluation Manager, Starlight Children's Foundation Australia
Erika is the Research and Evaluation Manager at Starlight Children's Foundation, an Australian not-for-profit organisation dedicated to brightening the lives of seriously ill children. In her current role, Erika manages research projects and program evaluations to ensure that programs... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Designing a baseline research for impact : The SKALA experience
Friday September 20, 2024 12:00pm - 12:30pm AEST
Authors: Johannes Prio Sambodho (SKALA), Ratna Fitriani (SKALA, ID)

SKALA (Sinergi dan Kolaborasi untuk Akselerasi Layanan Dasar- Synergy and Collaboration for Service Delivery Acceleration) is a significant Australian-Indonesian cooperation focuses on enhancing parts of Indonesia's extensive, decentralized government system to accelerate better service delivery in underdeveloped regions. As part of its End of Program Outcome for greater participation, representation, and influence for women, people with disabilities, and vulnerable groups, SKALA is commissioning baseline research focusing on understanding multi-stakeholder collaboration for mainstreaming Gender Equality, Disability, and Social Inclusion (GEDSI) in Indonesia. The program has designed a mixed-method study consisting of qualitative methods to assess challenges and capacity gaps of GEDSI civil society organizations (CSOs) in actively participating and contributing to the subnational planning and budgeting process, coupled with a quantitative survey to measure trust and confidence between the same CSOs and the local governments with whom they engage. The paper first discusses the baseline study's design, its alignment with SKALA's strategic goals and consider how the research might itself contribute to improved working relationships in planning and budgeting at the subnational level. Second, the paper discusses approaches taken by the SKALA team to design a robust programmatic baseline that is also clearly useful in program implementation. These include a) adopting an adaptive approach to identify key emerging issues based on grassroots consultations and the broader governmental agenda into a research objective; b) locating the study within a broader empirical literature to balance practical baseline needs with academic rigor, and c) fostering collaboration with the program implementation team to ensure the study serves both evaluation and programmatic needs. Lastly, based on SKALA experience, the paper will argue for closer integration of research and implementation teams within programs that can support systems-informed methodologies, and will consider ways in which this can be practically accomplished.
Speakers
avatar for Johannes Prio Sambodho

Johannes Prio Sambodho

Research Lead, SKALA
Dr. Johannes Prio Sambodho is the Research Lead for SKALA, a significant Australian-Indonesian development program partnership aimed at improving basic service governance in Indonesia. He is also a former lecturer in the Department of Sociology at the University of Indonesia. His... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Embracing the L in "MEL": A Journey Towards Participatory Evaluation in Government Programs
Friday September 20, 2024 12:00pm - 12:30pm AEST
103
Authors: Milena Gongora (Great Barrier Reef Foundation)

Best practice in evaluation encompasses a crucial step of learning, yet it often receives inadequate emphasis, particularly within government-funded initiatives. Our paper documents the journey of transforming a top-down, prescriptive evaluation process within a government-funded program into an inclusive, consultative approach aligned with Monitoring, Evaluation, and Learning (MEL) principles.

Funded by Australian Government, and managed by the Great Barrier Reef Foundation, the Reef Trust Partnership (RTP) launched in 2018 aiming to enhance the resilience of the Great Barrier Reef. Within it, a $200 million portfolio aims to improve water quality working with the agricultural industry. A framework for impact evaluation was developed in its early days. Whilst appropriate, due to the need to comply with broader government requirements, it was top-down in nature.

Four years into implementation, the Foundation was ready to synthesise, interpret and report on the program's impact. The Foundation could have simply reported "up" to government. However, we acknowledged that in doing so, we risked missing critical context, simplifying findings, misinterpreting information and presenting yet another tokenistic meaningless report.

Interested in doing things better, we instead circled back with our stakeholders in a participatory reflection process. Through a series of carefully planned workshops, we invited on-ground program practitioners to ground-truth our findings, share contextual nuances, and collectively strategise for future improvements.

Despite initial reservations, participants embraced the opportunity, fostering an atmosphere of open dialogue and knowledge exchange. This reflective process not only enriched our understanding of program impact but also enhanced collaboration, strengthening overall program outcomes.

Our experience highlights the importance of transcending tokenistic evaluation practices, particularly in environments where top-down directives prevail. Participatory approaches can be implemented at any scale, contributing to a culture of continuous improvement and strategic learning, ultimately enhancing the impact and relevance of evaluation efforts.

Speakers
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her experience ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Planning and Environment NSW
It was only a short step from training in environmental science, and a background in cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing your stories of what you've done and what you've learned, especially in the areas... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Place-based evaluation: collaborating to navigate learning in complex and dynamic contexts
Friday September 20, 2024 12:00pm - 12:30pm AEST
105
Authors: Sandra Opoku (Relationships Australia Victoria), Kate Matthies-Brown (Relationships Australia Victoria, AU)

Yarra Communities That Care (CTC) is a network of 24 local partner agencies who share a commitment to support the healthy development of young people in the City of Yarra. One of the key initiatives of Yarra CTC is the collaborative delivery of evidence-based social and emotional messaging to families by a centrally coordinated Facilitator Network involving multiple partner agencies. Building on positive feedback and program achievements from 2017-2022, we led an evaluation of the collaborative and place-based approach of the Yarra CTC Facilitator Network to better understand its contribution to systemic change and apply learnings to future place-based approaches for our respective organisations. The evaluation project team adopted the 'Place-Based Evaluation Framework' and was informed by a comprehensive theory of change. This provided an anchor in an otherwise complex and dynamic environment and unfamiliar territory.
There is an increased focus on collaborative place-based approaches at the federal, state and local levels as a promising approach to addressing complex social problems. Previous evaluations and literature identify successful collaboration and a strong support entity or backbone as key enabling factors that make place-based approaches successful. The collaborative place-based approach to strengthening family relationships in Yarra provides a local example of this.

Consistent with systems change frameworks this evaluation provided evidence of structural changes. These changes, manifested in the form of improved practices and dedicated resources and supports, ultimately leading to effective collaborative and transformative changes for the community.

This presentation will share the journey, key insights, and learnings of the evaluation project team over a two-year period to collaboratively gather evidence to inform ongoing program development and contribute to future place-based approaches. The Yarra CTC Facilitator Network serves as a valuable template for implementing best practices for place-based coalitions due to its focus on collaboration and fostering a sense of community.

Speakers
avatar for Sandra Opoku

Sandra Opoku

Senior Manager Evaluation and Social Impact, Relationships Australia Victoria
My role leads impact, evidence and innovation activities at Relationships Australia Victoria. These activities contribute to achieving strategic objectives and improving outcomes for individuals, families and communities. This now also includes oversight of several key prevention... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

A sprint, not a marathon: Rapid Evaluation as an approach for generating fast evidence and insights
Friday September 20, 2024 12:00pm - 12:30pm AEST
104
Authors: Marnie Carter (Allen + Clarke Consulting)

Increasingly, evaluators are called upon to quickly equip decision makers with evidence from which to take action. A program may be imminently approaching the end of a funding cycle; a critical event may have taken place and leadership needs to understand the causes and learnings; or a new program of work is being designed for which it is important to ensure that finite resources are being directed to the most effective interventions. For such circumstances, Rapid Evaluation can be a useful tool.

Rapid Evaluation is not simply doing an evaluation quickly. It requires a deliberate, interlinked and iterative approach to gathering evidence to generate fast insights. What makes Rapid Evaluation different is that the evaluation design needs to be especially flexible, constantly adapting to the context. Data collection and analysis don't tend to follow a linear manner, but rather iterate back and forth during the evaluation. Rapid Evaluation is often conducted in response to specific circumstances that have arisen, and evaluators therefore need to manage a high level of scrutiny.

This presentation will provide an overview of how to conduct a rapid evaluation, illustrated by practical examples including rapid evaluations of a fund to support children who have been exposed to family violence, and a quickly-established employment program delivered during the COVID-19 pandemic. It will discuss the methodological approach to conducting a Rapid Evaluation, share lessons on how to manage the evolving nature of data collection as the evaluation progresses, and discuss how to maintain robustness while evaluating at pace.


Speakers
avatar for Marnie Carter

Marnie Carter

Evaluation and Research Practice Lead, Allen + Clarke Consulting
Marnie is the Evaluation and Research Practice Lead for Allen + Clarke Consulting. She is experienced in program and policy evaluation, monitoring, strategy development, training and facilitation. Marnie is particularly skilled in qualitative research methods. She is an expert at... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

From evaluation to impact-practical steps in a qualitative impact study
Friday September 20, 2024 1:30pm - 2:00pm AEST
Authors: Linda Kelly (Praxis Consultants), Elizabeth Jackson (Latrobe University, AU)

This presentation focuses on a multi-year program funded by Australia that aims to empower people marginalised by gender, disability and other factors. Like similar programs, the work is subject to regular monitoring and evaluation - testing the effectiveness of program activities largely from the perspective of the Australian and national country Government.
But what of the views of the people served by the program? Is the impact of the various activities sufficient to empower them beyond their current condition? How significant are the changes introduced by the program, given the structural, economic, social and other disadvantages experienced by the marginalised individuals and groups.
Drawing from feminist theory, qualitative research methods and managed with local research and communication experts this presentation outlines the study focused on the long-term impact of the program.

The presentation will outline the methodology and practical considerations in the development of the approach and data collection methodologies. It will highlight the value of exploring impact from a qualitative perspective, while outlining the considerable management and conceptual challenges required in designing, introducing and supporting such an approach. It will consider some of the implications in shifting from traditional evaluation methods to more open-ended enquiry and consider whose values are best served through evaluation versus impact assessment?


Speakers
avatar for Linda Kelly

Linda Kelly

Director, Praxis Consultants
avatar for Elisabeth Jackson

Elisabeth Jackson

Senior Research Fellow, Centre for Human Security and Social Change, La Trobe University
Dr Elisabeth Jackson is a Senior Research Fellow at the Centre for Human Security and Social Change where she conducts research and evaluation in Southeast Asia and the Pacific. She is currently co-leading an impact evaluation of a program working with diverse marginalised groups... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Fidelity to context: A realist perspective on implementation science
Friday September 20, 2024 1:30pm - 2:00pm AEST
105
Authors: Andrew McLachlan (NSW Department of Education)

This paper offers insights into how evaluators can harness realist methodology to better understand challenges of program implementation. Realist methodology is ideally suited to investigating a range of implementation problems (Dalkin et al., 2021). It is versatile in that it draws on theories from diverse fields of social inquiry. It is pragmatic in that the theories it adopts are good only in so far as they offer explanatory insight. And it is transferable; realist methodology seeks coherence not by generalising findings but by adhering closely to real-world conditions.

As for implementation science, its founding question seems purpose built for realist work; it aims to improve the uptake of evidence-based practices by investigating how and why program implementation is sustained. Yet despite the obvious affinity between realist methodology and implementation science, so far there have been few attempts to formalise the relationship (Sarkies et al., 2022). Part of the reason may lie with how implementation scientists understand context (Nilsen & Bernhardsson, 2019). Where realist practitioners develop contextually sensitive causal theories, implementation science research tends to rely on static variables like setting and determinant which lack the contextual sensitivity to easily transfer insights across locations.

This paper offers a practical solution. It shows how implementation concepts like fidelity (the degree of adherence to the original program design), adaptation (the process of modifying a program to achieve better fit), and translation (the ability to apply knowledge in real-world settings) can be creatively repurposed within a realist framework to investigate the implementation of action learning in schools (Aubusson et al., 2012). In demonstrating how to construct program theories that are responsive to changing contexts, the paper promises to equip evaluators with tools that can help them navigate the complexities of program implementation in their own work.

Speakers
avatar for Andrew McLachlan

Andrew McLachlan

Evaluator, NSW Department of Education
Friday September 20, 2024 1:30pm - 2:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from the past: Reflections and opportunities for embedding measurement and evaluation in the national agenda to end Violence against Women and Children
Friday September 20, 2024 1:30pm - 2:30pm AEST
106
Authors: Lucy Macmillan (ANROWS), Micaela Cronin (Domestic and Family Violence Commission, AU), Tessa Boyd-Caine (ANROWS, AU), TBC TBC (National Lived Experience Advisory Council, AU)

As evaluators, we are often asked to examine complex, systems-change initiatives. Domestic, family and sexual violence is a national crisis. In late 2022, the Commonwealth Government, alongside all state and territory governments released the second National Plan to End Violence against Women and Children 2022-2032. The plan provides an overarching national policy framework to guide actions across all parts of society, including governments, businesses, media, educational institutions and communities to achieve a shared vision of ending gender-based violence in one generation.

After 12 years of implementation under the first National Plan, assessing whether our efforts had made a meaningful difference towards ending violence against women was a difficult task. We ask: As we embark on setting up measurement and evaluation systems against the second National Plan, how do we avoid making the same mistakes again?

The Domestic, Family and Sexual Violence Commission was established in 2022 to focus on practical and meaningful ways to measure progress towards the objectives outlined in the National Plan. This session will discuss:
  1. the current plans, opportunities and challenges in monitoring progress, and evaluating the impact of this national framework, and
  2. the role of lived-experience in evaluation and how large publicly-funded institutions can balance their monitoring and sensemaking roles at the national-level with accountability to victim-survivors.

The panel will explore common challenges faced when seeking to monitor and evaluate complex national policy initiatives, including data capture, consistency and capacity, and explore some of the opportunities ahead.

The audience will have the opportunity to contribute their insights and expertise on how we, as evaluators, approach the evaluation of complex systems-change at a national scale, and over extended durations, while also prioritising the voices of those most affected. How do we collectively contribute to understanding if these national policy agendas will make a difference?


Friday September 20, 2024 1:30pm - 2:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Navigating ethics dilemmas when evaluating for government: The good, the bad and the ugly
Friday September 20, 2024 1:30pm - 2:30pm AEST
Authors: Kristy Hornby (Grosvenor), Eleanor Williams (Australian Centre for Evaluation), Mandy Chaman (Outcomes, Practice and Evidence Network)

Navigating ethics is an essential part of any evaluation journey. As evaluators we often encounter complex situations that require thoughtful consideration of ethical principles and practice, far beyond the formal ethics process itself.

This session will explore real-world scenarios and provide attendees with actionable strategies to enhance ethical decision-making in their evaluation practice. The panel will speak to questions of managing commissioners' expectations, how to speak frankly to program areas where under-performance is found, issues of confidentiality, ensuring culturally sensitive practice, and ensuring power imbalances are acknowledged and addressed.

The panel presentation will take attendees through the journey of ethical practice and will consider:
- The overall significance of ethical thinking in evaluation
- Common ethical challenges faced by evaluators
- Practical tools and frameworks that empower evaluators to uphold their ethical standards and deliver meaningful results that can withstand scrutiny
- From an internal evaluator perspective, the balancing act of managing these tensions successfully
- Case studies that can illustrate the application of practical ethics in evaluation
- Takeaways and recommendations.

Eleanor Williams, Managing Director of the Australian Centre for Evaluation; Mandy Charman, Project Manager for the Outcome, Performance and Evidence Network in the Centre for Excellence in Child and Family Welfare; and Kristy Hornby, Victorian Program Evaluation Practice Lead at Grosvenor will be the panellists. Our expert panellists will talk to their deidentified war stories in their current and previous roles to set out exactly what kind of challenges evaluators can face in the conduct of their work, and learn from the panellists' hands-on experience on what to do about them. Attendees will be encouraged to participate in a dynamic dialogue with the panellists and with each other, to share their own experiences and strategies for addressing ethical concerns, building on the content shared through the session.
Speakers
MC

Mandy Charman

Project Manager, Outome Practice and Evidence Network, Centre for Excellence in Child and Family welfare
Dr Mandy Charman is the Project Manager for the Outcome, Performance and Evidence Network (OPEN) in the Centre for Excellence in Child and Family Welfare. OPEN, which represents a sector–government–research collaboration, has been developed to strengthen the evidence base of the... Read More →
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
avatar for Eleanor Williams

Eleanor Williams

Managing Director, Australian Centre for Evaluation
Eleanor Williams is a public policy, research and evaluation professional with 20 years' experience working with the public sector. She is the Managing Director of the Australian Centre for Evaluation and established the Australian Public Sector Evaluation Network in 2019. Eleanor... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
  Footprints

1:30pm AEST

Backfire Effect: When Good Intentions Lead to Unintended Consequences
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Ghulam Muhammad Shah (ICIMOD), Farid Ahmad (ICIMOD, NP)

Over the past few years, we have conducted rigorous research to identify 'resilience markers' in the context of environmental management in mountain areas in the Hindu Kush Himalayan region. Our findings suggest that evaluations focused on OECD criteria, as well as traditional experimental and non-experimental designs, have limitations in capturing the dynamic and evolving nature of environmental changes. In such scenarios, a complexity-aware, learning-oriented, and adaptive MEL approach becomes crucial. This approach involves adaptive monitoring and management of socio-ecological behavior, incorporating feedback loops to adjust interventions and strategies based on responses from the entire system.

In our quest for the most suitable and fit-for-purpose design, monitoring, evaluation, and learning approaches to effectively monitor and evaluate resilience outcomes, we have introduced three unique concepts: The Intervention Design Effect (IDE), External Reinforcing Factor's Trap (ERFT), and Type-III Error in Evaluations. We acknowledge that these terms are not only novel but also innovative concerning the design of monitoring and evaluating climate change and environmental programs. We believe that these critically important concepts have been overlooked by the field of 'evaluation science,' and we are introducing them for the first time.

These concepts are lined up for publication, starting with an AEA365 Blog.

The underlying principle is fundamental - it is crucial to identify, challenge, and explicitly define assumptions and risks from both the internal and external contexts of an intervention. By doing so, we ensure that the intervention is realistic and relevant to its implementation context, increasing the likelihood of success. Explicitly defining assumptions helps stakeholders better understand the rationale behind the intervention, fosters a shared understanding, and allows for constant testing of assumptions throughout implementation. This adaptive approach ensures the intervention remains responsive to changing needs and circumstances.
Poorly designed interventions become susceptible to the Intervention IDE. Investigating and understanding IDE is critical in program design, planning, monitoring, evaluation, and learning, optimizing design and maximizing the likelihood of detecting potential distracting effects. Neglecting IDE may lead to falling into an External Reinforcing Factor's Trap (ERFT), causing unintended consequences that supersede the intended outcomes of an intervention.

Simultaneously, using standard experimental, quasi-experimental, or non-experimental evaluation designs to assess resilience outcomes carries the risk of not considering critical scenarios emerging between baseline, midline, and endline evaluations. In statistical terms, missing decisive information between standard evaluation stages may lead to Type-I or Type-II errors. Unintentionally missing this crucial information introduces a Type-III error into standard evaluation designs.

The widely used indicators-based approach to assess the resilience of project/program outcomes is a top-down method, relying on assumptions that lack anticipatory resilience management of the socio-ecological system as a whole. This approach fails to capture dynamic interactions and complex relationships between the structural elements of a socio-ecological system, which determine the system's behavior.


Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Exploring the potential of qualitative sketch mapping in evaluation: mapping fear of crime.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Nikki Sloan (Evaluator)

Combining sketch mapping and qualitative Geographical Information Science (GIS) provides a unique but under-utilised approach to measuring and evaluating people's experience of physical spaces. Using the case study, Mapping Fear of Crime on the Australian National University Campus, this presentation will (1) review the sketch mapping as a qualitative tool, (2) present it's applied use to date, and (3) explore it's potential for evaluation. All illustrated with colourful maps of participants behavioural wayfinding due to fear of being a victim of crime.
Speakers
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Investment logic mapping or evaluation logic modelling? Similarities and differences.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Charlie TULLOCH (Policy Performance)

Evaluation logic modelling is a frequently used technique, looking at such things as inputs, activities, outputs and outcomes.
Since the early 2000s, the Department of Treasury and Finance (Victoria) has used an adapted logic modelling format called Investment Logic Mapping (ILM). It is now used nation-wide and internationally to support resource allocation planning, along with stakeholder engagement.

ILMs and evaluation logic modelling have many similarities, but some major differences.

This ignite presentation will compare and contrast both these tools, and describe when and why to use each.

Attendees will very quickly understand the main similarities and differences, their advantages and drawbacks.
Speakers
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a conference sponsor!Evaluation, training and all aspects of excellence in public sector service delivery.Helping those new to evaluation to thrive.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Measuring Impact Through Storytelling: using Most Significant Change to evaluate the effectiveness of QHub for LGBTIQA+ young people.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Gina Mancuso (Drummond Street Services), Arielle Donnelly (Drummond Street Services, AU)

LGBTIQA+ young people experience discrimination and marginalisation which contribute to poorer mental and physical health outcomes, compared to the general population. QHub is an initiative that creates safe spaces, offers mental health and well-being services, and provides outreach tailored for LGBTIQA+ young people in Western Victoria and the Surf Coast. QHub provides LGBTIQA+ young people and their families/carers with welcoming, inclusive and integrated support, as well as opportunities to connect with peers and older role models. This presentation will outline how the collection and selection of stories of change (Most Significant Change) is helping us evaluate the impact of QHub.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Program Evaluation Fundamentals in the NSW Department of Planning, Housing and Infrastructure: An eLearning course on evaluation
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Anabelle (Pin-Ju) Chen (NSW Department of Planning, Housing and Infrastructure)

Introducing Program Evaluation Fundamentals (PEF) in the NSW Department of Planning, Housing and Infrastructure, an eLearning course designed to facilitate a coherent journey of learning within the department. With learning and adapting together in mind, the design of PEF empowers individuals at all levels to navigate the fundamentals of evaluation. Through interactive modules, learners will understand key evaluation concepts and cultivate best practices. PEF promotes transformative growth by emphasising foundational evaluation knowledge. By leveraging PEF, we can shift our approach, embrace innovation, and advance the field of evaluation across the public sector, fostering a supportive community of forward-thinking evaluators.
Speakers
avatar for Anabelle Chen

Anabelle Chen

Senior Analyst, Program Evaluation, NSW Department of Planning and Environment
Anabelle (Pin-Ju) Chen is a distinguished senior analyst hailing from Taiwan, with a global perspective on evaluation, data analysis, and project management. Having studied in Taiwan, the United Kingdom, and Australia, Anabelle brings a diverse range of experiences and insights to... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Putting values on the evaluation journey map
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People)

Values guide all evaluation processes, methods and judgements. Although evaluators are often not aware of the values shaping their work and can't readily name them, they know when we are straying off their values path through the experience of conflict or unease. Reflecting on the evaluation literature and two decades of evaluation practice using a 'values' perspective, it is argued that there has never been a more important time to build values literacy. This presentation demonstrates how values literacy can guide conversations with yourself, your team and others and provide signposting and illumination of a more rigorous and ethical evaluation journey.
Speakers
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Voices of the Future: Elevating First Nations Leadership in the Evolution of Educational Excellence
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Skye Trudgett (Kowa ),Sharmay Brierley (Kowa, AU)

This ignite presentation will delve into the pioneering evaluation within the education sector, where a series of education initiatives were designed and implemented by Aboriginal Community Controlled Organisations (ACCO's) and mainstream Education partners to uplift and support young First Nations peoples. We will uncover how the initiative's evaluation framework was revolutionarily constructed with First Nations communities at its heart, applying the reimagining evaluation framework, utilising diverse data collection methods and producing Community Reports that reflect First Nations experiences and voices.

Attendees will be guided through the evaluative journey, showcasing the incorporation of wisdom to demonstrate the profound value of community-delivered initiatives that contribute to change. The session will highlight the success stories and learnings, emphasising how this approach not only benefits the current generation but also lays the groundwork for the prosperity of future generations.
Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

A practical approach to designing and implementing outcome measures in psychosocial support services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
Authors: Lauren Gibson (Mind Australia ),Dr. Edith Botchway (Mind Australia, AU), Dr. Laura Hayes (Mind Australia, AU)

Outcome measurement in mental health services is recommend as best practice and provides an opportunity for clients and staff to track progress and navigate the complex road to recovery together. However, there are many barriers to embedding outcome measures in mental health services, including time constraints, low perceived value among staff and clients, and not receiving feedback on outcomes regularly. To overcome these challenges, a national not-for-profit provider of residential and non-residential psychosocial support services, created an innovative approach for designing and implementing outcome measures. The objective of our presentation is to describe this approach which has resulted in average outcome measure completion rates of over 80% across 73 services in Australia.

Design
We believe the key to achieving these completion rates is through understanding the needs of outcome measures end-users, including clients, carers, service providers, centralised support teams, and funding bodies. In this presentation we will share how we:
  • "Begin with the end in mind" through working with stakeholders to create user personas and program logics to identify meaningful outcomes and survey instruments.
  • Design easy to use digital tools to record quality data and provide stakeholders with dashboards to review their outcomes in real time through visualising data at an individual client level, and service level.

Implementation
Also key to embedding outcome measures is having a structured, multi-stage approach for implementation, with tailored support provided to:
  • Prepare services (e.g., Training)
  • Install and embed outcome measures in routine practice (e.g., Service champions)
  • Maintain fidelity over time (e.g., Performance monitoring)

The presentation will highlight the salient barriers and enablers identified during each design and implementation stage.

Overall, the presentation will provide a practical example of how to design and implement outcome measures in mental health services to ensure they are adding value for relevant stakeholders and enabling efficient and meaningful evaluation.

Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Closing plenary: Panel, John Gargani "Finding Our Way to the Future Profession of Evaluation"
Friday September 20, 2024 2:30pm - 4:00pm AEST
More details of the closing plenary, including panelists, to be confirmed.

John Gargani, President of Gargani + Company, former President of the American Evaluation Association, USA

As the AES 2024 conference comes to a close, we gather one last time to consider the journey ahead. We seek a destination none have seen—a profession that in ten years’ time fully supports societal and planetary health—along a path we have never traveled. The urgency of existential threats such as AI, global heating, and pandemics call into question the traditional ways our profession has navigated the future. And new players such as impact investors, social entrepreneurs, effective altruists, and socially responsible corporations ensure that the journey will be crowded and some routes cut off.

With this in mind, we pose two questions to our panelists.
  1. What milestones and songlines should guide us on our way to an imagined future profession? How will we know if we have lost our way?
  2.     How should we interact with other professions on a similar journey? Like commuters on a train who dare not speak, shipwrecked strangers who must quickly band together, or something else altogether?
Followed by:
Conference close by the AES President, and handover to aes25 Ngambri/Canberra


Speakers
avatar for John Gargani

John Gargani

President (former President of the American Evaluation Association), Gargani + Company
Dr John Gargani is an evaluator with 30 years of experience and eclectic interests. He is President of the evaluation consulting firm Gargani + Company, served as President of the American Evaluation Association in 2016, coauthored the book Scaling Impact: Innovation for the Public... Read More →
Friday September 20, 2024 2:30pm - 4:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.