Loading…
Conference hashtag #aes24MEL
Education and training clear filter
arrow_back View All Dates
Friday, September 20
 

10:30am AEST

Following the (matched) data to understand impact: adventures in quasi-experimental evaluation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Mohib Iqbal (Department of Education), Kira Duggan (Department of Education, AU), Ben McNally (Department of Education, AU)

This presentation will showcase the use of quasi-experimental impact evaluation and the use of a relatively new data linkage capability within the Victorian public sector.
Impact evaluation provides important evidence on program effectiveness and helps to inform government investment decisions. Quasi-experimental design identifies a comparison group that is similar to the treatment group/program participants in terms of baseline or pre-intervention characteristics. Statistical methods such as propensity score matching, and regression discontinuity can create valid comparison groups with a reduced risk of bias (White & Sabarwal, 2014).

However, the implementation of this method faces significant technical, data availability, and other challenges.
The Evaluation and Program Impact (EPI) branch at the Victorian Department of Education (DE) used quasi-experimental assessment as part of six different education program evaluations spanning issues from teacher supply to support for vulnerable students. This approach was used to evaluate impact/effectiveness and the economic evaluation of interventions to measure avoided costs. The presentation will outline the process of design, methodology and implementation of quasi-experimental methods used as part of these six evaluations.

Key enablers of the use of quasi-experimental designs are data availability and expertise in undertaking advanced quantitative impact evaluations. This presentation will give an overview of the types of departmental data used (such as regularly administered student, parent/carer, teacher and school leader surveys, assessment results such as NAPLAN and administrative data) as well as the relatively new analytical capability available through linked service use data from the Victorian Social Investment Integrated Data Resource (VSIIDR) and Centre for Victorian Data Linkage (CVDL).
The presentation also contextualises quasi-experimental impact evaluations as being one component of mix-method approaches that were staged after evaluation of appropriateness, design and fidelity. Decisions on intervention effectiveness were made using a broader array of evidence including quasi-experimental impact evaluation as one of many sources.
Chair
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
Speakers
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
avatar for Mohib Iqbal

Mohib Iqbal

Senior Evaluation Officer, Department of Education
I am a multi-disciplinary evaluator and researcher with 15 years of experience across education, health, international development, social protection, and migration sectors. I currently work for the Department of Education in Victoria and have previously worked with the World Bank... Read More →
avatar for Ben McNally

Ben McNally

Manager, Evaluation and Research, Department of Education, Victoria
I have worked on evaluation and social research projects in consultancy and public sector settings. This has included evaluating reform programs in social services, employment, and school education.Talk to me about:- Evaluation practice in the Victorian Public Sector- In-house evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Walking together: First Nations participation, partnerships and co-creation in Evaluation.
Friday September 20, 2024 10:30am - 11:30am AEST
106
Authors: Tony Kiessler (First Nations Connect), Alice Tamang (First Nations Connect, AU)

Effective First Nations engagement is integral in the design and delivery of culturally safe evaluations. The AES' First Nations Cultural Safety Framework discusses 10 principles for culturally safe evaluation and describes the journey of engagement. However, the question of how to engage effectively can be the first and most significant challenge faced by evaluators. There is little clarity on how to create opportunities for First Nations leadership and voices in our evaluations, how to engage appropriately, and who we should engage with. There is also the challenge of managing tight timeframes, client expectations and capabilities that can limit the focus on meaningful First Nations participation, partnership and co-creation.

This is a unique offering that enables practitioners and First Nations facilitators to walk together, explore shared challenges and identify opportunities to improve First Nations engagement. The session will explore the potential for partnerships in informing and implementing evaluations, opportunities to increase First Nations participation, privilege their experience and knowledge, and how evaluation practitioners can draw on these strengths through co-creation to amplify First Nations voices and leadership in evaluation practice.

This session aims to:
  • Explore a principles-based approach to First Nations engagement;
  • Discuss shared experiences on successful approaches to enhance First Nations partnership, participation and co-creation; and
  • Develop a shared understanding of to take this knowledge forward through culturally safe evaluation commissioning, practice and reporting.

Discussion will draw on the collective experience of both the attendees and the facilitators, walking together. The sharing of ideas will be encouraged in a safe space that engages the audience in a collaborative dialogue with First Nations practitioners. This dialogue will explore current knowledge, capabilities and gaps, as well as the challenges (and how they can be overcome), as part of the broader journey to culturally safe evaluation practice.


Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Tony Kiessler

Tony Kiessler

Co-Convener, First Nations Connect
Tony is a Central Arrernte man, consultant and researcher living and working on beautiful Gundungurra Country in the NSW Southern Highlands. He is an evaluation, strategic planning and research consultant with a particular interest in health, human development and social inclusion... Read More →
avatar for Alice Tamang

Alice Tamang

Consultant, First Nations Connect
Alice is a Dharug woman based on Wurundjeri Country. She is a consultant and advisor, with a focus on facilitating connections between cultures, empowering individuals and communities to share knowledge and enhance cultural understanding. Alice primarily works on DFAT funded programs... Read More →
avatar for Nicole Tujague

Nicole Tujague

Founder and Director, The Seedling Group
Nicole TujagueBachelor of Indigenous Studies (Trauma and Healing/Managing Organisations)1st Class Honours, Indigenous ResearchPhD in Indigenous-led Evaluation, Gnibi College, Southern Cross UniversityNicole is a descendant of the Kabi Kabi nation from Mt Bauple, Queensland and the... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

Reviewing and writing for the Evaluation Journal of Australasia
Friday September 20, 2024 10:30am - 11:30am AEST
103
Authors: John Guenther (Batchelor Institute Of Indigenous Tertiary Education), Anthea Rutter (University of Melbourne, AU), Yvonne Zurynski (Macquarie Univesity, AU)

The Evaluation Journal of Australasia (EJA) supports evaluators who wish to share their knowledge and practical experiences in a peer-reviewed article. Documenting evidence, including for programs which do not achieve expected results, is critical for improving evaluation practice, building the evidence base, and advancing evaluation methodologies that are rigorous and ethical.

The EJA depends on volunteer reviewers who can offer critical feedback on articles that are submitted. Reviewers help to improve the quality of manuscripts the Journal receives.

The focus of this presentation is on how to write a good review: how to be academically critical, while at the same time providing constructive feedback that will benefit authors and readers. The presenters will offer step-by-step advice on what to look for, how to judge the quality of a manuscript, and how to make constructive suggestions for authors to consider.

The presentation will also explain how reviewing fits within the publication process, from submission to production. It will be most helpful to potential authors and current and potential reviewers. Authors will learn how to prepare their articles so they receive a favourable review, and reviewers will receive clear guidance on presenting their review feedback to authors.
Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for John Guenther

John Guenther

Research Leader, Education and Training, Batchelor Institute of Indigenous Tertiary Education
John Guenther is a senior researcher and evaluator with the Batchelor Institute of Indigenous Tertiary Education, based in Darwin. Much of his work has been based in the field of education. He has worked extensively with community-based researchers in many remote parts of the Northern... Read More →
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
avatar for Jeff Adams

Jeff Adams

Managing Editor | Senior Lecturer, Evaluation Journal of Australasia | Eastern Institute of Technology
I am the Managing Editor of the Evaluation Journal of Australasia - talk to me about publishing in, or reviewing for the journal. I also teach postgraduate Health Sciences at Eastern Institute of Technology, Auckland.
Friday September 20, 2024 10:30am - 11:30am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Value Propositions: Clearing the path from theory of change to rubrics
Friday September 20, 2024 11:00am - 12:30pm AEST
Authors: Julian King (Julian King & Associates Limited), Adrian Field (Dovetail Consulting Limited, NZ)

Evaluation rubrics are increasingly used to help make evaluative reasoning explicit. Rubrics can also be used as wayfinding tools to help stakeholders understand and participate meaningfully in evaluation. Developing rubrics is conceptually challenging work and the search is on for additional navigation tools and models that might help ease the cognitive load.

As a preliminary step toward rubric development it is often helpful to co-create a theory of change, proposing a chain of causality from actions to impacts, documenting a shared understanding of a program, and providing a point of reference for scoping a logical, coherent set of criteria.

However, it's easy to become disoriented when getting from a theory of change to a set of criteria, because the former deals with impact and the latter with value. Implicitly, a theory of change may focus on activities and impacts that people value, but this cannot be taken for granted - and we argue that value should be made more explicit in program theories.

Specifying a program's value proposition can improve wayfinding between a theory of change and a set of criteria, addressing the aspects of performance and value that matter to stakeholders. Defining a value proposition prompts us to think differently about a program. For example, in addition to what's already in the theory of change, we need to consider to whom the program is valuable, in what ways it is valuable, and how the value is created.

In this presentation, we will share what we've learnt about developing and using value propositions. We'll share a simple framework for developing a value proposition and, using roving microphones, engage participants in co-developing a value proposition in real time. We'll conclude the session by sharing some examples of value propositions from recent evaluations.

Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
I’m an independent public policy consultant based in Auckland. I specialise in evaluation and Value for Investment. I’m affiliated with the Kinnect Group, Oxford Policy Management, the University of Melbourne and the Northern Institute. Subscribe to my weekly blog at https:/... Read More →
avatar for Adrian Field

Adrian Field

Director, Dovetail
Adrian is the director of Dovetail, an Auckland-based evaluation consultancy, and a member of the Kinnect Group. Adrian has worked in evaluation in different capacities for some 20 years and doesn't really like how old that makes him feel. Adrian's experience traverses health, social... Read More →
Friday September 20, 2024 11:00am - 12:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Fidelity to context: A realist perspective on implementation science
Friday September 20, 2024 1:30pm - 2:00pm AEST
105
Authors: Andrew McLachlan (NSW Department of Education)

At first glance, realist methodology seems ideally suited to investigating implementation problems (Dalkin et al., 2021). It is versatile in that it draws on theories from diverse fields of social inquiry. It is pragmatic in that the theories it adopts are deemed useful only in so far as they offer explanatory insight. And it is transferable; realist methodology is less concerned with generalising findings than in understanding how programs work under different conditions and circumstances.

As for implementation science, its founding aim is purpose built for realist work; it seeks to improve the uptake of evidence-based practices by investigating the barriers and facilitators to implementation. Yet despite the affinity between realist methodology and implementation science, so far there have been few attempts to formalise the relationship (Sarkies et al., 2022).

This paper offers insights into how evaluators can harness realist methodology to better understand challenges of program implementation. It demonstrates how implementation concepts like fidelity (the degree to which a program is delivered as intended), adaptation (the process of modifying a program to achieve better fit), and translation (the ability to transfer knowledge across organisational borders) can be combined with realist concepts to develop a more active understanding of context.

In showing how to construct program theories that are responsive to changing conditions, the paper promises to equip evaluators with tools that can help them navigate the complexities of program implementation in their own work.



Chair Speakers
avatar for Andrew McLachlan

Andrew McLachlan

Evaluation Lead - Strategy, NSW Department of Education
Andrew McLachlan is an Evaluation Lead for the NSW Department of Education. Before becoming an evaluator, Andrew had over 10 years of experience as a teacher, working in settings as diverse as far North Queensland and Bangladesh. Since 2021, Andrew has worked as an embedded evaluator... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Program Evaluation Fundamentals in the NSW Department of Planning, Housing and Infrastructure: An eLearning course on evaluation
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Anabelle (Pin-Ju) Chen (NSW Department of Planning, Housing and Infrastructure)

Introducing Program Evaluation Fundamentals (PEF) in the NSW Department of Planning, Housing and Infrastructure, an eLearning course designed to facilitate a coherent journey of learning within the department. With learning and adapting together in mind, the design of PEF empowers individuals at all levels to navigate the fundamentals of evaluation. Through interactive modules, learners will understand key evaluation concepts and cultivate best practices. PEF promotes transformative growth by emphasising foundational evaluation knowledge. By leveraging PEF, we can shift our approach, embrace innovation, and advance the field of evaluation across the public sector, fostering a supportive community of forward-thinking evaluators.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Anabelle (Pin-Ju) Chen

Anabelle (Pin-Ju) Chen

Senior Analyst, Evidence and Evaluation, NSW Department of Planning, Housing and Infrastructure
Anabelle (Pin-Ju) Chen is a distinguished senior analyst hailing from Taiwan, with a global perspective on evaluation, data analysis, and project management. Having studied in Taiwan, the United Kingdom, and Australia, Anabelle brings a diverse range of experiences and insights to... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Putting values on the evaluation journey map
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People)

Values guide all evaluation processes, methods and judgements. Although evaluators are often not aware of the values shaping their work and can't readily name them, they know when we are straying off their values path through the experience of conflict or unease. Reflecting on the evaluation literature and two decades of evaluation practice using a 'values' perspective, it is argued that there has never been a more important time to build values literacy. This presentation demonstrates how values literacy can guide conversations with yourself, your team and others and provide signposting and illumination of a more rigorous and ethical evaluation journey.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Voices of the Future: Elevating First Nations Leadership in the Evolution of Educational Excellence
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Skye Trudgett (Kowa ),Sharmay Brierley (Kowa, AU)

This ignite presentation will delve into the pioneering evaluation within the education sector, where a series of education initiatives were designed and implemented by Aboriginal Community Controlled Organisations (ACCO's) and mainstream Education partners to uplift and support young First Nations peoples. We will uncover how the initiative's evaluation framework was revolutionarily constructed with First Nations communities at its heart, applying the reimagining evaluation framework, utilising diverse data collection methods and producing Community Reports that reflect First Nations experiences and voices.

Attendees will be guided through the evaluative journey, showcasing the incorporation of wisdom to demonstrate the profound value of community-delivered initiatives that contribute to change. The session will highlight the success stories and learnings, emphasising how this approach not only benefits the current generation but also lays the groundwork for the prosperity of future generations.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
avatar for Sharmay Brierley

Sharmay Brierley

Consultant, Kowa Collaboration
Sharmay is a proud Yuin woman and project lead at Kowa with prior experience supporting First Nations peoples across human services sectors.As a proud First Nations woman, and through lived experience, Sharmay has a strong understanding of the many challenges faced by First Nations... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Chair Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -