Loading…
Conference hashtag #aes24MEL
Environment and climate change clear filter
arrow_back View All Dates
Friday, September 20
 

10:30am AEST

Involving children and young people in evaluations: Equity through active participation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Sharon Marra-Brown (ARTD Consultants), Moya Johansson (ARTD Consultants, AU)

Think it's important to enable children and young people to have a voice in evaluations, but find it challenging? This paper presents tried and tested strategies for ensuring ethical engagement with children and young people and encouraging meaningful participation.

Involving children and young people in evaluation is critical to ensure that we arrive at evaluations that accurately reflect their experiences and capture the outcomes they consider most important. Children and young people have the right to have a say about their experiences, and evaluations that avoid their involvement risk perpetuating ongoing inequities.

However, involving children and young people in evaluations can prompt ethical concerns in relation to their comprehension of research, capacity to provide consent, potential coercion by parents, and the potential conflicting values and interests between parents and children. Depending on the subject, it can also create concerns about safety and readiness.

Based on our experience successfully achieving ethics approval for multiple evaluations of services for children and young people across Australia, which include interviews with children and young people who have accessed these services, we will talk through considerations for ensuring the voice of children and young people in evaluation while safeguarding them from unnecessary risks.

We will then take you through how we've overcome challenges engaging children and young people in evaluations with youth-centred innovative solutions, including carefully considering the language we use and how we reach out. We will demonstrate the developmental benefits of meaningful participation of children and young people once ethical considerations have been carefully considered and navigated.

Finally, we will take you through our tips for ensuring meaningful and safe engagement with children and young people. We will point you in the direction of Guidelines and practice guides for involving young people in research and evaluation in a safe and meaningful way.

The presenters are evaluators with extensive experience in designing, delivering and reporting on evaluations that include data collection with children and young people. This includes recently achieving ethics approval and commencing interviews with children as young as seven, accessing a suicide aftercare service.

While much attention is devoted to ensuring safe and inclusive data collection with various demographics, specific considerations for engaging children and young people remain relatively uncommon. Recognising the unique needs of this population, coupled with the understandably cautious stance of ethics committees, underscores the necessity for a thoughtful and deliberate approach to evaluations involving children and young people.

Given the additional complexities and ethical considerations involved, the default tendency can be to exclude children and young people from evaluation processes. However, it is important that children and young people are able to have a say in the programs, policies and services that they use. Participation in evaluations can be a positive experience, if risks are managed and the process is designed to be empowering.

This session will provide valuable insights, actionable strategies, and an opportunity for participants to reflect on their own practices, fostering a culture of inclusivity and responsiveness in evaluation.
Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Mitchell Rice-Brading

Mitchell Rice-Brading

ARTD Consultants
I started with ARTD in early 2022 after completing his Bachelor of Psychological Science (Honours) in 2021. This, in combination with experience as a Psychology research assistant, helped me develop strong research skills, namely the ability to synthesise and critically evaluate qualitative... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Embracing the L in "MEL": A Journey Towards Participatory Evaluation in Government Programs
Friday September 20, 2024 12:00pm - 12:30pm AEST
103
Authors: Milena Gongora (Great Barrier Reef Foundation)

Best practice in evaluation encompasses a crucial step of learning, yet it often receives inadequate emphasis, particularly within government-funded initiatives. Our paper documents the journey of transforming a top-down, prescriptive evaluation process within a government-funded program into an inclusive, consultative approach aligned with Monitoring, Evaluation, and Learning (MEL) principles.

Funded by Australian Government, and managed by the Great Barrier Reef Foundation, the Reef Trust Partnership (RTP) launched in 2018 aiming to enhance the resilience of the Great Barrier Reef. Within it, a $200 million portfolio aims to improve water quality working with the agricultural industry. A framework for impact evaluation was developed in its early days. Whilst appropriate, due to the need to comply with broader government requirements, it was top-down in nature.

Four years into implementation, the Foundation was ready to synthesise, interpret and report on the program's impact. The Foundation could have simply reported "up" to government. However, we acknowledged that in doing so, we risked missing critical context, simplifying findings, misinterpreting information and presenting yet another tokenistic meaningless report.

Interested in doing things better, we instead circled back with our stakeholders in a participatory reflection process. Through a series of carefully planned workshops, we invited on-ground program practitioners to ground-truth our findings, share contextual nuances, and collectively strategise for future improvements.

Despite initial reservations, participants embraced the opportunity, fostering an atmosphere of open dialogue and knowledge exchange. This reflective process not only enriched our understanding of program impact but also enhanced collaboration, strengthening overall program outcomes.

Our experience highlights the importance of transcending tokenistic evaluation practices, particularly in environments where top-down directives prevail. Participatory approaches can be implemented at any scale, contributing to a culture of continuous improvement and strategic learning, ultimately enhancing the impact and relevance of evaluation efforts.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Climate Change, Energy, the Environmentand Water NSW
It was a short step from studying environmental science, and working on cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing stories of what you've done and learned, especially in energy, climate change, environment and... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -