Loading…
Conference hashtag #aes24MEL
NFP or Philanthropy clear filter
arrow_back View All Dates
Friday, September 20
 

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

The Road Home - an evaluation journey to discover and demonstrate a new and exciting way to deliver a crisis housing response.
Friday September 20, 2024 10:30am - 11:30am AEST
105
Authors: Anne Smyth (LDC Group), Lesley Thornton (LDC Group, AU), Kym Coupe (First Step, AU), Caroline Lynch (Launch Housing, AU)

As all Wayfinders would understand, when we embarked on a developmental evaluation of the Road Home, we really had no idea how the program or evaluation would play out in practice. We did know however that the usual way of delivering crisis housing services was not working well for either clients or staff. Something needed to change. We needed to change. So, we did - we being the Road Home team working with the evaluators.

Road Home centres on a strong and engaged multidisciplinary team to deliver mental health, medical, legal and housing services to people in crisis accommodation, where they are, and when they need it the most. This integrated way of working is in stark contrast to the conventional, single discipline outreach and in-reach approaches that characterise service delivery in the community sector - its impact has been significant.

This panel will bring leading representatives of the Road Home team and the evaluators together to explore with our audience what we have learned; what it takes to do this well, the benefits to clients, staff and participating organisations, the pitfalls and challenges and the value of developmental evaluation and its methods.

We now have a much better idea of what Road Home looks like, what it takes to support and enable it, to achieve valued outcomes and to meaningfully evaluate it. The role of evaluators and the project manager in holding the uncertain and evolving space characteristic of developmental evaluation and wayfinding is central - it has taken clarity, alignment of purpose, a lot of patience and much persistence, not to mention flexibility. It has been and remains quite the journey!
Chair Speakers
avatar for Anne Smyth

Anne Smyth

Principal Consultant, LDC Group
I have extensive experience in working with the community and not for profit sectors. I am able to draw on 40 years of experience as an educator and researcher in university leadership and management development programs and as a consultant in the fields of organisational change and... Read More →
LT

Lesley Thornton

Principal Consultant, LDC Group
As an evaluator and organisational development consultant, I have extensive experience in government and not-for-profit sectors working in areas of policy and service development, evaluation, leadership and organisational development. Drawing on the theory and practice across these... Read More →
avatar for Kym Coupe

Kym Coupe

Project Manager, First Step
Kym is project lead for the collaborative partnership between First Step and Launch Housing and is passionate about the benefits – to both staff and consumers – of collaborative and integrated service delivery. Kym has a Masters of Public Health and has worked in health and community... Read More →
avatar for Caroline Lynch

Caroline Lynch

Service Manager - Women Services, Launch Housing
I am a trauma informed, feminist leader who believes in using my influence for a more inclusive and equitable society. I oversee the four programs within the Women Services function at Launch Housing. This includes the Women’s Only Crisis Accommodation, the Transitional Housing... Read More →
Friday September 20, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Participatory Analysis Workshops: a novel method for identifying important factors across diverse projects
Friday September 20, 2024 11:00am - 11:30am AEST
104
Authors: Martina Donkers (Martina Donkers), Ellen Wong (ARTD, AU), Jade Maloney (ARTD, AU)

Some programs comprise a range of diverse projects striving towards a common goal - for example grant programs, where a wide range of different grantees are carrying out different projects with very different activities in pursuit of the grant program objectives.

These can be a challenge to evaluate - with so many different activities, outputs cannot be easily aggregated, and each project may be responding to its local context in unique but important ways. These programs need efficient ways to identify common factors affecting implementation and outcomes that reflect the richness of the activities undertaken, but do not place undue burden on organisations, particularly those receiving smaller grants.

We developed a novel method that uses participatory workshops to explore commonalities in implementation across projects and the various ways they seek to achieve common program outcomes. The theory-driven method builds off the Most Significant Change and Qualitative Comparative Analysis, and combines data collection with collaborative participatory data analysis to build a rich qualitative understanding of projects in a relatively short timeframe with fewer resources. Active participation from project leaders (e.g. grant recipients) builds cohesion across the program, and helps project leaders feel more connected and supported.

This paper outlines the theory, approach and uses of Participatory Analysis Workshops, including strengths and limitations, the types of data and insights the method can yield. We use our work with the NSW Reconstruction Authority to evaluate the Covid Community Connection and Wellbeing Program using this method to illustrate what we've learnt about how the method works and in what circumstances, and then identify other potential use cases. Participants will have an opportunity to ask questions to help inform future uses of this methods. This information will equip evaluations with Tools to navigate varying territory together to understand progress toward program outcomes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Ellen Wong

Ellen Wong

Consultant, ARTD Consultants
I'm a consultant at ARTD with a background in human geography and environmental studies. I bring this lens to the work I do and am particularly passionate about the intersection between people and the environment. My portfolio spans environmental policy, disaster recovery and community... Read More →
avatar for Jade Maloney

Jade Maloney

Partner & CEO, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
Friday September 20, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Our five guiding waypoints: Y Victoria's journey and learning from applying organisation-wide social impact measurement
Friday September 20, 2024 11:30am - 12:00pm AEST
103
Authors: Caitlin Barry (Y Victoria), Eugene Liston (Clear Horizon Consulting, AU)

The demand for organisations to measure impact seems to be ever increasing. However, impact measurement looks different depending on what level you are measuring it (program level, organisation-wide, ecosystem level etc). While a lot of organisations focus on measuring social impact at a program level, what appears to be less commonly achieved is the jump to effective measurement of impact at an organisation-wide level.

The literature providing guidance on how to implement org-wide social impact measurement makes it seem so straight-forward, like a Roman highway - all straight lines. But what is it really like in practice? How does it differ from program-level impact measurement? How can it be done? What resources does it take? And, what are the pitfalls?

The Y Victoria has spent the last three years on a journey to embed org-wide social impact measurement under the guidance of our evaluation partner. The Y Victoria is a large and diverse organisation covering 7 different sectors/service lines; over 5,500 staff; over 180 centres; and delivering services to all ages of the community. This presented quite a challenge for measuring organisation-wide impact in a meaningful way.

While the journey wasn't 'straight-forward', we've learnt a lot from navigating through it. This presentation will discuss the approach taken, tell the story of the challenges faced, trade-offs, lessons learnt (both from the client and consultant's perspective), and how we have adapted along the way.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Jess Boyden

Jess Boyden

Senior Social Impact Manager - Recreation, YMCA Victoria
Hello! I'm Jess and I bring 20 years of experience in program design, strategy and social impact measurement within international aid and local community development settings. I specialise in creating practical and meaningful approaches to measuring social impact, using the power... Read More →
avatar for Caitlin Barry

Caitlin Barry

Principal Consultant, Caitlin Barry Consulting
Caitlin has extensive experience in monitoring and evaluation and holds a Masters of Evaluation (First Class Honours) from the University of Melbourne and an Environmental Science Degree (Honours) from James Cook University. The focus of Caitlin's presentation will be from her work... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Gamified, flexible, and creative tools for evaluating a support program for palliative children and their families
Friday September 20, 2024 11:30am - 12:00pm AEST
104
Authors: Claire Treadgold (Starlight Children's Foundation Australia), Erika Fortunati (Starlight Children's Foundation, AU)

Our program creates personalised experiences of fun, joy, and happiness for families with a palliative child, aiming to foster family connections and celebrate the simple joys of childhood during this challenging circumstance. Evaluating the program is of utmost importance to ensure that it meets the needs of the families involved. Equally, due to the program's sensitivity and deeply personal nature, a low-pressure, flexible evaluation approach is necessary.
In our session, we will showcase our response to this need and share our highly engaging, low-burden tools to gather participant feedback that leverages concepts of gamification and accessibility to boost evaluation responses and reduce participant burden. In particular, we will focus on our innovative “activity book”, which evaluates the program through artistic expression. By emphasising creativity and flexibility, our tools aim to enrich the evaluation process and respect the diverse preferences and abilities of the participating families.
The core argument will focus on our innovative evaluation methodology, how it aligns with best practices in the literature, and our key learnings. Key points include the considerations needed for evaluating programs involving palliative children, empowering children and young people through their active involvement in the evaluation process, and how gamification and creativity boost participation and engagement.
Outline of the session:
  • Introduction to the palliative care program and the need for flexible, creative, and respectful evaluation methods
  • What the literature tells us about evaluation methods for programs involving palliative children and their families
  • A presentation of our evaluation protocol
  • Case studies illustrating the feedback collected and its impact
Our learnings and their implications for theory and practice
Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Erika Fortunati

Erika Fortunati

Research and Evaluation Manager, Starlight Children's Foundation Australia
Erika is the Research and Evaluation Manager at Starlight Children's Foundation, an Australian not-for-profit organisation dedicated to brightening the lives of seriously ill children. In her current role, Erika manages research projects and program evaluations to ensure that programs... Read More →
Friday September 20, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Embracing the L in "MEL": A Journey Towards Participatory Evaluation in Government Programs
Friday September 20, 2024 12:00pm - 12:30pm AEST
103
Authors: Milena Gongora (Great Barrier Reef Foundation)

Best practice in evaluation encompasses a crucial step of learning, yet it often receives inadequate emphasis, particularly within government-funded initiatives. Our paper documents the journey of transforming a top-down, prescriptive evaluation process within a government-funded program into an inclusive, consultative approach aligned with Monitoring, Evaluation, and Learning (MEL) principles.

Funded by Australian Government, and managed by the Great Barrier Reef Foundation, the Reef Trust Partnership (RTP) launched in 2018 aiming to enhance the resilience of the Great Barrier Reef. Within it, a $200 million portfolio aims to improve water quality working with the agricultural industry. A framework for impact evaluation was developed in its early days. Whilst appropriate, due to the need to comply with broader government requirements, it was top-down in nature.

Four years into implementation, the Foundation was ready to synthesise, interpret and report on the program's impact. The Foundation could have simply reported "up" to government. However, we acknowledged that in doing so, we risked missing critical context, simplifying findings, misinterpreting information and presenting yet another tokenistic meaningless report.

Interested in doing things better, we instead circled back with our stakeholders in a participatory reflection process. Through a series of carefully planned workshops, we invited on-ground program practitioners to ground-truth our findings, share contextual nuances, and collectively strategise for future improvements.

Despite initial reservations, participants embraced the opportunity, fostering an atmosphere of open dialogue and knowledge exchange. This reflective process not only enriched our understanding of program impact but also enhanced collaboration, strengthening overall program outcomes.

Our experience highlights the importance of transcending tokenistic evaluation practices, particularly in environments where top-down directives prevail. Participatory approaches can be implemented at any scale, contributing to a culture of continuous improvement and strategic learning, ultimately enhancing the impact and relevance of evaluation efforts.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Finding ways to empower multicultural survivors of violence through evaluation: strategies, learnings and reflections
Friday September 20, 2024 1:30pm - 2:00pm AEST
104
Authors: Lydia Phillips (Lydia Phillips Consulting ), Jo Farmer (Jo Farmer Consulting )

As evaluators, we often work with people who have experienced trauma and/or marginalisation (whether we realise or not!). We're also seeing increased recognition in government and community organisations of the importance of lived experience and cultural safety in program design, implementation and evaluation.

Beginning an evaluation with a clear plan for how you'll engage and empower people from diverse cultural backgrounds and people who have experienced trauma can help to ensure success - of your project and of participants' experience.

So how can you design an evaluation framework to recognise diverse cultural backgrounds and empower survivors of violence?

And how can evaluators who don't have lived experience or identify from those cultural backgrounds best navigate the design process?

This session will share strategies, learnings and reflections from a project working with a multicultural family violence service to develop a culturally-safe, trauma-informed evaluation framework for a two-year program.

It will:
  • explore what worked well and what was challenging in the project
  • discuss similarities and differences in the concepts of culturally-safe and trauma-informed practice, drawing on current literature; and
  • pose questions and provide suggestions for evaluators who want to develop their skills in culturally safe and trauma-informed evaluation practice.

The session will offer key tips and strategies that are translatable to other contexts and conclude with reflective questions for attendees.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Friday September 20, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Measuring Impact Through Storytelling: using Most Significant Change to evaluate the effectiveness of QHub for LGBTIQA+ young people.
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Gina Mancuso (Drummond Street Services), Arielle Donnelly (Drummond Street Services, AU)

LGBTIQA+ young people experience discrimination and marginalisation which contribute to poorer mental and physical health outcomes, compared to the general population. QHub is an initiative that creates safe spaces, offers mental health and well-being services, and provides outreach tailored for LGBTIQA+ young people in Western Victoria and the Surf Coast. QHub provides LGBTIQA+ young people and their families/carers with welcoming, inclusive and integrated support, as well as opportunities to connect with peers and older role models. This presentation will outline how the collection and selection of stories of change (Most Significant Change) is helping us evaluate the impact of QHub.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

A practical approach to designing and implementing outcome measures in psychosocial support services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
Authors: Lauren Gibson (Mind Australia ),Dr. Edith Botchway (Mind Australia, AU), Dr. Laura Hayes (Mind Australia, AU)

Outcome measurement in mental health services is recommend as best practice and provides an opportunity for clients and staff to track progress and navigate the complex road to recovery together. However, there are many barriers to embedding outcome measures in mental health services, including time constraints, low perceived value among staff and clients, and not receiving feedback on outcomes regularly. To overcome these challenges, a national not-for-profit provider of residential and non-residential psychosocial support services, created an innovative approach for designing and implementing outcome measures. The objective of our presentation is to describe this approach which has resulted in average outcome measure completion rates of over 80% across 73 services in Australia.

Design
We believe the key to achieving these completion rates is through understanding the needs of outcome measures end-users, including clients, carers, service providers, centralised support teams, and funding bodies. In this presentation we will share how we:
  • "Begin with the end in mind" through working with stakeholders to create user personas and program logics to identify meaningful outcomes and survey instruments.
  • Design easy to use digital tools to record quality data and provide stakeholders with dashboards to review their outcomes in real time through visualising data at an individual client level, and service level.

Implementation
Also key to embedding outcome measures is having a structured, multi-stage approach for implementation, with tailored support provided to:
  • Prepare services (e.g., Training)
  • Install and embed outcome measures in routine practice (e.g., Service champions)
  • Maintain fidelity over time (e.g., Performance monitoring)

The presentation will highlight the salient barriers and enablers identified during each design and implementation stage.

Overall, the presentation will provide a practical example of how to design and implement outcome measures in mental health services to ensure they are adding value for relevant stakeholders and enabling efficient and meaningful evaluation.

Chair
avatar for James Copestake

James Copestake

Professor, International Development, University of Bath, UK
James Copestake is Professor of International Development at the University of Bath in the UK, where he is also Director of Studies for the Doctorate in Policy Research and Practice at the Institute of Policy Research.His publications range broadly across international development... Read More →
Speakers
avatar for Lauren Gibson

Lauren Gibson

Researcher, Mind Australia
Dr. Lauren Gibson’s research focuses on understanding the prevalence and impact of initiatives aimed at improving physical and mental health outcomes among mental health service users. She has been a researcher within the Research and Evaluation team at Mind Australia for over two... Read More →
Friday September 20, 2024 2:00pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Chair Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -