Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
Arts and recreation services clear filter
Wednesday, September 18
 

11:00am AEST

Design-stage evaluative thinking: helping NGOs and grant makers learn to love evaluation from the start
Wednesday September 18, 2024 11:00am - 12:00pm AEST
103
Authors: Claire Grealy (Rooftop Social ),Duncan Rintoul (Rooftop Social, AU),Virginia Poggio (Paul Ramsay Foundation, AU),Luciana Campello (NSW Department of Communities and Justice, AU),Kirsty Burow (NSW Department of Communities and Justice, AU),Jacqueline Webb (National Association for the Prevention of Child Abuse and Neglect (NAPCAN), AU)

The evaluation of grant programs has long frustrated grantees and perplexed fund managers.
Evaluators often arrive at the end, and may find a strong narrative about the funded activity (assuming the project staff are still in place) but less of the documentation and data that demonstrates the impact or learning, or shows the link between each project to the fund objectives.

Fund managers have often had to be content with the limited results available to them, sometimes as basic as acquittals on activity and expenditure. This limits funders' ability to capture learning, feed into new fund designs, mount budget bids, or tell a compelling story about the work grant holders are doing.

This panel brings together a cross-section of key players and beneficiaries from a variety of contexts:
* a state government fund manager in the human services sector
* an evaluation lead from a large national philanthropic organisation
* an experienced project manager from a national NGO that receives grants from various sources
* two evaluation specialists who have deep experience working in this space, developing and delivering this kind of support.

Drawing on case studies from practice, this panel will share some innovative approaches from their work, which bring the right mix of expectation and support to the design stage of grant-based projects, from the time of submitting an EOI through to the point of evaluation readiness.

The fruit that hangs off this tree includes:
* strengthening the 'evaluability' of each project and the overall fund
* testing each project's assumptions and ambitions
* deep conversations between grant makes and grant holders about outcome alignment
* building the evaluative thinking and capability of project teams and organisations, activating the 'ripple effect' as participants share their newfound commitment and skills with their colleagues.
"You couldn't drag me to program logic workshop before this. And now look at me - I took that process you did with us and yesterday I ran it with my team on another project."
Chair
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting
Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
avatar for Jacqueline Webb

Jacqueline Webb

Strategic Projects Manager, NAPCAN
As Strategic Projects Manager at NAPCAN, I am leading an important DCJ grant initiative aimed at enhancing NSW workforce capabilities to support children and young people affected by sexual violence. With guidance from Rooftop Social, we’ve adopted an innovative evaluation approach... Read More →
avatar for Virginia Poggio

Virginia Poggio

MERL Associate, Paul Ramsay Foundation
As a Measurement, Evaluation, Research, and Learning (MERL) Associate at the Paul Ramsay Foundation, I lead teams to deliver evidence-based advice to inform the Foundation’s strategic initiatives. My role involves commissioning, supporting, and managing independent evaluations of... Read More →
avatar for Luciana Campello

Luciana Campello

Senior Policy and Projects Officer, NSW Department of Communities and Justice
Wednesday September 18, 2024 11:00am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Evaluation Lab: Using design to solve evaluation challenges
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Authors: Matt Healey (First Person Consulting)

The Design and Evaluation Special Interest Group (DESIG) was established in 2017. Its primary aim has been to explore the intersection of evaluation and design, and that aim has been interpreted in different ways over time. In 2024, the DESIG identified an opportunity to take the SIG model in a slightly different direction, embarking on an innovative venture with the launch of the Evaluation Lab, an initiative aimed at talk into action, and taking evaluators through a design process to address evaluation challenges.
Drawing inspiration from the concept of 'living labs,' which serve as real-world testing grounds, the Evaluation Lab created a space where evaluation professionals could come together. Employing a design-thinking process, the Lab guided participants through a structured expedition of defining, ideating, and prototyping solutions to tackle nominated challenges. Participants also learned pitch skills to communicate their solutions.
This Big Room Session provides an opportunity for the DESIG to outline the Evaluation Lab model, capped off with participants presenting their solutions through rapid-fire pitches, either live or pre-recorded, akin to explorers sharing tales of new lands discovered. The session's innovative twist lies in the audience's role, acting as both audience and judges. The audience will vote on their favourite solution, and be involved in crowing the first AES Evaluation Lab winner.
By blending lecture-style content with dynamic team presentations and active audience engagement, the Big Room Session not only highlights the critical role of design in navigating evaluation challenges but also demonstrates the practical application of these methodologies in charting a course through real-world problems.

Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Shani Rajendra

Shani Rajendra

Principal Consultant & Head of Business Group (Social Impact), Clear Horizon
Shani is a Principal Consultant in Clear Horizon’s Social Impact team. Shani has extensive experience in community-led initiatives, organisational strategy, and social enterprise. She specialises in incorporating design thinking into evaluative practice. Having completed a Master... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

11:30am AEST

A new tool for participatory evaluation: A case study of the process of conducting online workshops with young creators with disabilities to tell stories using virtual reality animation
Thursday September 19, 2024 11:30am - 12:30pm AEST
103
Authors: Samantha Abbato (Visual Insights People), Lisa Stafford (University of Tasmania, AU)

Researchers from fields including public health, community, and disability have been utilising film methods such as participatory video and digital storytelling for decades. Co-creation of film narratives for evaluation can engage participants as unique people with lived experiences connected to social and cultural worlds and relationships, including their connection with the community. Evaluation has been reluctant to adopt participatory film methods.

Virtual Reality (VR) animation presents a distinctly participatory approach to evaluation data collection and a new lens for communicating findings. It places the participant in the driver's seat and the evaluation audience, in the passenger seat, alongside them. Using VR stories can increase the potential for the intended audience, including decision-makers, to deeply engage with the information communicated through focused immersion in participant stories using familiarity of local settings.

We present a case study examining the process of collaborating with young people with disabilities to tell their stories of inclusion in Tasmania, Australia. Three young people participated in online storyboarding and script-writing workshops over twelve months to develop short stories of everyday experiences in their community. An introduction to the particpants and their stories, the three completed stories, and a collaborative call to action were made into a set of five connected VR short films. The films were displayed as a report on a website page and for viewing as a complete VR story on a headset.

The presenters examine the process of applying this new VR digital storytelling approach to participatory evaluation. The challenges and benefits of the approach for participants and its impact on the intended audience, including urban planning and design students, are discussed. Using the lessons learned from the case study, recommendations for evaluators considering using participatory digital storytelling and VR animation are made.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Thursday September 19, 2024 11:30am - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Friday, September 20
 

10:30am AEST

To tinker, tailor, or craft from scratch? The tension in using validated tools in novel contexts
Friday September 20, 2024 10:30am - 11:00am AEST
104
Authors: Matt Healey (First Person Consulting), Alicia McCoy (First Person Consulting, AU), Tenille Moselen (First Person Consulting, AU)

In the dynamic realm of evaluation methodology, the discourse surrounding the use of validated tools versus the customization of evaluation metrics remains a contentious issue. This session aims to delve into the inherent risks associated with both approaches. This is often compounded when those in positions of power have preferences for the use of validated tools over for-context data collection questions or approaches. The tension this elicits is only increasing in a time when evaluating digital interventions where there is no direct tool to draw upon, leaving evaluators to navigate uncharted territory.

Moreover, there are an ever-increasing range of validated tools available, but little direction for evaluators - particularly emerging and early career evaluators to assist in deciding. This session presents on experiences from a range of digital and in-person projects, and we explore scenarios where there was no 'obvious solution'. This session will be of particular relevance to those undertaking evaluations of digital and novel programs.

Through candid dialogue and shared anecdotes, participants will reflect on their experiences in navigating decisions to adopt, adapt, or reject validated tools, and the learning that resulted. Embracing controversy, this session encourages attendees to challenge conventional wisdom and critically examine the balance between the reliability of validated tools, the importance of fitting data collection to context, and most importantly what 'good' looks like.

Join the conversation as we navigate the complex landscape of evaluation methodology, exploring the tensions between established practices and the pursuit of innovation in evaluation processes.

Chair
avatar for Phillip Belling

Phillip Belling

Evaluation Capacity Building Lead, NSW Department of Education
Talk to me about evaluation transforming lives and enabling social change. Talk to me about realist, developmental, embedded, responsive evaluation in education systems in Australia and in Southeast Asia. Talk to me about using ECB to transform teaching practice and the impact of... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Tenille Moselen

Tenille Moselen

First Person Consulting
https://www.fpconsulting.com.au/our-team.htmlTenille has qualifications in public health, with experience in mental health and wellbeing, alcohol and drug and international development. Her passion is creating change through design and bringing stakeholders together to address complex... Read More →
avatar for Alicia McCoy

Alicia McCoy

Principal Consultant, First Person Consulting
Alicia has 15 years of experience leading research and evaluation teams in the not-for-profit sector and is passionate about the role that research and evaluation plays in creating lasting change for individuals, families and communities. Alicia’s areas of interest include evaluation... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

In the spotlight: An arts industry-led approach to evaluation
Friday September 20, 2024 2:00pm - 2:30pm AEST
105
Authors: Kirstin Clements (Arts Centre Melbourne)

How does a creative institution develop an effective evaluation framework that honours the artistic process while meeting rigorous research standards?

At Arts Centre Melbourne we asked ourselves, 'what if?'... What if we integrated the economic story into a fit-for-purpose value proposition? What if we see the emotive, subjective nature of the arts as an asset, rather than a challenge in our evaluation design? What if we tried to embed systems thinking and extend our approach beyond individual projects?

Like many purpose-driven industries, the arts face an increasingly competitive funding landscape and heightened expectations from stakeholders for evidence-based reporting on the value generated by initiatives. Historically, in the arts such reporting has been responsive to external demands and formats. One of our core goals has been to equip the organisation with the capability and capacity to pro-actively drive its own public value narrative through a transparent, consistent approach.

In this presentation, we spotlight Arts Centre Melbourne's innovative approach to building appetite for evaluation and to designing a fit-for-purpose organisational impact model and evaluation function. We offer insights into the conceptual and methodological approaches we've adopted to achieve our objectives: supporting effective advocacy for the public value of the arts, enhancing accountability to stakeholders, and fostering a culture of continuous learning.

In sharing how we have creatively navigated challenges and opportunities at Arts Centre Melbourne, we aim to provide valuable advice and inspiration for evaluators and supporting professionals, particularly those working in sectors where evaluation is yet to be understood as 'business-as-usual' activity.

Chair Speakers
avatar for Kirstin Clements

Kirstin Clements

Partner, Impact and Evaluation, Arts Centre Melbourne
Friday September 20, 2024 2:00pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.