Loading…
This event has ended. Visit the official site or create your own event on Sched.
Conference hashtag #aes24MEL
Environment and climate change clear filter
Wednesday, September 18
 

11:00am AEST

Design-stage evaluative thinking: helping NGOs and grant makers learn to love evaluation from the start
Wednesday September 18, 2024 11:00am - 12:00pm AEST
103
Authors: Claire Grealy (Rooftop Social ),Duncan Rintoul (Rooftop Social, AU),Virginia Poggio (Paul Ramsay Foundation, AU),Luciana Campello (NSW Department of Communities and Justice, AU),Kirsty Burow (NSW Department of Communities and Justice, AU),Jacqueline Webb (National Association for the Prevention of Child Abuse and Neglect (NAPCAN), AU)

The evaluation of grant programs has long frustrated grantees and perplexed fund managers.
Evaluators often arrive at the end, and may find a strong narrative about the funded activity (assuming the project staff are still in place) but less of the documentation and data that demonstrates the impact or learning, or shows the link between each project to the fund objectives.

Fund managers have often had to be content with the limited results available to them, sometimes as basic as acquittals on activity and expenditure. This limits funders' ability to capture learning, feed into new fund designs, mount budget bids, or tell a compelling story about the work grant holders are doing.

This panel brings together a cross-section of key players and beneficiaries from a variety of contexts:
* a state government fund manager in the human services sector
* an evaluation lead from a large national philanthropic organisation
* an experienced project manager from a national NGO that receives grants from various sources
* two evaluation specialists who have deep experience working in this space, developing and delivering this kind of support.

Drawing on case studies from practice, this panel will share some innovative approaches from their work, which bring the right mix of expectation and support to the design stage of grant-based projects, from the time of submitting an EOI through to the point of evaluation readiness.

The fruit that hangs off this tree includes:
* strengthening the 'evaluability' of each project and the overall fund
* testing each project's assumptions and ambitions
* deep conversations between grant makes and grant holders about outcome alignment
* building the evaluative thinking and capability of project teams and organisations, activating the 'ripple effect' as participants share their newfound commitment and skills with their colleagues.
"You couldn't drag me to program logic workshop before this. And now look at me - I took that process you did with us and yesterday I ran it with my team on another project."
Chair
avatar for Christina Kadmos

Christina Kadmos

Principal, Kalico Consulting
Speakers
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
avatar for Jacqueline Webb

Jacqueline Webb

Strategic Projects Manager, NAPCAN
As Strategic Projects Manager at NAPCAN, I am leading an important DCJ grant initiative aimed at enhancing NSW workforce capabilities to support children and young people affected by sexual violence. With guidance from Rooftop Social, we’ve adopted an innovative evaluation approach... Read More →
avatar for Virginia Poggio

Virginia Poggio

MERL Associate, Paul Ramsay Foundation
As a Measurement, Evaluation, Research, and Learning (MERL) Associate at the Paul Ramsay Foundation, I lead teams to deliver evidence-based advice to inform the Foundation’s strategic initiatives. My role involves commissioning, supporting, and managing independent evaluations of... Read More →
avatar for Luciana Campello

Luciana Campello

Senior Policy and Projects Officer, NSW Department of Communities and Justice
Wednesday September 18, 2024 11:00am - 12:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Learning from failure at a NFP - pitfalls and pointers
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Victoria Pilbeam (WWF-Australia)

Across social and environmental movements, we are often reticent to talk about failure. But as innovation and learning gain greater emphasis across the sector, Not-for Profits are finding new ways to share and learn from their failures (eg: Engineers Without Borders failure reports, Save the Children Fail Fest, etc.). In this presentation, I will share both insights from the available research and reflect on my own journey developing failure programming at WWF-Australia. This presentation will provide practical guidance to evaluators and organisations navigating the challenging terrain of learning from failure.
Chair Speakers
avatar for Victoria Pilbeam

Victoria Pilbeam

MEL Adviser, The Pacific Community (SPC)
At the Pacific Community, I support MEL for fisheries, aquaculture and marine ecosystems across the region. Previously, I worked for WWF-Australia and inconsulting with a range of not-for profit, government , and philanthropic partners. I love MEL that is approachable, equitable... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Evaluation Lab: Using design to solve evaluation challenges
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Authors: Matt Healey (First Person Consulting)

The Design and Evaluation Special Interest Group (DESIG) was established in 2017. Its primary aim has been to explore the intersection of evaluation and design, and that aim has been interpreted in different ways over time. In 2024, the DESIG identified an opportunity to take the SIG model in a slightly different direction, embarking on an innovative venture with the launch of the Evaluation Lab, an initiative aimed at talk into action, and taking evaluators through a design process to address evaluation challenges.
Drawing inspiration from the concept of 'living labs,' which serve as real-world testing grounds, the Evaluation Lab created a space where evaluation professionals could come together. Employing a design-thinking process, the Lab guided participants through a structured expedition of defining, ideating, and prototyping solutions to tackle nominated challenges. Participants also learned pitch skills to communicate their solutions.
This Big Room Session provides an opportunity for the DESIG to outline the Evaluation Lab model, capped off with participants presenting their solutions through rapid-fire pitches, either live or pre-recorded, akin to explorers sharing tales of new lands discovered. The session's innovative twist lies in the audience's role, acting as both audience and judges. The audience will vote on their favourite solution, and be involved in crowing the first AES Evaluation Lab winner.
By blending lecture-style content with dynamic team presentations and active audience engagement, the Big Room Session not only highlights the critical role of design in navigating evaluation challenges but also demonstrates the practical application of these methodologies in charting a course through real-world problems.

Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Shani Rajendra

Shani Rajendra

Principal Consultant & Head of Business Group (Social Impact), Clear Horizon
Shani is a Principal Consultant in Clear Horizon’s Social Impact team. Shani has extensive experience in community-led initiatives, organisational strategy, and social enterprise. She specialises in incorporating design thinking into evaluative practice. Having completed a Master... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Navigating the choppy waters of the evaluation landscape in the Pacific
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
106
Authors: Allan Mua Illingworth (Mua'akia Consulting and Insight Pasifika) Fiona Fandim (Pacific Community (SPC), FJ), Eroni Wavu (MEL Officer for Pacific Women Lead at Pacific Community (SPC) and cofounder of the Fiji Monitoring, Evaluation & Learning Community), Mereani Rokotuibau (Balance of Power Program, FJ) and Chris Roche (La Trobe University),

In recent years there have been a number of Pacific driven initiatives designed to promote monitoring and evaluation practice which is culturally and contextually appropriate. These have occurred with projects and programs as well as at national and regional levels. At the same time geo-political interest in the Pacific region has resulted in an increased number of bi and multilateral donor agencies becoming present in the region and/or funding development programs, local organisations, national governments and regional bodies. This has in turn led to an evaluation landscape where notions of 'international best practice' as well as donor policies and practices and associated international researcher and consulting companies, risk crowding out emergent Pacific led evaluation initiatives.

This panel will bring together key participants who are leading four examples of these Pacific experiences: the Rebbilib process initiated by the Pacific Community (SPC ), Insight Pasifika (an emerging Pacific led and owned collective focused on evaluation in the first instance): the Fiji Monitoring, Evaluation & Learning Community and the Balance of Power program (a Pacific-led initiative, supported by the Australian Government, focused improving the political, social and economic opportunities for women and girls) each of whom are seeking to create space for processes of monitoring, evaluation and learning which are consistent with Pacific ways of knowing and being. They will share their experience, the challenges they face and ideas about what forms of support might be provided by international donors, consultants and advisors which are enabling rather than undermining.

Moderated by Prof. Chris Roche the panel and audience will also draw out the lessons from these four cases about what might contribute to more systemic change in the evaluation landscape more generally.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Allan Mua Illingworth

Allan Mua Illingworth

Adjunct Research Fellow, La Trobe University
Allan Mua Illingworth is a Monitoring and Evaluation specialist of Pacific Island heritage with a long career of international development experience and an extensive network of contacts who have worked to support development regionally and across many Pacific Island countries over... Read More →
avatar for Chris Roche

Chris Roche

Professor of Development Practice, La Trobe University
I am Professor Development Practice with the Centre for Human Security and Social Change at La Trobe University - (https://www.latrobe.edu.au/socialchange) - and former Deputy Director of the Developmental Leadership Program (www,dlprog.org) and member of the intellectual leadership... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Thursday, September 19
 

10:30am AEST

Evaluating capacity building for sustainability scientists: Pathways for early career researchers
Thursday September 19, 2024 10:30am - 11:00am AEST
104
Title: Evaluating sustainability science capacity building: Pathways for early career researchers

Author/s: Lisa Walker (CSIRO)

In the quest for sustainable solutions to pressing global challenges, transdisciplinary approaches, that integrate insights and methods from various fields, are increasingly being recognised as key to driving change. This presentation will share insights from an evaluation of a five-year, $18 million sustainability science research program seeking to not only drive innovation but to also empower the next cohort of early career researchers (ECRs) to coordinate action across research, policy and practice to address complex sustainability problems.
Our formative, mixed-methods evaluation highlighted on-the-job learning, face-to-face engagement and networking as pivotal in building sustainability science capacity. We also found targeted recruitment, research team engagement and the provision of support and resources to supervisors as essential, and sometimes overlooked, components. This work contributes to the broader discussion on how evaluation can enhance the development of sustainability science, proposing a framework that emphasises the individual, team and institutional support mechanisms necessary for effective ECR capacity building.
Novelty in our approach lies in the integration of evaluative practices within the capacity-building process, offering a reflective lens on how transdisciplinary endeavours can be optimised to address sustainability challenges. This is particularly relevant for evaluators wanting to build their own skills, or those of others, to engage on complex sustainability issues. The study also underscores the significance of adaptive learning and evaluation in navigating the complexities of sustainability science, inviting a broader conversation on how evaluation can be leveraged to facilitate meaningful contributions to societal and planetary well-being.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Lisa Walker

Lisa Walker

CSIRO
I am a social scientist with a background in program monitoring, evaluation and sustainable development. I am currently working with  CSIRO's Valuing Sustainability Future Science Platform (VS FSP) and manage the Monitoring, Evaluation, Learning and Research project within the VS... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

11:00am AEST

Learn, evolve, adapt: Evaluation of climate change and disaster risk reduction programs
Thursday September 19, 2024 11:00am - 11:30am AEST
104
Authors: Justine Smith (Nation Partners )

There is a pressing need to reduce the risks associated with climate change and the disasters that are likely to increase as a result. Along with the need to take action, comes the need to show we are making a difference - or perhaps more importantly the need to learn and evolve to ensure we are making a difference. However when operating in an ever changing, uncertain environment, with layers of complexity and outcomes that may not be realised for some time, or until disaster strikes, evidence of impact is not always easy to collect nor a priority.

Drawing on experience developing evaluation frameworks and delivering evaluation projects in the areas of climate change and disaster and emergency management, I will present some of the challenges and opportunities I have observed. In doing so, I propose that there is no 'one way' to do things. Rather, taking the time to understand what we are evaluating and to continually learn, evolve and adjust how we evaluate is key. This includes having clarity on what we really mean when we are talking about reducing risk and increasing resilience. Ideas I will explore include:
  • The concepts of risk reduction and resilience.
  • The difference between evaluation for accountability and for genuine learning and improvement.
  • Balancing an understanding of and progress towards big picture outcomes with project level, time and funding bound outcomes.
  • The challenge and potential benefits of event-based evaluation to learn and improve.

Evaluation has the capacity to contribute positively to action taken to reduce climate change risks and improve our management of disasters and recovery from disasters. As evaluators we too need to be innovative and open-minded in our approaches, to learn from and with those working directly in this space for the benefit of all.
Chair
avatar for Su-Ann Drew

Su-Ann Drew

Manager, Grosvenor
Su-Ann is a Manager specialising in program evaluation within Grosvenor’s public sector advisory practice. Su-Ann has more than a decade of rich and diverse professional experience, which enables her to offer a unique perspective and critical lens to solving complex problems for... Read More →
Speakers
avatar for Justine Smith

Justine Smith

Principal, Nation Partners
With a background spanning research, government, non-government organisations and consulting, Justine brings technical knowledge and over 10 years of experience to the projects she works on. As a highly experienced program evaluator and strategic thinker, Justine has applied her skills... Read More →
Thursday September 19, 2024 11:00am - 11:30am AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Navigating the unfamiliar: Evaluation and sustainable finance
Thursday September 19, 2024 12:00pm - 12:30pm AEST
Authors: Donna Loveridge (Independent Consultant), Ed Hedley (Itad Ltd UK, GB)

The nature and magnitude of global challenges, such as climate change, poverty and inequality, biodiversity loss, food insecurity and so on, means that $4 trillion is needed annually to achieve the Sustainable Development Goals by 2030. Government and philanthropic funding is not enough but additional tools include businesses and sustainable finance. Evaluators may relate to many objectives that business and sustainable finance seek to contribute to but discomfort can arise in the mixing of profit, financial returns, impact and purpose.

Sustainable finance, impact investing, and business for good are growing globally and provides opportunities and challenges for evaluators, evaluation practice and the profession.
This session explores this new landscape and examines:
  • What makes us uncomfortable about dual objectives of purpose and profit, notions of finance and public good, and unfamiliar stakeholders and languages, and what evaluators can do in response.
  • The opportunities for evaluators to contribute to solving interesting and complex problems with current tools and skills and where is the space for developing evaluation theory and practice.
  • How evaluation practice and evaluators' competencies might expand and deepen, and not get left behind in these new fields, and also sustaining evaluations relevance to addressing complex challenges.

The session draws on experience in Australia and internationally to share some practical navigation maps, tools and tips to help evaluators traverse issues of values and value, working with investors and businesses, and identify opportunities to add value.
Chair
Thursday September 19, 2024 12:00pm - 12:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Journey Mapping: Visualising Competing Needs within Evaluations
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104
Authors: Jolenna Deo (Allen and Clarke Consulting)

Journey mapping acts as a GPS to grasp audience or consumer experience in evaluating policies or programs, highlighting twists, hidden gems, and pitfalls It can be a useful tool to help evaluators capture disparities and competing needs among intended demographics. This session will discuss the journey mapping method, drawing from an evaluation of a Community Capacity Building Program which used journey mapping to illustrate key consumer personas. It will explore the integration of multiple data sources to provide a comprehensive understanding of complex disparities and the cultural and historical contexts in which these arise.
Chair
avatar for Claire Grealy

Claire Grealy

Director, Rooftop Social
So looking forward to AES 2024! We are Silver Sponsors this year, which means we're keeping your devices charged up through the conference, and you'll find us next to the charging stations. I welcome any and all conversation about evaluation, strategy and design, research, facilitation... Read More →
Speakers
avatar for Jolenna Deo

Jolenna Deo

Consultant, Allen and Clarke Consulting
Jolénna is a consultant at Allen + Clarke consulting. She is a proud Mirriam Mer, Pasifika women with a background in Development studies, Pacific studies and social policy, combining her interests in indigenous methodologies and social justice. She is experienced in community and... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Friday, September 20
 

10:30am AEST

Involving children and young people in evaluations: Equity through active participation
Friday September 20, 2024 10:30am - 11:00am AEST
Authors: Sharon Marra-Brown (ARTD Consultants), Moya Johansson (ARTD Consultants, AU)

Think it's important to enable children and young people to have a voice in evaluations, but find it challenging? This paper presents tried and tested strategies for ensuring ethical engagement with children and young people and encouraging meaningful participation.

Involving children and young people in evaluation is critical to ensure that we arrive at evaluations that accurately reflect their experiences and capture the outcomes they consider most important. Children and young people have the right to have a say about their experiences, and evaluations that avoid their involvement risk perpetuating ongoing inequities.

However, involving children and young people in evaluations can prompt ethical concerns in relation to their comprehension of research, capacity to provide consent, potential coercion by parents, and the potential conflicting values and interests between parents and children. Depending on the subject, it can also create concerns about safety and readiness.

Based on our experience successfully achieving ethics approval for multiple evaluations of services for children and young people across Australia, which include interviews with children and young people who have accessed these services, we will talk through considerations for ensuring the voice of children and young people in evaluation while safeguarding them from unnecessary risks.

We will then take you through how we've overcome challenges engaging children and young people in evaluations with youth-centred innovative solutions, including carefully considering the language we use and how we reach out. We will demonstrate the developmental benefits of meaningful participation of children and young people once ethical considerations have been carefully considered and navigated.

Finally, we will take you through our tips for ensuring meaningful and safe engagement with children and young people. We will point you in the direction of Guidelines and practice guides for involving young people in research and evaluation in a safe and meaningful way.

The presenters are evaluators with extensive experience in designing, delivering and reporting on evaluations that include data collection with children and young people. This includes recently achieving ethics approval and commencing interviews with children as young as seven, accessing a suicide aftercare service.

While much attention is devoted to ensuring safe and inclusive data collection with various demographics, specific considerations for engaging children and young people remain relatively uncommon. Recognising the unique needs of this population, coupled with the understandably cautious stance of ethics committees, underscores the necessity for a thoughtful and deliberate approach to evaluations involving children and young people.

Given the additional complexities and ethical considerations involved, the default tendency can be to exclude children and young people from evaluation processes. However, it is important that children and young people are able to have a say in the programs, policies and services that they use. Participation in evaluations can be a positive experience, if risks are managed and the process is designed to be empowering.

This session will provide valuable insights, actionable strategies, and an opportunity for participants to reflect on their own practices, fostering a culture of inclusivity and responsiveness in evaluation.
Chair
LB

Laura Bird

MERL Associate, Paul Ramsay Foundation
Speakers
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Mitchell Rice-Brading

Mitchell Rice-Brading

ARTD Consultants
I started with ARTD in early 2022 after completing his Bachelor of Psychological Science (Honours) in 2021. This, in combination with experience as a Psychology research assistant, helped me develop strong research skills, namely the ability to synthesise and critically evaluate qualitative... Read More →
Friday September 20, 2024 10:30am - 11:00am AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Embracing the L in "MEL": A Journey Towards Participatory Evaluation in Government Programs
Friday September 20, 2024 12:00pm - 12:30pm AEST
103
Authors: Milena Gongora (Great Barrier Reef Foundation)

Best practice in evaluation encompasses a crucial step of learning, yet it often receives inadequate emphasis, particularly within government-funded initiatives. Our paper documents the journey of transforming a top-down, prescriptive evaluation process within a government-funded program into an inclusive, consultative approach aligned with Monitoring, Evaluation, and Learning (MEL) principles.

Funded by Australian Government, and managed by the Great Barrier Reef Foundation, the Reef Trust Partnership (RTP) launched in 2018 aiming to enhance the resilience of the Great Barrier Reef. Within it, a $200 million portfolio aims to improve water quality working with the agricultural industry. A framework for impact evaluation was developed in its early days. Whilst appropriate, due to the need to comply with broader government requirements, it was top-down in nature.

Four years into implementation, the Foundation was ready to synthesise, interpret and report on the program's impact. The Foundation could have simply reported "up" to government. However, we acknowledged that in doing so, we risked missing critical context, simplifying findings, misinterpreting information and presenting yet another tokenistic meaningless report.

Interested in doing things better, we instead circled back with our stakeholders in a participatory reflection process. Through a series of carefully planned workshops, we invited on-ground program practitioners to ground-truth our findings, share contextual nuances, and collectively strategise for future improvements.

Despite initial reservations, participants embraced the opportunity, fostering an atmosphere of open dialogue and knowledge exchange. This reflective process not only enriched our understanding of program impact but also enhanced collaboration, strengthening overall program outcomes.

Our experience highlights the importance of transcending tokenistic evaluation practices, particularly in environments where top-down directives prevail. Participatory approaches can be implemented at any scale, contributing to a culture of continuous improvement and strategic learning, ultimately enhancing the impact and relevance of evaluation efforts.

Chair
avatar for Kate O'Malley

Kate O'Malley

Consultant
I provide targeted policy, advocacy and evaluation support on refugee and migration matters drawing on a lengthy career in the United Nations and the Australian Public Service and post-graduate studies in evaluation and diplomatic practice.
Speakers
avatar for Milena Gongora

Milena Gongora

Associate Director - Water Quality, Great Barrier Reef Foundation
Milena’s area of interest is nature conservation. With over 14 years of experience, her work ranges from managing the Mekong River to enhancing the resilience of the Great Barrier Reef. Over most of this time, her roles have involved evaluating the success of conservation initiatives... Read More →
Friday September 20, 2024 12:00pm - 12:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

If the destination is improvement, recommendations are the signpost
Friday September 20, 2024 12:00pm - 12:30pm AEST
106
Authors: Laura Baker (ACIL Allen), Larissa Brisbane (Department of Climate Change, Energy, the Environment and Water NSW, AU)

Recommendations are the sharp end of evaluation, connecting evidence and insights to the improvement we aim to achieve. Many evaluation theories focus on framing and conducting evaluations, rather than developing recommendations or the associated organisational change required to complete the journey.

Recommendations point the way beyond an evaluation report, as the journey doesn't end when the report is produced. This presentation tells the story of recommendation wayfinding. We will share an evaluation practitioner and a commissioner's journey on navigating the challenge of developing actionable recommendations to promote impact beyond program close and into future decisions.

Evaluators need ways to integrate diverse evidence sources and generate actionable insights. The consultant will share perspectives on where these insights and the associated recommendations "come from": how different data come together to inform insights, the process for developing recommendations (balancing independence and engagement from commissioners), and how to design recommendations for the program and beyond.

Commissioners need recommendations that make sense in their context. The commissioners will share considerations in what makes a recommendation useful, and how we use this evaluation journey to leverage learning, skill building, and improvement opportunities. They will also discuss the evaluation audience and how ambitious can you get with recommendations.

This work over a number of years has helped build the evaluation knowledge base within our organisations. We will close with our recommendations to you - with the top ideas that we plan to take with us on our next evaluation journey.
eloped evaluations for multiple end users, each with their own needs. They'll share the research and engagement approaches and tools, which have been useful in different situations, as well as what was useful specifically for this project.
Chair
avatar for Rachel George

Rachel George

Director, Research, Monitoring and Evaluation Practice, Tetra Tech International Development
Speakers
avatar for Larissa Brisbane

Larissa Brisbane

Team Leader, Strategic Evaluation, Dept of Climate Change, Energy, the Environmentand Water NSW
It was a short step from studying environmental science, and working on cross-disciplinary problem-solving, to evaluation where I still ask 'why' and 'how do you know that'. I love hearing stories of what you've done and learned, especially in energy, climate change, environment and... Read More →
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen
Friday September 20, 2024 12:00pm - 12:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Squaring up with rubrics
Friday September 20, 2024 1:30pm - 2:30pm AEST
103
Authors: Josh Duyker (Centre for Evaluation and Research Evidence, Victorian Department of Health)

Much like Felix the Cat, evaluators have a bag of tricks to get us out of sticky situations. But when you are staring face to face with a complex evaluand, juggling tricky stakeholders whist sat on a mountain of data, it's not always clear what 'trick' you need! One twisted potential solution is the colourful, yet humble rubric. In this reflective practice ignite presentation, I will guide you through our journey of using rubrics as a tool to way find through an evaluation, and our key takeaways in how rubrics can support evaluators to make comprehensive and balanced evaluative judgements.
Chair
avatar for Carina Calzoni

Carina Calzoni

Managing Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →
Speakers
JD

Josh Duyker

Evaluation and Research Officer, Centre for Evaluation and Research Evidence
I am an emerging evaluator, currently working at the Centre for Evaluation and Research Evidence in the Victorian Department of Health. I've completed a Master of Public Health and am embarking on a Masters of Evaluation. Through roles in the not-for-profit sector and my studies... Read More →
Friday September 20, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

2:00pm AEST

Introducing a trauma informed AI assessment tool for evaluators of AI (artificial intelligence) assisted programs and services.
Friday September 20, 2024 2:00pm - 2:30pm AEST
104
Authors: Lyndal Sleep (Central Queensland University)

AI and other forms of digital technologies are being increasingly used in programs and services delivery. They promise increased efficiency, accuracy and objectivity, however these technologies can also cause significant harm and trauma, as seen in Robodebt. It is vital for evaluators to be aware of key questions to ask to prevent AI having unintended impact on program processes, outputs and outcomes, or cause harm to service users.

Objective
This session aims to support evaluation of AI assisted programs and services by introducing evaluators to a new and innovative trauma informed AI assessment tool.

Core argument
(1) AI is increasingly being used in programs and services, and understanding the resulting risks is essential for evaluators to assess whether services are meeting intended outcomes.
(2) many evaluators are unaware of what types of risks to look for when assessing AI assisted services, or what questions to ask - especially when conducting trauma informed evaluations.
(3) a practical trauma informed AI assessment tool has been developed by researchers from [Universities omitted], with funding from [omitted] to address this need, and will be briefly introduced.

A short paper session will:
(1) Highlight the problem that AI is increasingly being used to assist program and services delivery, but many evaluators are unaware of the main risks to consider when evaluating these services.
(2) Suggest the solution of a practical tool which considers these risks, with technological knowledge and within a trauma informed framework, that can be employed by evaluators.
(3) Introduce a trauma informed AI assessment tool, the method used to develop it, as well as its intended practical use by evaluators (both internal and external to organisations).

There will be 10 minutes for questions and discussion at the end of the presentation.

Chair
avatar for Kira Duggan

Kira Duggan

Research Director, Systems and Services, Australian Institute of Family Studies
I am a social policy evaluation specialist and have worked with a broad range of government agencies and community service agencies across Australia and internationally. My experience is in advising on program evaluation and design; evidence-based policy and strategy development... Read More →
Speakers
Friday September 20, 2024 2:00pm - 2:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.