Loading…
Conference hashtag #aes24MEL
Foundational & Intermediate clear filter
arrow_back View All Dates
Wednesday, September 18
 

11:00am AEST

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106
Authors: Samantha Abbato (Visual Insights People )

The psychology of evaluation capacity building: Finding the way with the rider, elephant and the pathway
Evaluation capacity building is increasingly becoming a core part of evaluation practice and a critical part of incorporating evaluation into the everyday activity of organisations (Preskill and Boyle, 2008, White, Percy and Small, 2018). Reaching the point where evaluation becomes the way of doing business requires a change of knowledge, skills, and attitudes.

Changes need to happen at the level of individuals, teams, organisations, and partnerships. This journey requires supporting and managing change to systematic enquiry processes as much as it requires evaluation expertise. In this skill-building session, we introduce Jonathan Haidt's 'rider, elephant and pathway' metaphor as a framework to support change and strengthen evaluation capacity (Haidt, 2018).

Haidt's metaphor for change includes the rider (our rational thinking side) atop an elephant (our emotional side). Behaviour change for individuals and collectives requires steps that (1) support the rider, such as giving clear directions, (2) motivate the elephant by tapping into emotions, and (3) shape a pathway to change, including clearing obstacles. In this interactive session, the facilitator will provide case studies applying Haidt's metaphor,spanning two decades Through these examples the power of this framework to support evaluation capacity building is demonstrated. Examples include using Haidt's framework for:
1. Building a Monitoring, Evaluation and Learning (MEL) system with a medium-sized community organisation;
2. Increasing the maturity of MEL in an existing large organisation; and
3. Increasing the impact of evaluation partnerships.

The active skill-building component incorporates:_
  • Cartoon elephant, rider and pathway flashcards;
  • A 'snakes and ladders' style game; and
  • Evaluation-specific examples.

The combination of examples and activities are designed to support participant learning. The session will encourage discussion of barriers, enablers and actions to build evaluation capacity relevant to different situations and contexts.

Learning objectives include:
  • Knowledge of a sound and memorable psychological framework for supporting evaluation capacity building;
  • Ability to apply Haidt's metaphor
Chair
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →
Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
My twenty-plus years of evaluation experience are built on academic training in qualitative and quantitative disciplines, including mathematics, health science, epidemiology, biostatistics, and medical anthropology. I am passionate about effective communication and evaluation capacity-building... Read More →
Wednesday September 18, 2024 11:00am - 12:00pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Culturally Responsive Initiatives: Introducing the First Nations Investment Framework
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104
Authors: Eugenia Marembo

Representatives of First Nations communities have been advocating for changes in the way initiatives are planned, prioritised, and assessed. This includes greater visibility on where funding is going, more partnerships on designing initiatives and more evaluation on the outcomes being achieved, to inform government decision making.

This paper presents key insights on what constitutes good practice when designing and appraising initiatives that affect First Nations people and communities. The National Agreement on Closing the Gap is built around four new Priority Reforms that will change the way governments work with Aboriginal and Torres Strait Islander people and communities. Priority Reform Three is about transforming government institutions and organisations. As part of this Priority Reform, parties commit to systemic and structural transformation of mainstream government organisations to improve accountability, and to respond to the needs of First Nations people.

The findings presented in this paper draw on insights from consultations with various First Nations community representatives and government stakeholders in New South Wales, and the subsequent process of developing a government department's First Nations investment framework which seeks to strengthen the evidence on what works to improve outcome for First Nations people. Additionally, the frameworks to improve practice across government processes and better inform how initiatives are designed, prioritised and funded.
Chair
avatar for Alice Muller

Alice Muller

Senior Monitoring & Evaluation Advisor: FMNR Scale Up, World Vision Australia
An environmental scientist, working in international development, interested in evaluation and learning about all things community, trees, ecosystem restoration, climate action, scaling and systems transformation.  I also really like coffee and chatting about gardening, travel and... Read More →
Speakers
avatar for Steven Legg

Steven Legg

Associate Director, NSW Treasury
avatar for Eugenia Marembo

Eugenia Marembo

NSW Treasury, Senior Analyst, First Nations Economic Wellbeing
Wednesday September 18, 2024 11:30am - 12:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Enhancing evaluation value for small community organisations: A case example
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104
Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke, Carolyn McSporran (Blue Light Victoria, AU)Authors: Stephanie Button (Assessment and Evaluation Research Centre, University of Melbourne), Allison Clarke (Assessment and Evaluation Research Centre, University of Melbourne, AU), Elissa Scott (Blue Light Victoria, AU)

This presentation aims to provide a case example of how two small-scale, standard process/outcomes evaluations for a low-budget community organisation increased value for the organisation by identifying and seizing opportunities for evaluation capacity building. Formal evaluations represent a significant financial commitment for low-budget community organisations. By maximising the value provided by such evaluations, evaluators can contribute more to these organisations' mission and ultimately to social betterment.

There are numerous different evaluation capacity building models and frameworks, many of which appear to be quite complex (for example: Volkov & King, 2007; Preskill & Boyle, 2008). Many emphasise planning, documentation, and other resource intensive components as part of any evaluation capacity building effort. This session provides a case example of intentional but light-touch and opportunistic evaluation capacity building. Through such an approach, evaluators may need to do only minimal additional activities to provide extra value to an organisation. Reflection-in-action during the evaluation process is as important as the final reporting (Schwandt & Gates, 2021). The session emphasises, though, that a critical enabler will be the organisation's leadership and culture, and willingness to seize the opportunity offered by a formal evaluation. The session is co-presented by two members of the evaluation team and the Head of Strategy, Insights, and Impact of the client organisation.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Allison Clarke

Allison Clarke

Evaluator
- Allison is passionate about using monitoring and evaluation for organisational learning. She has over 20 years experience in the private and not-for-profit sectors in industrial research, probate research, and program development. She completed her Master of Evaluation at the Centre... Read More →
avatar for Stephanie Button

Stephanie Button

Research Associate & Evaluator, Assessment & Evaluation Research Centre
Stephanie has worked as a policy manager, analyst, strategist, researcher, and evaluator across the social policy spectrum in the public and non-profit sector for over 12 years. She is passionate about evidence-based policy, pragmatic evaluation, and combining rigour with equitable... Read More →
avatar for Carolyn McSporran

Carolyn McSporran

Head of Strategy, Insights and Impact, Blue Light Victoria
Passionate about social inclusion, Carolyn's work has spanned diverse portfolios across the justice and social services sectors. With a fervent belief in the power of preventative and early intervention strategies, she is committed to unlocking the full potential of individuals and... Read More →
Wednesday September 18, 2024 1:30pm - 2:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Revitalising Survey Engagement: Strategies to Tackle Low Response Rates
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Kizzy Gandy

Surveys are an excellent data collection tool when they reach their target response rate, but low response rates hinder the generalisability and reliability of the findings.

This Ignite presentation will discuss techniques Verian evaluators have applied to increase survey response rates while also assessing the efficacy and efficiency of these techniques. We will also explore other evidence-based strategies for boosting response rates and the value of drawing on other data sources if your response rates are still low.
Chair Speakers
avatar for Hannah Nguyen

Hannah Nguyen

Analyst, Verian Group
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Sign here: Supporting Deaf participation in evaluation
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103
Authors: Susie Fletcher (Australian Healthcare Associates)

Auslan is a visual, signed language that was developed by and for the Australian Deaf community. People who use Auslan as their primary or preferred language are not necessarily fluent in English. Our team was engaged to review access to interpreter services for Auslan users, a population group that is often underrepresented in evaluation. In this presentation we will highlight some of the issues evaluators need to consider when working with this marginalised community, and share practical skills and techniques for making their evaluations more accessible.
Chair Speakers
avatar for Susie Fletcher

Susie Fletcher

Senior consultant, Australian Healthcare Associates
Dr Susie Fletcher is an experienced health services researcher with over 50 peer reviewed journal articles and 3 book chapters in mental health and primary care. She is passionate about improving health outcomes through integrating services across sectors; her recent work has included... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Cultivating Equity: A Roadmap for New and Student Evaluators' Journeys
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
Authors: Ayesha Boyce (Arizona State University), Aileen Reid (UNC Greensboro, US)

Evaluation can be positioned as a social, cultural, and political force to address issues of inequity. We co-direct a 'lab' that provides new evaluators with hands-on applied research and evaluation experience to support their professional development. We are proud of our social justice commitments, and they show up in all aspects of our work. We believe the next generation of evaluators must be trained and mentored in high-quality technical, strengths-based, interpersonal, contextual, social justice-oriented, and values-engaged evaluation. We have found that novice evaluators are able to engage culturally responsive approaches to evaluation at the conceptual level, but have difficulty translating theoretical constructs into practice. This paper presentation builds upon our experiences and previous work of introducing a framework for teaching culturally responsive approaches to evaluation (Boyce & Chouinard, 2017) and a non-course-based, real-world-focused, adaptable training model (Reid, Boyce, et al., 2023). We will discuss how we have taught new evaluators three formal and informal methodologies that have helped them align their values with praxis. Drawing from our work across multiple United States National Science Foundation-funded projects we will overview how the incorporation of photovoice methodology, just-in-time feedback, and reflective practice have supported our commitments to meaningfully, and respectfully attend to issues of culture, race, diversity, power, inclusion, and equity in evaluation. We will also discuss our thoughts on the implications of globalization, Artificial Intelligence, and shifting politics on evaluation capacity building and training of new evaluators.

Chair
avatar for Nick Field

Nick Field

Director (Public Sector), Urbis
Nick has twenty years of public sector consulting experience, backed more recently by six years as a Chief Operating Officer in the Victorian Public Sector. A specialist generalist in a broad range of professional advisory services, Nick has expertise in the implementation of state-wide... Read More →
Speakers
avatar for Ayesha Boyce

Ayesha Boyce

Associate Professor, Arizona State University
Ayesha Boyce is an associate professor in the Division of Educational Leadership and Innovation at Arizona State University. Her research career began with earning a B.S. in psychology from Arizona State University, an M.A. in research psychology from California State University... Read More →
avatar for Aileen M. Reid

Aileen M. Reid

Assistant Professor, UNC Greensboro
Dr. Aileen Reid is an Assistant Professor of Educational Research Methodology in the Information, Library and Research Sciences department and a Senior Fellow in the Office of Assessment, Evaluation, and Research Services (OAERS) at UNC Greensboro. Dr. Reid has expertise in culturally... Read More →
Wednesday September 18, 2024 1:30pm - 2:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

Link-Up Services and wayfinding: Co-creating and navigating a culturally safe national monitoring and evaluation strategy
Wednesday September 18, 2024 1:30pm - 3:00pm AEST
Authors: Kathleen Stacey (beyond...Kathleen Stacey & Associates Pty Ltd), Cheryl Augustsson (Yorgum Healing Services), Raelene Rosas (NT Stolen Generation Aboriginal Corporation), Pat Thompson (Link-Up (Qld) Aboriginal Corporation), Jamie Sampson (Link-Up (NSW) Aboriginal Corporation)

Link-Up Services support Aboriginal and/or Torres Strait Islander people who were forcibly removed, fostered or adopted from their families as children, and their descendants who live with the ongoing impact of forcible removal policies, to reconnect with family, community, culture and Country. Wayfinding is at the core of our work - navigating unfamiliar territory with clients towards a hoped for destination of a greater sense of 'home', wherever this is possible, in a culturally safe, appropriate and trauma-informed manner.

In 2019, the National Indigenous Australians Agency funded development of a national Link-Up monitoring and evaluation strategy with the eight Link-Up Services operate across six jurisdictions. Each Link-Up Service is either a stand-alone Aboriginal community controlled organisations or based in an Aboriginal community controlled organisation.

This interactive workshop invites participants into our collective experiences of co-creating and implementing the M&E Strategy on a national basis, presented from the voices and position of Link-Up Services. We believe our experiences and learnings will be instructive for monitoring and evaluation activity with other Aboriginal and Torres Strait Islander organisations and programs.

Travel with us in reflecting on our monitoring and evaluation wayfinding journey over three phases of work. Pause with us at key points throughout the session to exercise your critical self-reflection and analysis skills, share your ideas and learn what has worked well or presented challenges for us and why in creating, navigating and implementing a culturally safe monitoring and evaluation strategy in a complex and demanding service context.
Speakers
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond…(Kathleen Stacey & Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →
RR

Raelene Rosas

Interim CEO, Northern Territory Stolen Generations Corporation
avatar for Patricia Thompson AM

Patricia Thompson AM

CEO, Link-Up Queensland
CEO of Link-Up (Qld), an organisation that celebrates 40 years of supporting Stolen Generations this year. Has a wealth of management experience across all levels of government and importantly at a community level.  Has represented Aboriginal & Torres Strait Islander people at a... Read More →
Wednesday September 18, 2024 1:30pm - 3:00pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Navigating a path to system impact: designing a strategic impact evaluation of education programs
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104
Authors: Amanda Reeves (Victorian Department of Education), Rhiannon Birch (Victorian Department of Education, AU), Eunice Sotelo (Victorian Department of Education, AU)

To provide insight to complex policy problems, evaluations need to adopt a systems perspective and examine the structures, relationships and contexts that influence program outcomes.

This paper outlines the design of a 4-year strategic evaluation that seeks to understand how a portfolio of over 25 education programs are interacting and collectively contributing to system-level outcomes. In this context, policy makers require evaluation to look beyond the boundaries of individual programs and assess the holistic impact of this investment to inform where and how resources can be directed to maximise system outcomes.

The strategic evaluation presented is theory-based and multi-layered, using logic modelling to identify outcomes at the program, cluster and system level and draw linkages to develop a causal pathway to impact. The strategic evaluation and the evaluations of individual education programs are being designed together to build-in common measures to enable meta-analysis and synthesis of evidence to assess system-level outcomes. The design process has been broad and encompassing, considering a diverse range of methods to understand impact including quantitative scenario modelling and value for money analysis.

The authors will describe how the strategic evaluation has been designed to respond to system complexity and add value. The evaluation adopts an approach that is:
• interdisciplinary, drawing on a range of theory and methods to examine underlying drivers, system structures, contextual factors and program impacts
• collaborative, using expertise of both internal and external evaluators, to design evaluations that are aligned and can tell a story of impact at the system-level
• exploratory, embracing a learning mindset to test and adapt evaluation activities over time.

This paper will be valuable for anyone who is interested in approaches to evaluating the relative and collective contribution of multiple programs and detecting their effects at the system level to inform strategic decision-making.
Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Amanda Reeves

Amanda Reeves

Principal Evaluation Officer, Victorian Department of Education
Amanda is an evaluation specialist with over 12 years experience leading evaluation projects in government, not-for-profit organisations and as a private consultant. She has worked across a range of issues and sectors including in education, youth mental health, industry policy and... Read More →
avatar for Eunice Sotelo

Eunice Sotelo

Senior Evaluation & Research Officer, Department of Education (Victoria)
I'm here for evaluation but passionate about so many other things - education (as a former classroom teacher); language, neuroscience and early years development (recently became a mom so my theory reading at the moment is on these topics); outdoors and travel. Workwise, I'm wrangling... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

When speed is of the essence: How to make sure the rubber hits the road
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103
Authors: Kristy Hornby (Grosvenor)

There is a lot of interest in rapid M&E planning and rapid evaluations at present; partially borne out of rapid contexts in a COVID-19 policy context; and partially borne out of constricting appetites for time and money spent on evaluations. It is unlikely this trend will reverse in the short-term, so what do we do about it to acquit our responsibilities as evaluators, ethically and appropriately, in a rapid context? This session sets out a step by step approach to conducting a rapid evaluation, inviting attendees to follow along with their own program in mind, to come away from the session with a pathway for conducting their own rapid evaluation. The session uses a fictional case study as the construct to move the rapid evaluation approach forward, describing throughout the session how you can use literature reviews, qualitative and quantitative data collection and analysis techniques, and report writing approaches innovatively to save you time while not compromising rigour.

We contend it is possible to do a rapid evaluation ethically and appropriately, but the backbone of doing so is good planning and execution. This session shares practical tips and approaches for doing so through each key phase of an evaluation, so attendees are well-equipped for their next rapid evaluation.

To consolidate the learning, attendees will be provided a framework to come away from the session with a high level plan of how to conduct their own rapid evaluation, to increase their chance of success.

Chair Speakers
avatar for Kristy Hornby

Kristy Hornby

Associate Director, Victorian Evaluation Lead, Grosvenor
Kristy has over ten years of evaluation experience, with expertise spanning the Victorian state government, federal government, local government and not-for-profit sectors. She has particular expertise in social services, employment, primary health, agriculture and environment and... Read More →
Wednesday September 18, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Wayfinding for integrating social justice and culturally responsive and equitable evaluation practices in meta-evaluation: Learning from the UN evaluation quality assessments.
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
Authors: Sandra Ayoo (University of North Carolina Greensboro)

With quality in improving interventions to improve societal and planetary wellbeing being the desired destination of evaluation, it is imperative that evaluators reflect on the meaning of quality and methods to assess if evaluation is achieving it. Meta-evaluation, coined by Michael Scriven in 1969, evaluates evaluations and aids in understanding how evaluations contribute to addressing structural and systemic problems in interventions and evaluation practice. Meta-evaluation has evolved over the past five decades and is included in the program standards by major professional associations. While the field of evaluation is confronted with major concerns regarding the centrality of social justice, there is currently there are no one-size-fits-all guidelines for meta-evaluation and for addressing social justice in evaluations. To address this, we reviewed meta-evaluation literature, mapped the American Evaluation Association's foundational documents with the United Nations Evaluation Group's Norms and Standards to explore their intersectionality on social justice, and analyzed 62 United Nations Population Fund evaluation reports alongside their management responses. The study findings indicated that meta-evaluation is contingent on context rather than established standards. Thus, it's crucial for evaluators to intentionally prioritize social justice in evaluation design and implementation and to select quality assurance tools that match the evaluation context and professional association guidelines to improve the quality of the intervention. I will share key characteristics of the United Nations Group's Norms and Standards on social justice to stimulate discussions on evaluators' efforts to address systemic issues. Collectively, participants will benefit from discussing and reflecting on their own practice by responding to questions like (a) what are the examples of their work in collaborative and systems-informed ways to intentionally include social justice in their evaluations, and (b) what should the field of evaluation do to ensure that evaluations add value for people and planet?anagement response.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Developing a Tool for Measuring Evaluation Maturity at a Federal Agency
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105
Authors: Eleanor Kerdo (Attorney Generals Department ),Claudia Oke (Attorney Generals Department, AU),Michael Amon (Attorney Generals Department, AU),Anthony Alindogan (Attorney Generals Department, AU)

To embed a culture of evaluation across the Australian Public Service (Commonwealth of Australia, 2021), we must first have an accurate understanding of the current state of evaluation capability and priorities across Commonwealth agencies. This paper shares tools on how to build an effective measurement framework for evaluation culture, and discusses how to use these for evaluation capability uplift.
We explore quantitative and qualitative methods to gather and analyse data to measure an organisation's readiness to change its culture towards evaluation. This includes assessing staff attitudes towards evaluation, the level of opportunity for staff to conduct and use evaluation, and confidence in their knowledge of evaluation.
We discuss the development of a staff evaluation culture survey based on Preskill & Boyle's ROLE and how behavioural insight tools can be utilised to boost engagement. The paper discusses the utility of holding focus groups with senior leaders to understand authorising environments for evaluation and key leverage points. Also discussed, are challenges and innovative solutions that were encountered throughout the assessment process.
This paper will be valuable for those who work in, or with, any government agency with an interest in evaluation capacity building and driving an evaluation culture within organisations. This paper explains each stage of measurement design, data analysis and results, and discussing opportunities for action.
1 Preskill, H., & Boyle, S. (2008). A Multidisciplinary Model of Evaluation Capacity Building. American Journal of Evaluation, 29(4), 443-459. ://journals.sagepub.com/doi/10.1177/1098214008324182

2 Michie S, Atkins L, West R. (2014) The Behaviour Change Wheel: A Guide to Designing Interventions. London: Silverback Publishing. www.behaviourchangewheel.com.

3 Lahey, R. (2009). A Framework for Developing an Effective Monitoring and Evaluation System in the Public Sector: Key considerations from International Experience. Framework for developing an effective ME system in the public sector (studylib.net)
Chair
avatar for Marwan El Hassan

Marwan El Hassan

Director, Future Drought Fund Program Evaluation and Support, Department of Agriculture, Fisheries and Forestry
I am the director of the Program Evaluation and Support team at the Future Drought Fund (FDF). My team is responsible of supporting the FDF's program areas in their monitoring, evaluation and learning work, and to ensure alignment of our MEL work with other areas around the department... Read More →
Speakers
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
avatar for Anthony Alindogan

Anthony Alindogan

Evaluation Lead, Attorney-General's Department
Anthony is an experienced evaluator with a particular interest in outcomes measurement and value-for-money. He completed his Master of Evaluation degree from the University of Melbourne. Anthony is an enthusiastic writer and has publications in various journals including the Evaluation... Read More →
avatar for Claudia Oke

Claudia Oke

Project Officer / Data Analyst, Australian Public Service Commission
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Uncovering Hidden STEM Footprints: Leveraging Output Data from Questacon’s Outreach Programs
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104
Authors: Jake Clark (Questacon - The National Science and Technology Centre), Jenny Booth (Questacon - The National Science and Technology Centre, AU), Sharon Smith (Questacon - The National Science and Technology Centre, AU), Nick Phillis (Questacon - The National Science and Technology Centre, AU)

Join our Monitoring and Evaluation team on an exciting journey as we delve into the untapped potential of output data in evaluation and how to reach beyond the 'low-hanging fruit'.

Notwithstanding to the importance of evaluating outcomes to measure program success, monitoring implementation and reach of initiatives is fundamental to good program management and evaluation. Output data on activity reach, target groups and participants often hold hidden gems of potential that are frequently overlooked. In this presentation we shine a spotlight on its significance and actionable tips to elevate monitoring data.

Our objective is to make this exploration enjoyable and enlightening, especially for foundational to intermediate level evaluators. We offer practical and universally applicable strategies for making the most of output data on to enhance program insights.

KEY MESSAGES

Using existing tools and tapping into open-source data sets you can create powerful visualisations and draw deeper inferences about your program reach and participants.

I. Understanding equity and inclusion
• A better understanding of who is and isn't involved in your initiative.
• Looking for patterns using socio-demographic variables.
• Benchmarking your initiative against relevant population data.

II. Connecting outputs to outcomes
• Analysing participant characteristics and program journey to illuminate differences in outcomes.
• Uncover program and policy questions that need further exploration.

Design of the Session: Drawing tangible examples from the education and informal learning STEM sector, we bridge the gap between theory and practice. Real-world strategies are shared to encourage active participation along with useful resource links.
Chair Speakers
avatar for Jake Clark

Jake Clark

Senior Monitoring and Evalution Officer, Questacon - National Science and Technology Centre
What value does STEM outreach bring to an individual? How does it change someone's attitude/behaviour/disposition around STEM? And how do you quantify such probing questions? These are the types of queries I'm answering in my Senior Evaluation Officer role at Australia's National... Read More →
Wednesday September 18, 2024 4:30pm - 5:00pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

4:30pm AEST

Evaluation Lab: Using design to solve evaluation challenges
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Authors: Matt Healey (First Person Consulting)

The Design and Evaluation Special Interest Group (DESIG) was established in 2017. Its primary aim has been to explore the intersection of evaluation and design, and that aim has been interpreted in different ways over time. In 2024, the DESIG identified an opportunity to take the SIG model in a slightly different direction, embarking on an innovative venture with the launch of the Evaluation Lab, an initiative aimed at talk into action, and taking evaluators through a design process to address evaluation challenges.
Drawing inspiration from the concept of 'living labs,' which serve as real-world testing grounds, the Evaluation Lab created a space where evaluation professionals could come together. Employing a design-thinking process, the Lab guided participants through a structured expedition of defining, ideating, and prototyping solutions to tackle nominated challenges. Participants also learned pitch skills to communicate their solutions.
This Big Room Session provides an opportunity for the DESIG to outline the Evaluation Lab model, capped off with participants presenting their solutions through rapid-fire pitches, either live or pre-recorded, akin to explorers sharing tales of new lands discovered. The session's innovative twist lies in the audience's role, acting as both audience and judges. The audience will vote on their favourite solution, and be involved in crowing the first AES Evaluation Lab winner.
By blending lecture-style content with dynamic team presentations and active audience engagement, the Big Room Session not only highlights the critical role of design in navigating evaluation challenges but also demonstrates the practical application of these methodologies in charting a course through real-world problems.

Chair
avatar for Duncan Rintoul

Duncan Rintoul

Managing Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, run a rad consulting firm that specialises in evaluation, lifelong learner. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health... Read More →
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Shani Rajendra

Shani Rajendra

Principal Consultant & Head of Business Group (Social Impact), Clear Horizon
Shani is a Principal Consultant in Clear Horizon’s Social Impact team. Shani has extensive experience in community-led initiatives, organisational strategy, and social enterprise. She specialises in incorporating design thinking into evaluative practice. Having completed a Master... Read More →
Wednesday September 18, 2024 4:30pm - 5:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Trigger warnings - do they just trigger people more?
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104
Authors: Kizzy Gandy (Verian (formerly Kantar Public) )

As evaluators, one of our key ethical responsibilities is to not cause psychological harm or distress through our methods. We often start workshops or interviews with a warning that the topic may be upsetting and provide contact information for mental health services to participants under the assumption this is the most ethical practice.

Trigger warnings are used with good intentions and are often recommended in evaluation ethics guidelines. However, what do we know about their impact? Is there a risk they actually trigger people more?

This talk examines the evidence on whether trigger warnings are an effective strategy for reducing the risk of trauma and re-traumatisation when discussing topics such as sexual assault, mental health, violence, drug use, and other sensitive issues. It also touches on new evidence from neuroscience about how emotions are understood differently now compared to in the past.

This session will not provide a definitive answer on when or how to use trigger warnings but aims to challenge the audience to think critically about whether trigger warnings are useful in their own work.
Chair Speakers
avatar for Kizzy Gandy

Kizzy Gandy

National Director, Program Evaluation, Verian
Dr Kizzy Gandy is Verian's National Director of Program Evaluation. She leads a team of expert methodologists and provides quality assurance. With 20 years’ experience in consultancy, federal and state government, and academia, Kizzy has overseen the design and evaluation of over... Read More →
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
104 113 Convention Centre Pl, South Wharf VIC 3006, Australia

5:00pm AEST

Development and implementation of a culturally grounded evaluation Framework: Learnings from an Aboriginal and Torres Strait Islander Peak.
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
Authors: Candice Butler (Queensland Aboriginal and Torres Strait Islander Child Protection Peak ),Michelle McIntyre (Queensland Aboriginal and Torres Strait Islander Child Protection Peak, AU),John Prince (JKP Consulting, AU)

There is increasing recognition that evaluations of Aboriginal and Torres Strait Islander programs must be culturally safe and appropriate, and represent the worldviews, priorities, and perspectives of Aboriginal and Torres Strait Islander communities. Aboriginal and Torres Strait Islander peoples have the cultural knowledge and cultural authority to design appropriate evaluations that are safe, and that tell the true story of the impacts of our ways of working.

As a peak body for Aboriginal and Torres Strait Islander community-controlled organisations we wanted to ensure that the worldviews and perspectives of our members and communities are embedded in any evaluations of services delivered by our member organisations. This is a necessary step towards building an evidence base for our ways of working, developed by and for Aboriginal and Torres Strait Islander people. To that end we developed an evaluation framework to enable self-determination and data sovereignty in evaluation, and to build capacity among our member organisations to undertake and/or commission culturally grounded evaluations. Culturally grounded evaluations are led by Aboriginal and Torres Strait Islander people and guided by our worldviews and knowledge systems - our ways of knowing, being and doing.

This paper reports on the development and implementation process used in the project and describes the standards and principles which underpin the framework. An example of how the framework is being applied in practice is also outlined. Our principles for evaluation describe the core values which underpin culturally grounded and safe evaluation including self-determination; cultural authority; truth-telling; two-way learning; and holistic approaches. The evaluation standards and associated elements operationalise our principles and embed them in evaluative practice.
Chair
avatar for Carlos Rodriguez

Carlos Rodriguez

Senior Manager Strategy & Evaluation, Department of Energy Environment and Climate Action
Speakers
avatar for Candice Butler

Candice Butler

Executive Director, Centre of Excellence, Queensland Aboriginal and Torres Strait Islander Child Protection Peak
Wednesday September 18, 2024 5:00pm - 5:30pm AEST
101-102 105 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -