Loading…
Conference hashtag #aes24MEL
arrow_back View All Dates
Thursday, September 19
 

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

12:00pm AEST

Optimising Evaluations of Wellbeing Programs in Schools
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105
Authors: Tamara Van Der Zant (Australian Council for Educational Research), Katherine Dix (Australian Council for Educational Research, AU)

In this presentation we will discuss the diverse and flexible data collection methods suitable for program evaluation in the context of schools. We will discuss the types of evidence that can be used to evaluate social and emotional learning programs and wellbeing initiatives, specifically, and considerations when working with educators, children and young people. We will invite all to participate in discussions about challenges to the evaluation of these programs in complex, real-world contexts (including data quality, confounding factors, system requirements, etc.) and propose methods we use to navigate these challenges.

Optimising program evaluation methods is important because of the ever-growing number of wellbeing programs being offered to schools. Accordingly, the need for high quality evaluation to guide funding decisions and use of programs and initiatives to support student and educator wellbeing in schools has never been greater.

By drawing on comprehensive experience in undertaking wellbeing program evaluations, this presentation will share our lesson learnt and recommendations that should support evaluators in crafting contextually appropriate evaluations. High quality program evaluations, often a requirement for ongoing funding, addresses the growing need for meaningful and accessible evidence that is currently being sought by schools, educators, funders, and policy decision makers.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Tamara Van Der Zant

Tamara Van Der Zant

Research Fellow, Australian Council for Educational Research
Tamara is a Research Fellow in the Educational Monitoring and Research Division at ACER. Prior to this role she completed her PhD in emotion research at the University of Queensland. She brings experience in research design, conducting research with diverse populations, broad data... Read More →
avatar for Dr Katherine Dix

Dr Katherine Dix

Principal Research Fellow, School and System Improvement, Australian Council for Educational Research
Dr Katherine Dix is a Principal Research Fellow at ACER, with over 20 years as a program evaluator, educational researcher and Project Director. Dr Dix is the National Project Manager for Australia’s participation in OECD TALIS 2024, and is a leading expert in wellbeing and whole-school... Read More →
Thursday September 19, 2024 12:00pm - 12:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

1:30pm AEST

The Art of Qualitative Sensemaking: Exploring New Methods
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105
Authors: Alli Burness (Tetra Tech), Sharon Marra-Brown (ARTD, AU), Matt Health (First Person Consulting, AU), Monica Wabuke (Tetra Tech, FJ)

Sensemaking is the process of making meaning and distilling the signal from the noise in primary research. Inclusive and transparent sensemaking ensures the critical link is maintained between evidence and insights, that evidence is interpreted correctly, and the views of participants are understood correctly. Using intentional sensemaking approaches with integrity can ensure transparency and logical rigor in an evaluation or research project.

Despite its critical nature, sensemaking can often be the most opaque step in an evaluation process. While replication is a hallmark of good sensemaking, especially in academia, this is not always feasible in the fast-paced world of evaluation. The time required to do sensemaking well, the importance of applying the correct approaches and engaging the correct parties, and the critical role of a lead facilitator can be overlooked or underestimated. By shining a spotlight on this step in an evaluation, this session will highlight inclusive and accessible sensemaking approaches used across the design and evaluation spectrum to identify new or emergent approaches. It will pay particular focus to sensemaking when working in complex systems.

Panellists bring deep experience in evaluation or design research in Australian or international consulting settings. They will touch on what sensemaking approaches can be used to maintain integrity through a rapid or agile sensemaking process common in large or complex evaluations; popular sensemaking processes for coding data and new or emerging methods; and how insights or recommendations emerge from the sensemaking process. The moderator will start the panel by reflecting on the definitions, understanding and application of sensemaking, with an emphasis on inclusive and accessible aspects. Our presenters will then explore methods through this same lens and with a focus on emergent or new approaches. Methods will be presented in a manner that audience members can learn and apply.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
avatar for Matt Healey

Matt Healey

Principal Consultant, First Person Consulting
My career in evaluation started fairly traditionally. I joined a small firm as a Research Assistant in early 2014 with no idea what evaluation was, or what I was in for! Since then I have:Co-founded and grown an organisation (First Person Consulting) to a team of 16 people working... Read More →
avatar for Sharon Marra_Brown

Sharon Marra_Brown

Director, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Monica Wabuke

Monica Wabuke

Associate Director - Research, Monitoring and Evaluation Practice, Tetra Tech International Development - Asia Pacific
Monica Wabuke is an Associate Director within Tetra Tech’s Research, Monitoring and Evaluation Practice (RME). She brings 14 years of experience in design, monitoring and evaluation and has provided technical support to DFAT, MFAT, EU, USAID and World Bank-funded projects and programs... Read More →
avatar for Alli Burness

Alli Burness

Director, Australian Consulting, Tetra Tech
Alli is an Australian strategic designer and researcher with settler heritage, born and living on Bunurong Country. As Director of the Australian Consulting Practice at Tetra Tech International Development, Alli works with a First Nations team to support relational approaches across... Read More →
Thursday September 19, 2024 1:30pm - 2:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Our new ways: Reforming our approach to impact measurement and learning
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105
Authors: Kaitlyn Scannell (Minderoo Foundation), Adriaan Wolvaardt (Minderoo Foundation, AU), Nicola Johnstone (Minderoo Foundation, AU), Kirsty Kirkwood (Minderoo Foundation, AU)

We have been on a journey to bring awareness, evidence and understanding to the impact of our organisation since inception, and in earnest since 2016. For years, we felt the tension of trying to solve complex problems with measurement and learning approaches that are better suited to solving simple problems.

To change the world, we must first change ourselves. In early 2023 we had the extraordinary opportunity to completely reimagine our approach to impact measurement and learning. What we sought was an approach to measurement and learning that could thrive in complexity, rather than merely tolerate it, or worse, resist it.
We are not alone in our pursuit. Across government and the for-purpose sector, practitioners are exploring and discovering how to measure, learn, manage, and lead in complexity. Those who explore often discover that the first step they need to take is to encourage the repatterning of their own organisational system. A system which, which in the words of Donella Meadows, "naturally resists its own transformation."

In this presentation we will delve into two themes that have emerged from our journey so far:
  • Transforming ourselves - We will explore what it takes to embed a systems-led approach to measurement, evaluation and learning in an organisation.
  • Sharing knowledge - We will discuss methods for generating, sharing, and storing knowledge about what works for measuring, evaluating, and learning in complexity.

The purpose of this session is to share what we have learnt with anyone who is grappling with how their organisation might measure and learn in complexity. We have been touched by the generosity of those who have accompanied us on our journey, sharing their experiences and wisdom. This presentation marks our initial effort to pay that generosity forward.
Chair
JC

Janet Conte

Principal Evaluation Officer, DPIRD
I live in Perth (Boorloo) and have 3 children. I really enjoy being a co-convenor of the WA branch of the AES with Lisette Kaleveld. I'm interested learning more about systems evaluation and building an evaluation culture.
Speakers
Thursday September 19, 2024 2:30pm - 3:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The learning journey: competency self-assessment for personal learning and profession development
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105
Authors: Amy Gullickson (University of Melbourne), Taimur Siddiqi (Victorian Legal Services, AU)

AES in collaboration with learnevaluation.org offers a competency self-assessment to members. The aim to help individuals understand their strengths and plan their learning journey, to help the AES continue to tailor its professional development offerings and develop pathways to professionalisation, and to contribute to ongoing research about evaluation learners. In this session, members of the AES Pathways Committee will briefly summarise the findings from the self-assessment and then invite participants into groups by their discipline and sector to discuss: Which competencies are really core and why? Reporting out from groups will will reveal whether the core competencies differ based on the sectors/background of the evaluators. The follow up discussion will then explore: What do the findings mean for evaluation practice, and teaching and learning? How do they relate to professionalisation? If we want to increase clarity about what good evaluation practice looks like - what are our next steps related to the competencies?

Participants will benefit from reflecting on their own competency self-assessment in relation to the findings and discussion, and discovering how the backgrounds of learners influences their ideas about core competencies. The session findings will be shared with the AES Pathways Committee to inform AES' next steps for the competencies, self-assessment, and ongoing discussion of pathways to professionalisation.

Chair
avatar for Peter Bowers

Peter Bowers

Assistant Director, Australian Centre for Evaluation (ACE)
I am part of the Australian Centre for Evaluation in Commonwealth Treasury that was set up to increase the volume, quality and use of evaluation across the Commonwealth government. I have a particular interest in RCTs. Come and speak to me if you would like to run an RCT in your... Read More →
Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, The University of Melbourne
I'm an Associate Professor of Evaluation at the University of Melbourne Assessment and Evaluation Research Centre. I'm also a co-founder and current chair of the International Society for Evaluation Education https://www.isee-evaled.com/, a long-time member of the AES Pathways Committee (and its predecessors), and an architect of the University of Melbourne’s fully online, multi-disciplinary, Master and Graduate Certificate of Evaluation programs https://study.unimelb.edu.au/find/courses/graduate/master-of-evaluation/ .I practice, teach, and proselytize evaluation... Read More →
avatar for Taimur Siddiqi

Taimur Siddiqi

Evaluation manager, Victorian Legal Services Board+Commissioner
Taimur is an experienced evaluation and impact measurement professional who is currently the evaluation manager at the Victorian Legal Services Board + Commissioner and a member of the AES Board Pathways Committee. He is also a freelance evaluation consultant and was previously the... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -