Loading…
Conference hashtag #aes24MEL
Other clear filter
arrow_back View All Dates
Thursday, September 19
 

10:30am AEST

Navigating organisational turbulence: An evaluation-based strategic learning model for organisational sustainability
Thursday September 19, 2024 10:30am - 11:00am AEST
103
Authors: Shefton Parker, Monash Univeristy; Amanda Sampson, Monash Univeristy

Increasingly, turbulent, and rapidly changing global operating environments are disrupting organisational plan implementation and strategy realisation of institutions. The session introduces a novel organisational collaborative strategic learning and effectiveness model, intended to bolster organisational resilience responses amidst such turbulence.
A scarcity of suitable organisational strategic learning systems thinking models utilising evaluation methodology in a joined-up way, prompted the presenters to develop a model. The model is tailored for strategic implementation in a complex organisational system environment, operating across decentralised portfolios with multiple planning and operational layers. The model amalgamates evaluation methodologies to identify, capture, share and respond to strategic learning in a complex system. It is hypothesised the model will outperform conventional organisational performance-based reporting systems, in terms of organisational responsiveness, agility, adaptability, collaboration, and strategic effectiveness.
The presentation highlights the potential value of integrating and embedding evaluation approaches into an organisation's strategy, governance and operations using a three-pronged approach:
- Sensing: Gathering relevant, useful timely data (learning);
- Making sense: Analysing and contextualising learning data alongside other relevant data (institutional performance data, emerging trends, policy, and legislative reform etc); and
- Good sense decisions: Providing timely and relevant evaluative intelligence and insights to support evidence based good decision making.
The presenters advocate for a shift from viewing evaluation use as a 'nice to have' to a 'must have' aspect of organisational growth and sustainability. The model aims to foster a leadership culture where decision makers value the insights that contextualised holistic organisational intelligence can provide for;

i) Strategic planning: Enhanced planning and strategic alignment across portfolios;

ii) Operational efficiency: Reducing duplication in strategic effort and better collaboration towards strategic outcomes;

iii) Business resilience and sustainability: Improved identification and quicker response to emerging opportunities and challenges; and

iv) Strategic effectiveness: Informing activity adaptation recommendations for strategic goal realisation.
Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →
Speakers
avatar for Shefton Parker

Shefton Parker

Senior Evidence & Evaluation Adviser, Monash University - Institutional Planning
Dr Shefton Parker is an evaluator and researcher with over 15 years of specialist experience in program and systems evaluation within the Vocational and Higher Education sectors. Recently, his evaluation of innovative education programs were referenced as evidence in the University... Read More →
avatar for Amanda Sampson

Amanda Sampson

Senior Manager, Institutional Planning, Monash University
I am leading the development and implementation of an Institutional Evaluation Model which a complex organisation to support organisational resilience, strategic adaptation and execution to realise the 10 year organisational strategic objectives. I am interested in learning how to... Read More →
Thursday September 19, 2024 10:30am - 11:00am AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

10:30am AEST

An evaluator in need of an evaluation
Thursday September 19, 2024 10:30am - 11:30am AEST
105
Authors: Dana Cross (Grosvenor ),Kristy Hornby (Grosvenor )

"If all you have is a hammer, then everything looks like a nail." - Maslow/Kaplan/Unknown

Maslow's Hammer (aka the law of the instrument or golden hammer) and déformation professionnelle are concepts that speak to cognitive biases that can limit our effectiveness. Essentially, they mean that we use what we know and as evaluators, that is evaluation.

How can we as evaluators and commissioners of evaluations avoid cognitive bias and work effectively within (evaluation) policy parameters to ensure we are adding value and not using evaluation as the only tool in our toolbox?

We invite you to join us in a fast-paced interactive session to unpack:
  • the ways in which our expertise can get in our way
  • explore what it means to stay open to other tools as evaluation professionals and commissioners of evaluation
  • how this challenges us as individuals and as a profession.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
avatar for Dana Cross

Dana Cross

Associate Director, Grosvenor
Dana is a public sector expert, possessing over 17 years of deep experience advising government organisations on program evaluation, organisational review, service optimisation and performance management. She is a member of Grosvenor’s Executive Leadership Team as Head of Strategy... Read More →
Thursday September 19, 2024 10:30am - 11:30am AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

11:30am AEST

Navigating complex government reforms: a tool to examine program theory. How complete and relevant is government program theory?
Thursday September 19, 2024 11:30am - 12:00pm AEST
105
Authors: Nerida Rixon

Developing program theory for complex government reforms and charting the 'how' we will reach our destination is not easy. Perhaps you, like me, rely on well tested templates? Do we challenge them and continually refine them to reflect emerging research and make them more useful for our purposes?

This research is about public policy packages and reforms and the program theories (or reform theories) that explain them. This research emerges from a desire to build program theory better, particularly in the context of whole of government reforms. Better program theory can drive better planning, monitoring and evaluation of performance, and better policy and public good.

Evidence shows Australian governments are not effectively planning, monitoring and evaluating performance of programs and policy packages. Theory can support development of meaningful performance indicators to track progress. Without strong program theory and clear strategy, as the Productivity Commission's recent 'Review of the National Agreement on Closing the Gap' study report suggests, we risk a 'spray and pray' approach to change, prioritisation of the wrong things and siloed policy responses.

A literature informed checklist to analyse program theory for completeness and relevance to public administration is provided. Policy makers and evaluators are given a tool and lens to build more complete and relevant program theory and to improve existing program theory.

Analysis of program theory in 15 government reform strategies and outcomes frameworks is presented to show governments' strengths and opportunities. Governments are strong at identifying our destination, or the intended outcomes, though not always in identifying the 'how' we will get there. Governments could improve their program theory by making it more explicit and more complete by articulating 'the when' we expect to see changes from implementing the reforms. Government program theory might be more relevant if potential (non-intended) outcomes are referenced.
Chair
avatar for Charlie Tulloch

Charlie Tulloch

Director, Policy Performance
Policy Performance is a proud conference sponsor! Charlie delivers evaluation projects, capability building support and drives public sector improvement. Charlie loves to help those who are new to evaluation or transitioning from related disciplines. He is a past AES Board member... Read More →
Speakers
Thursday September 19, 2024 11:30am - 12:00pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia

2:30pm AEST

Where next? Evaluation to transformation
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103
Authors: Rachel Wilks (Grosvenor), Kristy Hornby (Grosvenor), Sarika Bhana (Grosvenor)

What is evaluation? Better Evaluation defines it as "any systematic process to judge merit, worth or significance by combining evidence and values". Many government organisations and some private and not-for-profit entities use evaluations as an auditing tool to measure how well their programs are delivering against intended outcomes and impacts and achieving value for money. This lends itself to viewing evaluation as an audit or 'tick-box' exercise when it is really measuring the delivery of an organisation's mandate or strategy (or part thereof). Viewing evaluation more as an audit than a core part of continuous improvement presents a risk of our reports collecting dust.

During this session, we will discuss factors that build a continuous improvement mindset across evaluation teams, as well as across the broader organisation. This will include exploring how to manage the balance between providing independent advice with practical solutions that program owners and other decision-makers can implement more readily, as well as how to obtain greater buy-in to evaluation practice. We present the features that evaluations should have to ensure findings and conclusions can be easily translated into clear actions for improvement.

We contend that it is important to consider evaluation within the broader organisational context, considering where this might link to strategy or how it may be utilised to provide evidence to support funding bids. This understanding will help to ensure evaluations are designed and delivered in a way that best supports the wider organisation.

We end by sharing our post-evaluation playbook - a practical tool to help take your evaluations from pesky paperweight to purposeful pathway.

Chair
PP

Prescilla Perera

Principal Monitoring and Evaluation Officer, DFFH
Speakers
avatar for Rachel Wilks

Rachel Wilks

Senior Consultant, Grosvenor
Rachel is a management consultant and an emerging evaluator at Grosvenor. She took her first steps into the evaluation world two years ago, and since then has been increasingly interested in how evaluation can be used in and across the public sector and not-for-profit space. Rachel... Read More →
Thursday September 19, 2024 2:30pm - 3:00pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia
  Tools

3:30pm AEST

Constructing a Wisdom Base: A Hands-On Exploration of First Nations Knowledge Systems
Thursday September 19, 2024 3:30pm - 4:30pm AEST
106
Authors: Skye Trudgett (Kowa ),Haley Ferguson (Kowa, AU),Tara Beattie (Kowa, AU),Levi McKenzie-Kirkbright (Kowa, AU),Jess Dart (Clear Horizon, AU)

In the pursuit of understanding and honouring the depth of First Nations wisdom, this hands-on session at the AES conference introduces the Ancestral Knowledge Tapestry —a living guide for developing a repository of ancestral knowledge, practices, and philosophies. Participants will actively engage in co-creating a 'Wisdom Base,' a collective endeavour to encapsulate the richness of old and new First Nations knowledges and their application to contemporary evaluative practices.

Through interactive exercises, collaborative dialogue, and reflective practices, attendees will delve into the components of the Ancestral Knowledge Tapestry, exploring the symbiosis between deep knowing, artefacts, deep listening and truth-telling. The session aims to empower participants, particularly those from First Nations communities, to identify, document, and share their unique wisdom in ways that foster self-determination and cultural continuity.
Attendees will emerge from this workshop not only with a deeper appreciation for the intrinsic value of First Nations knowledge systems but also with practical insights into how to cultivate a Wisdom Base that not only preserves but actively revitalises First Nations wisdom for future generations.

Chair
avatar for Sandra Ayoo

Sandra Ayoo

Assistant Professor, University of North Carolina Greensboro
Dr. Ayoo is an Assistant Professor of Educational Research Methodology in the Department of Information, Library, and Research Science at the School of Education, University of North Carolina Greensboro. She teaches graduate courses in program evaluation and research methodology... Read More →
Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for Levi McKenzie-Kirkbright

Levi McKenzie-Kirkbright

Software Engineer, Kowa Collaboration
Software engineer at Kowa investigating how to implement Indigenous data sovereignty principles into software systems.
avatar for Tara Beattie

Tara Beattie

Consultant, Kowa Collaboration
Tara Beattie is a dedicated professional who is passionate about fostering positive change in Community.  As a Consultant at Kowa Collaboration, Tara leads projects designed to empower organisations in First Nations UMEL practices, aligning with Kowa's commitment to amplifying First... Read More →
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
106 102 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Growing Australia's future evaluators: Lessons from emerging evaluator networks across the Asia Pacific
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Authors: Amanda Mottershead (Tetra Tech International Development), Qudratullah Jahid (Oxford Policy Management Australia, AU),Eroni Wavu (Pacific Community, FJ)

The sustainability of the evaluation sector requires emerging evaluators to be supported in pursuing high-quality practice. What this support needs to be and how it should be developed is much less certain. What topics should we focus on? How should we deliver it? Who should we deliver it to? How can the broader evaluation community support emerging evaluators?

Global experiences in emerging evaluator support contain a treasure trove of lessons which can fill this knowledge gap and inform effective support here in Australia. Experiences show that fostering a strong evaluation community, that includes emerging evaluators, can nurture, ignite and shape future evaluation practices. A variety of approaches are being adopted across the region, and the globe, to foster this sense of community, that range from formal approaches to capacity building to more informal approaches that focus on experience sharing.

In this session, we bring together current and former emerging evaluator leaders from across the Asia Pacific region to answer some of these questions and understand what approaches could work best for the Australian context. This will include presentations and discussion on in-demand topics, how to formulate support, how to target emerging evaluators and the best means of delivery. The session will be highly interactive, engaging the audience in a question-and-answer forum on this important topic. All panel members have been engaged with emerging evaluator networks in their countries or regions and bring diverse experiences to facilitate cross learning. The session will provide practical ways forward for the broader evaluation community to grow and support the future of evaluation.
Chair Speakers
avatar for Qudratullah Jahid

Qudratullah Jahid

Senior MEL Consultant, Oxford Policy Management
I am a monitoring, evaluation, research, and learning specialist with a background in bilateral and multilateral development organisations. With expertise in MEL frameworks and systems, I support OPM projects in the Indo-Pacific. My focus areas include MEL frameworks, mixed methods... Read More →
avatar for Amanda Mottershead

Amanda Mottershead

Consultant - Research, Monitoring and Evaluation, Tetra Tech International Development
I enjoy the breadth of evaluation in international development. I've had the opportunity to work across sectors including economic development, infrastructure, energy, education and inclusion. I enjoy generating evidence that promotes improvements to organisations, policies and programs... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
Plenary 1 114 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

Committed to mentoring
Thursday September 19, 2024 3:30pm - 4:30pm AEST
103
Authors: Julie Elliott (Independent Evaluator), Jill Thomas (J.A Thomas & Associates, AU), Martina Donkers (Independent Evaluator, AU)

Mentors and mentees from the AES Group Mentoring Program share rich experiences of group learning, knowledge sharing, and reflective practice, exploring the Wayfinding skills, knowledge, and expertise they have found through the program and the valuable lessons learned.

AES remains committed to mentoring, and this session provides a unique opportunity to hear perspectives across the mentoring spectrum, from Fellows to emerging evaluators, and the ways that sharing our professional practice enhances our work. Since 2021, the AES Group Mentoring Program has been a trailblazer in fostering professional growth and competencies for emerging and mid-career evaluators, enabling mentors and peers to help navigate unfamiliar territories, incorporating various tools and strategies.

Our dynamic panel will discuss how evaluators have adapted their approaches to mentoring and to evaluation practice with the support of the program. It's a session where personal and professional growth intersect and will offer a unique perspective on the transformative power of mentorship.

This discussion is for evaluators who are passionate about learning - both their own and that of other AES members! Whether you're a seasoned professional eager to contribute to your community, an emerging talent or a mid-career evaluator navigating contemporary evaluation ecosystems, this session is for you. Don't miss this opportunity to hear directly from mentors and mentees who value the shared, continuous journey of social learning and adaptation.




Chair
avatar for Laura Holbeck

Laura Holbeck

Monitoring, Evaluation & Learning Manager, Australian Humanitarian Partnership, Alinea International
Speakers
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.
avatar for Rick Cummings

Rick Cummings

Emeritus Professor, Murdoch University
Rick Cummings is an Emeritus Professor in Public Policy at Murdoch University. He has 40 years of experience conducting evaluation studies in education, training, health, and crime prevention primarily for the state and commonwealth government agencies and the World Bank. He currently... Read More →
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent freelance evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured... Read More →
avatar for Lydia Phillips

Lydia Phillips

Principal Consultant, Lydia Phillips Consulting
I operate an independent consulting practice, providing evaluation and social policy services to community organisations and government.With a background in law and social policy, I have more than 15 years' experience building and using evidence in order to create positive social... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
103 110 Convention Centre Pl, South Wharf VIC 3006, Australia

3:30pm AEST

The learning journey: competency self-assessment for personal learning and profession development
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105
Authors: Amy Gullickson (University of Melbourne), Taimur Siddiqi (Victorian Legal Services, AU)

AES in collaboration with learnevaluation.org offers a competency self-assessment to members. The aim to help individuals understand their strengths and plan their learning journey, to help the AES continue to tailor its professional development offerings and develop pathways to professionalisation, and to contribute to ongoing research about evaluation learners. In this session, members of the AES Pathways Committee will briefly summarise the findings from the self-assessment and then invite participants into groups by their discipline and sector to discuss: Which competencies are really core and why? Reporting out from groups will will reveal whether the core competencies differ based on the sectors/background of the evaluators. The follow up discussion will then explore: What do the findings mean for evaluation practice, and teaching and learning? How do they relate to professionalisation? If we want to increase clarity about what good evaluation practice looks like - what are our next steps related to the competencies?

Participants will benefit from reflecting on their own competency self-assessment in relation to the findings and discussion, and discovering how the backgrounds of learners influences their ideas about core competencies. The session findings will be shared with the AES Pathways Committee to inform AES' next steps for the competencies, self-assessment, and ongoing discussion of pathways to professionalisation.

Chair
avatar for Peter Bowers

Peter Bowers

Assistant Director, Australian Centre for Evaluation (ACE)
I am part of the Australian Centre for Evaluation in Commonwealth Treasury that was set up to increase the volume, quality and use of evaluation across the Commonwealth government. I have a particular interest in RCTs. Come and speak to me if you would like to run an RCT in your... Read More →
Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, The University of Melbourne
I'm an Associate Professor of Evaluation at the University of Melbourne Assessment and Evaluation Research Centre. I'm also a co-founder and current chair of the International Society for Evaluation Education https://www.isee-evaled.com/, a long-time member of the AES Pathways Committee (and its predecessors), and an architect of the University of Melbourne’s fully online, multi-disciplinary, Master and Graduate Certificate of Evaluation programs https://study.unimelb.edu.au/find/courses/graduate/master-of-evaluation/ .I practice, teach, and proselytize evaluation... Read More →
avatar for Taimur Siddiqi

Taimur Siddiqi

Evaluation manager, Victorian Legal Services Board+Commissioner
Taimur is an experienced evaluation and impact measurement professional who is currently the evaluation manager at the Victorian Legal Services Board + Commissioner and a member of the AES Board Pathways Committee. He is also a freelance evaluation consultant and was previously the... Read More →
Thursday September 19, 2024 3:30pm - 4:30pm AEST
105 109 Convention Centre Pl, South Wharf VIC 3006, Australia
 
Share Modal

Share this link via

Or copy link

Filter sessions
Apply filters to sessions.
Filtered by Date -