-
During 2021 a working group of the AES Pathways Committee piloted an online group mentoring mode. This was in response to the high demand for mentorship recorded in the membership survey, as part of the 2019–22 strategic planning process. Under the Pilot, a diverse range of 22 mentees benefited from the mentorship of four AES
Fellows- Penny Hawkins, Scott Bayley, Rick Cummings and Alan Woodward – and one associate mentor: Duncan Rintoul.
During the Pilot, a group of talented CPE Evaluation Capstone students: Kate Glastonbury, Ion Ioannidis, Bronwyn Ledgard, Martina Donkers and Georgia Pinto evaluated the program. This is their report.
-
Evaluation by Tara Collins
This report presents the planning, methodology, key findings and recommendations of the evaluation of FestEVAL 2020. Designed and delivered by the Australian Evaluation Society (AES), FestEVAL was held from 21 to 25 September 2020 and offered a series of free, online evaluation-themed activities, centred around the theme of “Celebrating Evaluation”.
The evaluation was led by a volunteer Master of Evaluation student, Ms Tara Collins. Ms Collins benefited from the support and guidance of the AES, as well as that of an experienced evaluation fellow, Dr Dorothy Lucks.
-
-
COVID has required the AES and its members to work in fundamentally different ways. The AES has implemented a range of measures to maintain services for its members given COVID restrictions which have prevented face to face activities and suspended the conference for 2020. We surveyed AES members and others from the evaluation sector to identify other initiatives that the AES could take to improve services during the COVID period and beyond.
The objectives of the survey were to identify:
- Actions/initiatives that the AES could undertake in the immediate term to enhance the services being offered to and accessed by its members.
- How services should be delivered in the longer term post the COVID restrictions (should mode of operation return to pre COVID or should some of the changes continue on).
-
Evaluation lead Emily MacKay, with support from Penny Smallwood in scoping, survey design and quantitative analysis
This report presents the planning, approach and findings of the evaluation of the annual international evaluation conference of the Australian Evaluation Society ‘AES18’. With support from the AES as both commissioner and provider of the conference, two volunteer external evaluators were engaged to undertake the project; one (the report’s author) a Master of Evaluation student.
Two key evaluation questions (KEQ) and a series of sub-questions were developed in collaboration with key AES stakeholders: (1) How valuable was the conference for the range of delegates; and (2) To what extent did the conference contribute to advancing the strategic objectives of the AES Board?
A three-phased fixed mixed method convergent design was conceptualised for the project. Data collection was undertaken both during and after the conference via interviews and an online survey post-conference with participants. The overarching purpose of the evaluation was to determine the overall value of the conference and to find areas for improvement within the conference, or AES services. The report conveys a high degree of value expressed by the evaluation community regarding participants’ involvement in AES18 and positive developments towards addressing the strategic objectives of the AES.
-
By Greet Peersman and Patricia Rogers
Commissioned by the AES Board, this report outlines possible pathways to achieve one of the AES’ Strategic Priorities – ‘to strengthen the capacity and professionalism of the evaluation sector’. By professionalism, we do not mean working to make evaluation an exclusive profession, with barriers to entry including compulsory accredited qualifications and licensing. Instead, we mean continuously improving the quality of evaluation practice and products.
-
Evaluation by Jen Thompson
This report communicates the approach, methodologies and findings of the evaluation of the 2017 AES International Conference. The evaluation was conducted by a solo evaluator, Jen Thompson, a student of Melbourne University’s Master of Evaluation. The AES was the commissioner of this evaluation and the author was hosted as an intern while completing the evaluation and post-Capstone obligations.
Two key evaluation questions (KEQ) and nine sub-questions were developed in consultation with the AES2017 conference organisation committee. The KEQs asked “Did the conference satisfy the professional interests of conference delegates?” and “Did the conference satisfy the strategic interest of the AES Board?”.
A mixed method research approach was employed which included two data collection points; 20 at-conference interviews and a post-conference on-line survey of 215 participants. The evaluation primarily explored merit (quality) and worth (value) of the conference as it was perceived by participants and expressed in terms of satisfaction levels.
-
Evaluation by Benedict Shipp
This report is an evaluation of the AES International Evaluation Conference that was held in Perth from September 19-21, 2016. Themed “Landscapes”, the AES conference attracted more than 300 participants. The AES conference was preceded by two days of pre-conference workshops, which were out of the scope for this evaluation.
The 2016 conference had 110 presentations over 3 days, plus a special session of the AES Annual General Meeting and the AES Fellows. Each presentation had been allocated to one of 6 conference strands (excluding the AES Annual General Meeting):
• Plenary session
• Evaluation Landscapes
• Natural and Built Landscapes
• Organisational, Political and Economic Landscapes
• Social and Cultural Landscapes
• Special Session.
This report gathered data from three sources — an Online Attendee Survey, an Online Non-Attendee Survey and semi-structured interviews conducted during the three days of the conference.
All the data collected was collated and used to answer the key evaluation question: ‘Overall, did the conference attendees perceive there was value in attending the conference?’
-
Evaluation by Emma Walters
The AES 2015 International Evaluation Conference was held in Melbourne from September 6-9, 2015. Themed “Reaching Across Boundaries”, the AES 2015 conference attracted over 500 participants. Data collection from attendees and non-attendees was conducted during and after the conference to answer the following evaluation questions: Overall, did the conference attendees perceive there was value in attending the conference? If so, why? If not, why not? Specific questions this evaluation report addresses are:
- What were the reasons participants registered for the AES 2015 conference?
- What were the reasons that non-participants did not register for the AES 2015 conference?
- Did conference participants find the conference valuable? If so, why? If not, why not?
- What elements of the conference were most valuable to participants?
- What elements of the conference were least valuable to participants?
- Do participants perceive they are likely to change their evaluation practice in any way as a result of attending the conference? If so, how?
- How could the conference be improved in future?
Attendees of the conference perceived there was value in attending: approximately four-fifths of respondents (79%) found it “fairly valuable” or “extremely valuable”, although this is a small drop from the 85% who found the 2014 conference “fairly” or “extremely valuable”. The AES 2015 attendees particularly valued the networking and learning opportunities offered.
-
Evaluation by Charlie Tulloch and Sigrid Patterson
The Darwin 2014 Australasian Evaluation Society (AES) Conference was held between 8 and 12 September 2014 at the Australian Centre for Indigenous Knowledges (ACIKE), located at the Casuarina Campus of Charles Darwin University (CDU). The conference’s theme was ‘unleashing the power of evaluation’. It featured two days of pre-conference workshops, two and a half days of conference activities, a conference opening event, a social program, international keynote speakers, a broad range of presentations by attendees and a range of other activities.
An evaluation was sought by the AES to assess the value of the conference, along with ways to improve the conference over the next three to five years. AES Board members and key Darwin event organisers helped to shape the focus of the evaluation against three key themes: the participant experience; organisation and logistics; and strategic alignment of the conference with AES’ goals.
The evaluation was conducted by two Master of Evaluation students at the University of Melbourne, with support from the Centre for Program Evaluation and the AES, in particular two evaluation ‘fellows’ who provided expert guidance throughout the process.
The evaluation deliverables include this report on findings and appendices, associated data collection tools that could be used in future years and a separate report for the AES detailing post-event participant and non-participant survey findings.
The findings focus largely on identifying critical challenges and areas for improvement in future.
-
Evaluation by David Turner
The 2013 Australasian Evaluation Society (AES) conference took place in Brisbane, Queensland, and has been evaluated on the basis of an online survey of conference participants. This represents the third year in which the approach of an online survey after the conference has been used.
Responses to the conference were generally positive. Overall, participants rated the conference experience positively. Learning opportunities, networking, and skill development were recognised as positive elements of the experience.
Comparison with the previous two years, however, shows cause for concern. Assessments of the learning and skill development opportunities were lower than for previous conferences, and negative comments were made about the quality of the program and some elements of the conference arrangements. New aspects of the conference, such as the use of social media, continue to attract little support or use.
Some suggestions are provided for consideration in planning future conferences.
-
Evaluation by David Turner
The 2012 Australasian Evaluation Society (AES) conference took place in Adelaide, South Australia. The conference was attended by over 400 people, with a wide range of participants that included delegates from Papua New Guinea, Thailand, the USA, the UK, and other countries. Pre-conference workshops were held to provide professional development, and are included in this report. The primary source of feedback from participants, for this evaluation, was provided by an online survey, with a 57% response rate.
Evaluation results show generally positive responses to the conference. Most respondents said the conference provided them with value in a range of areas, such as knowledge of new evaluation approaches or methods, awareness of research being done elsewhere, and opportunities to develop new evaluation skills. Some concerns remain about presentation quality, opportunities to interact effectively with peers, and the adequacy of the time allowed for presentations.
This conference was the first to embrace new social media technologies, but with a low take-up rate.
Suggestions are provided for building on the 2012 experience in future.
-
Evaluation by David Turner
This report summarises feedback on the 2011 Australasian Evaluation Society (AES) international conference in Sydney, Australia. It reflects a decision to go beyond the evaluations done on previous conferences, which focused heavily on participant satisfaction. Instead, this year’s evaluation looked at what participants learned from the conference and how likely they think they are to make use of new contacts, information or skills in their future work.
Responses to an online survey indicated that participants assessed their experience highly in terms of what they learned and how they expect to use it. Concerns and criticisms were raised about issues that reduced the opportunity for effective interaction with other participants, or otherwise reduced the opportunity to build on the conference for professional development. Recommendations are given for addressing these concerns and building on this year’s evaluation approach for future conferences.