NCFE is an educational charity in vocational and technical learning. Just like RM, NCFE is a long established education organisation focused on enhancing learning and improving life chances.
Both organisations also share a deep passion and understanding of innovation and have worked together for a number of years in various capacities.
In this latest project, the RM Compare team worked with NCFE to try to approach a particular assessment challenge with a new holistic approach using Adaptive Comparative Judgement.
The NCFE used RM Compare to try to improve the way they were currently assessing applications for two competitions, namely the NCFE Assessment Innovation Fund and the NCFE Aspiration Awards.
Together we were able to show the transformative potential of this new approach by delivering significant improvements in efficiency, reliability and stakeholder engagement in two critical, live evaluation processes.
NCFE sought to enhance their assessment processes, particularly for the Assessment Innovation Fund (AIF) and the Aspiration Awards shortlisting.
Project 1
Assessment Innovation Fund (AIF) evaluation
Project 2
Aspiration Awards shortlisting
NCFE implemented RM Compare’s Comparative Judgement (CJ) system for both the AIF and Aspiration Award’s processes.
The platform’s key features included:
The NCFE Assessment Innovation Fund (AIF) is an initiative to support and develop innovative approaches to assessment in education. Its aims are to break boundaries in assessment practices, support and pilot new ideas for the future of assessment, build trust and confidence in innovative assessment methods, and add value to stakeholders including learners, educators, employers, and government.
40 applications received, plus 4 seed scripts from previous applications
61 volunteer judges, each tasked with making 10 judgements
The Aspiration Awards are about honouring the success of learners, apprentices, educators, support staff and educational organisations across the UK. There are multiple awards including Against All Odds, Apprentice of the Year, Centre of the Year and Learner of the Year.
The CJ method substantially reduced the overall evaluation time in both trials. For the AIF trial, the time spent on evaluation decreased by 58%, despite more judges and more application reviews. Similarly, in the Aspiration Awards, the total time spent on shortlisting was reduced by 43%, again with significantly more reviews of each application and more judges. The additional eyes on each application gave teams a high degree of confidence in the process compared to previous methods.
Feedback from stakeholders was overwhelmingly positive. Judges found the CJ process user-friendly and efficient. They appreciated the simplicity and the ability to manage their judgements around other commitments. The increased engagement, with more judges involved than in previous methods, contributed to a more inclusive and comprehensive evaluation process. Administrators also found the setup and management of CJ sessions straightforward, with time savings and significantly improved visibility of progress.
Significantly more judges involved in both processes
Aspiration Award’s judges expanded beyond the communications team to include more diverse perspectives
Judges appreciated the user-friendly interface and flexibility
Ability to complete judgements around other work commitments
Administrators found set up and management straightforward
The comparative approach proved effective in evaluating a wide range of innovative and subjective entries. Holistic judgements allowed for nuanced evaluations beyond rigid criteria
The findings from both trials indicate significant improvements in time efficiency, reliability, and stakeholder satisfaction. Both trials demonstrated high reliability scores, indicating consistent and dependable judgements. In the AIF trial, the reliability score reached 0.73, which is considered acceptable given the breadth of applications. For the Aspiration Awards, the reliability scores were even higher, averaging above 0.80 across all categories. CJ provided direction to, and evidence for, the moderation processes for both activities, further enhancing reliability.
Substantial time savings while increasing the depth of evaluation
Consistently high reliability scores across different evaluation contexts
More reviews per application, robust moderated data and outlier detection
Increased participation and positive feedback from stakeholders
Easily adaptable to different assessment scenarios
Comprehensive analytics for informed decision-making
Demonstrated effectiveness across varying numbers of entries and judges
Significant savings over existing processes
The encouraging results from this study point to a number of opportunities for further exploration and exploitation.
The principles underpinning the RM Compare system are well understood by NCFE, who will endeavour to release even more value to its staff, customers and users.
The RM Compare team looks forward to working with NCFE and other awarding organisations in the exciting domain of Adaptive Comparative Judgement.
The implementation of RM Compare at NCFE has resulted in significant improvements across multiple dimensions of their assessment processes.
By leveraging ACJ technology, NCFE has set new standards in assessment efficiency and reliability, demonstrating the platform’s potential to revolutionise assessment practices across various sectors.
Work is ongoing to investigate how NCFE might use RM Compare technology to improve processes and add value to a number of scenarios including:
Efficient shortlisting of high-volume applications
Streamlining processes for other internal awards
Project prioritisation, AI frameworks, and new product development
Enhancing high-volume tender processes
Alternative to traditional criteria-based methods in formative and summative assessments