The role that high-quality assessment can play in apprenticeships
Catherine Large’s speech: Annual Apprenticeship Conference 2022
Hello everyone, thank you for inviting me. I’ve been working on, within, and around, the apprenticeships system for the past 15 years, and it’s good to see so many familiar faces in the audience.
As the qualifications regulator, it is our job to make sure that the apprenticeship end-point assessments that are in our purview are awarded consistently and fairly, with the apprentice’s interests protected as a priority. It is from this position that I bring you some perspectives today.
I’d like to focus on 3 key things:
- Firstly, the value that high-quality assessment can bring to apprenticeships. I’m keen to consider some of the changes that have come into effect in recent years, and where they might take the sector next.
- Secondly, and building on this, I’d like to reflect on the organisations delivering apprenticeship assessments. You will be aware that end point assessment organisations (EPAOs) were asked in 2020 to apply to be regulated by Ofqual, as part of the Institute for Apprenticeships and Technical Education’s programme to streamline the quality assurance process. I’ll provide an update today on the progress we’ve made so far, and what the market might look like in the future.
- And finally, I’d like to talk about our approach to regulating delivery, to ensure that apprentices get tested fairly, in a way that produces a result that employers can rely on. Where there is a choice, we will choose the route that best protects the apprentice – that, as we see it, is what our statute empowers us to do. This complements the work of the Institute who bring the perspective of employers. Working together, in strategic partnership, we can uphold quality.
1. High-quality assessment
So, to start with, let’s consider the role that high-quality assessment can play in apprenticeships. The reforms enacted have made the assessment approach for all apprenticeships synoptic, so apprentices are assessed holistically at the end point, rather than continuously throughout. All apprentices are now independently assessed at the end of their training, rather than only assessed by their centres. New grading approaches have also been introduced, to enable employers to differentiate better between apprentices entering the labour market. Overall, the reforms have aimed to introduce an assessment that truly assesses what an apprentice is capable of doing at the conclusion of their training.
We have now reviewed a proportion of the end-point assessments currently available in the market, delivered by a range of organisations. We are therefore getting a good idea of how EPAOs develop assessments, and how assessment could be improved per apprenticeship standard. To give you a few examples:
The assessment plan, developed by employers and approved by the Institute, is an important blueprint. We would expect organisations to bring their own sectoral and technical assessment design expertise to bear in interpreting it.
Overall, where we see better quality assessment, it is where EPAOs have considered where materials for different audiences – training providers, assessors, apprentices themselves – have to align and do not contradict each other, when a clear and simple method of grade aggregation is supplied, and when a clear and robust quality assurance process is in place.
We expect EPAOs to take in to account the relevant employment context in their assessment design, as set out in the assessment plan, and to make sure that questions aren’t focused on a part of the industry that the apprentice may not have experience of where a standard covers multiple job roles.
We expect multiple choice tests to provide only one possible correct answer where there is one mark available in the mark scheme. EPAOs must provide enough information to allow assessors to apply the mark scheme accurately and consistently.
We want EPAOs to set clear parameters and expectations for how observations should be conducted, and to include some exemplification of the grading criteria to show what an assessor should be looking for in practice. We want to see examples provided of what an apprentice might do to demonstrate the different grade criteria.
For assessments that use observation, and/or a portfolio, showcase or journals, we’d like to see the expectations of apprentices set out clearly, for example with the number of pieces of evidence required, the types of evidence to include and format required, and to have a clear and effective process for authenticating this evidence.
These are all examples of good practice to inform assessment development, which I know all of you will know about already, which have a direct read-across to the regulations that awarding organisations need to adhere to. We have published these examples as findings from our technical evaluation process, and it is all available to you if you search for Ofqual and apprenticeships on GOV.UK. The team are here delivering a workshop on this today too.
In the future, we’re looking forward to how the assessment reforms will give us more information about apprentices as cohorts of learners. As we collect more outcomes data, it will be interesting to see how grade profiles continue to mature in different sectors as the numbers of assessments increase, and how differentiation through grading is meeting the expectations of employers.
We are looking forward to researching more deeply how different assessment methods are used, as per broader work we are doing on vocational and technical qualifications, such as our recent ‘CASLO’ research that some of you will be aware of. We are keen, for example, to understand how best to assess ‘soft skills’, behaviours – the ‘highly valued competencies’ that employers demand. These are inherently difficult to assess but incredibly important to get right for employers.
And we are looking forward to the benefits that taking a more consistent approach to assessment can bring to individual industries and sectors, where delivery was perhaps patchy before.
2. Organisations
Moving on then to the organisations who deliver apprenticeship assessment, many of whom are in the audience and on stage today.
The decision that the Institute made in 2020 to ensure that all apprenticeship assessments are awarded by organisations that are subject to statutory regulation was an important development, an underpinning development. We are delighted that so many end-point assessment organisations, of all shapes and sizes, have met the necessarily high bar we set for entering the regulated community of awarding organisations in recent months. We continue to carefully evaluate applications from others, and to give proactive support and feedback. The more organisations engage with us, the more we can help them understand the process so, if this applies to you, please do make sure you’re keeping in close contact with your assigned Ofqual relationship manager.
By way of a specific update on progress, 111 new EPAOs have applied for Ofqual recognition to-date, and we are expecting a further 20 to apply by 16 May. We have recognised 41 new EPAOs so far, including 19 who did this on their first attempt. We have never received and approved so many applications for recognition in Ofqual’s 12-year history. This has been done without lowering the bar on quality.
This is not simply growth from already well-established awarding organisations, but organisations that are specialists in their respective fields, that have demonstrated they can meet our requirements, and we welcome them into the regulated community. They range from niche organisations like the Institute of Cast Metal Engineers, recognised for a single apprenticeship standard – the Metal Casting, Foundry and Patternmaking Technician, to those that are broader in scope offering popular standards such as Business Administrator and Associate Project Manager.
Together with established awarding organisations already in regulation, this brings the total number of organisations offering regulated end-point assessments to over 100, with many applications still to be considered.
There are 19 EPAOs who submitted their application by the January deadline, who are waiting for a decision on their latest application to us as we speak. They – you – will all have heard from us by 1 April.
The Education and Skills Funding Agency will be taking steps to remove end point assessment organisations from its register if they are not recognised by Ofqual after two applications. These organisations will have not yet demonstrated that they meet our requirements by providing sufficient evidence to us, almost two years after the announcement was first made. If that sounds a bit tough then it all comes back, for me, to the goal of protecting the quality of assessments for apprentices and their current and potential employers. There is absolutely nothing stopping organisations applying to us at any point in the future, as many times as they like.
I am alive to the risks in the market as it changes, and we remain in close dialogue with the ESFA and the Institute as they manage these developments. Our work together so far has brought more than 90% of apprentices in our first phase into the protection of regulation. We are united in our aims for quality and will continue to prioritise the apprentices themselves as we progress through the programme.
I’m confident that training providers and employers will continue to have a thriving market of end point assessment organisations to work with in the future. And, of course, I highly recommend that anyone makes an Ofqual-recognised awarding organisation their EPA provider of choice from this point on.
3. Regulatory approach
Finally today, I’d like to just touch briefly on the work that Ofqual is doing to regulate end-point assessments in delivery. Apprenticeship assessment is a particular form of assessment. Like many vocational qualifications, it is not done en masse in an exam hall but in the field, in workplaces, on site, and at multiple different points in the year. The delivery approach has changed following the reforms, with assessment now required to be delivered independently, outside the centre’s purview, with appropriate quality assurance in place.
Our regulatory approach therefore also has to be different and tailored to this unique set of circumstances. Over the past year we have established our new field team, whose job it is to engage EPAOs directly and take time to understand their approaches to assessment. An important and growing part of this team’s role is to travel to those places where assessments are carried out and observe apprentices being assessed. From manufacturing environments to hair salons, the team has been welcomed by both EPAOs and – most crucially – employers, who understand how important it is for Ofqual to see assessments first hand, so we can properly appreciate the hard work both they and the apprentice put into a meaningful apprenticeship programme. Through this work our aim is to better understand the quality of the assessments being offered to apprentices and their experience of them, and to support EPAOs with continuous improvement.
We are also establishing specific sector fora for EPAOs to collectively share and address issues. We have seen a real willingness to engage, and to work together to resolve problems, for example the EPAOs offering assessments in financial services, who recently agreed to work jointly on common grade descriptors. We have seen a welcome readiness from EPAOs to adjust their approach to accommodate common practice. This is critical to establishing consistency across EPAOs and strengthening comparability for apprentices and employers. We also have a close feedback loop with the Institute, to make sure that any adjustments needed to apprenticeship standards and assessment plans are fed back and acted on.
We call this collaborative working with awarding organisations co-regulation. Yes, we are the regulator, and yes, we have rules that they/you have to comply with. But there is no reason not to work co-operatively, if you have quality as your shared goal, and the interests of the apprentices themselves as your priority.
Conclusion
To conclude, therefore, I would just say one thing. We are clearly motivated by improving quality in the apprenticeship assessment system and, if you are too, please do talk to us. There are many ways you can get involved in our work, whether as a sector expert helping with our evaluations and observations, as an assessment expert informing our research, as a training provider on our centre reference group, or as an employer or apprentice providing us with feedback about how the system is working for you. We want you to help us continue to ensure that the assessments taken by apprentices and used by employers are as valid and reliable as they can possibly be.
Responses