Confidence in Qualifications: Understanding the regulation of End-Point Assessments
Sally Collier spoke at The Apprenticeships Conference 2019 about external quality assurance for end-point assessments:
I’m delighted to be here today and to have the opportunity to explain some more about our regulation of end point assessments. I want spend some time this morning considering:
- our approach to external quality assurance (or EQA I shall call it henceforth), and how we work closely with the Institute
- and what I believe are the wider benefits of regulation to the wider apprenticeships programme as a whole
It is very important to me that everybody involved in this complex system understands what we do and how we work very effectively with employers and professional bodies. I want to start by setting out what we’re trying to achieve through regulation, how we go about delivering our intentions, and how we can help to resolve things when they don’t go to plan.
So, what is regulation in this context?
Well, it’s not about enforcing rules for the sake of enforcing rules. Regulation is about supporting the right behaviours. And for us, those behaviours mean high standards in qualifications. As a regulator, it is important that we operate objectively and consistently. We must set out our expectations clearly, so that those who play a part in developing, delivering and awarding qualifications understand very clearly what is expected of them.
It is also important to understand why we have those expectations. We regulate for quality not quantity. Good qualifications are trusted and their value is understood. They are a tool for signalling ability and are a recognised currency in the labour market.
A key objective of ours is that the public have confidence in qualifications. Our expectation is that those who rely on qualifications can trust that individual learners have the knowledge, skills and behaviours they need.
As a regulator, we are supported in our objectives through the legal tools and powers with which we are endowed. Of course, we hope we don’t have to use them, but when we do, we use them proportionately, and in combinations that will have the best effect.
Yes, that does mean that on occasions we may issue fines and direct End-Point Assessment Organisations – or EPAOs as I shall call them henceforth – to do certain things, but it is always with the intention of securing standards and promoting public confidence, which go hand in hand. In short, regulation enables us to set and maintain high standards, and to protect the interests of learners and those who rely on qualifications.
So, how does our regulation translate specifically into EQA?
As you may know, Ofqual is currently one of 4 options that employers can choose to provide EQA for apprenticeship end point assessments.
Like other EQA providers, we work extremely closely with the Institute, supporting its overarching responsibility for apprenticeship quality. But because we are the only statutory regulator for non-degree apprenticeships, it means we are also different to other EQA providers in a number of ways.
1. Treat end point assessments as qualification
First, we treat end point assessments as qualifications. This means that we approach them with the same rigour and focus as other qualifications that we regulate, and are able to draw on all of our regulatory tools. For example, it is absolutely essential that qualification users have confidence that the organisations delivering end point assessments have the right capacity and competence.
We achieve this by requiring EPAOs that offer end point assessments regulated by us to be recognised by us. That means they have to reach the high quality bar that we set for all regulated awarding organisations, and be compliant with our rules. We believe that all EQA providers should have similarly high expectations in this area. It benefits no one to allow organisations to deliver end-point assessments without being confident in their knowledge and understanding of assessments as well as their professional subject-specific expertise. If we don’t all have similar expectations there could be a race to the bottom and quality could be undermined. We cannot allow that to happen.
2. The need to be open and flexible
Second, one of the important ways that we have responded to the reforms is to recognise that we need to be open and flexible about the type of assessment organisation that we recognise, and to be responsive to employer needs. We know that some EPAOs are different to those that we already regulate. And we appreciate that our regulatory approach for niche EPAOs needs to be different to an organisation delivering a vast breadth of qualifications. That’s why we consider each application carefully, on its own merits. Indeed, we meet with every potential applicant before they commence applying for recognition to discuss their specific circumstances.
We recognise that applying for recognition can seem a daunting process. But those who have applied to us tell us that what they do as part of their application stands them in good stead for their ongoing operations. Circumstances can be very different, so we want to understand the specific needs of particular industries, and the context and scale in which they are operating. We know that for some industries, only a certain professional body or type of organisation will have the capacity and capability to meet the needs of that particular sector.
For example, we are currently working with an established professional body, who are small but long established and certainly an expert organisation. They are operating in a niche sector and want to offer end-point assessments against a single Apprenticeship Standard to around 40 apprentices a year. There is no reason why they cannot become recognised. If they can demonstrate to us that they have sufficient resources and processes in place to deliver valid assessments, and are capable of ongoing compliance, then we will find a way to recognise them and support them to develop a full understanding of our rules.
I know that about a third of the EPAOs we recognise now are niche, sector-specific, or professional bodies themselves, and this number is growing day by day. If you are this kind of organisation and have considered recognition in the past but decided it is not for you, do look again, we would be delighted to hear from you.
So, how do we ensure quality in the end point assessments we regulate?
Our regulatory approach is tailored to each type of qualification. We have carefully created a new approach to ensure effective quality assurance of end-point assessments that secures standards and their safe delivery, and which dovetails with the roles of other agencies and, importantly, meets the needs of employers. We have set end-point assessment-specific rules and published our EQA approach so that it is clear and transparent to all who choose to use it.
Before we agree to provide EQA, we apply our assessment expertise to a technical review of the Assessment Plan – confirming that it is capable of supporting quality end-point assessments. When we do that, we work closely with the Institute to feed back to the group of employers our advice, to ensure that the assessment plan meets everyone’s needs and supports valid assessment – that is, the delivery of end-point assessments that test the right things, at the right level, wherever and whenever apprentices complete their assessment.Whether its employers in high speed rail infrastructure or accountants who have designed the standard, we recognise that they know their profession and are committed to its future – they are the ones who know what good looks like. But they can often use a little help in knowing how to judge assessment success – knowing how best to assess an individual – and that’s where our expertise comes in.
We have now reviewed 118 Assessment Plans. Last year we published findings from 100 of them in order to share best practice. As a result, we gave advice about the detail of assessment procedures – do they appropriately match the knowledge, skills and behaviours that the apprentice needs to be able to evidence in order to do the job? For example, it wouldn’t be right to assess the practical skills a butcher requires through a multiple choice test. We also gave advice on grading and grading criteria, which is notoriously difficult to get right because the grade needs to send the right signal of competence and differentiate appropriately between learners.
The lessons from our Assessment Plan review are a key first stage from our perspective, in designing end-point assessment quality in upfront. Without a decent Assessment Plan, the EPAO has less chance of designing assessment materials that will work.
A second key stage in how we ensure quality is what we call the Technical Evaluation. This is where we work with industry professionals and our own assessment specialists to review the end-point assessment materials produced by EPAOs.
In essence, we are checking that the EPAO has interpreted the Assessment Plan correctly and designed assessment materials that are sufficiently valid. To assist with this, we bring in subject experts to ensure that we get an industry perspective. For example, our team are currently working with experts in land-based and heavy vehicle engineering, teaching assistants, and learning and development practitioners, to evaluate end-point assessment materials that have recently come on to the market.
Collaborative work like this strengthens assessment. It gives employers and apprentices’ confidence that end-point assessments are fit for purpose, and it gives EPAOs – particularly those developing end-point assessments for the first time – important feedback and guidance so that they can be sure their end-point assessments are meeting the needs of their customers.
These reviews have led to changes that range from points of detail through to major differences in interpretation between EPAOs. For example, in the customer service standard we found a wide range of different assessment approaches – from multi-choice tests to portfolio submissions – that, had they gone ahead, would have led to a standard of competence meaning something different in each case.
We consider it our job to ensure that there is a consistent approach being taken across the market, and that apprentices who take their end-point assessment against one standard with one EPAO are not susceptible to different testing regimes with any other. It is extremely important to us that the apprentice who passes in Coventry has reached the same standard as the apprentice in Carlisle. And similarly, that the apprentice who passes in 2019, has reached the same standard as the apprentice who passes in 2021.
The maintenance of standards is our core business as a regulator, and we consider it equally as important in apprenticeships as in other qualifications; upholding these standards is what will give employers confidence in the apprenticeships system.
We bake quality into regulated qualifications from the start, but it is equally important that we then monitor them over their lifecycle to ensure that quality is maintained. Just as with other sectors, developments can occur that mean intervention can become necessary. At these times, it is also important that others within the sector hear and take heed of any lessons learned.
For example, last year we became increasingly alert to risks regarding assessor capacity and capability across end-point assessments. So over the last 6 months we have undertaken audits with 15 EPAOs across 5 apprenticeship standards. Our focus for these audits has been whether EPAOs have sufficient qualified people to deliver end-point assessments and that they are available when they are required. We found that EPAOs are, in general, making great efforts to ensure they have a sufficient supply of capable assessors to meet demand. But that’s not to say that they are finding it easy. In particular, we know that they are finding it challenging to access the data they need to accurately forecast end-point assessment demand, and therefore to be able to arrange and resource assessment appropriately.
It is possibly no surprise that we found stronger performing EPAOs were those that:
- First, have set up robust systems for forecasting demand…taking into account indicative Gateway dates, how long each end-point assessment takes to conduct, and the locations of apprentices and assessors;
- Second, are keeping their forecast under regular review – making changes where necessary, for example to take account of things like the on-programme apprenticeship taking longer than expected and the rate of EPA resits; and
- Third, have systems for early identification of instances where a shortfall of assessors is likely, enabling them to take action.
What’s evident to us is that these EPAOs are taking steps to know where and when they will need assessors, and are making quite substantial efforts to do so. We’ve worked with specific EPAOs who have encountered problems to ensure any necessary improvements are being made. We will continue to keep this area under review as volumes of end-point assessments increase. And we will continue to undertake proactive, thematic reviews wherever appropriate and share that intelligence.
Audits are just one of our regulatory tools. We can and will take other action, including issuing directions and penalties, where that is a necessary and appropriate course of action. The new Apprenticeship system is already sufficiently mature enough for us to have utilised the full breadth of our regulatory powers to protect the interests of apprentices and maintain the confidence of employers in apprenticeships.
For example, we are currently taking action where apprentices undertaking an end-point assessment were unable to upload the work they had done in a timed assessment task to an online platform. This resulted in lost work and data, and could have resulted in an invalid assessment. We directed the Awarding Organisation concerned to stop awarding in order to protect the apprentices, and directed them to fix the problem to ensure that the apprentices were not unfairly disadvantaged. This should not have happened and should not happen again. That’s why we are taking regulatory action, and we will be publishing an announcement on this specific case today. The situation for those learners has been resolved – their interests have been protected. We need to ensure that lessons here are learned by others too, so that such a situation does not arise again.
So, as you have heard, we have a well-developed, tried and tested approach to EQA that provides employers, training providers and apprentices with confidence that the end-point assessment is all that it should be: that there is quality, reliability, consistency and comparability.
We also believe our approach is having positive benefits for the wider system too. For example, our recognition process is an established bar for the competence and capacity of effective EPAOs. It is a standard to which many EPAOs are now looking to aspire. We also know that we regulate many EPAOs for which we are not the nominated EQA provider. Organisations we regulate deliver nearly 500 end-point assessments, for 200 (approved) standards – which is over 50% of the market. The good practice of these EPAOs’ regulated operations will naturally strengthen the end-point assessment market. And, as previously indicated, we are always sharing feedback from our experiences to benefit the wider system.
In summary, I think there are 4 areas where our EQA work is benefitting the wider system:
- First, we believe that our approach has added value to employer involvement, and not replaced it
- Second, we believe the feedback we have provided on assessment plans has benefitted more than just the EPAOs we regulate, helping to deliver assessments that are conducted in the most appropriate ways
- Third, we know our approach is helping to ensure that there is consistency between EPAOs and has set a bar for their competence, so standards can be more easily maintained
- And fourth, where we have identified challenges we have taken action and shared these concerns, so the wider system can be on the lookout for similar issues.
I hope that gives you a useful overview of our work in this important area.
Responses