Employer-led end-point assessment external quality assurance
Following the publication of my introductory article on end-point assessment external quality assurance, here is the next article in the series on external quality assurance (EQA).
This article is from the perspective of an employer-led EQA provider, the National Skills Academy for Rail (NSAR), following an interview with Barry Smith, Head of Assessment.
An introduction
NSAR provide EQA for 14 digital apprenticeship standards, 4 freight and logistics standards, and 5 rail standards, confirmed in its recognition letter of November 2018 from the Institute for Apprenticeships and Technical Education (IfATE).
There are currently 30 End-point Assessment Organisations (EpAOs) that are approved on the register deliver End-point Assessment (EPA) for standards that have NSAR as their EQA provider, all at varying stages of readiness and delivery.
How did NSAR get involved in EQA for Standards?
Given the massive investment employers have made in creating apprenticeship standards and assessment plans, and their desire to maximise the quality and impact of apprenticeships in their industry, many wanted to retain their involvement and support the work being done to assure the quality of assessment, many seeing through what they had helped start.
NSAR has always had a strong track record in the provision of EQA of training and competence for non-public funded training and competence assessment, so when the rail sector was developing their apprenticeship standards, they wanted NSAR to support them in developing an employer-led EQA approach.
You are the Skills Academy for Rail, so how can you be an EQA provider for digital?
When the Tech Partnership knew it had to wind down, the original, intended EQA provider for digital apprenticeships, they put their EQA responsibility out to competitive tender.
We knew this employer-led model had to continue and thrive if the employer-led option was to remain an option for others (and if rail was to get a chance to have its own version).
So, we tendered for the role and made the digital employers an offer. We committed to take their existing digital employer board under the NSAR group banner and to work with them to get their vision off the drawing board.
Digital isn’t entirely new territory for us. All rail employers are digital employers and, indeed, digital apprenticeship users. The Department for Transport has also set the rail sector a huge ‘digital rail’ modernisation challenge, so we had a vested interest in the success of these apprenticeships and in supporting other digital employers to achieve their vision.
Now, 9 months on, the Digital Apprenticeship Quality Board has met five times (working with NSAR) and is directing the work of the NSAR:DIGITAL EQA service.
As we signalled our intent to be a source of support for employer-led EQA, we were approached by the British International Freight Forwarding Association, and the Chartered Institute for Logistics and Transport to see if we would be interested in taking on their EQA responsibilities and introducing the employer-led approach to four freight and logistics standards.
We discussed this with them and agreed to support them, too. We are now setting up their employer-led, apprenticeship quality board under the NSAR:LOGISTICS EQA service. So, we now very much see ourselves as supporting employer-led, EQA helping employers make it happen, where and when we can.
How have you designed your framework for EQA?
We are very clear about how EPA and apprenticeship assessment happens in a marketplace and that employers need to have confidence in that marketplace (as do training providers and, most importantly, apprentices).
Who better than employers to say to other employers, as customers in the EPA marketplace, ‘Don’t worry about which EPA you choose from an assessment outcome perspective. Whichever one you choose; we are overseeing the work to ensure the assessment outcome for the apprentice is the same’.
We operate a ‘3C’ framework to EQA, focussed in the areas of highest risk to the validity of end-point assessment:
- Compliance – Standards compliance and readiness check against the Assessment Plan requirements
- Consistency – Quality and consistency in assessment in managing and delivering end-point assessments
- Comparability – Comparability review of assessment outcomes across assessors and across End-point Assessment Organisations
Our quality assurance judgements are made based on a set of quality statements produced by our Apprenticeship Quality Boards, which are published on our website.
These statements set out the minimum expectations for EpAOs and we use these Quality Statements as the basis of our EQA reports.
When do you begin EQA?
We start all our EQA work with a Market Insight activity. We work with EpAOs to understand the standard, the assessment plan and the apprentice standard marketplace through the lens of EPA delivery.
We do a bit of triangulation work with AELP forums and other provider groups and then develop a clear sense of where the assessment risks are.
Each assessment method, each combination of methods and each marketplace with its own distinct features, gives us a unique standards-based profile of risks to validity in assessment and we start from there.
We start with the greatest risks and move through our cycle from that point. So, for digital infrastructure technician or warehouse operative it might be comparability (multiple EpAOs), for cyber security or Train Driver it might be consistency (single EpAO active).
What is the outcome of your EQA?
There will be summative reports to IfATE on performance at individual EpAO level and at individual standard level. We also report in much more formative assessment style ways to EpAOs on the on-going IQA activity we see (e.g. standardisation, moderation etc).
Here we try to add value to their own continuous improvement processes working hard to be critical friends and trusted partners offering the right amount of professional challenge.
We are very conscious of our own role in looking to add value to EpAOs in their own continuous improvement journeys.
The IfATE is due to release an updated EQA framework following the March 2019 DfE Strategic Guidance and the House of Commons Committee of Public Accounts in May 2019, where does that leave you with an already developed framework and service?
We have been working, alongside the other EQA providers, with the IfATE in the development of the new framework.
Good EQA clearly must be focused on risks to validity in assessment, proper functioning of the EPA market place and the need for assessment plan compliance, assessor consistency and comparability of assessment outcomes.
EpAOs are commercial businesses and their assessment materials commercially sensitive and subject to copyright, how do you deliver an effective EQA with these constraints?
We respect commercial sensitives and intellectual property. We also respect that EpAOs operate in a competitive environment. Data is used confidentially and sensitively to monitor trends and we will only request information required to conduct our core EQA responsibilities.
The EQA reports we produce for each EpAO are reported to our Apprenticeship Quality Boards, and it is the Board that makes recommendations to the IfATE. We see our role as creating, maintaining and curating an assessment community of shared understanding of how to interpret and implement assessment plans.
This means coming to an agreement across the EPA community when there is any uncertainty or confusion. Having helped establish this community of shared understanding, we then work to de-risk the EPA assessment market place by ensuring new entrants are brought up to speed, share and comply with this shared understanding.
What is the EQA cycle?
We have a yearly cycle but the frequency of different activities is driven by risk.
How much do you charge?
First of all, we have to recover our costs and we look at charging as a cost-recovery exercise. There is no free EQA. There are only different ways that existing EQAs recover their cost.
All our thinking on charging and cost recovery is shared (open book) with EpAOs, and the employer board has ownership with full visibility of the EQA service P&L accounts. We have a transparent policy on charging, and all our charges are available on our website.
Our charges are set per standard, so there is no cross-subsidising using common charge, regardless of standard.
The employers were adamant that this was what was needed if EQA charges were to be fair. We are aware that others have taken a different view.
For us, we felt that EQA charges had to be apprenticeship standard specific because the risk profile of each assessment plan is different, the EpAOs in each marketplace are not the same, and the volume of apprentices are different.
So, charges derived from the conditions around each standard avoided large volume apprenticeships subsidising lower volumes, and EpAOs in large volume market places subsidising other EpAOs (possibly even competitors) in other EPA marketplaces. At the heart of what we are trying to do is the need to find proportionate EQA funded in a responsive and fair way.
The charges are determined by acknowledging the combination variables that drive cost. These include the IfATE EQA requirements (what we have to do), the nature of the assessment marketplace, e.g. how many EpAOs in the marketplace (who and how many times we have to do it), and number of apprentices likely to take EPA.
This allows us to forecast and we recover the cost through actual EPA numbers, i.e. those that take the assessments. Some of these variables change annually, so we actively manage and adjust annually. Our goal is to get as close to breakeven each year as is feasible.
What are you doing to help reduce the bureaucracy for EpAOs who may be EQA’d by multiple EQA providers, or who deliver multiple standards EQA’d by NSAR?
This is an interesting question and your starting assumption seems to chime well with the prevailing narrative when EQA providers and EpAOs relationships are discussed.
The IfATE is about to launch a new common EQA framework, that we alongside other EQA providers have helped to develop, that we will all, as EQAs, implement and work to. This will help address the concerns around bureaucracy for EpAOs working with multiple EQA providers.
But, let’s look behind the ‘headlines’ a bit. I wonder just how many EpAOs actually have ‘multiple’ EQA providers. There are some. Very many have just one, some have two or three but across the 200 odd EpAOs, I do wonder just how many have ‘multiple’.
*(note from the author, if you are interested in how many EQA providers the EpAOs work with, please see the base of this article for some follow up analysis that was carried out post interview).
Unnecessary duplication around organisation-based EQA concerns needs to be, and will be, sorted through the new EQA framework that is about to be launched, but at an apprenticeship standards level I think the picture changes.
Each standard and assessment plan is pretty unique; the occupational competence as a result of doing an apprenticeship is certainly unique. So, to guarantee properly functioning EPA marketplaces, EQA has to happen at an apprenticeship standard level.
What we need to do is get to a place where a kind of 80:20 model comes into play, where organisational EQA concerns are checked once (the 80) and the standards specific EQA concerns are checked each time (the 20). The new framework and readiness check arrangements will help here.
So, the trade-off for being a large enough to offer EpAO for large numbers of standards, accessing large numbers of apprentice marketplaces and multiple routes to public-funded assessment is having a range of EQAs to work with.
Soon we will all work to the same EQA framework, anyway. There are returns in operating across lots of standards but remember, very many EpAOs only operate across one or a very few EQAs. So, let’s look at the evidence and understand what the real picture of coverage is.
Where we see the same EpAOs across digital, logistics and freight we look to work through where we can find efficiencies and make secure inferences and draw safe conclusions using this benefit to our and their advantage.
The House of Commons education committee last October recommended that Ofqual should be given responsibility for the EQA of all end-point assessments. (Paragraph 19) will that mean an end to EQA providers other than Ofqual over time?
This is an interesting one. I was surprised when the committee arrived at this conclusion without much EQA having been done and without them speaking to those EQAs that were operational.
I know that lots of others had lots to say about EQA, I might have been inclined to want to hear from EQA providers.
As I see it:
- if we believe employer-leadership in external quality assurance of the apprenticeship marketplaces they operate in is not important, and that employers bring nothing of real value to this space; and
- if we think we already know what EQA will need to be like and how it needs to perform going forward (this before we have reached a steady state, before the system is tested and before any evidence is in (and we consider Ofqual as the fit for purpose solution), and
- if we thing we should close down the choice of EQA options for employers writing standard, including losing the way some of these choices invite other types of organisations like training providers, professional institutions and others to bring greater choice and innovation into the EPA space (without having to be recognised and regulated), and
- if we think there is the political appetite for what may need to be a larger, tax-payer funded, close to government organisation to take on sole responsibility for what will end up being a large number of apprenticeship standards (along with everything else they do), then
then the answer would probably be ‘yes’, but that’s a lot of ‘ifs’.
I think the future lies in something potentially more responsive built on the best of what we are seeing emerge and being applied to what we understand to be the real EQA needs of the EPA system.
This means looking at and acknowledging the key things that the Ofqual, employer, and professional organisation models brings to EQA. This could be underpinned by a regulatory clarity and statutory reach that makes sure the overall approach is robust and accountable.
One clear one for us is the ability to immediately access for assessment, assurance and occupational competence savvy employers (our Boards) for clarifications, rulings and elaborations on standards and assessment plan issues.
Thank you!
This detailed, interview based, article on the operation and perspective of EQA by NSAR, has been produced thanks to the time and contribution of Barry Smith, Head of Assessment at NSAR.
Follow up
Following my interview with Barry Smith, I was intrigued to look behind the headlines and find out just how many EQA providers each EpAO works with.
Based on the June IfATE and ESFA spreadsheets from their websites, of the 236 EpAOs, it appears that 72% of EpAOs work with just one EQA provider, 15% with 2 EQA providers, 7% with 3 EQA providers, and 6% with 4 of more or more EQA providers.
Jacqui Molkenthin, JEML Consulting
I hope you have found this background article useful. Over the next few weeks I will be writing articles and adding videos and podcasts on EQA following interviews with a range of EQA providers. If you are an EQA provider, who is not already booked in for an interview, and you are interested in being interviewed, please do get in touch.
Responses