Collecting learner, training and qualification level information
Share learner data (previously known as Data Collection) beta assessment
This service collects learner and training and qualification level information from the Further Education (FE) sector.
Data is submitted to ESFA to enable payments for some providers and build funding allocations for others.
From: | Government Digital Service |
Assessment date: | 23/04/19 |
Stage: | Beta |
Result: | Met |
Service provider: | Education and Skills Funding Agency (ESFA) |
Service description
This service collects learner and training/qualification level information from the Further Education (FE) sector. Data is submitted to ESFA to enable payments for some providers and build funding allocations for others.
This data covers learners who are at least 16 years old, not in schools or undertaking higher education, and comes from a range of training providers including FE colleges, private training providers and local authorities. It helps distribute around £8 billion of government funding.
This service validates on submission then allows the data provider to cleanse the data before funding calculations are made. Once completed there are a number of reports made available that give detailed feedback on submissions.
Service users
There are over 2500 unique organisations in England that use the service with about 6000 users in total. Core users are responsible for managing and consuming data for financial forecasting and business intelligence within the following providers:
- private training providers
- local authorities
- colleges
- universities
- employer providers
- National Career Service
1. Understand user needs
Decision
The team met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- over the course of the discovery, alpha, and private beta the team have developed a good understanding of the needs of different types of learner data providers and the difficulties they face with the current system
- the team have carried out a significant amount of user research, and a total of 94 users took part in the private beta
- the team have embraced the shift to a user needs-driven approach and are doing work within their organisation to enable other teams to become user needs-driven, including training other user researchers.
What the team needs to explore
Before their next assessment, the team needs to:
- continue investigating more impactful improvements to the service that may meet the users’ needs better, such as an API and displaying key data within the service rather than users having to download reports
- continue doing more research on the accessibility of the service and research with users who need assisted digital support
- continue sharing insights and collaborating with other teams within DfE to address the end-to-end user needs that go beyond the submission of learner data.
2. Do ongoing user research
Decision
The team met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team have carried regular user research and have iterated the design of the service based on the findings of this research
- the team have had to overcome challenges while planning and carrying out research due to users’ previous negative experiences, and the team realise they need to develop a better relationship with their users by demonstrating that they are now actively trying to meet their needs
- the team have defined some KPIs that relate directly to findings from user research, including around the speed with which payments are made – as they understand that this is particularly important for smaller providers.
What the team needs to explore
Before their next assessment, the team needs to:
- explore appropriate ways to study how people submit their data using the new service, such as shadowing different types of providers
- do more research with less experienced providers and evaluate their journey from on-boarding to submitting learner data
- work closely with performance analysts and the help desk to analyse the real usage of the service. Use these insights to help focus further user research on meeting user needs and addressing usability problems, thereby decreasing the current emphasis on users’ self-reported needs and experiences.
3. Have a multidisciplinary team
Decision
The team met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team are engaging with graduate and apprenticeship schemes, seeking to bring new talent into the team and build in-house capability
- they have trained existing in-house staff with a strong interest in digital development, building on underlying understanding of their organisational purpose and users’ needs
- contract staff on the team trained and developed in-house people ensuring an effective knowledge transfer, reducing the risk of losing skills and knowledge as the team make-up changes.
What the team needs to explore
Before their next assessment, the team needs to:
- work with senior leaders in ESFA to enable use of available freedom to apply recruitment and retention allowance, bringing in additional high quality specialist staff in a competitive environment.
4. Use agile methods
Decision
The team met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team is employing agile practices to ensure iterative, incremental improvements to their service based on real world user experience
- they are working effectively with their wider organisation to influence policy choices, reduce waste and unnecessary information requests
- they have worked to ensure effective collaboration with others in ESFA, including working with stakeholders / key representatives from across the organisation to ensure alignment through a board chaired by the Service Owner.
5. Iterate and improve frequently
Decision
The team met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has learned and improved its service throughout this development phase, making improvements based on user research
- they are in control of their development pipeline, and are able to make regular changes and get rapid feedback from real users as a consequence
- working effectively and improving internal practices with open and constructive retrospective meetings, critically reviewing their own processes to identify areas to improve.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to consolidate these efforts, and explore opportunities to collaborate or share development with other components of the wider ‘education’ journeys.
6. Evaluate tools and systems
Decision
The team met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- cloud first development was used for the system architecture
- the team replaced existing IAAS (infrastructure as a service) to reduce maintenance costs to Azure, PAAS (Platform as a service) entirely
- Redis Cache was changed to cloud based storage directly which produced no loss of performance
- Microsoft technology stack including .Net core was used to overcome the business constraints of the existing system being hard to debug (full of SQL statements)
- the use of the Azure development operations and Visual studio online made development 100% automated.
What the team needs to explore
Before their next assessment, the team needs to:
- explore the use of APIs as opposed to direct file uploading.
7. Understand security and privacy issues
Decision
The team met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- because personally identifiable data is stored by service, its architecture has been made GDPR compliant to ensure data security
- there is an Independent Security Architect in the team who is a permanent member of staff and also a Security Information Risk Advisor who looks at residual risk, for example, moving to the cloud
- the service has the necessary security accreditation through PEN Testing that’s been carried out.
What the team needs to explore
Before their next assessment, the team needs to:
- produce a security threat model for public beta phase to anticipate how the service will work with maximum volume of concurrent users.
8. Make all new source code open
Decision
The team met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team open sourced code wherever possible and uses Autofac for version control
- they make use of .Net core so it can be used on any platform and also uses MVC (Model-View-Controller) patterns
- they have documented everything on GitHub.
What the team needs to explore
Before their next assessment, the team needs to:
- identify opportunities for other services within DfE to share code produced for the Share Learner Data service.
9. Use open standards and common platforms
Decision
The team met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- open data is used for file types used by this service, the Government publish these file types using the ILR (Individualised Learner Record) specification
- the open data used in the service is anonymised.
What the team needs to explore
Before their next assessment, the team needs to:
- identify opportunities for other services within DfE to share architecture platform hosting Share Learner data.
10. Test the end-to-end service
Decision
The team met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- testing was carried out with 94 individual private beta users (from 300 invited to take part) without significant problems
- the IT Service Manager will conduct an operational readiness review
- the end-to-end testing is carried out using Agile methods
- the next elements in the service will be getting detailed operational view of the system and helpdesk will support in that.
What the team needs to explore
Before their next assessment, the team needs to:
- identify possible ways to produce read-only dashboard which was requested by product owner.
11. Make a plan for being offline
Decision
The team met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team can rebuild the service and go back online in 4 hours
- the team can work with traffic managers in the cloud to carry out page redirects if necessary.
What the team needs to explore
Before their next assessment, the team needs to:
- understand the thresholds of users more precisely in public beta with the team currently anticipating around 1500 users
- work on measuring the channel shift uplift metric.
12: Make sure users succeed first time
Decision
The team met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the service explains what it is for on its start page
- the team are working with other parts of ESFA and DfE to make sure their service is designed in the context of the other parts of the user journey
- the team has thought through different approaches to solving the problem, including an API that collects the data automatically
- the team includes a content designer and an interaction designer and are working with other designers in the department
- support channels are well signposted.
What the team needs to explore
Before their next assessment, the team needs to:
- gain clarity on how users will find the start of the service
- continue working with the team working on the induction into the service, so they can be clear on what expectations are set early on
- focus on reducing the number of errors that users experience whilst trying to upload their data. The average amount of errors faced by users (proportionate to their data size) should be coming down over time as the service continuously improves throughout the beta period
- continue to future-proof the system so that an API-driven solution is possible in the future
- ensure decisions are consistently made with an emphasis on user needs first. A balance between pragmatism and idealism is needed, but currently, the balance seems a little too leaning towards pragmatism and not enough idealism
- ensure the final page of the service meets users needs (i.e to have a forecast of how much funding they will receive) in a way that is least burdensome for them (i.e without needing to download a file etc.).
13. Make the user experience consistent with GOV.UK
Decision
The team met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using the GOV.UK design system patterns. Namely the start page pattern and the file upload pattern.
What the team needs to explore
Before their next assessment, the team needs to:
- continue using the GOV.UK Design System
- continue to test whether users, especially new users, understand the acronyms and other terminology that the service uses.
14. Encourage everyone to use the digital service
Decision
The team has not met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team have been using GOV.UK patterns and the GDS Design System to avoid potential issues with service design for users with access needs
- a specialist accessibility audit has been conducted.
15. Collect performance data
Decision
The team met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team has processes in place for calculating data on the 4 mandatory KPIs
- a provisional figure for cost per transaction has been calculated, including apportionments for relevant staff and other resources.
What the team needs to explore
Before their next assessment, the team needs to:
- clarify how it will apply the mandatory KPIs to the service. For example, how the completion rate will be measured and what constitutes as a completed transaction and a started transaction
- do more in identifying service-specific metrics rather than just the 4 mandatory KPIs, which may not always be helpful in measuring success (see point 16).
16. Identify performance indicators
Decision
The team met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team is recruiting a data scientist/analyst to help identify and capture suitable metrics
- the team is planning to make better use of the Google Analytics tracking, to identify where users experience problems
- the team is capturing data from other sources, including the service desk.
What the team needs to explore
Before their next assessment, the team needs to:
- identify KPIs that specifically measure the success of the service in meeting the needs of users and the business. Until now, the team’s focus has been about collecting and providing data to meet the service standard, but we’d like to see more measures specific to the service, and derived from user needs and the purpose of the service. A GDS Performance Analyst can help the team to understand how to do this
- look into segmenting their users (e.g. by ILR, EAS and ESF users) to help identify problems experienced by specific sets of users.
17. Report performance data on the Performance Platform
Decision
The team met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team is in discussions with the Performance Platform about the format of data and the process of uploading (it does not yet have a dashboard).
What the team needs to explore
Before their next assessment, the team needs to:
- have a publicly-available dashboard, and processes in place to ensure it is kept up-to-date.
18. Test with the minister
Decision
The team met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the team had taken time to work through the service with senior leaders in ESFA as an effective proxy for testing the service with a minister.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to ensure senior leaders are fully supportive of the service under development.
Responses