Plans by tech giants for more encrypted messaging risks greater child exploitation and abuse
Anne Longfield warns that the introduction of end-to-end encryption messaging by tech giants could make it much harder for platforms to detect grooming, scan for child abuse material and share reports with law enforcement agencies.
Children’s Commissioner urges Government to stand up to social media giants and introduce online harms legislation in 2021.
Children’s Commissioner for England survey reveals 9 out of 10 children in England aged 8-17 use a messaging app or messaging website, including 7 out of 10 8-10-year-olds.
Survey reveals 60% of 8-year-olds and 90% of 12-year-olds report using a messaging app with an age restriction of 13 or older.
Over a third of children say that they have received something that made them feel uncomfortable on a messaging service – and almost one in ten children report using a messaging site to talk to strangers.
Anne Longfield, Children’s Commissioner for England, is today (Tuesday) publishing a report,
“Access Denied: How end to end encryption threatens children’s safety online”, looking at children’s use of private messaging services like WhatsApp and Facebook Messenger.
The study suggests that millions of children in England are using messaging platforms that they are not old enough to be accessing. The report comes following announcements by Facebook, and indications by other platforms such as Snap, that they plan to apply end-to-end encryption to all their messaging services. End-to-end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse.
The report includes a survey revealing the extent of children’s use of messaging services, including by children much younger than the minimum age requirement. It shows:
- Nine out of ten children aged between 8-17 are using messenger services.
- 60% of 8-year-olds and 90% of 12-year-olds reported using a messaging app with an age restriction of 13 or older.
- Almost one in ten children report using a messaging service to talk to people they don’t already know.
- One in six girls aged 14-17 reported having received something distressing from a stranger via a private message.
- 1 in 20 children say they have shared videos or photos of themselves with strangers.
- Over a third of 8-10-year-olds and over half of 11-13-year olds admit that they have said they were older than they were in order to sign up to an online messaging service.
The report warns that the privacy of direct messaging platforms can conceal some of the most serious crimes against children, including grooming, exploitation and the sharing of child sexual abuse material. An NSPCC investigation found that Facebook, Instagram and WhatsApp were used in child abuse images and online child sexual offences an average of 11 times a day in 2019. It also found that the rate of grooming offences committed in the UK appears to have further accelerated over the course of lockdown, with 1,220 offences recorded in just the first three months of national lockdown. Facebook-owned apps (Facebook, Instagram, Whatsapp) accounted for 51% of these reports and Snapchat a further 20%.
End to end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse. The Children’s Commissioner’s survey found that Whatsapp – an end-to-end encrypted service owned by Facebook – is the most popular messaging app among all age groups, used by 62% of children surveyed. Chat services attached to large social media sites, such as Snapchat, Instagram, Facebook and TikTok, are also popular, particularly among teenagers. None are yet end-to-end encrypted by default but all – with the exception of TikTok – have made public their plans to do so in the near future or suggested that they are looking into it. All have age limits which children routinely ignore, and which platforms do little to meaningfully enforce.
It is now over 18 months since the publication of the Government’s Online Harms White Paper, and over 3 years since the publication of the Internet Safety Strategy green paper which preceded it. Added to this delay, the Children’s Commissioner is concerned that end-to-end encrypted messaging services could be defined as “private communications” and could therefore not be subject to the duty of care in the same way as other platforms.
The Children’s Commissioner is also warning that end-to-end encryption could be a cynical attempt on the part of some tech firms to side-step sanctions and litigation, as the UK Government prepares to establish a new legal ‘duty of care’ on companies towards their users. If a platform is unable to read a message shared across their server, it follows that it would be hard for a Government to hold them accountable for its contents.
The report makes a series of recommendations, including:
- The Government should introduce its online harms legislation to Parliament in 2021. The legislation should set a strong expectation on platforms to age verify their users and allow for strong sanctions against companies which breach their duty of care. This should include GDPR-style fines and a requirement to issue notifications to users when tech firms are found to be in breach of their duty of care. It should also bring the full range of services used by children in scope – including social media, messaging services (end to end encrypted or not) gaming platforms, and more.
- The ICO should take robust action against platforms who do not conform to the requirements of the Age Appropriate Design Code when the transition ends in September 2021.
- The Government should ensure that if tech giants are unable to demonstrate that new features they introduce will not put younger users at heightened risk of harm, the feature should not be implemented.
- Tech giants should not apply end to end encryption to children’s accounts if doing so reduces children’s safety, and they must introduce better mechanisms for proactively monitoring their platforms for child sexual exploitation. Tech companies should also retain the ability to scan for child sexual abuse material. Platforms which fail to meet these tests should be judged to have breached their duty of care.
- The Government’s proposed duty of care should cover ‘private communications’, including those which are end to end encrypted.
Anne Longfield, Children’s Commissioner for England, commenting on the report, said:
“This report reveals the extent to which online messaging is a part of the daily lives of the vast majority of children from the age of 8. It shows how vigilant parents need to be but also how the tech giants are failing to regulate themselves and so are failing to keep children safe.
“The widespread use of end-to-end encryption could put more children at risk of grooming and exploitation and hamper the efforts of those who want to keep children safe.
“It has now been 18 months since the Government published its Online Harms White Paper and yet little has happened since, while the threat to children’s safety increases.
“It’s time for the Government to show it hasn’t lost its nerve and that it is prepared to stand up to the powerful internet giants, who are such a big part in our children’s lives. Ministers can show they mean business by promising to introduce legislation in 2021 and getting on with the job of protecting children from online harms.”
Simone Vibert, Senior Policy Analyst for the Children’s Commissioner, and author of the report, said:
“Messaging services play an important role in children’s lives, helping them to keep in touch with family and friends. But there is a more sinister side to these platforms. This research shows that hundreds of thousands of children are using messaging apps to contact strangers, including sharing images and photos, and that they are receiving images messages back which make them feel uncomfortable.
“The fact that there are age limits on these apps shows that the tech giants themselves are aware of the risks, and yet most do very little, if anything, to reliably check the age of their users. Our research shows a majority of children are using a messaging app which they aren’t old enough to be using.
“It is yet more evidence of the need for a statutory duty of care on online platforms, including messaging apps.”
Responses