Home Secretary Priti Patel plans to speak at a child welfare charity event focused on exposing the perceived ills of end-to-end encryption and calling for tighter regulation of the technology. At the same time, a new report will say tech companies need to do more to protect children online.
Patel will headline an April 19 roundtable hosted by the National Society for the Prevention of Cruelty to Children (NSPCC), according to a draft invitation seen by WIRED. The event is expected to be deeply critical of the encryption standard, making it more difficult for investigators and tech companies to monitor communications between people and detect child grooming or illegal content, including terror or child abuse images.
End-to-end encryption works by securing communications between those involved in it – only the sender and recipient of messages can see what they are saying and the platforms providing the technology cannot access the content of the messages. . The technology has become more and more standard in recent years with WhatsApp and Signal using end-to-end encryption by default to protect people’s privacy.
Home Office move comes as Facebook plans to deploy end-to-end encryption across all of its messaging platforms, including Messenger and Instagram, which has sparked a fierce debate in the UK and elsewhere over the supposed risks the technology poses to children.
During the event, the NSPCC will unveil a report on end-to-end encryption by PA Consulting, a UK company that has advised the UK’s Department of Cultural and Sports Media (DCMS) on the upcoming regulation on online security. An early draft of the report, seen by WIRED, indicates that increased use of end-to-end encryption would protect adult privacy at the expense of children’s safety, and that any strategy adopted by tech companies to mitigate the end-to-end encryption effect the end-to-end encryption will “almost certainly be less effective than the current ability to search for harmful content.”
The report also suggests that the government design regulations “specifically aimed at encryption” to prevent technology companies from “[ing] far ”from their ability to control illegal communications. It recommends that the upcoming online safety bill – which will impose a duty of care on online platforms – make it mandatory for tech companies to share child abuse data online, as opposed to voluntary.
The online security bill should force companies whose services use end-to-end encryption to show how effective they are combating the spread of harmful content on their platforms – or risk being slapped by them. fines by the communications authority Ofcom, which will be responsible for enforcing the rules. As a last resort, Ofcom could require a company to use automated systems to remove illegal content from its services.
The NSPCC says this setup does not go far enough to curb encryption: In a statement released last week, the charity urged digital secretary Oliver Dowden to strengthen the proposed regulation, preventing platforms from deploy from start to finish. end encryption until they can demonstrate that they can protect the safety of children. Facebook is currently tackling the circulation of child sexual abuse content on WhatsApp by removing accounts with banned images in their profile photos, or groups whose names suggest illegal activity. WhatsApp said he bans more than 300,000 accounts a month that he suspects of sharing child sexual abuse material.
“Ofcom will have to respond to a series of tests before it can act on a regulated platform,” says Andy Burrows, NSPCC’s Child Safety Online Policy Manager. “It’s about being able to demand proof of serious and lasting abuse, which will be practically very difficult to do because end-to-end encryption will remove a significant portion of the reporting flow.”