UK’s Technology Minister, Michelle Donelan, championed new legislation as a “landmark” initiative, commenting, “Our government is ardently committed to transforming the UK into the world’s safest online environment.”
The legislation lays out a novel framework to confront unlawful and detrimental online material. When ratified, platforms like Facebook and TikTok would be mandated to act swiftly against prohibited content, either halting its publication or swiftly taking it down.
Ofcom, the telecommunications regulatory body, is slated to oversee internet regulations. The obligations of social media entities include the following:
- Expedited action against illegal content, encompassing materials that induce self-harm.
- Barriers to prevent minors from encountering harmful or age-restricted content.
- Rigorous age verification and age restriction enforcement.
- Heightened transparency concerning risks posed to minors on leading social media sites, inclusive of publishing risk evaluations.
- Offering straightforward and user-friendly mechanisms for children and their guardians to voice online concerns.
Child online safety has garnered widespread attention, with entities such as the National Society for the Prevention of Cruelty to Children, Internet Watch Foundation, grieving parents attributing online harm to their children’s demise, and survivors of sexual assault endorsing the bill.
Firms resisting compliance might face potential penalties – up to £18 million or a whopping 10% of their global annual revenue.
The comprehensive 300-page bill has undergone extensive refinements since its original draft four years ago, with fresh clauses addressing issues like trolling, deepfake obscenities, maltreatment of animals, and misleading advertisements.
In a significant policy alteration in November 2022, the UK government pivoted from addressing “permissible yet harmful” material responding to anxieties that such interventions might infringe upon free speech.
Initial proposals had indicated that fines could be imposed on social media firms failing to tackle detrimental yet non-criminal content. There were also suggestions of potential repercussions for senior management.
Yet, the debate doesn’t end. The bill’s most debated provision mandates the scanning of encrypted communications for illicit material. This possibly suggests pre-encryption, client-side scans – a notion that has met with considerable scepticism, with critics pointing out its feasibility and potential privacy infringements.
Encrypted messaging platforms like Meta’s WhatsApp and Signal, which employ end-to-end encryption – safeguarding messages from external visibility – had hinted at exiting the UK market if scanning obligations were imposed.
In a subsequent development, the UK administration clarified that Ofcom would only demand scanning if viable technology was available. This adjustment was tentatively acknowledged by tech aficionados and activists.
In an ongoing dialogue, the UK has already approached Meta, urging them to tread cautiously when considering introducing end-to-end encryption on platforms like Instagram and Facebook Messenger, emphasizing child safety concerns.
Meta’s vision includes expanding end-to-end encryption across Messenger and Instagram DMs, emphasizing the technology’s role in amplifying user security.