Ofcom gets more powers over tech groups in online safety campaign

UK authorities will have the power to order tech companies to redesign their platforms and impose fines if they fail to police child sexual exploitation material under new online safety legislation. line.

The rules will target end-to-end encrypted platforms, where messages can only be viewed by the sender and receiver, who are under increasing political pressure to give governments and law enforcement access to the content, including messages, photos and videos.

The Home Office announced the amendment to the Online Safety Bill on Wednesday, allowing communications regulator Ofcom to pay fine tech companies £18million or 10% of their annual turnover, according to the higher amount, if they do not meet child protection standards that have yet to be defined.

Under the proposals, the regulator could be allowed to order tech companies to install yet-to-be-developed software on encrypted platforms or develop their own technologies to detect inappropriate hardware.

The move comes as tech companies seek to balance protecting the privacy of their users’ data and protecting vulnerable users, while working with law enforcement and lawmakers who cannot view the content on encrypted platforms.

Apple has previously attempted to introduce scanning software to crack down on harmful child sexual abuse images, but was forced to backtrack after a backlash from privacy activists last year .

Meanwhile, Meta, which owns Facebook, Instagram and WhatsApp, has pledged to roll out end-to-end encryption on Facebook Messenger, which the Home Office and charities have previously lobbied on behalf of the community. child safety.

In a public submission to the Bill’s committee last month, the company said it was concerned about the operation of Ofcom’s ability to require message scanning to detect inappropriate content. “It is unclear how this would be possible in an encrypted messaging service, and it would have significant privacy, security and safety implications for users,” wrote Richard Earley, public policy manager at Meta UK.

Under the legislation, Ofcom will decide whether platforms are doing enough to prevent, detect and remove explicit material, and whether it is necessary and proportionate to ask platforms to change their products.

“Privacy and security are not mutually exclusive – we need both, and we can have both and that’s what this amendment delivers,” Home Secretary Priti Patel said.

The government has awarded five projects across the UK more than £550,000 to develop technologies to stop the spread of child pornography material, which platforms could be asked to use in their products in the future.

These include external software that could be integrated into existing crypto platforms, as well as age verification technology that could be used before consumers access crypto services.

Figures released on Wednesday by children’s charity NSPCC suggest online grooming crimes have jumped by more than 80% in four years in the UK and are averaging around 120 offenses a week.

Meta-owned platforms were used in 38% of cases where the means of communication was known and Snapchat 33%.


Source link

Comments are closed.