UK Admits Encryption Hurdles to Online Safety Law After WhatsApp, Signal Threaten to Pull Out

The UK acknowledged possible technical hurdles in its planned crackdown on illegal online content after encrypted messaging companies including WhatsApp threatened to pull their service from the country.

Regulator Ofcom can only compel tech companies to scan platforms for illegal content such as images of child sexual abuse if it’s “technically feasible,” culture minister Stephen Parkinson told the House of Lords on Wednesday, as the chamber debated the government’s Online Safety Bill. He said the watchdog will work closely with businesses to develop and source new solutions.

“If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use,” Parkinson said. Ofcom “cannot require companies to use proactive technology on private communications in order to comply” with the bill’s safety duties.

The remarks aim to allay concerns by tech companies that scanning their platforms for illegal content could compromise privacy and encryption of user data, giving hackers and spies a back door into private communications. In March Meta Platforms’s WhatsApp even threatened to pull out of the UK.

“Today really looks to be a case of the Department for Science, Innovation and Technology offering some wording to the messaging companies to enable them to save face and avoid the embarrassment of having to row back from their threats to leave the UK, their second largest market in the G7,” said Andy Burrows, a tech accountability campaigner who previously worked for the National Society for the Prevention of Cruelty to Children.

Protecting Children
The sweeping legislation — which aims to make the web safer — is in its final stages in Parliament after six years of development. Parkinson said that Ofcom would, nevertheless be able to require companies to “develop or source a new solution” to allow them to comply with the bill.

“It is right that Ofcom should be able to require technology companies to use their considerable resources and their expertise to develop the best possible protections for children in encrypted environments,” he said.

Meredith Whittaker, president of encrypted messaging app Signal, earlier welcomed a Financial Times report suggesting the government was pulling back from its standoff with technology companies, citing anonymous officials as saying there isn’t a service today that can scan messages without undermining privacy.

However, security minister Tom Tugendhat and a government spokesman said it was wrong to suggest the policy had changed.

Feasibility
“As has always been the case, as a last resort, on a case-by-case basis and only when stringent privacy safeguards have been met, it will enable Ofcom to direct companies to either use or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content – which we know can be developed,” the spokesman said.

Ministers met big tech companies including TikTok and Meta in Westminster on Tuesday.

Language around technical feasibility has been used by the government in the past. In July Parkinson told Parliament “Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible.”

The NSPCC, a major advocate of the UK crackdown, said the government’s statement “reinforces the status quo in the bill and the legal requirements on tech companies remain the same.”

Accredited Tech
Ultimately, the legislation’s wording leaves it up to the government to decide what is technically feasible.

Once the bill comes into force, Ofcom can serve a company with a notice requiring it to “use accredited technology” to identify and prevent child sexual abuse or terrorist content, or face fines, according to July’s published draft of the legislation. There is currently no accredited technology because the process of identifying and approving services only begins once the bill becomes law.

Previous attempts to solve the dilemma have revolved around so-called client-side or device-side scanning. But in 2021 Apple Inc. delayed such a system, which would have searched photos on devices for signs of child sex abuse, after fierce criticism from privacy advocates, who feared it could set the stage for other forms of tracking.

Andy Yen, founder and CEO of privacy-focused VPN and messaging company Proton, said “As it stands, the bill still permits the imposition of a legally binding obligation to ban end-to-end encryption in the UK, undermining citizens’ fundamental rights to privacy, and leaves the government defining what is ‘technically feasible.’”

“For all the good intentions of today’s statement, without additional safeguards in the Online Safety Bill, all it takes is for a future government to change its mind and we’re right back where we started,” he said. 

© 2023 Bloomberg LP 


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Deepfakes Sharing to Be Criminalised in UK Under New Online Safety Bill

Under a planned amendment to the UK’s new Online Safety Bill, people who share so-called “deepfakes” — explicit images or videos which have been manipulated to look like someone without their consent — will be among those to be specifically criminalised for the first time and face potential time behind bars.

The UK government said it will also bring forward a package of additional laws to tackle a range of abusive online behaviour including the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.

These will cover so-called “downblousing” – where photos are taken down a woman’s top without consent. The Ministry of Justice (MoJ) said this delivers on British Prime Minister Rishi Sunak’s pledge to criminalise the practice, in line with previous steps taken to outlaw “upskirting” – or filming up a woman’s clothing without consent.

“We must do more to protect women and girls, from people who take or manipulate intimate photos in order to hound or humiliate them,” said UK Deputy Prime Minister and Justice Secretary Dominic Raab.

“Our changes will give police and prosecutors the powers they need to bring these cowards to justice and safeguard women and girls from such vile abuse,” he said.

The amendment to the Online Safety Bill is intended to broaden the scope of current intimate image offences, so that more perpetrators will face prosecution and potentially time in jail.

According to official figures, around one in 14 adults in England and Wales have experienced a threat to share intimate images, with more than 28,000 reports of disclosing private sexual images without consent recorded by police between April 2015 and December 2021.

The latest package of MoJ reforms follows growing global concerns around the abuse of new technology, including the increased prevalence of “deepfakes”.

These typically involve the use of editing software to make and share fake images or videos of a person without their consent, which are often pornographic in nature.

 


 

Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Exit mobile version