Tech firms

UK Plans 48-Hour Deadline On Abuse Images

Technology companies could be legally required to remove intimate images shared without consent within 48 hours under new proposals announced by the UK government.

The measures, set out as an amendment to the Crime and Policing Bill currently before the House of Lords, would place intimate image abuse on the same regulatory footing as child sexual abuse material and terrorist content.

Under the proposals, platforms that fail to act within the timeframe could face fines of up to 10% of their global turnover or have their services blocked in the UK.

Prime Minister Keir Starmer said the move was part of an “ongoing battle” with technology firms to better protect victims. Speaking on BBC Breakfast, he said the change would spare victims from repeatedly reporting the same content as it resurfaces on different platforms.

“This is about stopping the whack-a-mole,” he said, adding that similar systems already exist for removing terrorist material online.

The proposed law would allow victims to flag abusive content once, after which platforms would be required not only to remove the images but also to prevent them from being re-uploaded. Internet service providers would also be given clearer guidance on blocking access to websites that host illegal content, targeting sites that currently fall outside the scope of the Online Safety Act.

Campaigners welcomed the move. Janaya Walker, interim director of the End Violence Against Women Coalition, said the proposals “rightly place the responsibility on tech companies to act”.

Government data shows that women, girls and LGBT people are disproportionately affected by intimate image abuse. A report published in July 2025 also found that young men and boys were increasingly targeted through financial sexual extortion, commonly known as sextortion.

A separate parliamentary report in May 2025 recorded a 20.9% rise in reports of intimate image abuse during 2024.

Sir Keir said enforcement would involve a mix of regulatory oversight and criminal processes, but he did not expect the measures to include prison sentences for technology executives.

Technology Secretary Liz Kendall said the proposals marked the end of what she described as a “free pass” for tech firms, adding that victims should not be forced to wait days for harmful content to be removed.

The announcement follows a dispute earlier this year between the government and X, after its AI tool generated sexualised images of real women. The feature was later withdrawn. New legislation introduced in February has already made the creation and sharing of non-consensual deepfake images a criminal offence in the UK.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We don’t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *