Elon Musk

Australia Fines Twitter (X) Over Handling of Child Sex Abuse Content

Australia has imposed a hefty fine of $610,500 Australian dollars ($386,000) on Twitter, previously known as X, for its alleged shortcomings in disclosing its approach to addressing child sex abuse content. This latest penalty adds to the growing challenges facing the Elon Musk-owned social media platform.

The Australian e-Safety Commission, the country’s online safety regulator, stated in a release on Monday that Twitter/X had inadequately addressed several inquiries concerning its efforts to combat child abuse materials. The platform was accused of providing no response to certain questions, leaving sections blank, or furnishing incomplete and inaccurate answers.

Twitter/X has publicly declared that combating child sexual exploitation is its top priority, but the eSafety Commissioner, Julie Inman Grant, stressed the need for substantive action to back these statements.

In February, Inman Grant had questioned five tech companies, including X, about their measures to address crimes against children on their platforms. The responses revealed significant gaps and inconsistencies, with Twitter/X’s failure to comply seen as more severe than other companies.

X now has 28 days to either request the withdrawal of the notice or make the payment. As of now, the platform has not commented on the matter.

The commission highlighted that X did not respond to critical questions, including the platform’s response time to reports of child sexual exploitation and the methods used to detect such content in livestreams.

When asked about measures to prevent grooming of children by sexual predators, X argued that it is not a platform widely used by young people and its technology is currently insufficient in capability and accuracy.

In a similar vein, Google failed to address several key questions on child abuse and has received a formal warning to avoid future non-compliance. The American tech giant affirmed its commitment to the fight against child sexual abuse material and its willingness to collaborate constructively with the eSafety Commissioner.

Earlier, the Australian regulator had identified “serious shortfalls” in how Apple, Meta, Microsoft, Skype, Snap, WhatsApp, and Omegle handled online child sexual exploitation.

These developments highlight the growing scrutiny of tech companies’ responsibilities regarding harmful content on their platforms, particularly content related to child exploitation.

Oh hi there 👋
It’s nice to meet you.

Sign up to receive awesome content in your inbox, every week.

We don’t spam!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *