The Australian regulatory authority has imposed a substantial fine of A$610,500 ($386,000) on Elon Musk’s social media platform, X, for its failure to cooperate with an investigation into its anti-child abuse practices.
This development marks a significant setback for the company, which has been grappling with a decline in revenue and concerns about its content moderation practices that have deterred advertisers.
The e-Safety Commission, responsible for overseeing online safety, found X, formerly known as Twitter before Musk’s rebranding, guilty of noncompliance.
The company had not provided satisfactory responses to critical inquiries, including the time taken to respond to reports of child abuse content on the platform and the methods employed to detect such material.
While the fine may appear relatively small compared to the $44 billion Musk paid for the platform in October 2022, it carries significant reputational damage.
X has been under scrutiny for its lenient content moderation policies, resulting in the reinstatement of numerous banned accounts and the departure of advertisers.
The European Union had also initiated an investigation into X for potentially violating its new tech rules, particularly concerning disinformation related to Hamas’s attack on Israel.
Commissioner Julie Inman Grant emphasized the importance of transparency in addressing illegal content on platforms and suggested that the only reason for not providing answers to these crucial questions would be a lack of satisfactory responses.
X had closed its Australian office following Musk’s acquisition, leaving no local representative to respond to the regulator’s inquiries.
Despite a request for comment to the San Francisco-based company, there was no immediate response.
Under Australian laws introduced in 2021, internet companies can be compelled to disclose information about their online safety practices or face fines.
If X refuses to pay the fine, the regulator has the option to pursue legal action against the company.
Despite Musk’s prior commitment to prioritize the removal of child exploitation on the platform, the regulator found X’s responses lacking when questioned about its measures to prevent child grooming.
X asserted that it was not widely used by young people and claimed that available anti-grooming technology was insufficient for deployment.
The e-Safety Commission also issued a warning to Alphabet’s Google for noncompliance regarding its handling of child abuse content. Google’s responses to some questions were deemed “generic,” but the company expressed disappointment at the warning and reaffirmed its commitment to online safety efforts.
X’s noncompliance was deemed more serious, encompassing a failure to provide information about response times to child abuse reports, steps taken to detect abuse during livestreams, and the number of staff dedicated to content moderation, safety, and public policy.
The company confirmed it had reduced its global workforce by 80% and had no public policy staff in Australia, compared to two before Musk’s takeover.
It also revealed that its proactive detection of child abuse material in public posts had declined since going private, citing the technology for detecting such material in private messages as still in development.