For failing to comply with an investigation into anti-child abuse policies, an Australian regulator fined Elon Musk’s social networking platform X A$610,500 ($386,000), dealing a setback to a business that has struggled to retain advertisers amid claims that it is becoming lax in content moderation.
The e-Safety Commission punished Musk’s platform X, which he renamed from Twitter, for not responding to inquiries about how quickly it addressed accusations of child abuse content on the site and how it found it.
Although little in comparison to the $44 billion Musk paid for the website in October 2022, the fine is a blow to the company’s image since advertisers have been cutting down on their investment on a platform that has ceased most content control and let thousands of banned users to be reactivated.
X was recently accused of failing to control misinformation regarding Hamas’ attack on Israel, prompting the EU to announce that it was looking into possible violations of its new digital regulations.
“If you’ve got answers to questions, if you’re actually putting people, processes and technology in place to tackle illegal content at scale, and globally, and if it’s your stated priority, it’s pretty easy to say,” Commissioner Julie Inman Grant said in a telephone interview.
Inman Grant, a public policy director at X until 2016, continued, “The only reason I can see to fail to answer important questions about illegal content and conduct occurring on platforms would be if you don’t have answers.”
After Musk’s purchase, X dissolved its Australian office. Thus, Reuters was unable to contact a local representative. The media email address for the San Francisco-based business did not immediately respond to a request for comment.
Australian regulations that went into force in 2021 allow the regulator to punish internet businesses that fail to provide information about their online safety policies. Grant says the regulator may take the firm to court if X refuses to pay the penalties.
Musk stated that “removing child exploitation is priority #1” after turning the firm private. The Australian regulator, however, said that X claimed the site was “not a service used by large numbers of young people” when questioned about how it prevented child grooming on the platform.
According to X, the anti-grooming technology that is currently on the market “is not of sufficient capability or accuracy to be deployed on Twitter.”
Inman Grant stated that the commission also warned Alphabet (GOOGL.O) Google for noncompliance with its request for information on managing child abuse content, referring to the search engine giant’s replies to several inquiries as “generic.” Google expressed disappointment at the notice despite having worked with the regulator.
Google’s director of government affairs and public policy for Australia, Lucinda Longcroft, stated, “We remain committed to these efforts and collaborating constructively and in good faith with the e-Safety Commissioner, government, and industry on the shared goal of keeping Australians safer online.”
The regulator said that X’s noncompliance was more significant, including the company’s refusal to provide information on how quickly it responded to accusations of child abuse, the procedures it took to identify child abuse in live streams, and the number of staff members it employed for content moderation, safety, and public policy.
The business informed the regulator that it had reduced its employment by 80% internationally and no longer had a public policy team in Australia, down from two before Musk’s acquisition. After Musk took the firm private, X reported to the regulator that its proactive identification of child abuse content in public posts decreased.
Because “the technology is still in development,” the business informed the regulator that it does not utilize tools to find the content of private conversations.
Comment Template