On Friday, European Union (EU) tech authorities issued a directive to Meta Platforms (META.O) to give more data on steps taken to address child sexual abuse content on its picture and video sharing app Instagram by the 22nd of December. Failure to comply with this directive might result in a formal inquiry under the new EU online content regulations.
The European Commission issued a first request for information in October regarding actions taken to stop the spread of violent and terrorist content. A second request for information on what measures have been taken to safeguard children was sent out last month.
In response to its most recent inquiry, the European Commission said it was “also requesting information regarding Instagram’s recommender system and the amplification of potentially harmful content.”
The request for information was made by the Digital Services Act (DSA) of the European Union, a new set of regulations that mandates that large technology companies take further measures to monitor and control dangerous and unlawful content hosted on their platforms.
Failure to comply with such requests may result in a legal investigation and financial penalties.
TikTok, which the Chinese giant ByteDance owns, and Elon Musk’s X have both been subjected to comparable information requests.
It is essential for the development of a more secure online environment that Meta adheres to the rules issued by the European Union (EU) for the prevention of sexual abuse of children. Meta exhibits its consistent dedication to prioritizing user safety and combating illegal internet activity by implementing various cutting-edge technologies, encouraging partnerships, assuring transparency, and planning for future developments.
Comment Template