The apps Instagram, Facebook and WhatsApp can be seen on the display of a smartphone in front of the logo of the Meta internet company.
Image Credits:Jens Büttner/picture alliance / Getty Images
Government & Policy

Meta put on watch over terrorism content in the EU

Ireland’s media regulator has put social media giant Meta on watch over terrorist content takedowns again — it issued a decision against Facebook on Monday. The Coimisiún na Meán said the tech giant would have to take “specific measures” to prevent its services from being used for the dissemination of terrorist content and report back to the regulator on the measures taken.

The decision follows a similar determination by Coimisiún na Meán against Meta-owned Instagram in November, along with TikTok and X.

The Irish authority plays an oversized role in regulating tech giants’ compliance with a range of digital rule books due to how many opt to locate their regional headquarters in Ireland.

The relevant bit of Ireland’s online safety framework that Coimisiún na Meán is enforcing in today’s decision is a pan-EU law on terrorist content takedowns that was agreed to by the bloc’s lawmakers back in 2021. It requires hosting service providers — in this case social media platforms — to remove terrorist content within one hour of it being reported. Penalties under the regime can reach up to 4% of global annual turnover.

“Under the Terrorist Content Online Regulation, hosting service providers which receive two or more final removal orders from EU competent authorities within the last 12 months may be determined as being exposed to terrorist content,” the Irish regulator wrote in a press release. “An Coimisiún has reached this decision [against Meta-owned Facebook] following the notification of two or more final removal orders in respect of this providers and following engagement with this provider.”

It’s not clear exactly which type of terrorist content was found on Facebook and notified to the regulator. We’ve asked for more details. Meta has been contacted for a response to the Coimisiún na Meán decision.

Update: Meta spokesman Ben Walters emailed a statement in which the company wrote: “This designation means CnaM [Coimisiún na Meán] can assess the measures we have in place to deal with terrorist content.”

The company also claimed to have “one of the most comprehensive approaches in the industry targeting dangerous organisations and individuals,” adding: “[L]ast quarter, we found and removed over 99% of this type of violating content before it was even reported to us.”

Topics

, , , ,

Related

  翻译: