Meta launches an open source moderation system to combat the dissemination of terrorist images


Vincent Mannessier

December 13, 2022 at 3:55 p.m.

1

facebook hacker © Michael Vi / Shutterstock.com

© Michael Vi/Shutterstock

In an article published on Meta’s blog, a company executive announced the establishment of a cooperative system to fight more effectively against terrorist content on the Internet.

If Meta, like many tech companies in 2022, has laid off a large part of its workforce, it is not one of those who are abandoning moderation. Far from being irreproachable in this area, Mark Zuckerberg’s company still devotes significant resources to it. While the group will occupy from next month the presidency of the Global Internet Forum for the Fight against Terrorism (GIFCT in English), it unveils today a project to apprehend moderation differently, called Hasher Matcher Actioner (HMA).

The GIFCT and the 2022 review of moderation at Meta

Nick Clegg, one of Meta’s senior executives, published a blog post in which he presents this tool on the occasion of the company’s future GIFCT chairmanship. This forum is an NGO born in 2017 from a partnership between YouTube, Microsoft, Twitter and Meta. Its objective is clear: to fight against the dissemination of terrorist, violent and extremist content on the Internet. Other companies, associations and governments are actively involved.

In his post, Clegg presents the figures of moderation at Meta: more than 40,000 people work on its good application, and the company has invested more than 5 billion dollars in it last year. He also explains that the experience accumulated over the years makes it possible to be more efficient in this area and allows the creation of new tools such as the HMA.

How does HMA work?

Clegg explains that the HMA is based on a system and technology that already exists and is used by most major platforms. In concrete terms, as soon as a moderator spots content that breaks the rules, he deletes it and enters it into a database. The latter, rather than storing the images or videos concerned, only keeps a “hash”, a sort of unique digital fingerprint specific to each content. Any media posted on the platform is then compared to these “hashes” and deleted or reported to moderation if the system finds a match. This allows, in addition to using considerably less storage space, not to keep terrorist, violent or pedophile images or videos in its database.

The system presented by Meta is therefore not revolutionary, but its form is. Indeed, the great innovation here is based on its open source, which allows on the one hand to the smallest companies which were hitherto deprived of it to have access to this technology. And above all, it makes possible the pooling of “hashes” of all entities participating in the program. Indeed, as the author of the post reminds us, people who post this type of content are rarely content to do so on a single medium. The success of the HMA will therefore largely depend on whether or not it is adopted by content platforms, regardless of their size.

Source : Meta



Source link -99