Islamic extremists sidestep Facebook’s content police

From Politico

Photos of beheadings, extremist propaganda and violent hate speech related to Islamic State and the Taliban were shared for months within Facebook groups over the past year despite the social networking giant’s claims it had increased efforts to remove such content.

The posts — some tagged as “insightful” and “engaging” via new Facebook tools to promote community interactions — championed the Islamic extremists’ violence in Iraq and Afghanistan, including videos of suicide bombings and calls to attack rivals across the region and in the West, according to a review of social media activity between April and December. At least one of the groups contained more than 100,000 members.

In several Facebook groups, competing Sunni and Shia militia trolled each other by posting pornographic images and other obscene photos into rival groups in the hope Facebook would remove those communities.

In others, Islamic State supporters openly shared links to websites with reams of online terrorist propaganda, while pro-Taliban Facebook users posted regular updates about how the group took over Afghanistan during much of 2021, according to POLITICO’s analysis.

During that time period, Facebook said it had invested heavily in artificial intelligence tools to automatically remove extremist content and hate speech in more than 50 languages. Yet the scores of Islamic State and Taliban content still on the platform show those efforts have failed to stop extremists from exploiting the platform.

When POLITICO flagged the open Facebook groups promoting Islamic extremist content to Meta, the parent company of Facebook, it removed them, including a pro-Taliban group that was created in the Spring and had grown to 107,000 members. Yet within hours of its removal, a separate group supportive of the Islamic State had reappeared on Facebook, and again began to publish posts and images in favor of the banned extremist organization in direct breach of Facebook’s terms of service. Those groups were eventually removed after also being flagged.

In the now-deleted open Facebook group, with roughly 107,000 members, reviewed by POLITICO, scores of graphic videos and photos, with messages written in local languages, had been uploaded during much of 2021 in support of the Islamic group still officially banned from the platform because of its international designation as a terrorist group.

“There’s clearly a problem here,” said Adam Hadley, director of Tech Against Terrorism, a nonprofit organization that works with smaller social networks, but not Facebook, in combating the rise of extremist content online. He added he was not surprised that the social network was struggling to detect the extremist content because its automated content filters were not sophisticated enough to flag hate speech in Arabic, Pashto or Dari.  “When it comes to non-English language content, there’s a failure to focus enough machine language algorithm resources to combat this,” he added.

Facebook is extremely quick to detect certain words and phrases in English, such that we have to write T0mmy R0b1ns0n and suchlike in the hope of fooling it. 24 hours in FB jail, 30 day bans, and on that memorable day in April 2019 when FB deleted my account (and that of hundreds of other British patriots and activists; men and women who thought they had the right of free speech and free association) ) without warning in the time it took me to leave my PC to load the washing machine. Maybe we should have conducted business in Old English or Welsh.

Except the account of convicted terrorist Baz Hockton was conducted in English. Not quite good, grammatical English, but recognisably a part of our language nonetheless. And that continued under the radar for too long until last week.