Abu Bakr Naji wants you to believe that it was the prayers of the mujahideen in Afghanistan that caused the Chernobyl disaster. “They raised their palms in prayer against the Russians and after a short period of time God wiped out many in the Chernobyl disaster,” Naji wrote in The Management of Savagery: The Most Critical Stage Through Which the Ummah Will Pass—one of the most infamous terrorist strategies published in the post-9/11 era.
The pages of Naji’s 2004 book are notorious for more than ludicrous correlations between mujahideen prayers and a nuclear disaster. It is infamous for foreshadowing the future strategy of governance used by the Islamic State of Iraq and Syria (ISIS). What makes The Management of Savagery dangerous is its deliberate, and almost clinical, dissection of international terrorist groups, and the solutions Naji proffers as corrective strategy until the establishment of a “Caliphate.”
In an era where more than 70 percent of the American population above the age of 12 uses YouTube for streaming audio, and more users use the platform for streaming audio than any other streaming service, it only makes sense that The Management of Savagery made its way onto YouTube as an audiobook. The text also found a home on Facebook, where large “educational” pages shared direct links to the text.
Facebook, YouTube, Microsoft, and Twitter formed the Global Internet Forum to Counter Terrorism, and, at a meeting in San Francisco and another meeting held in Jordan a month ago, I specifically noted to these platforms the need for improved Arabic detection practices. But The Management of Savagery’s Facebook and YouTube presence highlights the inability of technology companies to plug, disrupt, and identify legacy terrorist content in multiple languages on their sites. Mimicking ISD findings, a research report commissioned by social media companies, and put out through the Global Research Network on Terrorism and Technology, noted that “jihadists are aware that Arabic, in addition to considering it a sacred language, provides a linguistic firewall which their adversaries find difficult to penetrate.”
Throughout June and July, the team of ISD researchers I led used simple Arabic language searches to surface Facebook and YouTube networks of users, channels, and pages sharing al Qaeda and ISIS legacy terrorist content across both platforms. It was as simple as typing “ The Management of Savagery” into YouTube and Facebook search fields. ISD shared all of its findings, including the links to content on both sites with both Facebook and Google. Both companies are reviewing the content, or have taken the content down. Facebook has yet to take down multiple posts linking to a direct download of The Management of Savagery—shared more than 180 times across the platform.
Much of the technology companies’ focus over the past four years has been on removing and limiting the impact of ISIS-affiliated content on their platforms at the behest of governments around the world. While much of the Islamist terrorist activity online has migrated to alternative, end-to-end encryption platforms such as Telegram, the ability to still find treasure troves of terrorist content on the largest platforms in the world, in Arabic, the fastest growing language of social media, is mind-numbingly confounding. To be clear, Facebook says it has taken down millions of pieces of ISIS and Al-Qaeda related content this year alone, while Google says it has reviewed a million pieces of potential terrorist-related content on its platforms.
Terrorist sympathizers on Facebook and YouTube have clearly taken advantage of these gaps, and are deploying simple tactics to evade both manual and automated detection by the companies. Facebook users identified terrorist content or pages as “educational,” tagging terrorist material as “books,” “public figures,” or “companies,” in order to evade detection. As per the companies’ own rules, educational content that might otherwise violate the platform’s codes of conduct are often spared take downs. Facebook’s Community Guidelines outline this exception under the company’s “Objectionable Content” section. YouTube similarly outlines this exception under its “Violent Criminal Organizations” policy.
Facebook’s design actually aids users in finding more content. The “related pages” option functions as an algorithmic bridge between networks of terrorist content propagators. I was linked from pages that were sharing explicitly terrorist content to other “softer” pages supportive of Salafi-jihadism, Salafi-jihadist principles, and Salafi-jihadist ideologues. Two clicks away from The Management of Savagerythrough related pages is a “Company” page title that reads “Jihad for the Sake of God.” This page claims more than 42,000 followers, and is dedicated to countering “misguided sheiks.”
On YouTube, terrorist propagandists are linking extremist material to mainstream content. This approach appears designed to game the platform’s search algorithms by linking mainstream content to terrorist content, and hence driving traffic back to dangerous content. A number of the playlists were also posting a smattering of mainstream content, including videos of popular Egyptian singers, belly dancers, and mobile-phone shot videos of families.
Islamist supporters seeded eight different direct links to the PDF version of The Management of Savagery on Facebook. The links were shared 184 times across the platform, and in some instances were downloaded thousands of times. One public page with more than 137,000 followers posted a link to the PDF version of the terrorist strategy that has been shared 61 times on Facebook. The same link has been liked more than 160 times.
Both YouTube and Facebook have made strides in detecting terrorist content. For instance, Facebook claimed last year to have removed 99 percent of “terror content” before it was reported by third parties. In the first three months of 2019, YouTube said it removed 89,968 videosthat “violated its violent extremism policy.” The question remains, is what we are finding in Arabic the remaining one percent that has not been reported or automatically detected? Most likely both. How large is that one percent, especially in terms of how much content is uploaded to both sites, not only in a day, but in any given hour? YouTube similarly contends that “90 percent of violent extremist videos that were uploaded and removed in the past six months were removed before receiving a single human flag, and of those, 88 percent had fewer than ten views .”
It’s evident that much of the effort of social media platforms has been on English language content, however, the inability to detect clearly defined terrorist content in Arabic, the digital sphere’s fastest growing language, is a gaping hole for the companies. YouTube and Facebook should update their manual and automated Arabic language detection efforts to ensure legacy terrorist content no longer festers across their platforms. This should include the basics. Arabic names of al Qaeda and ISIS ideologues, and the titles of terrorist strategies, pamphlets, books, and films. While the platforms have been diligent at removing some English-language terrorist content, the same approach has yet to be reproduced in Arabic. Until that basic level of detection is instituted, terrorist content will, in the words of the infamous ISIS refrain, “ bakiya, bakiya, bakiya, or remain, remain, remain.”