icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
10 May, 2019 05:02

Facebook accused of promoting terrorism with auto-generated content

Facebook accused of promoting terrorism with auto-generated content

After five months of research, a whistleblower says Facebook allows terrorist sympathizers to thrive not only by failing to remove their accounts, but also by auto-creating celebratory videos and pages for terrorist groups.

The National Whistleblower Center in Washington (NWC) has published a 48-page comprehensive study into how Facebook neglects to curb, and even boosts, Al-Qaeda affiliated militants' outreach, arguing that Facebook's lack of action facilitates the recruitment of terrorists.

The study, conducted by an unnamed whistleblower from August through December 2018, showed that out of 3,228 accounts belonging to 'friends' of self-identified terrorists, only 30 percent were removed by the platform.

Also on rt.com ‘Morally bankrupt pathological liars’: Facebook slammed by NZ privacy commissioner

These "friends" hail from around the world. Those who were singled out for the survey are based in the Middle East, Europe, Asia, and Latin America. Among them, "many openly identified as terrorists themselves and shared extremist content," the executive summary of the report states.

The social media giant, which recently prided itself on taking down 99 percent of Al-Qaeda and Islamic State (IS, formerly ISIS) related content before it is reported by users, actually creates such content itself with its auto-generating tools, according to the report.

For example, it refers to an auto-generated page Facebook "set up" for Al-Shabaab militant group, complete with an ISIS logo. Somalia-based Al-Shabaab has long been associated with Al-Qaeda. In 2015, a splinter group of its fighters pledged allegiance to IS.

RT

The page came about because some users listed their membership in the terrorist group as their job occupation, prompting Facebook's AI to create a "local business" page.

Another popular feature embedded in the Facebook algorithm allows for glorification of terrorism and extremism, the report says, referring to "celebration" and "memories" videos. It turns out that the seemingly innocuous feature makes no distinction between a user sharing a video with, say, birthday celebrations, and violence-filled extremist content. One such chilling "celebration" video featuring puddles of blood racked up over 2,000 likes, the report notes.

RT

The "frame" feature that allows users to pledge allegiance to a favorite sports team or express support for a country, has been abused by terrorist sympathizes to pledge allegiance to extremist groups – and Facebook is apparently OK with it, since before a frame can be used publicly it must be approved by a Facebook moderator, the study points out.

The whistleblower was able to establish that at least a portion of users who self-identify as terrorists on their account pages, actually are terrorists. Out of 63 profiles of those who liked Al-Qaeda affiliate Tahrir Al Sham's (formerly Al-Nusra) auto-generated "business page" on Facebook, 31 were confirmed to be actual terrorists active in Syria's Idlib province by a local NGO.

RT

The NWC says it has filed a petition with the US Securities and Exchange Commission (SEC) for it to "sanction Facebook for its dishonesty about terror and hate content on its website."

Facebook has come under fire for what some believe was a slow and half-hearted response to the recent mosque attack in New Zealand, which saw an attacker livestreaming the massacre of 50 Muslims. Although the video was taken down, copies spread throughout social media.

Responding to allegations that it aids terrorist supporters, Facebook told AP that it has been investing heavily into stamping out terrorist content and is "detecting and removing terrorism content at a far higher success rate than even two years ago."

"We don't claim to find everything and we remain vigilant in our efforts against terrorist groups around the world," the company said.

Think your friends would be interested? Share this story!

Podcasts
0:00
27:48
0:00
29:53