icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
13 Aug, 2017 08:18

YouTube’s ‘extremist-tackling’ technology misses mark by removing wrong content

YouTube’s ‘extremist-tackling’ technology misses mark by removing wrong content

YouTube and parent company Google’s efforts to crack down on extremist content has resulted in embarrassment after its technology targeted videos from monitors, journalists and Syrian opposition, amongst others.

A number of accounts, such as those of Airwars, Bellingcat, Middle East Eye and Orient News have had videos removed or suspended as a result of past videos deemed ‘extreme’ by YouTube’s machine learning.

Earlier this month, YouTube shared an update on its “commitment to fight terror [sic] content online,” boasting of “better detection and faster removal” thanks to “cutting edge” machine learning.

The post reported more than 75 percent of videos removed for “violent extremism” over the last month were taken down “before receiving a single human flag.” However, a lack of human flagging has resulted in inaccurate results.

The airstrike monitoring group Airwars reported three of its videos were removed this week, including one of a 2015 coalition airstrike near Ar-Rutbah, Iraq, a 2016 coalition strike on an IS vehicle near Manbij, Syria, and a 2016 coalition strike on an Islamic State (IS, formerly ISIS) barracks near Hit in Iraq.

The videos were later reshared after consultation with YouTube, but now carry an 18+ classification.

“Crucial video archives of recent conflicts are being purged,” Airwars tweeted.

YouTubers targeted by the company’s technology as being “extremist” content get three strikes for flagged content before being removed from the platform.  

In June, Google announced its plans for tackling extremism, pointing to increased use of machine learning, more independent human experts - as part of YouTube’s Trusted Flagger program, taking a tougher stance on controversial videos and expanding counter-radicalisation efforts.

The Trusted Flagger program includes the help of NGOs and organizations including the Anti-Defamation League (ADL), the No Hate Speech Movement and the Institute for Strategic Dialogue.

The ADL was founded to stop the defamation of Jewish people, but has been accused of labeling any criticism of Israel, including that coming from Jewish groups, as “anti-Semitism.”

Some of the content removes includes evidence used in the Chelsea Manning court case [a portion of an As Shahab video] and evidence of IS destroying ancient artifacts.

In July, the company received criticism when it removed a number of accounts relating to the Syrian war, including that of state news media, the Syrian Arab News Agency.

A number of channels tied to rebel groups fighting the Assad government were also removed.

Podcasts
0:00
28:7
0:00
28:37