icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
4 Sep, 2021 07:24

Facebook apologizes after its AI put ‘primates’ label on video about black men

Facebook apologizes after its AI put ‘primates’ label on video about black men

Facebook has promised to fix its auto-generated recommendation system after the company’s AI linked a video that heavily featured black men to primates, shocking viewers.

“We apologize to anyone who may have seen these offensive recommendations,” Facebook said in a statement to the media, adding that the entire topic recommendation feature has been disabled. 

As we have said, while we have made improvements to our AI, we know it’s not perfect, and we have more progress to make.

Darci Groves, a former content design manager at Facebook, said a friend sent her a screenshot of a video featuring black men, which included the company’s auto-generated prompt asking viewers if they wanted to “keep seeing videos about primates.” 

 The video, uploaded by UK tabloid the Daily Mail in June 2020, contained clips of two separate incidents in the US – one of a group of black men arguing with a white individual on a road in Connecticut, and one of several black men arguing with white police officers in Indiana before getting detained. 

According to The New York Times, Groves posted the screenshot in question to a product feedback forum for current and former Facebook employees. A product manager then called the recommendation “unacceptable” and promised to investigate the incident. 

Last year, Facebook formed a team at Instagram to study how different minority users are affected by algorithms. The move was made after the social media giant was criticized for overlooking racial bias on its platforms. 

Algorithms used by tech giants have come under fire for embarrassing mistakes in the past. In 2015, Google apologized after its Photos app labeled a picture of black people “gorillas.”

Also on rt.com Google, Facebook & Microsoft top EU lobbying spending amid clampdown on Big Tech’s powers by Brussels – study

Think your friends would be interested? Share this story!

Podcasts
0:00
25:36
0:00
26:25