icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
18 Apr, 2021 10:49

Humans now trust algorithms more than each other, according to new research

Humans now trust algorithms more than each other, according to new research

Researchers from the University of Georgia have conducted a study which confirms what many already suspected; humans now tend to trust algorithms more than each other, especially when it comes to tedious tasks.

The premise of the study was simple: some 1,500 participants were shown photos and asked to count the number of people in them. 

The participants were able to take suggestions from a computer algorithm or the averages of guesses from their fellow humans in order to complete the task, which involved images of 15 to 5,000 people.

Also on rt.com Professor argues that MARIO KART could inspire fairer economic system

As the crowd size or complexity of the task increased, the participants, understandably, relied more and more on the algorithm to count the people. After all, computers are especially good at tedious tasks that humans shy away from, such as counting.

“It seems like there’s a bias towards leaning more heavily on algorithms as a task gets harder and that effect is stronger than the bias towards relying on advice from other people,” says management information systems PhD student Eric Bogert, from the University of Georgia.

The researchers concede that, in this particular task at least, there is no ambiguity in terms of the answer, only right or wrong, so the lack of nuance or perspective makes the task ideal for an algorithm as opposed to a human. 

“This is a task that people perceive that a computer will be good at, even though it might be more subject to bias than counting objects,” says Aaron Schecter, an information systems researcher from the University of Georgia.

However, the researchers emphasized that our perception of how accurate an algorithm can be plays an important factor – outsourcing the task to a machine unwittingly affords the opportunity for bias and discrimination to creep in unbeknownst to the human participants.

Also on rt.com New Netflix doc ‘Coded Bias’ is so keen to show AI is racist that it ignores how tech tyranny is dehumanizing EVERYONE

“One of the common problems with AI is when it is used for awarding credit or approving someone for loans,” Schecter says. 

“While that is a subjective decision, there are a lot of numbers in there – like income and credit score – so people feel like this is a good job for an algorithm. But we know that dependence leads to discriminatory practices in many cases because of social factors that aren't considered.”

Algorithms already largely dictate huge portions of human activity, with the stock market, social media, and online marketplace pricing among a myriad of tasks deemed too tedious for humans, but therein lies the potential for disaster, this latest research highlights.

Think your friends would be interested? Share this story!

Podcasts
0:00
27:21
0:00
26:13