icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
19 Apr, 2021 21:50

People trust algorithms more than other human beings, study finds. Who do they think programs those computers?

People trust algorithms more than other human beings, study finds. Who do they think programs those computers?

No matter how many algorithmic ‘glitches’ and backdoors we find in our electronics, we can’t seem to put them down. Most of us would argue we need technology, but a growing subset trust it more than people. Why?

In an era where humans are hamstrung by fear of bias, it’s easier to delegate certain tasks to computers, especially if the task is particularly trying or difficult. Certainly it’s hard to believe a seemingly-impartial silicon chip could be anything other than immune to the social pressures of human interaction in all its forms. But someone has to program the algorithms that make these devices work, and chances are that “someone” has their own set of prejudices and preconceived notions that they don’t check at the door when they arrive at work every day.

Also on rt.com UK government has decided to introduce Covid ghettos. What next, Covid camps?

That’s not necessarily a problem - humans are biased creatures, and the sooner we learn to live with that reality instead of hiring con artists to beat it out of us with a sledgehammer (or PR flacks to convince the world other forms of it don’t exist), the less stressful our lives will be. But computer modeling is vulnerable to deadly flaws, and grasping at silicon straws isn’t going to save us. Oxford University scientists may work day and night on ethnically-based algorithms to determine how likely an innocent individual is likely to contract a serious disease over the course of their lives, but instead of concentrating on protecting the vulnerable group against that disease, the UK government seems to be focusing on marginalizing these people, a strategy whose uselessness was already proven during World War II, to the tragedy of all involved.

Lest we believe such deadly algorithms are a thing of the past, Google is still using its “Machine Learning Fairnessalgorithms to shove a warped vision of “equality” down our throats.

A study published last week by the University of Georgia found humans are relying more and more on computer algorithms “as a task gets harder.” Additionally, “that effect is stronger than the bias towards relying on advice from other people,” study supervisor Eric Bogert said in the writeup, published in Nature’s Scientific Reports

The results are troubling, not least because once humans get started down the slippery slope of depending on computers for everything, they quickly abandon human interaction as too troublesome, too difficult, wrought with too many unseen hazards or potential offenses, and in general too much work. Reality doesn’t necessarily measure up to this viewpoint, and the real danger for humanity is where it collides with this wishful thinking.

The benefits of technology can be seen wherever we rest our eyes, and we undeniably move more smoothly through space and time. But we’ve also seen the harms it can do, whether it’s creating an entire generation of overmedicated children incapable of sitting still for more than five minutes before demanding their screens, turning their parents into twitchy addicts afflicted with (but in denial of) the same condition, and empathy-free sociopaths who simply find real life dull in comparison to the myriad options available in virtual reality and don’t understand that the figures they interact with in real life are living, breathing human beings. Many of us, it seems, have been ‘spoiled’ by technology and can no longer find joy in the comparatively slower pace of normal life.

Some might argue this is being done on purpose - humans becoming inexorably, permanently dumbed down and wedded to technology that is growing ever more expensive, complex and probing, giving whoever owns the backdoors to the beloved software of the moment an unthinkable level of access to people’s inner thoughts and motivations. Controllers of these companies can easily obtain the secret thoughts and behavioral patterns of an entire population without having to resort to time-consuming methods like interrogation, and - perhaps more important - the path from user to technology is not a one-way street. These extremely powerful corporations don’t just slurp up the ideas they find inside the user’s mind; given enough time, they can ultimately start feeding “nudges” and other suggestions to the individual, growing more powerful with regard to controlling every thought that passes through our heads the longer the user interacts with the program.

Also on rt.com What to do post-Covid? Deprived of routine, Americans haven’t a clue... and the Great Resetters couldn’t be happier

Why would a smartphone or laptop user permit such violation of their inner thoughts? Surely they’ve read enough literature about the dangers of these devices over the last several decades. Is it a form of Stockholm Syndrome? Have we learned to love our silicon captors because we believe that in voluntarily relinquishing control of our own thoughts to a circle of computerized devices we are benefiting from their digital assistance?

Because humans do realize they’re dependent on their digital drug. For an example of just how far we’ve gone down the rabbit hole, take away the iPhone from a millennial living in a trendy New York City neighborhood after a few rounds of drinks. The student will panic, utterly incapable of finding their way home without the helping hand of Siri or an Uber. Few are willing to get into the drivers’ seat of their car without Google or Siri charged up and ready to guide them. Watching a modern couple on a date is somewhat depressing as both halves of the couple spend 90% of the time gazing at their phones, rather than each other. This is all considered normal.

Some would certainly argue we’re learning to trust the bots more than humans because we benefit from a growing symbiosis with them. Certainly this seems to be the case with young ‘early adopters,’ whose popularity is measured by the number of devices they flaunt on their limbs, no matter how secretly embarrassed they must become when they look at their Amazon Halo’s stern writeup of how they gained X amount of weight this week and their conversational tone has become notably more shrill. Still, in the younger generation, preteens can rake in hundreds of thousands of dollars as they dance and lip-sync in seemingly pointless mimicry to the same 15-second clip of pop songs for millions of TikTok watchers - lest anyone think 15 minutes was too tedious.

Ultimately, our devices know us better than anyone else. It is these silicon parasites who develop their own identities based on our own, becoming a source of comfort, a literal second self to talk to when one is lonely and one hasn’t taken the time to go out into the world and develop real friends (ironically, because one has been spending too much time at home on their laptop). The “second you” never makes any demands and can be shut off when it wants too much attention, making it the perfect friend, as you always know what it’s going to do next.

Also on rt.com Google once again takes an axe to humor on April Fool’s Day, in its ongoing bid to Make Intelligence Artificial Again

This isn’t how human bonds are supposed to work — though don’t expect to learn that from watching the people around you. More and more, this is how people abuse their relationships. What, you don’t want to sit there and watch me unburden myself for three hours and ignore you for the rest of the night while we stare at a screen together? Clearly that’s your problem, not mine. That last point is the real purpose of all this up-with-robots, down-with-humans argument. It takes a truly callous individual to continue doing the wrong thing even though you know you’re treating a person like an object and preventing them from achieving their goals in life. We humans even have a saying for it: the definition of insanity is doing the same thing over and over and expecting different results. But robots and algorithms don’t have this problem, unless it’s programmed into their software. And how would such an app be marketed? "Dysfunctional Girlfriend"? Not a big money-maker, that one.

But far from solving that problem, humanity is only rushing headlong further into it. Today’s generation, raised without the ability to recognize emotional cues in their surroundings due to omnipresent mask-wearing, distance learning, and a society that encourages digital narcissism, is being born into a world wholly unprepared to teach them how to understand the signs of fear, love, hate, and other emotional indicators out in the wild. 

Computers, on the other hand, don’t have such pesky emotional underlayers for their interlocutor to translate - what you see is what you get. Even humor is something beyond the range of Alexa or Siri’s capability - AI can’t understand or tell jokes, though it can repeat quips someone else tells them without understanding the humor within. On the bright side, this is what sets us humans apart from the silicon brutes - just try having a chuckle with Alexa, and you’ll be running to find a human friend in minutes - only to meet hordes of half-human, half-Alexas all solicitously asking you “what’s wrong?” without caring at all for the answer, like something out of Invasion of the Body Snatchers.

Putting one’s trust in a computer may be the route toward certitude, predictability, and security. In a world where humans are conditioned to reach for the path of least resistance, many of us may want to skip over the nuances of learning human emotion - after all, aren’t humans those nasty things that spread disease and spill things and otherwise demand attention? Sounds too much like work. But if we’re truly trusting computers more than humans, that means we’re jettisoning one of the key puzzle pieces that make up our consciousness. Before we throw it away, we might want to reconsider what a tedious life a Siri or an Alexa truly lives - when humans aren’t entertaining it, that is. Be careful what you wish for.

Think your friends would be interested? Share this story!

The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.

Podcasts
0:00
28:18
0:00
25:17