icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
21 Aug, 2019 01:57

First they came for the bots: US academics make case for 1984-style silencing of any dissent

First they came for the bots: US academics make case for 1984-style silencing of any dissent

With the "Russian meddling" theory of Trump's victory on life support heading into 2020, US academic researchers have heeded the patriotic call and put forth a new definition of "disinformation" that includes inconvenient truths.

Social media platforms must expand their definitions of "coordinated inauthentic behavior" beyond the usual bots-and-trolls model to include conversations about topics harmful to the state if they hope to curb the spread of disinformation on their platforms, a trio of University of Washington researchers insist in a paper released ahead of the 2019 Conference on Computer-Supported Cooperative Work. To help in this quest, the researchers have redefined "disinformation" to include truths arranged to serve a purpose.

Evaluating disinformation is less about the truth value of one or more pieces of information and more about how those pieces fit together to serve a particular purpose.

Such an Orwellian redefinition would include the lion's share of journalism, especially opinion journalism, and sure enough, the researchers have their knives out for those who become "unwitting agents" in the spread of disinfo by writing based on anonymous tips – otherwise known as "reporting."

All it takes is one article on a "conspiracy theory" to cause a rift in society, the researchers warn, as a single story spreads to multiple outlets and then throughout the social media infosphere. But governments may spend billions of dollars on manipulating public opinion over social media, because it's OK to lie, as long as you're helping your country.

The paper tiptoes around propaganda campaigns run by the "good guys" – acknowledged US operations like the notorious pro-Clinton Correct the Record, while New Knowledge, rather than being called out for its fake Russian bot campaign to influence the 2017 Alabama senate election, is cited as an academic source!

Also on rt.com Really? McFaul 'plants US flag' in Twitter to claim it as free speech sanctuary

Understanding that bot- and troll-hunting has limited use, the researchers focus on "actors who are not explicitly coordinated and, in some cases, are not even aware of their role in the campaign" - i.e. ordinary social media users with opinions the researchers don't like.

One "case study" examines content "delegitimizing" the White Helmets while neglecting to mention that the group and the publicity surrounding it are, themselves, part of a well-funded western influence operation against the Syrian government (with a sideline in terrorism and head-chopping). The researchers complain that anti-WH voices were not the expected bots and trolls but included "western journalists" and overlapped with "'anti-war' activism" – as if "anti-war" was an artifact of a bygone era when one could, realistically, be against war. They complain that not enough accounts retweeted pro-White Helmets articles and videos – essentially that the problem here was not enough of the right kind of propaganda.

Also on rt.com Facebook ‘News’: A bold step toward total control of reality?

Conspiracy theories especially get under the researchers' skin, as they have trouble untangling "conspiracy pushers" from those following mainstream news and seem incapable of realizing that people looking for answers in the aftermath of a tragedy are inclined to look in multiple places.

The researchers warn their peers not to minimize the effects of Russian "influence operations" in 2016, even if their analysis shows them to be minimal – clearly, they aren't looking hard enough (i.e., if you don't see the effects, it's not that they aren't there, it's that you aren't using sophisticated enough instruments. May we interest you in this fine Hamilton68 dashboard?).

Scientists are cautioned never to allow their hypothesis to color the way they report the results of their experiments. If the lab doesn't show something, it isn't there. But these researchers are not scientists – they, like the New Knowledge "experts" they so breathlessly cite, are propagandists. They are the droids they are looking for. At one point, they even admit that they "wrestl[ed] with creeping doubt and skepticism about our interpretations of [operations promoting progressive values] as problematic – or as operations at all." Skepticism, it seems, lost.

Also on rt.com #KamalaHarrisDestroyed trends on Twitter after annihilation by Tulsi Gabbard, ‘Russian bots’ blamed

Social media platforms are warned that their current model of deplatforming people based on "coordinated inauthentic behavior" leaves much to be desired. If they truly want to be ideal handmaidens of the national security state, they must "consider information operations at the level of a campaign and problematize content based on the strategic intent of that campaign." It's not whether the information is true, it's where it came from – and what it might lead to – that matters. Such a model would complete the transformation of platforms into weapons in the state's arsenal for suppressing dissent, and the researchers acknowledge they might be at odds with "commonly held values like 'freedom of speech'" (which they also place in quotes), but hey, do you want to root out those Russian influence operations or not? We've got an election to win!

When at first you don't succeed, redefine success. None have heeded this maxim better than the Russiagate crowd and their enablers in the national security state, and academic researchers have long provided the grist for these propaganda mills. But the sheer chutzpah of expanding the definition of disinformation to include truths arranged to have an effect – a definition that could include most of journalism, to say nothing of political speeches and government communications – is unprecedented.

Helen Buyniski

Helen Buyniski is an American journalist and political commentator, working at RT since 2018

Think your friends would be interested? Share this story!

The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.

Podcasts
0:00
28:21
0:00
25:26