Sunday, November 28

Covid-19 pandemic drives ‘degradation of social discourse’, says researcher

The Delta outbreak has sparked an increased willingness to use derogatory and offensive slurs as part of public discourse, according to a researcher at the University of Auckland.

WhatsApp, Facebook and Instagram APs in a close-up of the screen of an Apple iPhone X smartphone.

Photo: 123rf

The director of the Disinformation Project, Kate Hannah, said terms like Nazism were being used carelessly.

But where has the apparent recent lowering of standards come from?

Those who tune in to the Covid-19 briefings on social media will find posts and videos accompanied by a flood of anti-vax sentiment.

Many of the comments come from new, anonymous accounts.

So could they be bots, in other words, a software program that mimics humans?

Hannah said that was unlikely.

“That is very much based on the bases,” he said.

“It is organized through the Telegram platform, usually by real people encouraging others to participate in this and obviously there is a real sense of feedback.”

Those who flooded the comment section were able to see the impact they were having in real time, delivering a dopamine hit at a time when many were feeling quite low, he said.

However, it also had the side effect of disconnecting others from what could be important public health messages.

But many of those behind the comments would genuinely uphold their beliefs, Hannah said.

What had also been noteworthy was the hardening of public debate since the August outbreak.

“We have really witnessed a degradation of social discourse, so the acceptability of really vulgar, obscene, degrading, rude, misogynistic and racist terminology is simply being used.”

Terms like Nazism, Communism, and Authoritarianism were being thrown around casually, even by those who should know better.

Last week, a prominent member of the public service compared some of the country’s leading scientists to Nazi war criminal and physician Josef Mengele, and a national MP posted on Facebook: “Generations before us we have fought against the tyranny of socialism. , now it’s our turn. “

Brainbox Institute director and researcher Tom Barraclough said it was important to remember that there were real people behind the comments.

“I think there is a real risk when we talk about things like this, that we see everything as an intentional misinformation campaign to do harm. When in reality it is much more likely that there are shades of gray in all this and for many people, promoting political messages. online in a very strong and coordinated way, they might not actually have any kind of coherent political ideology. “

Getting away from the computer and reaching out to people was key.

“Get into conversations with people and try to understand where they are coming from and treat them as if your concerns are sincere and come from a place of prosocial concern,” Barraclough said.

But what couldn’t be ignored was the minority of nefarious bad faith actors trying to drive that debate, he said.

Social media platforms had more work to do to address the speech they were allowing, but no one should expect or want them to act as arbiters of free speech, he said.

Hannah said that while bots weren’t behind Facebook’s stream of comments, they were still playing a role in the proliferation of misinformation.

“We observed a couple of weeks ago that a petition that was started by a New Zealand-based organization on regarding vaccine mandates was picked up very quickly by different foreign language Twitter accounts who were retweeting it in large numbers. , which actually did suggest a robot farm. “

Auckland University Business School Associate Professor Arvind Tripathi said disinformation is not new as rumors and innuendos have always existed.

But social media had allowed misinformation about steroids with the platform’s own algorithms facilitating that, he said.

“So with the algorithm of social media, the idea is that the things that you like or comment on or that show that you have a preference will bombard you with more of those things,” Tripathi said.

Barraclough put it another way: “People need to understand that what they see on social media platforms like Facebook, Twitter and YouTube is not a representative image of what everyone is saying or thinking.

“The platforms use AI recommendation systems, which means that they will mainly show you what they think you want to see.

“Social media is not like a window to look out, it’s more like having someone bring you things they think you want to see. That’s crucial to understand when it comes to navigating what you see online.”

Leave a Reply

Your email address will not be published. Required fields are marked *