Covid-19 Delta epidemic: the pandemic leads to a “degradation of social discourse”, according to a researcher

0

The Delta outbreak has triggered a greater willingness to use derogatory and offensive slurs as part of public discourse, according to a University of Auckland researcher.

The head of the disinformation project, Kate Hannah, said terms like Nazism were used without concern.

But where does the apparent recent decline in standards come from?

Those who log into the Covid-19 social media briefings will find posts and videos accompanied by a flood of anti -ax sentiment.

Most of the comments come from new anonymous accounts.

So, could they be bots – in other words software that mimics humans?

Hannah said it was unlikely.

“It’s very grassroots-driven,” she said.

“It’s organized through the Telegram platform usually by real people encouraging others to participate and there is obviously a real feeling of coming back.”

Those flooding the comments section could see the impact they were having in real time, providing a shot of dopamine at a time when many were feeling quite weak, she said.

However, it also had the side effect of distracting others from what could be an important public health message.

But many of those behind the comments would genuinely stand by their beliefs, Hannah said.

What had also been remarkable was the thickening of public debate since the August epidemic.

“We have really seen a degradation of social discourse – therefore an acceptability of the really vulgar, obscene, derogatory, crass, misogynist and racist terminology that has just been used.”

Terms like Nazism, Communism, and authoritarianism were used casually, including by those who should know better.

Last week, a prominent member of the civil service compared some of the country’s top scientists to Nazi doctor and war criminal Josef Mengele, and a national MP posted on Facebook: “Generations before us have fought against the tyranny of socialism. , now it’s our turn ”.

Brainbox Institute director and researcher Tom Barraclough said it was important to remember that behind the comments were real people.

“I think there’s a real risk when we talk about things like that, that we just see it as an intentional disinformation campaign to do harm. When it’s actually much more likely that there are nuances of gray through all of this and for a lot of people, by delivering political messages online in a very strong and coordinated fashion, they may in fact not have any sort of cohesive political ideology. “

Getting away from the computer and reaching out to people was key.

“Get into conversations with people and try to understand where they’re coming from and treat them like their concerns are genuine and come from a place of pro-social concern,” Barraclough said.

But what could not be ignored was the minority of infamous bad faith actors trying to conduct such a debate, he said.

Social media platforms had more work to do to respond to the speech they were allowing, but no one should expect or want them to act as the arbiter of free speech, he said.

Hannah said that while the bots weren’t behind the torrent of comments on Facebook’s live streams, they still played a role in the proliferation of disinformation.

“We observed a few weeks ago that a petition launched by a New Zealand-based organization on Change.org regarding vaccine mandates was very quickly picked up by various foreign-language Twitter accounts which retweeted it in volume, this which really did suggest a robot farm. “

Arvind Tripathi, associate professor at the Business School at the University of Auckland, said disinformation was not new because rumors and innuendos had always been around.

But social media has allowed disinformation about steroids, with the platform’s own algorithms facilitating that, he said.

“So with the social media algorithm, the idea is that the things that you like or comment on or that you show a preference for, they’ll bombard you with more of those things,” Tripathi said.

Barraclough put it another way: “People need to understand that what they see on social media platforms like Facebook, Twitter and YouTube is not a representative image of what everyone is saying or thinking.

“The platforms use AI recommendation systems, which means they will mainly show you what they think you want to see.

“Social media isn’t like a window to look out, it’s more like someone bringing you things they think you want to see. It’s crucial to understand when they do. it’s about browsing what you see online. “


Source link

Leave A Reply

Your email address will not be published.