By Karen Kelly

These days, it’s hard to find a polite disagreement. Friends and family on social media seem to be drifting to—and sharing—stronger and stronger opinions on everything from COVID restrictions to the war in Ukraine to cultural norms.

Merlyna Lim

Merlyna Lim

Researcher Merlyna Lim has found that this is not just a hardening of societal divisions: it is an unintended consequence of what algorithms are designed to do.

“Social media algorithms are not designed to facilitate logical conversation or democracy, but to keep the user engaged as long as possible,” says Lim, a Canada Research Chair in Digital Media and Global Network Society, who tracks the behaviour of social media algorithms and machine learning in her laboratory.

“They want to feed emotions, so the more a post is hated or loved, the higher its score and its propensity to go viral. There’s little to no space for anything in between.”

While that love-hate dichotomy may be great for business, it’s not so great for a society.

“I call these algorithmic enclaves,” says Lim, a professor of Communication and Media Studies (COMS). “We may start out on the political right or left, but if you repeatedly click a love or hate emoji on a bunch of information that come from similar beliefs, you’ll be likely exposed to more extreme information over time. You are then drawn into an enclave, an echo-chamber of interaction with people with similar extreme emotion, blinded hatred or love.”

Lim’s findings were echoed by Facebook whistleblower Frances Haugen, who testified before the U.S. Senate in October 2021 that the company gives scores to emojis—with the greatest emotional reactions getting bumped up in users’ news feeds.

That’s a concern when that formula is applied to serious issues such as a pandemic, an election, or a war—especially when 54% of Americans surveyed by the Pew Research Center reported getting news from social media, with Facebook being the most popular site.

As new issues arise in society, Lim finds it doesn’t take long for the algorithmic enclaves to form. She points to the recent protests and occupation of downtown Ottawa.

“There were all of these narratives—anti-vaccine, anti-mandate, anti-immigrant, anti-Semitism, anti-Trudeau—that didn’t fit together. But the far right tapped into those grievances through social media and brought them under the umbrella of freedom and patriotism. They tapped into an emotion, which is what social media thrives on.”

How to Resist the Algorithm

Resisting social media content that is carefully tailored to your strongest emotions is not easy—even for someone like Lim.

“Most of us would think that we are rational, but that’s not always true—including myself. Most of the time, our immediate reaction is generated by emotion, so when it comes to online information, we have to ask if we like something because it’s true or we want it to be true. Rejection and acceptance need to be questioned.”

Lim says the overwhelming amount of emotional content and the speed at which it moves makes it almost impossible for it to be a “rational, well-informed type of consumption.”

But she says there are ways to approach it more carefully.

“Slow down, slow down, slow down. Speed is the enemy of the healthy consumption of information,” she recommends, adding that people need to get out of their comfort zones and consume news from diverse sources.

Wednesday, March 30, 2022 in , ,
Share: Twitter, Facebook