Engagement-Driven Algorithms are Causing Social Division- Is There a Better Way?

Engagement-driven algorithms dominate social media, influencing opinions and contributing to social division.

The impact of engagement-focused social media algorithms on the deepening of societal divides is increasingly worrisome, especially in Western states like the U.S., where extremist voices are incredibly loud. These algorithms favor content that elicits strong emotional reactions, chiefly anger, fear, and joy, with the latter proving most virally contagious.

Algorithms Increase Anger and Division

A recent study reported:

“Anger is more contagious than joy, indicating that it can drive more angry follow-up tweets and anger prefers weaker ties than joy for the dissemination in social network, indicating that it can penetrate different communities and break local traps by more sharing between strangers.”

Social media algorithms, which are optimized to maximize engagement alone, are emotion-blind and tend to promote content that provokes the strongest reactions. This binary thinking leads to an amplification effect of outrage and divisive issues, leading to further division within society.

Current Measures Fall Short

Platforms have tried to humanize moderation, such as Facebook’s Community Notes, which provide context for why a piece of content was debunked. But these are barely a firewall against the viral spread of divisive content that the algorithms themselves reward.

The cycle of outrage is perpetuated when this underlying engagement-driven model is ignored.

douyin_trends

As you can see, China offers a notable example of this in the form of Douyin (the local version of TikTok), where the state controls algorithms to amplify themes such as “positive energy” and “knowledge sharing,” to instil positive behaviour patterns in young people.

This orientation is a stark departure from its international version, which is characterized by pranks, politics, and polarizing content, serving different social and cultural priorities.

The Impact on Younger Audiences and Society

As more young people obtain their news and cultural understanding from TikTok, engagement-driven algorithms that amplify divisiveness to maintain audience engagement are shaping the psyche of a generation. Others have wondered if the dynamics of TikTok’s content suggest geopolitical motives.

It is clear, however, that the vast disparity in content environment between people who use Chinese-controlled and international platforms, and by extension, what that means for society.

The concept of governments having control over social media algorithms, as in China, raises thorny ethical and practical concerns. Heavy-handed regulation in democratic societies risks being biased and manipulated by incumbents, and it is more challenging to enforce patchily across multiple regions.

European regulators have already shaken up tech ecosystems with laws in progress, although the effectiveness of these laws in policing social content remains a subject of debate.

So, what’s the solution? Should the U.S. government consider taking control of platforms like Instagram to influence what users see?

Probably not. Especially after former President Donald Trump recently joked that he’d make TikTok’s algorithm “100% MAGA” if given the chance, hardly a reassuring prospect.

Still, there’s a growing argument for greater oversight of what gains traction on social platforms, and for prioritizing more constructive, empathy-driven content that fosters understanding rather than division.

The Dangers of Privately Coordinated Content Moderation

Private ownership, as seen in the shake-ups at Twitter (now X) under Elon Musk, highlights a thorny time in the realm of impartial content management. Owner-biased algorithms may also exacerbate polarization rather than mitigate it.

One root cause is the gap between online personas and real-world relationships: Many people can express hostility to minority groups online even as they have very positive real-life relationships with individuals within those groups.

Algorithmic incentives encourage divisive expression through engagement-driven dopamine hits, fueling social division.

The Unavoidable Status Quo

Curb algorithmic amplification, and perhaps polarization decreases, but the emotional incentives to share content remain. People would still find ways to spread the offensive material, just with fewer viral accelerants.

Algorithms, however, are the future of the internet; removing them completely would significantly reduce engagement with and ad revenue from platforms, which would be detrimental to business. This is why it’s unlikely, given the power of tech lobby groups.

In the absence of responsible curators of algorithmic content, whether governments, regulators, or independent bodies, this cycle of amplification around emotionally charged topics appears set to continue.

Therefore, social media users are likely to continue seeing polarizing content, perpetuating the anger and division that run rampant, particularly now that many people use their networks for news and social interaction.

Bottom Line

Engagement-centric algorithms thrive on stirring strong feelings. These algorithms do so by driving endless cycles of outrage that keep users hooked and, in the process, divide societies.

Mohsin Pirzada
Mohsin Pirzada is a freelance writer and editor with over 7 years of experience in SEO content writing, digital…