A level playing field, in short, is imperative for all media.

Social Media may change for good

Following calls for laws that hold social media platforms accountable for user content posted on their platform, Section 230 in the US, that created the base for the modern internet, may be repealed or amended. And it will change the internet, forever!

A couple of months ago, the Wall Street Journal reported how ex-Facebook (she resigned!) India’s public policy directory Ankhi Das sparked off a political storm after the company and her, allegedly, refused to take down anti-Muslim posts by a Hindu nationalist legislator because it could damage the firm’s business interests.

Even the recent Bangalore riots were triggered by an allegedly blasphemous Facebook post about Prophet Muhammed.

The two examples cited above raise questions about “content moderation” on social media. It raises a couple of questions:

  1. Are governments doing enough to hold social media platforms responsible for the content users create and what they choose to leave up and/or take down?

Let’s try answer these as we go along.

No law to govern user generated content exists - Section 230 and the USA

In 1996, the USA created and advocated the 26-words, in Section 230 of the Communications Decency Act, that shaped the modern internet and the open forums that we see today.

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (Section 230 of the Communications Decency Act of 1996).

At the moment, the social media companies cannot be sued over what their users post online, or the decisions they take over labeling a post or what they decide to leave up and/or take down.

Section 230 is part of a federal law that protects the social media companies from being held liable for the content posted by users. In simpler terms: It means websites themselves are not generally responsible for illegal or offensive things users post on them.

There were some changes in May? Aren’t those enough?

Simply put — No!

In May 2020, after Twitter appended fact checks to several of his tweets regarding voting by mail, Donald Trump signed an Executive Order that, according to the Council on Foreign Relations, would:

“Direct Trump’s administration to consider regulations that narrow the scope of Section 230 and investigations of companies engaging in “unfair or deceptive” practices.”

On the surface, according to lawfareblog, the order has just enough grounding (a threat!) to worry the platforms, but the order does not hold up for legal scrutiny.

If it had grounds for legal scrutiny, we would be seeing an ongoing battle between Twitter/Facebook vs. Trump. On October 06, Facebook deleted Trump’s post comparing COVID to the flu and Twitter slapped a fact check label on it.

Trump merely tweeted that Section 230 should be repealed entirely. Following the recent antitrust reports against the Silicon Valley goliaths and the tweet, the US Senate/Congress has been reviewing the law even more closely.

Why is it under scrutinisation, 24 years later?

We all know that over the past couple of years, hate speech (religious and caste incendiary), fake news (photos and videos including) across the internet has risen, and more often than not, they go undetected. The platforms may choose to leave some content up and take some down.

For example: Twitter chose to flag Donald Trump’s tweets, but did not flag Iran’s Ayatollah Ali Khamenei tweet that threatened violence against Israel on grounds of “sabre rattling”, which did not violate its terms of service.

All the power at the hands of private players to filter content has left US Senators worried about both censorship and the spread of misinformation, even saying that Section 230 is outdated.

Democrats (Joe Biden!) take issue with the spread misinformation without consequences for the site. Republicans (Donald Trump!) says they use moderation powers’ to censor they do not agree with — removing content than staying neutral. And both sides agree they want to see the social networks held accountable.

Even some industry experts even agree that the law should be revisited. Prof Fiona Scott Morton, the Theodore Nierenberg Professor of Economics at the Yale University School of Management, told BBC’s Tech Tent podcast that:

“[It] allows digital businesses to let users post things but then not be responsible for the consequences, even when they’re amplifying or dampening that speech. That’s very much a publishing kind of function — and newspapers have very different responsibilities. So we have a bit of a loophole that I think is not working well for our society.”

In the hearing between the Senate and the CEOs of Facebook, Google, and Twitter yesterday (28 October) on ‘content moderation’ and Section 230:

Facebook’s Mark Zuckerberg was for the changes in the law: “The internet has also evolved. And I think that Congress should update the law, to make sure that it’s working as intended.”

Twitter’s Jack Dorsey told the Committee Section 230 “is the most important law protecting internet speech” and its abolition “will remove speech from the internet”.

Google’s Sundar Pichai, like Jack Dorsey, defended the law: “Our ability to provide access to a wide range of information is only possible because of existing legal frameworks like section 230.”

But, will repealing the Section 230 help the companies at all?

Let’s step back for a second and consider the power that these social media companies have. When we sign up for social media, we agree to share our personal data, in turn becoming susceptible to the misinformation and possibly, manipulation. And that’s what social media profits on — serving flammable content that keeps us hooked.

Even if the suit against Google goes through and they’re not the default search engine on the iPhone or other browsers you use, we still use it most of the time — Gmail, Hangouts, Office services, etc. We are so immersed in it that even if we want to phase ourselves out, we cannot!

Like Google, we are dependent on Facebook (including WhatsApp and Instagram), Twitter, and LinkedIn in our daily lives, for multiple reason.

Long story short, the world has a bunch of social tech giants that control and influence the market. If Section 230 is repealed and the business model of these goliaths’ change for policing content that their platform publishes, it may give them more power. Yes, and it will also kill freedom of speech!

Oh, and not to forget government officials would be able to exert their influence strong-arm these companies and dictate how speech should be moderated online.

For a country like India, that is terrifying.

So repealing is out, what about modifying the act?

‘Content moderation’ is still necessary. And we need to hold social media companies accountable because, while most of use social media positively, there may be some users and groups who use social platforms to radicalize their audience through fake news, videos — ultimately leading to outrage. Such posts still need to be filtered out.

Traditional media would not be spared for errors of judgement when it came to publishing content, so why should social media be? While all social media platforms have features to monitor content, it still cannot fact check or monitor every single piece of content. On Twitter alone, there are over 456,000 tweets every minute.

Given the market share the US tech goliaths hold, the influence they have, and the content they choose to leave up and/or take down, there is a need for a standing law that holds them accountable for the content users create, to stop malaise content from creating problems. At the same time, there need to be provisions against the same defamation charges that traditional media receive for posting content in good faith

A level playing field, in short, is imperative for all media.

What do you think, should there be laws to hold social media platforms and users accountable for the content they create? Leave your thoughts below!

PS: According to AMlegals, even in India, there is no specific law which holds the social media giants responsible for the content created on their platform.

Analytics Leader | Love Writing | Foodie | Travel Enthusiast | Plane Spotter

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store