Social Media Companies Are Acting Like Publishers, So They Should Expect To Be Regulated Like One

By now, we’re all starting to accept that the volatility of 2020 did not, unfortunately, end with a flip of the calendar. Nothing demonstrated that more than the attack on the US Capitol, and since then, it seems battles are raging in just about every corner – even on Robinhood and Wall Street. Social media, often a battleground in and of itself, became the subject of a new, heated debate when Twitter and Facebook suspended Donald Trump’s accounts and Apple, Google and Amazon Web Services followed by refusing to support and host sites, such as Parler, from their services. This gave new life to the big question – platform or publisher? The distinction is important as it has far-reaching global implications.

While the aforementioned social media companies seek to define themselves as just platforms, by moderating and removing content (and the people who post it), they are actively taking control of the content and content providers on their websites and making editorial decisions, just like a newspaper or online news source. In other words, they are acting as publishers.

In life, as it is in law, it’s not just what you say that matters, it’s also what you do. And ultimately, what social media companies are doing – even though it is different to what they saying – means that they are likely going to be subject to regulations where until now they have largely been left to regulate themselves.

Regulation of this sector has been slow to come but clearly demands for increased regulations are picking up pace from governments and societies around the world. Initially there will no doubt be a patchwork of different types of regulations with different activities covered and a variety of ideals enforced. Broadly we anticipate that these will dictate how Big Tech should apply its editorial power in a fair and consistent manner and they will determine the guardrails of this power and what people can do when they feel it is used improperly. It should also address what happens when an individual or institution has been damaged or violated by content published by these publishers.

Currently, for the platforms based solely in the US, section 230 of the Communications Decency Act generally provides website publishers with immunity for liability arising out of third-party content. While jurisdiction and enforcement have historically provided protection for them there, the world is changing and becoming smaller and Big Tech can’t hide there forever.

The aim of the UK’s Online Harms Bill, which is currently scheduled to be finalised and come into force later this year, will provide a duty of care on social media platforms to protect users. Platforms will be required to assess what content could be harmful to users, even if it not illegal, and take steps to protect users from this harmful content. This will apply to companies around the world where users from the UK access their content. The Bill will include powers to impose large fines and block companies from providing access to the UK. Meanwhile in Europe, the Bloc is looking at taking the Digital Services Act and Digital Markets Act forward. These focus on content moderation and self-preferencing. Both proposed Acts have substantial fines included.

This is where the debate gets interesting. There is a publishing spectrum. On the one end is the telephone – essentially, just a platform. Anyone can say anything damaging about someone on the line, and a telephone company could never be implicated because they did not edit or control that content; they merely transmitted the conversation. One the other end is a fully edited newspaper where every word in the newspaper is written by an employee of the company and completely under the company’s editorial control. ISPs are more akin to telephone operators and social media companies’ claims that they are just platforms has historically put them closer to telephone companies than newspapers on this spectrum. But when Big Tech is making significant editorial decisions like taking the US President and Iranian leader off their services and leaving the leaders of other countries active, it shows that they are actually much further down the spectrum and much closer to newspapers. Society needs clarity on how their editorial decisions are being made along with consistent guidelines and opportunities to object to such decisions.

So why would Big Tech opt-in to being a publisher, and therefore become subject to a publisher’s regulation? In short, to protect their reputation and because they have no choice if they want to maintain viable businesses that the bulk of society want to be a part of. While they might be facing regulation, it’s still better than being shut down altogether or becoming fringe platforms. And public opinion is changing and starting to demand that they take responsibility for their actions. It is undoubtedly preferable to have democratically-elected governments regulating these publishers after debate, public consultation and input from Big Tech themselves rather than allow a few billionaires to decide what the rules and how those rules are enforced and when those rules should be changed.

As the ancient Chinese curse says: may you live in interesting times. Well we certainly do with all the conflict and complexity that the Internet, Big Tech and social media bring. But whether you’re in tech, law or are a social media commentator of any sort, things sure are getting interesting – and that will only continue as new leaders and new countries weigh in and new legislation starts to get passed and enforced. We’ll be watching closely and will be here to advise, provide counsel, and keep you informed every step of the way.