For years critics have attacked the Big Tech giants for not doing enough to protect users, particularly children and the vulnerable. Over the last week Big Tech has faced a three-pronged legislative and regulatory assault, which has the potential to threaten entire business models.
Earlier this week, the UK government published its long-awaited plans for online safety legislation, setting out new ‘duty of care’ requirements for platforms such as Facebook, YouTube and WhatsApp. The proposed laws will give the UK watchdog Ofcom new powers to block access to online services, if it deems they have failed to do enough to protect users.
The European Commission has also published drafts of two major pieces of tech regulation, which have the potential to place an increased burden on Big Tech to take more responsibility for regulating content.
In the US, Facebook was sued last week by the Federal Trade Commission and 48 state attorneys-general for anti-competitive practices, as authorities try to expand the definition of an illegal monopoly. Following the antitrust case filed against Google in October, this is further evidence that the tide of sentiment is turning against these tech giants – amid a desire at the highest levels to make sure that the digital economy works for the benefit of consumers, not just in the interests of shareholders.
The Online Harms White Paper
In the UK, the Government has published an Online Harms White Paper, with a view to bringing forward new legislation (The Online Safety Bill) next year. This new legislation will target social media websites and search engines (it will not apply to media companies) with multibillion-pound fines if they fail to remove illegal and harmful content from their platforms.
The proposed legislation will create a statutory duty of care, with the Government acting following the death in 2017 of 14-year-old Molly Russell, who committed suicide after accessing a stream of self-harm images on Instagram.
Ofcom will be given enhanced powers to enforce the new laws, including the power to issue fines of up to £18 million or 10% of global turnover in the event of non-compliance. The White Paper also raises the prospect of criminal sanctions for senior managers if they fail to comply with requests from Ofcom.
This is a vital step forward in protecting the privacy and rights of all online users, but particularly the most vulnerable (including children), who have had very little support to date. The proposals also intend to cover material, which is legal, but may be harmful – such as disinformation – although how this will work in practice remains to be seen.
Meanwhile in the EU…
Proposed legislation threatens to break up Big Tech companies if they repeatedly engage in anti-competitive behaviour. This warning comes in the form of two significant new pieces of tech legislation (i.e. Regulations), which were published in draft form earlier this week.
The Digital Markets Act aims to tackle unfair competition in the sector. At the same time, the Digital Services Act will force tech companies to take more responsibility for illegal behaviour on their platforms.
In practice, this means that ‘very large companies’ (which Brussels defines as those with more than 45m users or the equivalent of 10% of the bloc’s population), such as Facebook and Amazon, will have to take greater responsibility for policing what is on their sites – including greater transparency about advertising for users. Failure to do so could result in fines of up to 6% of their turnover.
This is the first significant overhaul of the EU’s approach to the internet for two decades and looks to ensure that platforms with the largest audience reach, and therefore the potential to cause the most severe harm, are being held to a proportionate standard of due diligence obligations.
The Regulations will be directly applicable in EU member states. Given the UK has left the EU and the Regulations will not be in force before the end of the transition period on 31 December 2020, this legislation will not be directly effective in the UK. It therefore remains to be seen whether the UK will choose to deviate from this framework. Given the content of the Online Harms White Paper, this seems unlikely.
After years of inaction, it is particularly encouraging that global authorities have taken a multi-jurisdictional approach to seeking to tackle the regulation of social media giants. The proposed powers of enforcement are extensive and undoubtedly pose a threat to Big Tech business models if questions remain over compliance.
In the UK, it is interesting that Ofcom, previously a regulator widely considered to lack the teeth to take on the media, is proposed to take on significant additional powers and responsibilities to hold Big Tech to account. Introducing a statutory duty of care is a vital step forward and should finally mean that everyone will be better protected online, especially children and the vulnerable.
The proof will be in both whether the proposed legislation makes it onto the statute book and then how effectively it is enforced. For the moment, the signs are very encouraging that the wind direction may finally be changing away from Big Tech and towards fairer and responsible regulation to protect users.