Over the past few weeks, the words ‘Online Safety Bill’ have been gaining traction, the promise of a safer way to use social media and the internet rippling through the media. Last week’s testimony from Frances Haugen – the so-called Facebook whistleblower – in UK parliament has given us incredible insight and food for thought on just what should be prioritised in the new Bill – and what good could look like for Facebook and perhaps social media more generally.
Frances Haugen, an American data engineer and product manager, has earned herself the title of ‘Facebook whistleblower’ thanks to her commitment to exposing what she says are the inherent flaws in the social media platform, including its repeated and institutionalised prioritisation of profit over user safety. Following a testimony to US Congress earlier this month on the subject of social media regulation, on Monday 25 October, the former Facebook employee gave evidence to a British parliamentary Committee and shared her views on how the UK’s draft Online Safety Bill could be expanded.
Ms. Haugen’s evidence only confirmed the urgent need that exists for regulation in this space – and the Online Safety Bill may just be the blueprint we need.
What is the Online Safety Bill?
The Online Safety Bill is part of the UK government’s aim to ‘make the UK the safest place to go online’, a bold claim and an optimistic objective. The 145-page draft legislation follows the ‘Online Harms White Paper’, which was set out by the Home Office and the Department for Digital, Culture, Media & Sport in April 2019. The Bill would mean that social media companies have a duty of care towards their users and would have to remove harmful content quickly from their platforms or face a hefty fine from Ofcom. The emphasis in all this is on the protection and safety of children, but also extends to protecting users from scams and fraud. If approved, the Online Safety Bill would be a landmark piece of global legislation intended to combat the worst excesses of social media and the Internet.
Frances Haugen, Facebook whistleblower
As it’s only in draft form, there is still scope to widen the Bill’s remit – and this is where Ms Haugen comes in. Her compelling testimony exposed what she believed to be Facebook’s failings, ways of working, the key issues with the current technology, and what should be prioritised when it comes to regulation. One major concern highlighted by her was Facebook’s use of algorithms, which she says structurally amplify ‘contentious’ issues such as fake news, misinformation and election interference, and also significantly contribute to the toxicity of social media by ensuring the spread of the most toxic content.
In addition, the word ‘harm’ was on repeat, with Ms Haugen pointing out the real risks, especially to young people, of social media addiction, as well as physical and mental health problems. Despite the significant concerns she brought to light, Ms Haugen clearly expressed that she doesn’t think social media is fundamentally evil – but she knows that there is substantial room for improvement. Like us at Schillings Ms Haugen believes there is a positive role for social media in the world, but that using it shouldn’t come at the price of our safety, security and privacy.
The problems we are experiencing are not unavoidable or necessary bad by-products; instead Ms Haugen says they are the direct result of Facebook’s profit-first approach that means that possible solutions to safety concerns are never properly explored or implemented.
Ms Haugen wants to help fix Facebook – and so as well as focusing on what she says is wrong, in her testimony, she also suggested how things could be made right. She reassures us that there is a way for social media companies to make our experience one with minimised harm and increased enjoyment – but what would this look like?
- More enjoyment, less profit: The potentially largest change is that a safer, more enjoyable experience on Facebook would be one that reduces addiction, reduces time spent on the platform and therefore reduces profit for Facebook – at least in the short term. Ms Haugen believes that Facebook would be a more profitable and substantiable company long-term if their product was safe and enjoyable for users and wasn’t causing them such serious harm. According to Ms Haugen, by making change now – changes which may mean less profit, but would guarantee an improved, safer user experience – the popularity of the platform should grow.
- The algorithm would be a thing of the past: Ms Haugen remarked that anger is the easiest mood to generate, and more likely than say, empathy, to create engagement. With no algorithm to favour the most ‘engaging’ content which, in reality, is the most contentious and anger-generating content, we’d see more human-led and meaningful content from family and friends, and less of the AI curated, or sponsored posts. This, in turn, would minimise the harmful and intentionally contentious posts in our feeds, so we’d spend less time on the platform, would get less toxic news and (hopefully) be less angry and less addicted to our screens.
- Viral misinformation would be reduced and slowed down: with a limited share function, a post could only be shared twice, not thousands of times in a chain without users clicking on the link and resharing the original piece. This would increase the chance of them reading it and creates a human hurdle to the rapid spread of misinformation. Twitter has already jumped on this boat and accepted the cost of change, for example by encouraging users to read a link before they retweet it, but Facebook doesn’t currently do the same for sharing posts (of course, this would decrease Facebook’s profits if they were to limit sharing). If Facebook were to take a similar approach to Twitter, and change the way we share posts, we’d be more informed, and get to vet these viral posts, leading to a more fact-based, and thorough approach to sharing content.
- Our data would be used – for good: Facebook has got a lot of data. We know this and are aware it’s an inevitable part of using social media. However, this data could be utilised in a positive way. For example, using lookalike tools, Facebook could share data about people at risk of self-harm to charities who could then help those people. Currently, these data tools are used solely to target ads more closely to make more money, and not to make the world a better place – but this could be changed so that those who need help the most are being exposed to opportunities to get that help.
- Facebook employees could raise concerns: When working for Facebook, Ms Haugen saw content which made her concerned for national security, but she was unable to escalate this due to a lack of resources. In her testimony, she said that employees were given specific metrics and were focused on moving these metrics and not spending time on issues that sat outside of them. Almost exclusively these metrics are focused on amplification and growth. To add to this situation, rewards and promotions at the company were apparently given for meeting these metrics. With regulation and mandatory requirements, the metrics would expand to include safety, privacy and security and employees would be rewarded for meeting these and creating positive change, they would have the capacity to escalate concerns, and the resources to address them.
Ms Haugen’s contributions could indeed shape the remit of Bill – and her vision of a safer Facebook experience could become a reality. The future of our online safety, and the direction of social media regulation in the UK and beyond, may have just changed thanks to this whistleblower.