Google Cracks Down On Removing Defamatory Content Online But Has This Solved The Problem Of ‘Fake News’?

Ben Hobbs 11 Jun 2021

The ability of private information and defamatory allegations to spread exponentially online has always been an issue. Anonymity has always given people confidence to post things which they wouldn’t do in their own name. Add the two together and this leaves the subjects of such posts in an unenviable position, particularly if these posts appear prominently in search results. These individuals may now be able to take some comfort from Google’s recent actions.   

Google’s dominance over online search effectively means that your reputation is largely determined by the first page of results which appear about you. This is not new, neither is the scourge of websites who allow people to pay to anonymously post private information or defamatory allegations, and then charge the subjects to remove it.

Individuals who are written about frequently in the media or who have a presence online, are likely to enjoy a degree of control and resilience over what search results appear. However they are not immune from the scourge of these websites, which are likely to have an even greater impact on individuals who do not have such a big online presence.

If people targeting individuals see their posts ranking highly on search results, or pay-to-remove websites see victims paying what can be substantial sums for removal, this will only lead to the problem – and the harm to victims – continuing or worsening.

Now, following articles by the New York Times on this topic, Google has introduced a number of measures designed to address the problem, including amendments to its algorithm and the introduction of a ‘known victims’ concept.

For many years, Google resisted calls for it to take more responsibility for, or involve human intervention in, the search results it delivers, relying instead on the objectivity and independence of its algorithm. Over time, this resistance softened somewhat, with things such as the Court of Justice of the European Union’s decision in the ‘Google Spain’ case in 2014. That decision reinforced EU citizens’ right to have inaccurate, outdated and irrelevant information deleted and came to be known as the ‘right to be forgotten’. Other developments outside of the courts, such as the 2016 US Presidential election, also had an impact, with Google adopting changes to its algorithm to see “more authoritative content”  appear in search results. Its latest measures are a continuation of these steps.

Google says that its recent changes to its algorithm are a part of long-standing measures it is taking against sites which are “gaming our system” and appearing higher than they should. Part of determining how prominently a website should appear includes Google evaluating how many websites link to that site and the quality of those websites. Previously, websites charging to remove content could copy and paste defamatory content or private information from other sites, in order to exploit Google’s algorithm. The recent changes Google has implemented will make this less effective.

The ‘known victim’ concept enables victims to report to Google that they have been attacked on pay-to-remove sites, or have had explicit/intimate images posted without their consent, and Google will then automatically suppress similar content in search results generated for their name. It will be very interesting to see the impact Google’s measures will have.

Private information and defamatory allegations not appearing in search results will obviously limit their impact and visibility, but they will remain out there. If legal action or other steps can be taken to remove them at source, that will plainly deliver the most comprehensive results. Taking action in respect of search results should not therefore be seen as a silver bullet, although it may well address a large part of the problem.

While Google’s announcement about removing defamatory content from search results where websites charge victims to remove it is a positive development, this has not solved the problem of ‘fake news’. It is clear that search results can, do and will continue to contain false and defamatory content, meaning anyone searching an individual or company should continue to be vigilant.