A damning report by the British Commons Home Affairs Committee has claimed social media companies are failing to protect users from hate speech and illegal content, and also facilitating the profits of extremist groups.
The committee launched an inquiry into hate crimes, hate speech and the rise of extremism online following the murder of Jo Cox MP in the lead-up to the Brexit referendum last June.
Tech giants such as Google, Facebook and YouTube were all interviewed as part of the investigation into online extremism and its proliferation throughout social media.
The paper concluded that the social media companies also profit from failing to protect users effectively.
“There may be some lasting financial implications for Google’s advertising division from this episode,” the committee bluntly states.
The report heavily criticized Google for discrepancies within its own policies and the failure to effectively enforce them.
The report says Google is incredibly efficient at enforcing copyright rules but is bizarrely inept at moderating extremist, hateful or illegal content to anywhere near the same extent, even after the committee itself raised multiple specific examples throughout its investigation.
In addition, the inquiry alleged that Google was directly reaping “substantial profits from hosting dangerous and illegal content.”
“One of the world’s largest companies has profited from hatred and has allowed itself to be a platform from which extremists have generated revenue,” the report stated.
The committee also referred to the Metropolitan Police’s Counter Terrorism Internet Referral Unit (CTIRU), which is responsible for monitoring social media for hate speech and extremist content in the UK.
The group considers this as outsourcing a major compliance and security department of these corporations to the British government free of charge.
“Google, Facebook and Twitter are expecting the taxpayer to bear the costs of keeping their platforms and brand reputations clean of extremism.”
The Home Affairs Committee also claims that Prime Minister Theresa May’s call on 18 April for a general election has hampered its work and prevented it from establishing clear guidelines to combat the issue of hate speech.
It made several recommendations which it hopes will be implemented once a new Parliament is formed in June.
“We recommend that the Government consult on a system of escalating [financial] sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe.”
The committee also called for quarterly reports from social media companies elaborating on the specific number of instances of hate crimes that occur on their platforms as well as more transparency in terms of the measures they implement to combat the proliferation of “illegal and dangerous content.”
In April, the German government approved plans to fine social media companies up to €50 million ($53 million) for failing to remove hateful or illegal content in a timely manner.
“It is the wrong approach to make social networks into a content police,” said the head of the Digital Society Association consumer group, Volker Tripp, as cited by Reuters.