British Premier Theresa May wants to fine Google, Facebook and other companies for not doing enough to counter extremist content on their respective platforms, but has been severely criticized by her own country’s terrorism legislation watchdog.
Max Hill QC, the government’s appointed expert on counterterrorism, has lashed out against May’s policy, stating: “I struggle to see how it would help if our parliament were to criminalise tech company bosses who ‘don’t do enough’.”
He continued:
“How do we measure ‘enough’? What is the appropriate sanction?
“We do not live in China, where the internet simply goes dark for millions when government so decides.
“Our democratic society cannot be treated that way.”
The demand by Mrs. May and Amber Rudd, Britain’s Home Secretary, of allowing security agencies a ‘back door’ to the end-to-end encryption algorithms used by Facebook, WhatsApp, Apple and other companies with messaging apps is being seen as a move that could further height the risk of hacking.
The new planned fine is equally indicative of certain parties within the government not really understanding how things work with security and online content moderation on a massive scale.
Google, meanwhile, recently announced a four-step plan to address the problem of extremist content.
There is no doubt that this is a persistent problem that needs addressing, but taking punitive action against tech companies could have detrimental effects on what they construe to be extremist content versus the freedom of speech.
Tech companies are certainly doing their part. Facebook has been working on a slew of tools and algorithm changes, Google engineers have been working on technologies to prevent re-uploads of banned content, and both companies have ramped up their review staff.
The big question: these governments that are accusing tech companies of complacency – what exactly have they done to help the matter? Have they implemented large-scale education initiatives to teach children about hate speech and how it can destroy a society? Have they invested time and effort towards creating more tolerant and inclusive policies that they can now point a finger at these companies?
The blame game is a dangerous one for governments to engage in, and it can only cause unnecessary fallouts in other areas. China is a perfect example of what might well be the end result of these severe measures. Is that the society that we want to live in?
The high-handed approach is simply not going to work. A more cooperative tone is what’s required. Strict action against non-compliance is not the problem here; the problem is the level of compliance that governments expect, and the ambiguity around what “compliance” actually means.
These moves by the EU and the UK open up a Pandora’s box of extremely challenging questions that nobody seems to have an answer for. Rather than take a belligerent stand against the platforms on which such content lives, it would be far more prudent to work with these companies and assist them in tackling the problem, which affects society as a whole rather than a particular group of people.
Thanks for visiting! Would you do us a favor? If you think it’s worth a few seconds, please like our Facebook page and follow us on Twitter. It would mean a lot to us. Thank you.