The most disturbing aspect of the Home Secretary’s howlingly incompetent maunderings on Sunday’s Andrew Marr show wasn’t her view of encryption. Although the idea of banning of compromising strong crypto is disturbing, it is vanishingly unlikely that it will come to pass. Whenever similar ideas have been floated in the past, it has inevitably resulted in the culprit being dogpiled by anyone and everyone who understands security — and the the arguments have not changed.
More worrying was the suggestion that those who ‘understand the necessary hashtags’ should prevent problematic material being uploaded to the Internet. This is a sinister proposal, which could have dire implications if allowed to develop. Yet the Guardian has come out in favour of it, and there is a growing chorus of voices calling for increasing censorship of the online world. In this week’s PMQs, the PM confirmed that she wishes to see tech companies do more to combat extremist material online.
The Daily Mail ran a splash last Friday bemoaning the ease with which they could use Google to access instructions on how to use vehicles to run people over and calling Google ‘the terrorists’ friend’, implying that they are to blame. This is a transparently atrocious argument: if an intern at a newspaper can find that material so easily, then certainly it is reasonable to assume an aspiring terrorist could do likewise. But the same applies to the police or the security services. And it is already an offence under section 2 of the Terrorism Act to disseminate such material. So if it is there, it is trivially accessible, and it is criminal, then it is a failure in policing that nothing has been done about it.
Now the obvious response is that the volume of such material is such that it would be a drain on the police and security services to have to constantly deal with it. But the same applies to the tech companies: given their tiny profit margins, it will inevitably be algorithms rather than human beings which are assigned to detect problematic material. Out of charity to Rudd, this is likely what she meant by ‘the necessary hashtags’.
Finding extremist content is hard
Algorithms are blunt instruments when it comes to censorship: we’ve all heard of the Messrs Babcock and Lifshitz who can’t sign up for online services, of Middlesex companies who find their URLs on blacklists. Claire Perry MP had her own website blocked by the porn filters she campaigned for, presumably owing to the density of certain keywords on her blog.
Even more sophisticated algorithms are unlikely to be perfect. We have not yet reached the stage where machine understanding is an adequate substitute for real human intelligence. And even if the service providers were to have a team of humans looking at all information flagged by algorithm before making the decision to block, it is unlikely that they would be able to provide detailed and informed scrutiny.
Yet — and this is crucial — if it is within the capability of Google or Facebook to develop dowsing algorithms which can detect pernicious content, then it is definitely within the capability of the fine minds working in the doughnut in Cheltenham to do the same, creating spiders which crawl the web looking for extremist material in the hopes of eradicating it…or even using it to pursue further leads. But it’s not just GCHQ. The police’s Counter-terrorism Internet Referral Unit has been extremely successful in ensuring terrorist material is taken down.
After all, it is our police and security services who should be doing our policing and seeing to our security. While there are certain basic duties on private citizens (and companies) to assist the police within bounds and report crimes which come to their attention, they are not compelled to independently devote their resources to the investigation and detection of crime, other than in very narrow circumstances.
The individual vs the state
In the UK, our unique constitutional arrangement gave rise to the very powerful principle that it is the state which is more heavily fettered than its citizenry. In 1765, a group of messengers of the King broke into the home of one John Entick, a writer, making search for some ‘very seditious pamphlets’ he had allegedly written. He sued Carrington, the King’s Chief Messenger, and won, because there was no law enabling the state to act as it did.
This case established the balance of power between individual and state: it is often paraphrased along the lines of dicta such as ‘an individual can do anything not prohibited; the state may do only that which is explicitly allowed’.
If the state themselves were to impose on the citizenry the types of curtailment of free expression they are proposing for social media, there would be uproar. For someone to be convicted of publishing obscene material or committing a hate crime, a jury must first be convinced that what the accused has done lies outside the bounds of what society can accept…and the records show that British juries tend to be far more tolerant of others’ rights to express themselves freely than scolding politicians might expect.
It is nearly unthinkable that a democratic state could circumvent this fair and just mechanism for determining the bounds of free expression, and instead set up a national firewall, with algorithms for detecting problematic material and preventing it being uploaded, with no meaningful appeals process or independent oversight and transparency. This would put us in the digital company of China or Iran.
If there is to be an imperfect system, and we must be very clear that any system for detecting objectionable content will be imperfect, then the burden of this imperfection must lie with the state and not the public. The state exists to serve us, and not the other way around. Police do not have free reign to shut down presses, seize published materials, or silence speakers…even when it is ‘for our own good’.
When they want to do so, there are significant obstacles they must overcome to show that what they are doing is necessary and proportionate – not just once, in the abstract sense, but for each and every particular case. And then it may proceed to trial, either as a criminal prosecution or a judicial review to assess whether the police acted reasonably.
Pressuring tech companies is cheating
This is all a bit bothersome for politicians flailing about, wanting to be seen to be Doing Something about the torrents of objectionable material online. So instead, they are leaning on tech companies to do their dirty work for them, because these standards of rigour do not apply.
The debate surrounding Twitter bans illustrates the point well. Those (predominantly from the ‘alt-right’) who find themselves on the receiving end of a ban will bewail a curtailment of their free speech. Their critics will immediately respond that as a private company, Twitter is entitled to ban whomsoever it pleases in line with its T&Cs, and constitutional protections of free speech do not apply to it quite in that way.
The critics may be correct on the law (at least in the US), but they miss the point. Social media has become pervasive to the extent that if people wish to participate in the public discourse, they have little option but to use it. This is why state pressure on the companies amounts to censorship by the back door.
Allowing service providers to maintain neutrality about the content posted on them is crucial to the flourishing of the Internet. And handing them powers which rightly belong to emanations of the state is detrimental to the social order. The companies may well respond to the pressure. The Pakistani government has been boasting this week that Facebook has now removed ‘85% of blasphemous material’ at their bequest. But we must ask ourselves if this how the UK, where the rule of law has defined our constitutional history, should be operating. And our laws must be proportionate and designed to maximise the liberty of the subject, not curtail it.
To be very clear, the attack on Westminster was heinous, and must be condemned. Published material with the sole purpose of enabling terrorism is criminal, and has no place in our society; and if it poses a real threat to our national security, then it is the duty of the state to do what it can to see it purged.
Overreaction is conceding defeat
The existing security regime has pushed terror right to the edge: ‘spectaculars’ such as 9/11 are vanishingly unlikely to progress beyond the initial planning stages. Even suicide bombs and the like have logistic and technical requirements beyond those which it is likely an individual could achieve without raising his threat signature to the level that he would come to the attention of the security services. And so the only options remaining are to use everyday items: knives, axes, motor vehicles.
Despite their huge propaganda value, the physical impact of such attacks, while deeply tragic, is relatively minor when set against overall patterns of crime and disorder. We might say there’s a Pareto principle at work here: the security effort required to mitigate against these low tech attacks is huge…near-total surveillance, arming all police (or deploying the Army on the streets), banning encryption, censoring the Internet. And it is unclear whether even these measures would meet with 100% success, or how the terrorist threat might evolve in response or what other threats might arise.
There must come a point where we draw a line. There is always a limit at which we are willing to accept risk in exchange for liberty. This may be on the personal level: people drink, they smoke, they travel in cars and aeroplanes. All of these are decisions made by offsetting potential adverse consequences against potential benefits.
The same applies to society at large. It is a bad deal, and poor risk management, to focus on zero tolerance of risk if the cost is high. And many of the options proffered in the wake of the Westminster attack suffer from just this problem. It was an attack on our democracy not solely because of the venue, but because an overreaction to it threatens our values.
As things stand, Rudd’s statements in reaction to the attack are a prime example of the politician’s syllogism from Yes, Prime Minister: ‘we must do something; this is something…therefore we must do this’. So palpably ridiculous was her commentary on crypto that even Matt Hancock, the minister responsible for digital affairs, has issued a statement in support of strong encryption as a key part of contemporary communication. But we cannot allow the point on censorship to go unchallenged.
Stephen is a Policy Analyst for Conservatives for Liberty. A pragmatic minarchist and instinctual conservative, he likes free speech, free markets, and free people. Follow him on Twitter: @
The views expressed in this article are that of the author and do not necessarily reflect the views of Conservatives for Liberty