I realise it’s been a while since I posted something up here, so here’s an article I wrote recently for Top10VPN’s new Privacy Central site:
The UK has been unlucky enough to know terrorism for quite some time. Many will remember the IRA campaigns of the 1970s and ’80s. This was an era before smartphones and the internet, yet the Irish paramilitary group continued to wage a successful campaign of terror on the mainland.
It continued to recruit members and organise itself to good effect. Politicians of the modern era, led by Theresa May and various members of her government, would do well to remember this when they launch into yet another assault on Facebook, Google, and the technology platforms that are alleged to provide a “safe haven” for Islamic terrorists today.
Now she is calling for greater regulation of cyberspace, something the independent reviewer of terrorism legislation has openly criticised. Along with increasing moves across Europe and the world to undermine end-to-end encryption in our technology products, these are dangerously misguided policies which would make us all less safe, less secure and certainly less free.
Our “Sliding Doors” moment
Every time a terror attack hits, the government continues its war of words not simply against the perpetrators, but against the tech companies who are alleged to have provided a “safe haven” for them. After all, such rhetoric plays well with the right-wing print media, and large parts of the party.
“Safe haven” has become something of a mantra for the prime minister, alongside her other favorite; “strong and stable”. She argues that terrorists are hiding behind encrypted communications on platforms like Facebook’s WhatsApp and Apple’s iMessage, and are using social media platforms like YouTube to recruit members and distribute propaganda.
“We cannot allow this ideology the safe space it needs to breed. Yet that is precisely what the internet, and the big companies that provide internet-based services, provide,” May said after the London Bridge attacks. “We need to work with allied democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremism and terrorism planning.”
Part of the regulation May wants to bring in could include fining tech companies that don’t take down terrorist propaganda quickly enough. Max Hill QC, independent reviewer of terror legislation, has rightly questioned this hard-line approach.
“I struggle to see how it would help if our parliament were to criminalize tech company bosses who ‘don’t do enough’. How do we measure ‘enough’? What is the appropriate sanction?” he said in a speech reported by The Times.
“We do not live in China, where the internet simply goes dark for millions when government so decides. Our democratic society cannot be treated that way.”
China is an interesting parallel to draw, because in many ways it offers a glimpse into an alternative future for the UK and Europe; one in which government has total control over the internet, where freedom of speech is suppressed and privacy is a luxury no individual can claim to have.
The problem is that no one sees authoritarianism coming, because it happens slowly, drip by drip. Regulating cyberspace would begin a slow slide into the kind of dystopic future we currently know only from sci-fi films. As Margaret Atwood’s heroine Offred says in her acclaimed novel The Handmaid’s Tale: “Nothing changes instantaneously: in a gradually heating bathtub you’d be boiled to death before you knew it.”
In many ways, we sit today at a Sliding Doors moment in history. Which future would you prefer?
The problem with backdoors
End-to-end encryption in platforms like WhatsApp and on our smartphones and tablets is something Western governments are increasingly keen to undermine, as part of this clamp down. It doesn’t seem to matter that this technology keeps the communications of consumers and countless businesses safe from the prying eyes of nation states and cybercriminals – it’s also been singled out as providing, you guessed it, a “safe space” for terrorists.
The Snoopers’ Charter already includes provisions for the government to force tech providers to effectively create backdoors in their products and services, breaking the encryption that keeps our comms secure. In fact, the government is trying to sneak through these provisionswithout adequate scrutiny or debate. They were leaked to the Open Rights Group and can be found here.
It remains to be seen whether the British government could actually make this happen. An outright ban is unworkable and the affected tech companies are based almost entirely in the US. But the signs aren’t good. Even the European Commission is being strong-armed into taking a stance against encryption by politicians keen to look tough on terror in a bid to appease voters and right-wing newspaper editors. Let’s hope MEPs stand up to such calls.
The problems with undermining encryption in this way are several-fold. It would give the state far too much power to pry into our personal lives, something the UK authorities can already do thanks to the Investigatory Powers Act (IPA), which has granted the government the most sweeping surveillance powers of any Western democracy. It would also embolden countries with poor human rights records to do the same.
Remember, encryption doesn’t just keep terrorist communications “safe” from our intelligence services, it protects journalists, human rights activists and many others in hostile states like those in the Middle East.
More importantly, it protects the communications of all those businesses we bank with, shop with, and give our medical and financial records to. The government can’t have its cake and eat it: recommending businesses secure their services with encryption on the one hand, but then undermining the very foundations on which our economy is built with the other.
Once a provider has been ordered to create a “backdoor” in their product or service, the countdown will begin to that code going public.
Even the NSA and CIA can’t keep hold of their secrets: attackers have managed to steal and release top secret hacking tools developed by both. In the case of the former this led to the recent global ransomware epidemic dubbed “WannaCry”.
Why should we set such a dangerous precedent, putting our data and privacy at risk, while the real criminals simply migrate to platforms not covered by the backdoor program?
“For years, cryptologists and national security experts have been warning against weakening encryption,” Apple boss Tim Cook has said in the past. “Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.”
In short, we need more police officers, constructive relationships with social media companies, and smarter ways of investigating terror suspects. Dragnet surveillance, encryption backdoors and more internet regulation is the quickest way to undermine all those democratic freedoms we hold so dear – and send us hurtling towards that dystopic authoritarian future.
Europe’s new data protection laws might have been over a decade in the making but it would take about as long again to read every piece of advice that’s since been produced on how to comply. In search of some simple answers to a typically complex piece of European legislation, I asked a few legal experts on their thoughts.
With 13 months to go before the compliance deadline, organisations across the country will be scrabbling to ensure they’re not one of the unlucky ones caught out in the months following 25 May.
Start with the Data
Most experts I spoke to were in agreement that firms need to start by mapping their data – after all, you’ve got to know where it is and what you do with it first before working out how to keep it safe.
“For those that are compliant with existing laws, GDPR is going to be an evolution. For the others, it’s going to be a deep, radical change. In general, I think that every organisation should be working on assessing their current practices in light of GDPR,” Forrester analyst Enza Iannopollo told me.
“My advice is, regardless of the kind of support an organisation chooses, it must put together a team of internal people – hopefully the privacy team – and make sure that that team leads the work. Compliance with GDPR is not a one-off effort, but an ongoing process that has to be ingrained in firms’ business model,” she said.
Change the culture
That cultural change might be the hardest thing for organisations to achieve, although a good start is hiring a Data Protection Officer (DPO) – one of the key requirements of the GDPR. Another is the privacy impact assessment, which PwC’s US privacy lead, Jay Cline, recommends as a key stage once you’ve completed a data inventory.
“Data protection impact assessments (DPIAs) are the eyes and ears of the privacy office throughout the company,” he told me by email. “DPIAs are how chief privacy officers enlist the help of the whole company to keep their privacy controls current with all the change going on in the company.”
For Alexandra Leonidou, Senior Associate at Foot Anstey, there’ll be a key role for non-IT functions inside the organisation.
“Who needs to know about the GDPR? Who are the key stakeholders? This isn’t just something for IT, information security teams or data officers. Boards should be aware of the risks, and HR teams need to think about employee data. Getting GDPR compliance right will be critical for marketing and communications teams’ activity,” she told me.
“You will need to engage key stakeholders and implement measures that leave you with an acceptable level of commercial risk.”
Leonidou was also keen to stress the need for independence in the DPO role.
“Guidance from Europe suggests that this role is likely to be incompatible with certain existing C-suite executives,” she explained. “The awareness-raising that follows on from the allocation of accountability will be an ongoing process.”
For those still in the dark, some useful free resources include the Article 29 Working Party and our very own Information Commissioner’s Office. It’s also expected that even post-May 25, the regulators will give firms a little bedding in time before they start going after some high profile offenders.
All over Europe organisations of all sizes are currently scrabbling desperately to get their house in order for 25 May 2018. What happens then? Only the biggest shake-up to Europe’s data protection laws in nearly a generation. The implications are immense, both in terms of the scope of the new regulation and the companies who will now be held liable.
There’s just one problem. The UK’s Snoopers’ Charter, or Investigatory Powers Act. Its enshrining into law of mass surveillance powers could create major problems down the line, possibly putting UK firms at a competitive disadvantage precisely at a time when they need the digital economy most.
What’s the problem?
Let’s start at the beginning. UK firms will have to comply with GDPR, even with Brexit looming. That’s because the extrication of the country from the EU will take at least two years from whenever Article 50 is triggered – presumably in March – and probably much, much longer. And even beyond that, the UK government has said in its Brexit white paper:
“The European Commission is able to recognise data protection standards in third countries as being essentially equivalent to those in the EU, meaning that EU companies are able to transfer data to those countries freely.
As we leave the EU, we will seek to maintain the stability of data transfer between EU Member States and the UK.”
This implies that the UK will broadly speaking harmonise its laws with the GDPR. But the bulk data collection powers granted by the IPA mean the regime is certainly not equivocal to that in Europe. Emily Taylor, CEO of Oxford Innovation Labs and associate fellow of Chatham House, told me that the European Court of Justice (CJEU) shows no signs on shifting its stance on bulk data collection – having recently ruled against the forerunner to the Snoopers’ Charter, DRIPA.
“Other elements of the judgment are likely to cause problems with the Investigatory Powers Act: the CJEU says that targeted data retention may be allowable, but must be restricted solely to fighting serious crime; warrants must be signed off by a court, not a minister; and the data concerned must be retained within the EU. All these will potentially conflict with core elements of the IP Act,” she told me.
If its kept as is, the Act could therefore impact the legality of data transfers between Europe and a newly independent UK, which will be bad news for most firms reliant on a thriving digital economy.
“The impact of conflicts between the GDPR and our Investigatory Powers Act may be to hamper the competitiveness of UK tech, particularly as the GDPR seeks to protect EU citizens’ data wherever it will be processed,” she argued.
Not great for America
This is a hot button issue for Europe In fact it’s the reason why data transfers to the US were put under threat after Safe Harbour was torn down because of fears of US authorities snooping on Europeans’ data. Despite a new agreement – Privacy Shield – being put in place, there could still be bumps in the road ahead.
“Transatlantic data flows will not be legal unless there is a robust framework in place to offer EU citizens’ data equivalent protection to what is enjoyed in the EU,” said Taylor.
“President Trump’s ‘America First’ policy is likely to renew tensions over Privacy Shield – a shaky compromise which was hurriedly reached following the CJEU’s obliteration of its predecessor ‘Safe Harbour’.”
KPMG’s globa privacy advisory lead, Mark Thompson, told me that firms outside of Europe that need to comply with the GDPR are better off keeping data on European citizens inside the EU so as not to fall foul of any changes to data transfer agreements.
“Despite the USA and EU having some cultural alignment, there is potential for significant culture clash between the EU’s view of a fundamental human right to privacy and the US view on what constitutes privacy, which is significantly different,” he added.
We’ll have to wait a while to see what the fallout of all this is. But with the UK government unlikely to countenance any changes to the IPA, there could be some potentially bad news for the country’s digital economy in the next few years if nothing changes.
The idea is to raise awareness among consumers to think twice about leaving a bigger digital footprint online than they already have, and to try and get businesses to take data privacy more seriously.
On both counts it’s a challenging prospect, according to many of the experts I spoke to.
David Gibson, vice president of strategy at Varonis, told me that improving privacy protection all comes down to better monitoring of fraud abuses.
“The proof that traditional methods don’t work is in the increasing frequency and magnitude of data breaches related to unstructured data,” he argued.
“Not only is there more data to worry about, but it’s containing more sensitive and valuable information and it’s getting easier for attackers to exfiltrate that data since it’s typically not monitored. If what you’re trying to steal isn’t being watched, you have a much better chance of getting away.”
Rackspace senior director of legal, Lillian Pang, admitted that firms still don’t prioritise data privacy at a board level, and this needs to change if things are to get better for consumers.
“Only then will firms start taking it seriously and filter down the privacy compliance needs to the ground level of its business. In some respects, you could say that privacy needs to be led from the top level of any business and administered from the ground level,” she told me.
“Many firms pay lip service to the importance of data privacy but few really understand or recognise that a robust data privacy program in a firm solidifies its information security and helps to further safeguard the firm’s business.”
The EU General Data Protection Regulation could be the push that many firms need to start taking the issue seriously, according to Gemalto data protection CTO, Jason Hart.
“The EU Data Protection Regulation is set to be finalised later this year, but companies need to start taking the steps to change how they protect their data now, otherwise they could find themselves subject to compliance penalties, and also put their reputation and consumer confidence at risk,” he warned.
“As the reporting requirements of the new EU regulation make data breaches more visible, we can expect the economic and business consequences of a breach to continue to escalate, so businesses need to start taking steps to ensure they are prepared for when new regulation comes into force.”
So are awareness raising exercises like Data Protection Day even worth the effort? Well the general consensus is that anything like this is probably a bonus, although the jury’s out on how effective it can be.
“Although Data Privacy Day is a great opportunity to raise awareness of the issue, understanding the importance of protecting data needs to be an all year round initiative,” said Hart. “Businesses need to realise the importance of the data they hold in their systems and how the loss of this can impact their customers.”
Data Protection Day (Data Privacy Day in the US) is on 28 January.
It’s widely expected that next week the government will unveil details of its hugely controversial Snoopers’ Charter, aka the Investigatory Powers Bill. To preempt this and in a bid to influence the debate cyber security firm F-Secure and 40 other tech signatories presented an open letter opposing the act.
The bill most controversially is expected to force service providers to allow the authorities to decrypt secret messages if requested to do so in extremis. This is most likely going to come in the form some kind of order effectively banning end-to-end encryption.
I heard from F-Secure security adviser Sean Sullivan on this to find out why the bill is such as bad idea.
To precis what I wrote in this Infosecurity article, his main arguments are that forcing providers to hold the encryption keys will:
- Make them a likely target for hackers, weakening security
- Send the wrong signal out to the world and damage UK businesses selling into a global marketplace
- End up in China or other potentially hostile states a service provider also operates in also requesting these encryption keys – undermining security further
- Be useless, as the bad guys will end up using another platform which can’t be intercepted
I completely agree. Especially with Sullivan’s argument that the providers would become a major target for hackers.
“End-to-end encryption makes good sense and is the future of security,” he told me by email. “Asking us to compromise our product, service, and back end would be foolish – especially considering all of the back end data breach failures that have occurred of late. If we don’t hold the data, we cannot lose control of it. That’s just good security.”
One other point he made was the confusion among politicians about tech terminology as basic as “backdoor” and “encryption”.
“A lot of UK politicians end up putting their foot in their mouth because they don’t properly understand the technology. They try to repeat what their experts have told them, but they get it wrong. UK law enforcement would probably love to backdoor your local device (phone) but that’s a lost cause,” he argued.
“The politicians (who actually know what they’re talking about) really just want back end access. As in, they want a back door in the ‘cloud’. They want to mandate warranted access to data in transit and/or in the back end (rather than data at rest on the device) and fear that apps which offer end-to-end encryption, in which the service provider doesn’t hold any decryption keys, are a threat.”
Let’s see what happens, but given the extremely low technology literacy levels among most politicians I’ve got a bad feeling about this one.