Last week, the Irish High Court made a judgement on transatlantic data flows that could have far reaching implications for US tech firms and point the way towards economic disaster for the UK.
Yes, it might not have received much coverage at the time, but the court’s decision was a biggie.
It asked the European Union Court of Justice (CJEU) to scrutinise the mechanism by which Facebook and many other firms transfer data: standard contractual clauses (SCCs).
Why? Because Austrian law student Max Schrems is still not happy that his personal data could theoretically be snooped on by the US authorities whilst residing in Facebook datacentres over there. His previous battle with Facebook over this issue led to the collapse of the Safe Harbour agreement between the EU and US.
Its replacement, Privacy Shield, is the other main legal mechanism – aside from SCCs – that govern data transfers outside the US.
“In simple terms, US law requires Facebook to help the NSA with mass surveillance and EU law prohibits just that,” Schrems said in a written statement following the court’s decision. “As Facebook is subject to both jurisdictions, they got themselves in a legal dilemma that they cannot possibly solve in the long run.”
Emily Taylor, CEO of Oxford Innovation Labs and Chatham House associate fellow, took time out to discuss the issue with me.
“The reference to the CJEU is no surprise, and the fact that the US government applied to be joined as party shows how high the stakes are on all sides – for governments, for big data platforms like Facebook, and for individuals,” she told me.
“The case shows that the Snowden revelations continue to reverberate on both sides of the Atlantic. The CJEU has taken a consistently hard line against mass data collection and retention, and increasingly relies on the EU Charter of Fundamental Rights. The Charter allows for ‘more extensive protection’ of fundamental rights such as privacy, compared with the more familiar European Convention.”
That spells some uncertain times ahead for Silicon Valley, especially with Privacy Shield also facing an uncertain future.
That’s not all though. The case tells us much about what may happen to post-Brexit Britain.
Our digital economy is worth around £160bn and responsible for over 1.5m jobs, by some estimates. That makes it a vital part of the economy, and means unhindered data transfers with the EU – our biggest trading partner and the largest trading bloc in the world – are absolutely essential.
So how do we square the EU’s requirements around strong privacy protections for citizens, with the round hole of the UK’s brand spanking new Investigatory Powers Act? Also known as the Snoopers’ Charter, the new law has given the UK authorities probably more power than any country on earth – save for China and North Korea – to snoop on their own citizens.
“It is difficult to see how the UK’s mass data collection requirements under the Investigatory Powers Act could satisfy the EU Charter and this could have a severe impact on EU-UK data flows, potentially damaging UK business interests post-Brexit,” Taylor concluded.
That should be getting people in all sorts of high places very nervous indeed.
I realise it’s been a while since I posted something up here, so here’s an article I wrote recently for Top10VPN’s new Privacy Central site:
The UK has been unlucky enough to know terrorism for quite some time. Many will remember the IRA campaigns of the 1970s and ’80s. This was an era before smartphones and the internet, yet the Irish paramilitary group continued to wage a successful campaign of terror on the mainland.
It continued to recruit members and organise itself to good effect. Politicians of the modern era, led by Theresa May and various members of her government, would do well to remember this when they launch into yet another assault on Facebook, Google, and the technology platforms that are alleged to provide a “safe haven” for Islamic terrorists today.
Now she is calling for greater regulation of cyberspace, something the independent reviewer of terrorism legislation has openly criticised. Along with increasing moves across Europe and the world to undermine end-to-end encryption in our technology products, these are dangerously misguided policies which would make us all less safe, less secure and certainly less free.
Our “Sliding Doors” moment
Every time a terror attack hits, the government continues its war of words not simply against the perpetrators, but against the tech companies who are alleged to have provided a “safe haven” for them. After all, such rhetoric plays well with the right-wing print media, and large parts of the party.
“Safe haven” has become something of a mantra for the prime minister, alongside her other favorite; “strong and stable”. She argues that terrorists are hiding behind encrypted communications on platforms like Facebook’s WhatsApp and Apple’s iMessage, and are using social media platforms like YouTube to recruit members and distribute propaganda.
“We cannot allow this ideology the safe space it needs to breed. Yet that is precisely what the internet, and the big companies that provide internet-based services, provide,” May said after the London Bridge attacks. “We need to work with allied democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremism and terrorism planning.”
Part of the regulation May wants to bring in could include fining tech companies that don’t take down terrorist propaganda quickly enough. Max Hill QC, independent reviewer of terror legislation, has rightly questioned this hard-line approach.
“I struggle to see how it would help if our parliament were to criminalize tech company bosses who ‘don’t do enough’. How do we measure ‘enough’? What is the appropriate sanction?” he said in a speech reported by The Times.
“We do not live in China, where the internet simply goes dark for millions when government so decides. Our democratic society cannot be treated that way.”
China is an interesting parallel to draw, because in many ways it offers a glimpse into an alternative future for the UK and Europe; one in which government has total control over the internet, where freedom of speech is suppressed and privacy is a luxury no individual can claim to have.
The problem is that no one sees authoritarianism coming, because it happens slowly, drip by drip. Regulating cyberspace would begin a slow slide into the kind of dystopic future we currently know only from sci-fi films. As Margaret Atwood’s heroine Offred says in her acclaimed novel The Handmaid’s Tale: “Nothing changes instantaneously: in a gradually heating bathtub you’d be boiled to death before you knew it.”
In many ways, we sit today at a Sliding Doors moment in history. Which future would you prefer?
The problem with backdoors
End-to-end encryption in platforms like WhatsApp and on our smartphones and tablets is something Western governments are increasingly keen to undermine, as part of this clamp down. It doesn’t seem to matter that this technology keeps the communications of consumers and countless businesses safe from the prying eyes of nation states and cybercriminals – it’s also been singled out as providing, you guessed it, a “safe space” for terrorists.
The Snoopers’ Charter already includes provisions for the government to force tech providers to effectively create backdoors in their products and services, breaking the encryption that keeps our comms secure. In fact, the government is trying to sneak through these provisionswithout adequate scrutiny or debate. They were leaked to the Open Rights Group and can be found here.
It remains to be seen whether the British government could actually make this happen. An outright ban is unworkable and the affected tech companies are based almost entirely in the US. But the signs aren’t good. Even the European Commission is being strong-armed into taking a stance against encryption by politicians keen to look tough on terror in a bid to appease voters and right-wing newspaper editors. Let’s hope MEPs stand up to such calls.
The problems with undermining encryption in this way are several-fold. It would give the state far too much power to pry into our personal lives, something the UK authorities can already do thanks to the Investigatory Powers Act (IPA), which has granted the government the most sweeping surveillance powers of any Western democracy. It would also embolden countries with poor human rights records to do the same.
Remember, encryption doesn’t just keep terrorist communications “safe” from our intelligence services, it protects journalists, human rights activists and many others in hostile states like those in the Middle East.
More importantly, it protects the communications of all those businesses we bank with, shop with, and give our medical and financial records to. The government can’t have its cake and eat it: recommending businesses secure their services with encryption on the one hand, but then undermining the very foundations on which our economy is built with the other.
Once a provider has been ordered to create a “backdoor” in their product or service, the countdown will begin to that code going public.
Even the NSA and CIA can’t keep hold of their secrets: attackers have managed to steal and release top secret hacking tools developed by both. In the case of the former this led to the recent global ransomware epidemic dubbed “WannaCry”.
Why should we set such a dangerous precedent, putting our data and privacy at risk, while the real criminals simply migrate to platforms not covered by the backdoor program?
“For years, cryptologists and national security experts have been warning against weakening encryption,” Apple boss Tim Cook has said in the past. “Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.”
In short, we need more police officers, constructive relationships with social media companies, and smarter ways of investigating terror suspects. Dragnet surveillance, encryption backdoors and more internet regulation is the quickest way to undermine all those democratic freedoms we hold so dear – and send us hurtling towards that dystopic authoritarian future.
It’s hard to find an optimist in the cyber security industry in these post-referendum days. I spoke to a fair few for an upcoming feature for Infosecurity Magazine and the consensus seems to be that a Brexit will be bad for staffing, the digital economy and the financial stability of UK-based security vendors.
That’s not even to mention the legal and compliance implications. Chatham House associate fellow, Emily Taylor, recommended firms continue on the road to compliance with the European General Data Protection Regulation. Aside from the fact that any firms with EU customers will still need to comply with the far-reaching law, she reckons that if we want to protect the free flow of digital information between the EU and UK, we’ll need to continue following European laws in this area.
Snoopers gonna snoop
However, a Brexit would cause other problems, notably in that the current Snooper’s Charter looks like it will enshrine in legislation the principle of bulk surveillance – the very thing which effectively led to the scrapping of the Safe Harbour agreement between the US and EU. If this bill goes through as is and we go out of Europe but stay in the single market, we’ll have to change that bit, Taylor told me.
“A case brought by David Davis and Tom Watson questioning the legality of bulk surveillance powers under the old DRIPA laws is currently being considered by the CJEU,” she explained.
“It’s not clear which way the CJEU will go on this, because many member states have lined up to support the British approach. However, if CJEU follows its recent decisions, it could strike down bulk data collection. If we wanted to stay in the single market, we’d have to amend our IP Bill in response.”
Even if we broke away from Europe completely and adopted the status of a “third country” like the US, we’d still have to adopt measures “to give equivalent protection to EU citizens’ data as they enjoy within the EU,” she argued. And bulk surveillance would certainly be a no-no in this scenario.
The uncertainty – which could continue potentially for years while Brexit deals are worked out – is also viewed by many as damaging to the cyber security industry, and tech in general. Immigration lawyer and partner at MediVisas, Victoria Sharkey, claimed firms may be unwilling to employ skilled workers if there’s a chance they might have to leave in a couple of years’ time.
“This is certainly going to be the case where significant training and investment is involved,” she added.
In fact, EU nationals are apparently already packing their bags.
“I am already seeing EU nationals who have been here for years make plans to leave and either go home or go to another EU country. They are worried for their jobs, are worried that they will be told to leave and so would rather leave on their own terms, and they are also being made to feel unwelcome,” Sharkey continued.
“I feel that when we do leave that it is going to become significantly harder for UK employers to encourage the best in their industry to come and work in the UK.”
This, for an industry which has always struggled with skills gaps and shortages, is potentially catastrophic.
Can we overcome?
Philip Letts, CEO of global enterprise services platform blur Group, has run businesses in Silicon Valley and the UK. He also pointed out the potential damage that political and financial uncertainty could have on the industry.
“The politicians are in unchartered territory. We don’t yet have a clear timetable for the triggering of Article 50, nor the trade deals that are going to have to be negotiated. There is a political vacuum. Business confidence is low and many will hunker down, try to avoid risk and wait for this to play out,” he told me.
“Globally, the US tech heavyweights will want to remain in the UK and the EU, and they will do both, operating across different European centres. But the EU market is more lucrative than the UK, so things may shift over time.”
So is the tech and cyber security sector really doomed? Not so, according to KPMG UK head of technology, Tudor Aw.
“I believe the resilient UK tech sector can withstand the challenges of Brexit and thrive,” he told me.
“Technology is increasingly a key sector that underpins all other sectors – whether it be back office systems or strategic enablers such as IoT and data analytics. Companies will need to invest in technology to drive efficiencies and strategic growth – one only has to look at developments across a diverse range of sectors such as healthcare, automotive, property, retail and the military to see that technology spend will only increase regardless of Brexit.”
It’s a moot point now, but I wonder how much better it could have thrived had we not voted out on 23 June.
It’s widely expected that next week the government will unveil details of its hugely controversial Snoopers’ Charter, aka the Investigatory Powers Bill. To preempt this and in a bid to influence the debate cyber security firm F-Secure and 40 other tech signatories presented an open letter opposing the act.
The bill most controversially is expected to force service providers to allow the authorities to decrypt secret messages if requested to do so in extremis. This is most likely going to come in the form some kind of order effectively banning end-to-end encryption.
I heard from F-Secure security adviser Sean Sullivan on this to find out why the bill is such as bad idea.
To precis what I wrote in this Infosecurity article, his main arguments are that forcing providers to hold the encryption keys will:
- Make them a likely target for hackers, weakening security
- Send the wrong signal out to the world and damage UK businesses selling into a global marketplace
- End up in China or other potentially hostile states a service provider also operates in also requesting these encryption keys – undermining security further
- Be useless, as the bad guys will end up using another platform which can’t be intercepted
I completely agree. Especially with Sullivan’s argument that the providers would become a major target for hackers.
“End-to-end encryption makes good sense and is the future of security,” he told me by email. “Asking us to compromise our product, service, and back end would be foolish – especially considering all of the back end data breach failures that have occurred of late. If we don’t hold the data, we cannot lose control of it. That’s just good security.”
One other point he made was the confusion among politicians about tech terminology as basic as “backdoor” and “encryption”.
“A lot of UK politicians end up putting their foot in their mouth because they don’t properly understand the technology. They try to repeat what their experts have told them, but they get it wrong. UK law enforcement would probably love to backdoor your local device (phone) but that’s a lost cause,” he argued.
“The politicians (who actually know what they’re talking about) really just want back end access. As in, they want a back door in the ‘cloud’. They want to mandate warranted access to data in transit and/or in the back end (rather than data at rest on the device) and fear that apps which offer end-to-end encryption, in which the service provider doesn’t hold any decryption keys, are a threat.”
Let’s see what happens, but given the extremely low technology literacy levels among most politicians I’ve got a bad feeling about this one.