End-to-end encryption: What happens next?

The Online Safety Bill (OSB) is still winding its way through parliament. But while much of the analysis so far has been on its provisions to force social media companies to remove “harmful” content, there’s an elephant lurking in the corner of the room. Clause 110 compels not only social media firms but also messaging app providers to identify and take down child sexual exploitation and abuse (CSEA) content.

There’s one big problem here. End-to-end encryption (E2EE), which makes message content impenetrable to providers like WhatsApp. It appears as if the government might be looking at client-side scanning as a solution. Experts I spoke to for an upcoming feature are unconvinced.

What’s client-side scanning?

Put simply, this “accredited technology” would require individuals to download software to their devices. It would run locally, scanning potentially for suspicious keywords and image content that matches a CSEA database, before a message is encrypted and sent. On paper, this preserves E2EE while allowing the authorities to police child abusers. In reality, it will fail on both counts for several reasons.

  • Researchers have already worked out it could generate too many false positives to be useful, and could be hacked in other ways
  • If client-side scanning were targeted by foreign governments or cyber-criminals, it would put private data potentially at risk
  • The bosses of several big-name messaging apps say they’d rather exit the UK than comply with the OSB, which would also make UK firms and consumers less secure
  • If client-side encryption comes into force, child abusers will simply gravitate to unpoliced apps, as criminals have in the past with services like EncroChat
  • There’s a concern that the technology could be used in the future to police other content types – government mission creep

Matthew Hodgson, CEO of secure messaging app Element, argued that the new provisions directly contradict the GDPR in undermining encryption.

“It undermines privacy and security for everyone because every secure communication app which happens to have abusive users could be obligated to incorporate a third-party scanning solution, which then means every single user is at risk of that scanning solution being exploited by an attacker to break their privacy,” he told me.

“Any business depending on E2EE for privacy may find themselves at a loss, given encryption vendors would be forced to stop providing their services in the UK, as it is literally impossible to preserve privacy whilst also adding a mechanism to let third parties exfiltrate user data.”

Corelight cyber security specialist, Matt Ellison, cautioned against government putting its faith in a “magic technical solution” that doesn’t exist – adding that Apple abandoned similar plans for client-side scanning after a privacy uproar.

“Ultimately the government is proposing to significantly weaken the security of almost the entire nation, for the ability to perform a lawful intercept of an individual suspected of a crime,” he told me.

“Should all vehicles be fitted with a remote kill switch, in case you are deemed to be committing a crime in your vehicle? Should all houses have the same door key type, with authorities maintaining a master key that could get into everyone’s house to gather evidence without you knowing, again, if you are under suspicion?”

Ellison argued that smartphones are much more than just a technically advanced mobile phone.

“The reality is that they are an intimate and highly integrated aspect of our lives and mass surveillance approaches such as this are a gross invasion of privacy and civil liberties.”

What should happen?

According to Hodgson, there are plenty of ways law enforcers could hunt down child abusers.

“These include investigation/infiltration of forums where abusers recruit or advertise, or by analysing communication metadata, or by educating users within apps, and in general, to be mindful of abuse,” he added.

“Blanket surveillance which undermines the privacy of everybody is not the answer.”

Ross Anderson, who wrote a paper on this challenging the conclusions of the NCSC technical director Ian levy, agreed that old-fashioned policing techniques are the answer, rather than technology solutions which promise much but deliver little. The debate between law enforcement/government on one side and encryption specialists/tech vendors on the other has been raging for years. Throughout, the former have argued that tech wizards simply need to apply themselves more diligently to the task in order to find an answer. The latter retort that E2EE can’t be broken without undermining security for everyone.

So where does that leave us? With Labour backing the bill, it will undoubtedly become law. But what of Clause 110? If it remains unchanged, it’s unlikely the government will enforce it. The best privacy and security advocates can hope for is that its most controversial provisions are never enforced. That’s what happened with the Investigatory Powers Act – which incidentally already gives the British government theoretical powers to force tech firms to break encryption. It will probably happen again.

Advertisement

Data Transfers and a Chaotic Post-Brexit Future

european unionLast week, the Irish High Court made a judgement on transatlantic data flows that could have far reaching implications for US tech firms and point the way towards economic disaster for the UK.

Yes, it might not have received much coverage at the time, but the court’s decision was a biggie.

It asked the European Union Court of Justice (CJEU) to scrutinise the mechanism by which Facebook and many other firms transfer data: standard contractual clauses (SCCs).

Why? Because Austrian law student Max Schrems is still not happy that his personal data could theoretically be snooped on by the US authorities whilst residing in Facebook datacentres over there. His previous battle with Facebook over this issue led to the collapse of the Safe Harbour agreement between the EU and US.

Its replacement, Privacy Shield, is the other main legal mechanism – aside from SCCs – that govern data transfers outside the US.

“In simple terms, US law requires Facebook to help the NSA with mass surveillance and EU law prohibits just that,” Schrems said in a written statement following the court’s decision. “As Facebook is subject to both jurisdictions, they got themselves in a legal dilemma that they cannot possibly solve in the long run.”

Emily Taylor, CEO of Oxford Innovation Labs and Chatham House associate fellow, took time out to discuss the issue with me.

“The reference to the CJEU is no surprise, and the fact that the US government applied to be joined as party shows how high the stakes are on all sides – for governments, for big data platforms like Facebook, and for individuals,” she told me.

“The case shows that the Snowden revelations continue to reverberate on both sides of the Atlantic.  The CJEU has taken a consistently hard line against mass data collection and retention, and increasingly relies on the EU Charter of Fundamental Rights. The Charter allows for ‘more extensive protection’ of fundamental rights such as privacy, compared with the more familiar European Convention.”

That spells some uncertain times ahead for Silicon Valley, especially with Privacy Shield also facing an uncertain future.

That’s not all though. The case tells us much about what may happen to post-Brexit Britain.

Our digital economy is worth around £160bn and responsible for over 1.5m jobs, by some estimates. That makes it a vital part of the economy, and means unhindered data transfers with the EU – our biggest trading partner and the largest trading bloc in the world – are absolutely essential.

So how do we square the EU’s requirements around strong privacy protections for citizens, with the round hole of the UK’s brand spanking new Investigatory Powers Act? Also known  as the Snoopers’ Charter, the new law has given the UK authorities probably more power than any country on earth – save for China and North Korea – to snoop on their own citizens.

“It is difficult to see how the UK’s mass data collection requirements under the Investigatory Powers Act could satisfy the EU Charter and this could have a severe impact on EU-UK data flows, potentially damaging UK business interests post-Brexit,” Taylor concluded.

That should be getting people in all sorts of high places very nervous indeed.


Why Theresa May’s Encryption Plans Are a Danger to Us All

houses of parliamentI realise it’s been a while since I posted something up here, so here’s an article I wrote recently for Top10VPN’s new Privacy Central site:

The UK has been unlucky enough to know terrorism for quite some time. Many will remember the IRA campaigns of the 1970s and ’80s. This was an era before smartphones and the internet, yet the Irish paramilitary group continued to wage a successful campaign of terror on the mainland.

It continued to recruit members and organise itself to good effect. Politicians of the modern era, led by Theresa May and various members of her government, would do well to remember this when they launch into yet another assault on Facebook, Google, and the technology platforms that are alleged to provide a “safe haven” for Islamic terrorists today.

Now she is calling for greater regulation of cyberspace, something the independent reviewer of terrorism legislation has openly criticised. Along with increasing moves across Europe and the world to undermine end-to-end encryption in our technology products, these are dangerously misguided policies which would make us all less safe, less secure and certainly less free.

Our “Sliding Doors” moment

Every time a terror attack hits, the government continues its war of words not simply against the perpetrators, but against the tech companies who are alleged to have provided a “safe haven” for them. After all, such rhetoric plays well with the right-wing print media, and large parts of the party.

“Safe haven” has become something of a mantra for the prime minister, alongside her other favorite; “strong and stable”. She argues that terrorists are hiding behind encrypted communications on platforms like Facebook’s WhatsApp and Apple’s iMessage, and are using social media platforms like YouTube to recruit members and distribute propaganda.

“We cannot allow this ideology the safe space it needs to breed. Yet that is precisely what the internet, and the big companies that provide internet-based services, provide,” May said after the London Bridge attacks. “We need to work with allied democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremism and terrorism planning.”

Part of the regulation May wants to bring in could include fining tech companies that don’t take down terrorist propaganda quickly enough. Max Hill QC, independent reviewer of terror legislation, has rightly questioned this hard-line approach.

“I struggle to see how it would help if our parliament were to criminalize tech company bosses who ‘don’t do enough’. How do we measure ‘enough’? What is the appropriate sanction?” he said in a speech reported by The Times.

“We do not live in China, where the internet simply goes dark for millions when government so decides. Our democratic society cannot be treated that way.”

China is an interesting parallel to draw, because in many ways it offers a glimpse into an alternative future for the UK and Europe; one in which government has total control over the internet, where freedom of speech is suppressed and privacy is a luxury no individual can claim to have.

The problem is that no one sees authoritarianism coming, because it happens slowly, drip by drip. Regulating cyberspace would begin a slow slide into the kind of dystopic future we currently know only from sci-fi films. As Margaret Atwood’s heroine Offred says in her acclaimed novel The Handmaid’s Tale: “Nothing changes instantaneously: in a gradually heating bathtub you’d be boiled to death before you knew it.”

In many ways, we sit today at a Sliding Doors moment in history. Which future would you prefer?

The problem with backdoors

End-to-end encryption in platforms like WhatsApp and on our smartphones and tablets is something Western governments are increasingly keen to undermine, as part of this clamp down. It doesn’t seem to matter that this technology keeps the communications of consumers and countless businesses safe from the prying eyes of nation states and cybercriminals – it’s also been singled out as providing, you guessed it, a “safe space” for terrorists.

The Snoopers’ Charter already includes provisions for the government to force tech providers to effectively create backdoors in their products and services, breaking the encryption that keeps our comms secure. In fact, the government is trying to sneak through these provisionswithout adequate scrutiny or debate. They were leaked to the Open Rights Group and can be found here.

It remains to be seen whether the British government could actually make this happen. An outright ban is unworkable and the affected tech companies are based almost entirely in the US. But the signs aren’t good. Even the European Commission is being strong-armed into taking a stance against encryption by politicians keen to look tough on terror in a bid to appease voters and right-wing newspaper editors. Let’s hope MEPs stand up to such calls.

The problems with undermining encryption in this way are several-fold. It would give the state far too much power to pry into our personal lives, something the UK authorities can already do thanks to the Investigatory Powers Act (IPA), which has granted the government the most sweeping surveillance powers of any Western democracy. It would also embolden countries with poor human rights records to do the same.

Remember, encryption doesn’t just keep terrorist communications “safe” from our intelligence services, it protects journalists, human rights activists and many others in hostile states like those in the Middle East.

More importantly, it protects the communications of all those businesses we bank with, shop with, and give our medical and financial records to. The government can’t have its cake and eat it: recommending businesses secure their services with encryption on the one hand, but then undermining the very foundations on which our economy is built with the other.

Once a provider has been ordered to create a “backdoor” in their product or service, the countdown will begin to that code going public.

It’s inevitable.

Even the NSA and CIA can’t keep hold of their secrets: attackers have managed to steal and release top secret hacking tools developed by both. In the case of the former this led to the recent global ransomware epidemic dubbed “WannaCry”.

Why should we set such a dangerous precedent, putting our data and privacy at risk, while the real criminals simply migrate to platforms not covered by the backdoor program?

“For years, cryptologists and national security experts have been warning against weakening encryption,” Apple boss Tim Cook has said in the past. “Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.”

In short, we need more police officers, constructive relationships with social media companies, and smarter ways of investigating terror suspects. Dragnet surveillance, encryption backdoors and more internet regulation is the quickest way to undermine all those democratic freedoms we hold so dear – and send us hurtling towards that dystopic authoritarian future.


One Year to GDPR Compliance Deadline: Time to Panic Yet?

european unionEurope’s new data protection laws might have been over a decade in the making but it would take about as long again to read every piece of advice that’s since been produced on how to comply. In search of some simple answers to a typically complex piece of European legislation, I asked a few legal experts on their thoughts.

With 13 months to go before the compliance deadline, organisations across the country will be scrabbling to ensure they’re not one of the unlucky ones caught out in the months following 25 May.

Start with the Data

Most experts I spoke to were in agreement that firms need to start by mapping their data – after all, you’ve got to know where it is and what you do with it first before working out how to keep it safe.

“For those that are compliant with existing laws, GDPR is going to be an evolution. For the others, it’s going to be a deep, radical change. In general, I think that every organisation should be working on assessing their current practices in light of GDPR,” Forrester analyst Enza Iannopollo told me.

“My advice is, regardless of the kind of support an organisation chooses, it must put together a team of internal people – hopefully the privacy team – and make sure that that team leads the work. Compliance with GDPR is not a one-off effort, but an ongoing process that has to be ingrained in firms’ business model,” she said.

Change the culture

That cultural change might be the hardest thing for organisations to achieve, although a good start is hiring a Data Protection Officer (DPO) – one of the key requirements of the GDPR. Another is the privacy impact assessment, which PwC’s US privacy lead, Jay Cline, recommends as a key stage once you’ve completed a data inventory.

“Data protection impact assessments (DPIAs) are the eyes and ears of the privacy office throughout the company,” he told me by email. “DPIAs are how chief privacy officers enlist the help of the whole company to keep their privacy controls current with all the change going on in the company.”

For Alexandra Leonidou, Senior Associate at Foot Anstey, there’ll be a key role for non-IT functions inside the organisation.

“Who needs to know about the GDPR? Who are the key stakeholders?  This isn’t just something for IT, information security teams or data officers. Boards should be aware of the risks, and HR teams need to think about employee data. Getting GDPR compliance right will be critical for marketing and communications teams’ activity,” she told me.

“You will need to engage key stakeholders and implement measures that leave you with an acceptable level of commercial risk.”

Leonidou was also keen to stress the need for independence in the DPO role.

“Guidance from Europe suggests that this role is likely to be incompatible with certain existing C-suite executives,” she explained. “The awareness-raising that follows on from the allocation of accountability will be an ongoing process.”

For those still in the dark, some useful free resources include the Article 29 Working Party and our very own Information Commissioner’s Office. It’s also expected that even post-May 25, the regulators will give firms a little bedding in time before they start going after some high profile offenders.


GDPR and Snoopers’ Charter: A Marriage Made in Hell

european unionAll over Europe organisations of all sizes are currently scrabbling desperately to get their house in order for 25 May 2018. What happens then? Only the biggest shake-up to Europe’s data protection laws in nearly a generation. The implications are immense, both in terms of the scope of the new regulation and the companies who will now be held liable.

There’s just one problem. The UK’s Snoopers’ Charter, or Investigatory Powers Act. Its enshrining into law of mass surveillance powers could create major problems down the line, possibly putting UK firms at a competitive disadvantage precisely at a time when they need the digital economy most.

What’s the problem?

Let’s start at the beginning. UK firms will have to comply with GDPR, even with Brexit looming. That’s because the extrication of the country from the EU will take at least two years from whenever Article 50 is triggered – presumably in March – and probably much, much longer. And even beyond that, the UK government has said in its Brexit white paper:

“The European Commission is able to recognise data protection standards in third countries as being essentially equivalent to those in the EU, meaning that EU companies are able to transfer data to those countries freely.

As we leave the EU, we will seek to maintain the stability of data transfer between EU Member States and the UK.”

This implies that the UK will broadly speaking harmonise its laws with the GDPR. But the bulk data collection powers granted by the IPA mean the regime is certainly not equivocal to that in Europe. Emily Taylor, CEO of Oxford Innovation Labs and associate fellow of Chatham House, told me that the European Court of Justice (CJEU) shows no signs on shifting its stance on bulk data collection – having recently ruled against the forerunner to the Snoopers’ Charter, DRIPA.

“Other elements of the judgment are likely to cause problems with the Investigatory Powers Act: the CJEU says that targeted data retention may be allowable, but must be restricted solely to fighting serious crime; warrants must be signed off by a court, not a minister; and the data concerned must be retained within the EU.  All these will potentially conflict with core elements of the IP Act,” she told me.

If its kept as is, the Act could therefore impact the legality of data transfers between Europe and a newly independent UK, which will be bad news for most firms reliant on a thriving digital economy.

“The impact of conflicts between the GDPR and our Investigatory Powers Act may be to hamper the competitiveness of UK tech, particularly as the GDPR seeks to protect EU citizens’ data wherever it will be processed,” she argued.

Not great for America

This is a hot button issue for Europe In fact it’s the reason why data transfers to the US were put under threat after Safe Harbour was torn down because of fears of US authorities snooping on Europeans’ data. Despite a new agreement – Privacy Shield – being put in place, there could still be bumps in the road ahead.

“Transatlantic data flows will not be legal unless there is a robust framework in place to offer EU citizens’ data equivalent protection to what is enjoyed in the EU,” said Taylor.

“President Trump’s ‘America First’ policy is likely to renew tensions over Privacy Shield – a shaky compromise which was hurriedly reached following the CJEU’s obliteration of its predecessor ‘Safe Harbour’.”

KPMG’s globa privacy advisory lead, Mark Thompson, told me that firms outside of Europe that need to comply with the GDPR are better off keeping data on European citizens inside the EU so as not to fall foul of any changes to data transfer agreements.

“Despite the USA and EU having some cultural alignment, there is potential for significant culture clash between the EU’s view of a fundamental human right to privacy and the US view on what constitutes privacy, which is significantly different,” he added.

We’ll have to wait a while to see what the fallout of all this is. But with the UK government unlikely to countenance any changes to the IPA, there could be some potentially bad news for the country’s digital economy in the next few years if nothing changes.


Data Protection Day: Shot in the Arm or a Waste of Time?

privacy It’s Data Protection Day next Thursday if you hadn’t noticed, and you’re forgiven for not doing so. I only remembered about it after researching an analysis piece today for Infosecurity Magazine.

The idea is to raise awareness among consumers to think twice about leaving a bigger digital footprint online than they already have, and to try and get businesses to take data privacy more seriously.

On both counts it’s a challenging prospect, according to many of the experts I spoke to.

David Gibson, vice president of strategy at Varonis, told me that improving privacy protection all comes down to better monitoring of fraud abuses.

“The proof that traditional methods don’t work is in the increasing frequency and magnitude of data breaches related to unstructured data,” he argued.

“Not only is there more data to worry about, but it’s containing more sensitive and valuable information and it’s getting easier for attackers to exfiltrate that data since it’s typically not monitored. If what you’re trying to steal isn’t being watched, you have a much better chance of getting away.”

Rackspace senior director of legal, Lillian Pang, admitted that firms still don’t prioritise data privacy at a board level, and this needs to change if things are to get better for consumers.

“Only then will firms start taking it seriously and filter down the privacy compliance needs to the ground level of its business. In some respects, you could say that privacy needs to be led from the top level of any business and administered from the ground level,” she told me.

“Many firms pay lip service to the importance of data privacy but few really understand or recognise that a robust data privacy program in a firm solidifies its information security and helps to further safeguard the firm’s business.”

The EU General Data Protection Regulation could be the push that many firms need to start taking the issue seriously, according to Gemalto data protection CTO, Jason Hart.

“The EU Data Protection Regulation is set to be finalised later this year, but companies need to start taking the steps to change how they protect their data now, otherwise they could find themselves subject to compliance penalties, and also put their reputation and consumer confidence at risk,” he warned.

“As the reporting requirements of the new EU regulation make data breaches more visible, we can expect the economic and business consequences of a breach to continue to escalate, so businesses need to start taking steps to ensure they are prepared for when new regulation comes into force.”

So are awareness raising exercises like Data Protection Day even worth the effort? Well the general consensus is that anything like this is probably a bonus, although the jury’s out on how effective it can be.

“Although Data Privacy Day is a great opportunity to raise awareness of the issue, understanding the importance of protecting data needs to be an all year round initiative,” said Hart. “Businesses need to realise the importance of the data they hold in their systems and how the loss of this can impact their customers.”

Data Protection Day (Data Privacy Day in the US) is on 28 January.


Why F-Secure and Others Are Opposing the Snoopers’ Charter

whatsapp logoIt’s widely expected that next week the government will unveil details of its hugely controversial Snoopers’ Charter, aka the Investigatory Powers Bill. To preempt this and in a bid to influence the debate cyber security firm F-Secure and 40 other tech signatories presented an open letter opposing the act.

The bill most controversially is expected to force service providers to allow the authorities to decrypt secret messages if requested to do so in extremis. This is most likely going to come in the form some kind of order effectively banning end-to-end encryption.

I heard from F-Secure security adviser Sean Sullivan on this to find out why the bill is such as bad idea.

To precis what I wrote in this Infosecurity article, his main arguments are that forcing providers to hold the encryption keys will:

  • Make them a likely target for hackers, weakening security
  • Send the wrong signal out to the world and damage UK businesses selling into a global marketplace
  • End up in China or other potentially hostile states a service provider also operates in also requesting these encryption keys – undermining security further
  • Be useless, as the bad guys will end up using another platform which can’t be intercepted

I completely agree. Especially with Sullivan’s argument that the providers would become a major target for hackers.

“End-to-end encryption makes good sense and is the future of security,” he told me by email. “Asking us to compromise our product, service, and back end would be foolish – especially considering all of the back end data breach failures that have occurred of late. If we don’t hold the data, we cannot lose control of it. That’s just good security.”

One other point he made was the confusion among politicians about tech terminology as basic as “backdoor” and “encryption”.

“A lot of UK politicians end up putting their foot in their mouth because they don’t properly understand the technology. They try to repeat what their experts have told them, but they get it wrong. UK law enforcement would probably love to backdoor your local device (phone) but that’s a lost cause,” he argued.

“The politicians (who actually know what they’re talking about) really just want back end access. As in, they want a back door in the ‘cloud’. They want to mandate warranted access to data in transit and/or in the back end (rather than data at rest on the device) and fear that apps which offer end-to-end encryption, in which the service provider doesn’t hold any decryption keys, are a threat.”

Let’s see what happens, but given the extremely low technology literacy levels among most politicians I’ve got a bad feeling about this one.