Advertisement

Can tech companies do more to stop terrorism?

A vigil is held at Potters Fields Park in London on Monday for the victims of the terror attack on London Bridge and at Borough Market.
(Daniel Leal-Olivas / AFP/Getty Images)
Share

British officials renewed demands this week for stricter regulation of the software industry to prevent the online services they develop from becoming a harbor for terrorists.

Conversations on WhatsApp, Telegram and Signal can’t be tapped by police as simply as traditional phone calls or emails, hindering the ability of law enforcement officials to spy on terrorists coordinating activities. Though evidence is often scant, politicians have suggested that several attacks in Europe over the last couple of years could have been prevented if investigators had been able to spy on secure online chats.

“We cannot allow this ideology the safe space it needs to breed,” British Prime Minister Theresa May said in the wake of a terrorist rampage that left seven people dead in London on Saturday.

Officials around the world have called on companies that produce chat apps to find ways to be more accommodating to government investigators. And they’ve also demanded that the world’s most used social media services do more to stop the spread of offensive material, such as beheading videos or recruitment messages from terrorist organizations.

It’s been a challenge, though, for lawmakers to find ways to strike a balance between the necessities of secure communications and law enforcement evidence-gathering. Putting pressure on the software industry last year led to “encouraging” gains for the counter-terrorism community, European leaders said last week. What the latest criticism might bring about to is unclear.

The tech industry has gotten more aggressive with encryption

Encryption programs scramble the contents of messages or other files using a formula that integrates special passwords. Only people who know the secret phrase can decrypt or unscramble the information.

With hackers increasingly trying to get at people’s data, whether to steal trade secrets or credit card numbers, app makers have turned to encryption as a way to prevent data breaches. For example, if hackers managed to infiltrate WhatsApp systems, they still wouldn’t be able to read people’s conversations, assuming the encryption is set up correctly. In that situation, only the users in a conversation have the requisite keys.

Governments are supportive of such technology, and many regulations now either mandate it or absolve firms for liability for making encryption part of their security procedures.

“Everybody who works in IT recognizes encryption is one of the lines of defense,” said Nigel Hawthorne, European privacy lead for Campbell, Calif., cybersecurity start-up Skyhigh Networks.

But encryption also acts as a block on mass surveillance of online chatter. Authorities had come to rely on searching through the online data to identify enemy maneuvers, just as they had done with letters, radios and phone calls before.

The majority of Internet traffic is now encrypted, which limits the amount of information accessible to snooping hackers and investigators.

But plenty of data remains loosely encrypted, like most emails, because the companies that provide the email services usually have a password too. That means they can turn over messages when ordered by a court or law enforcement body. It’s certainly the case with apps that rely on advertising for revenue because scanning conversations helps them decide which ads to show.

Authorities have been crafty. Leaked National Security Agency files in 2013 said agency experts had found ways to defeat encryption and access some discussions on the Internet.

But exploiting security gaps and coming up with workarounds has been viewed as inefficient and insufficient. Continued complaints from officials in the U.S., France, Germany, Britain and elsewhere suggest that they want broader access to online conversations.

“If you are law enforcement, you want to have all the powers you can think of and there’s no doubt full encryption stymies that,” Hawthorne said. And “we’re seeing more of the world moving toward all-encrypted technology.”

Political pressure has led to action from tech companies

The wave of criticism following attacks in London and Manchester has come with mostly generalities about what officials desire from the tech industry.

That might be because many of the easiest-to-solve issues have been addressed. A slew of Islamic State attacks has led Facebook, Twitter, YouTube and other social media platforms to pledge quicker removals of uploads by terrorists. Each of the services bans violent imagery or posts that incite violence or hatred.

The companies have deployed technology to identify previously banned material, and they’ve increased the size of their moderation teams to respond faster when users flag something objectionable.

Politicians still want the companies to be more aggressive about developing software that automatically catches never-before-seen terrorist material. But that’s a challenge that lacks an overnight solution, technologists say.

There’s a growing acknowledgement, including from Congress, that the use of encryption shouldn’t be prohibited. What May and other leaders may want to see now instead is tech companies offering to help devise surveillance measures that could be effective despite messages being untappable.

Authorities may seek new regulations that allow them to hack into apps or gadgets. In some cases, the tech companies could be party to the legal hacking — perhaps being forced to swap the encryption formula so that any messages in a thread from that point forward are readable by investigators.

Governments could ask for software vendors to be more forthcoming about what information is unscrambled, which could include the names of recipients, the locations where messages were typed and the times messages were sent. Or there could be rules requiring social media companies to proactively turn over data when certain suspects reach out to a new contact.

Collaboration also may be possible in helping authorities access in real time whatever data is legally available, instead of a slower, back-and-forth process, said Daniel Weitzner, principal research scientist at MIT’s Computer Science and Artificial Intelligence Lab.

Weitzner, a former Obama administration official, said a discussion about data retrieval speeds would be more fruitful than “dead-end arguments about putting backdoors” in encryption technology — arguments the tech world has fought because any vulnerability opened to law enforcement could be discovered and exploited by criminal hackers as well.

“It’s easier to get there and work out standard procedures,” Weitzner said.

The European Commission and British ministers have said they would introduce in the coming weeks specific legislative proposals on apps, data and encryption.

paresh.dave@latimes.com

Twitter: @peard33

Advertisement