icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
11 Mar, 2016 08:50

Giving govt ‘backdoors’ to encrypted data will compromise security even more – Cryptography pioneer

This man helped lead a revolution in computer cryptography decades ago - but with more and more information stored online today, concerns over the safety of our digital communications are bigger than ever. The world is facing a dilemma between safety and privacy. The American government is trying to force corporations to give out the keys for encryption, saying it is needed to fight terror. The infrastructure of nations is getting more and more dependent on digital networks - opening up another vulnerable spot. With the entire planet steadily marching into the era of virtual reality, how secure are our secrets? Or maybe the locks should be left open for the sake of a safer society? We ask the person who pioneered encryption tech, and whose inventions are used everywhere nowadays - Professor Martin Hellman is on Sophie&Co today.

Follow @SophieCo_RT

Sophie Shevardnadze: Pioneer, encryptology professor, Martin Hellman welcome to the show, it's really great to have you with us. Now, professor, the FBI's counter-terrorism experts say Apple and Google are helping terrorists with strong encryption tech. Apple automatically encrypts messages on FaceTime, new Iphones are locked, not even Apple can access info on the phone. Is encryption really making us less safe, is it helping terrorists?

Martin Hellman: Encryption does both. It makes us more safe and less safe, and there's a trade-off there. Back in late 70s, or early 80s, when I was fighting with NSA and Bobby Inman, who was the Director of NSA at that time, said things like "the FBI is now safe", but just a year ago, in an article in the Stanford Alumni magazine in an interview he said that if he could afford seeing the changes, the way Chinese hackers, for example, are trying to get into the American defense establishments, he would've been much more worried as I was about securing encryption rather than trying to weaken it. So, there's a trade off. I agree more with him today, too, so we have to look at the trade off.

SS: Now the debate over handing the government an encryption key has intensified, followed by an attack in California, recently. Now, using the perpetrator's conversations as an example, the security services want a back door open to private encryption. Is all this a power grab from the law enforcement or do you think their concerns are legitimate?

MH: I think it's both, the concerns are legitimate and it is a power grab that they're using these incidents without really having full information at this point in time as to what information they would've gotten: if they can't get it, how do they know what would've let them stop these incidents? So, they have legitimate concerns, and, yes, if I could make it that the law enforcement could get criminal and terrorist communications but everyone else's were protected, I'd do it. The big problem is that the government is not being specific about how it hopes to do these things. And not just the American government - the British and the French have been saying very similar things. Very similar problem occurred about 20 years ago, when I served on a National Research Council Committee, that's our National Academy of Sciences research arm, that at Congressional request looked at a very similar vague proposal from the government, and we spend a lot of time trying to figure out how to make it work, and, experts in cybersecurity, we could not figure out how to make it work back then and I don't see how to make it work today, where you can get what the government wants, without having tremendous problems.

SS: But do you think the government should be handed over an encryption key? Right now?

MH: What encryption key? That's part of the problem. See, they make it sound simple, like there's a single encryption key, but it's much more complex than that. Is this the key that lets them get into everything or into one conversation? It's a very vague proposal. It was a vague proposal 20 years, and so, that Committee, by the way, had a former Attorney-General representing the law enforcement's interests, the FBI reports to the Attorney-General, it had a former Deputy Director of NSA serving on it, so, our national security interests were represented, and we reached unanimous conclusion in our report, and while the report didn't say quite so explicitly, we were unable to figure out how to make this kind of access work. It's a reasonably sounding request, but the devil is in the details. We need more details on how the government would hope to do this. We could not see how to do it, we don't see how to do it today.

SS: The FBI, by the way, actually wants to make it illegal for Google and Apple to encrypt their devices. Is the government being afraid of companies' encryption capabilities? I mean, surely they can't introduce anything that can't be hacked by the government itself.

MH: Encryption today can be made so secure that even the U.S. government, the Russian government, could not break it. And so, yes, it is possible to build that in and that is what appears to have been built in. As I said before, there's a trade-off that if you give government access then you create all kinds of problems. For example, if the proposal of 20 years ago, which was less vague than the one today, had been enacted, there would've been a master key, basically, and escrow key, and the database of the escrow key becomes a really tempting target for criminals, for foreign governments, and just for general people that are up to no good. Complexity is the enemy of security, and so when you introduce complexity to allow access, then you also introduce potential security faults. Good example - the government in Athens was being listened into, because they bought telephone gear - this is not encryption app, this is just telephone gear that allowed wiretapping, and someone got into it, and was listening to the Prime Minister.

SS: It's been reported that ISIS is taking techonology quite seriously. They are said to be training their members in using encrypted services, concealing communication online, et cetera. Do you think your baby, your invention, is becoming a dangerous tool?

MH: Oh, absolutely. But it's very similar... Imagine that automobiles have been invented in the classified community, many years ago, and then someone discovered how to make an automobile in the open world, and you can see where the governments would be very concerned, that up until now only law enforcement had police cars, but the criminals were using horses and buggies, it's going to make their work a lot harder. But, giving automobiles to individuals and to commerce has tremendous benefit to society - ambulances save lives - so you need to look at the trade-offs.

SS: Telegram and others are now coming under fire because ISIS members use them to talk covertly. If those messengers allow law enforcement in, can ISIS develop its own software to send encrypted messages? How difficult is it, by the way?

MH: That's another problem, that even if the apps that go on the smartphones, tablets, et cetera, have government access, if we can figure out how to do that without introducing security flaws, anyone who wants to and is really dedicated can add their own encryption to the device. However, and I did point this out in the Committee 20 years ago, very few people will do that. Perhaps, they should do it, but a  lot of people won't.

SS: The law enforcement can still trace the data from communications such as who and when - isn't that enough to trace suspicious activity? I mean, despite initial claims by officials, the terrorists that carried out the Paris attacks didn't bother to encrypt their messages.

MH: Well, you hit an important point, and this is something I've said 20 years ago in this Committee that's even truer today: let's say, for the sake of argument, that today law enforcement gets a 100 times as much raw data as they did in the good old days, back in 1950s, when they said that if they used to get the warrant, they could tap and understand exactly what was being said. They're getting probably a 100 times as much raw data today. They're getting things like cellphone locations, we have automated license plate readers, we have facial recognition, not used so much in this country, but, for example, in Great Britain, and so... Let's for the sake of argument say that law enforcement is getting a 100 times as much raw data, and even in 90% of it was dark, encrypted - which I think is much more than is the case, then they still would be getting 10 times as much useful information as they did in "good old days", back in 1950s. But in this time, 90% of the time, they'd be very frustrated. And so, we need to look at this and see how important this information would actually be and whether there's adequate information there now that does not introduce security flaws and other major problems.

SS: With all the access to information, the FBI knew about the Tsarnaev brothers, the Belgian authorities knew about the Paris attackers - so what's the point of breaking encryption if the information isn't used anyway?

MH: Well, in September 11, too, there were lots of warning signs that were overlooked. So, yes, there are, probably, much better ways to improve our intelligence than introducing what's called an "exceptional access", these back doors.

SS: Now, you have been part of the...

MH: By the way, there's a new National Research Council study committee, it's being formed right now. It's just been approved to look at this problem again, and so you're gonna have cybersecurity experts and representatives from law enforcement, national security, privacy advocates, on this committee, and this is, again, part of the National Academy of Sciences, and they will look at this, it will be a much more expedited committee than the one we did 20 years ago, which took 2 years, and we should wait for the results of that committee, we should wait for their conclusions, rather than trying to form an opinion now, in an emotion-charged environments after the Paris attacks and San-Bernardino. We're likely to make the wrong decision. We really need to think these things through.

SS: Professor, you have part of the Computer Privacy debate for quite some time. Now, privacy campaigners say they're not opposed to targeted surveillance that monitors terrorism suspects. They're against bulk data collection of everybody else. Does this bulk collection of information really make a difference?

MH: It can. There's a student in Stanford that did a study and looked at this question of... there's a limit on who you can target, and you have to be, I think, three hops away from a target of interest. But because of... frequently called numbers, like cell phone customer support - I call them and terrorist calls them - I am only two hops away or one hop away from terrorist by that measure.

SS: CIA has lashed out against Ed Snowden, saying his revelations about the NSA spying have prompted terrorists to use encryption, making the fight against terrorism harder. Is this them trying to deflect the blame for a missed attack?

MH: Certainly, Snowden’s revelations had positive and negative impact. I mean, it alerted terrorists to certain problems, although I think they knew that the CIA was able to listen in on them, or even more, NSA was able to listen in on them. So, Al-Qaeda moving away from using satellite phones, things like that. So, I think, they knew that. But, yes, it also alerted people, it also alerted us to the fact that there were things going on that should not have been going on, and so it has some positive impact.

SS: Despite a government crackdown, companies security features stronger. Is this an attempt to cash on a public’s distrust of the government in the wake of the NSA scandals?

MH: I wouldn’t say they’re cashing in. I think that they’re taking action to improve security, because they see the need for it, and their customer base sees the need for it. This is something that I was pointing out 40 years ago, actually, before it was really that much needed. We were a little bit ahead of our time in pointing out the need for secure encryption. I remember 30-40 years ago saying: “I can foresee the day when you might buy a loaf of bread with an electronic funds transfer”. Back in those days, electronic funds transfer was multimillion dollar transactions between banks. I couldn’t call it a debit card, because there was no debit cards, and I pointed out the need for protection, when all your financial transactions are digital, you need security, you need encryption.

SS: The new Freedom Act curbed the NSA’s bulk phone data collection, but if it only applies to phone records, does it really make a big difference? I mean ,the NSA’s spying on Internet and social media will continue, and that’s where really everything happens…

MH: The Internet is a tremendous boon to law enforcement and to espionage, being able to see what going on, because all information is in computer-readable form. To put that in perspective, I have to do the calculation, but I think it was something like billions, maybe even trillions of words could be searched for a dollar if they’re in computer-readable form. Compare that with the cost of having human beings listening in on telephone calls, or automated words recognition and so once you have information like email that’s in a computer-readable form it's really easy to sort through it, looking for keywords, key phrases, things like that.

SS: A new bill, called the “Cybersecurity Information Sharing Act” grants the NSA and other agencies access to Internet user data without any warrant. Can the government be trusted not to abuse the powers it’s given with the CISA? What do you  think, just your personal take?

MH: Governments are made of people, and there are good people, there are cautious people and also some that push the limits, and we’ve seen that with some of Snowden’s revelations, we saw it during the Nixon White House, with the enemies list and the use of the IRS, and misuse of the FBI. Martin Luther King was being harassed by the FBI… So, there’s always a potential in any government for abuse, and we need to make sure that we have adequate safeguards.

SS: How do you make sure, though?

MH: Not granting too sweeping powers. So, for example, the FISA court, the Foreign Intelligence Service Act court that approves these secret warrants, it was only through Snowden’s revelations and the debate that followed, that I learned that all FISA court judges are appointed by  the Chief Justice of the American Supreme Court. That’s a single-point failure, if the Chief Justice has a wrong perspective, he will appoint people that can abuse our civil liberties; if he’s the right person, that won’t happen, but we need more checks and balances.

SS: We keep bringing up Snowden. What do you think of the man? Do you think he’s a traitor, or do you think he’s a hero?

MH: I think he will be seen as both by different people, and some people see him as being part of each. I hesitate to call him a “traitor”, though. I’m friendly with Daniel Ellsberg who revealed the Pentagon papers during the Vietnam war, and one of the things that he really hates, one of the things that’s really hard for him is being seen as a traitor. What I know now about the Vietnam war, what he did was very heroic. So, I think only time will tell, but I think the history shows that we need to be cautious with calling anyone a “traitor”, who reveals abuses of power.

SS: Encryption and digital technology is used on everything now, right. Not just smartphones, but also banking, military, even nuclear weapons. Does this make us vulnerable to a new kind of terrorism? I mean, not physical attack, but a cyberattack on our critical infrastructure?

MH: There’s concern about that and there’s debate as to how real the danger is. There’s been some reports that a cyber attack could kill something like a third of the American population, if it brought down the electric grid in ways that fry transformers, that take years to replace, and the whole food distribution system were to melt, literally. There are other people who say that that's exaggeration, and it’s hard to say at this point where it is. We need to be paying more attention to cybersecurity, and we need to be paying more attention to where we’re headed, not just to where we are now, because right now, the grid might be not that connected, but if we continue on the path we are, it might well be in that place 5 or 10 years from now.

SS: Are encryption protection locks on nukes safe enough for atomic weapons? Is there a way for terrorist to infiltrate nuclear weapons systems online and use them to do harm?

MH: I doubt that. That’s highly classified information and I don’t have access to it, but there’s one interesting point about nuclear weapons, that Bruce Blair who was a former Minuteman launch control officer in 70s, and I think he still heads Global Zero, which is working for nuclear disarmament… Bruce had a column, 10 years ago, in which he told Robert McNamara, former Secretary of Defense, something that he said made McNamara turn livid. After the Cuban missile crisis, McNamara ordered that, basically, combination locks be put on all the Minutemen missiles and their nuclear warheads. What he told McNamara was that yes, the locks were put on, but the code was set to all zeroes, eight zeroes,  and he said, “In the bulletin board in my launch silo, there was a piece of paper that said: “Be sure to set all your dials to zero”, and so they put the locks on as McNamara had ordered, but they didn’t  make them secure. So we’ve got to really watch for things like that.

SS: Growing online economy also depends on strong encryption. Is our increasingly digitised financial system vulnerable to cyber-terrorism?

MH: It’s vulnerable to theft as well as cyber-terrorism, and we’ve certainly seen a lot of money syphoned off, and it could be a lot more. Someone told me, a couple years ago, that my invention, with public key cryptography, protected trillions of dollars a day in transactions, and I said: “ You must be wrong, you must mean trillions of dollars a year.” And he said: “No, trillions of dollars a day”, and I looked into it and he was right. Foreign exchange transactions alone are about $4 trillion a day.

SS: Now, to break encryptions, computers must spend infinite amounts of years, right? But with this appearance of quantum machines, which can solve this task within 20 minutes - is nothing ever going to be secure again? Is encryption obsolete?

MH: Encryption is certainly not obsolete now. You’re referring to quantum computers and there have been papers published that show that if a large enough quantum computer can be built, then all of the currently used public key encryption systems would be broken. But, it would require a quantum computer with thousands of cubits, as they are called - that’s quantum bits. Right now, we’re down to just a few. It might be a problem in the future, that is true, but right now, we’re safe.

SS: What’s the weakest element of the encryption system?

MH: I think, the better question is what’s the weakest link in security, and it’s usually people. But another good example in encryption, is that even if you have a totally secure encryption system it can be broken - this comes from a friend of mine, he was a student here, an undergraduate in Stanford, Paul Kocher, who now runs a company called “Cryptography Research Inc.” - one of the things he discovered, was that smartcards, which are in widespread use in Europe and now are coming into a greater use in the U.S. as well, have a microprocessor and encryption engine and a secret key built into them. What he found is that by measuring the power drawn by the chips - when you go to merchant, you put your smartcard into a reader, and it applies power to the card and the card does operations - well, as it’s going through your secret key, it’s doing one operation if the first bit is a zero, and another operation if that bit is one. It takes more power for one of those operations than the other, and so as it goes through the bits, one at a time, let’s say there are thousand bits, he measures a thousand power levels, and he’s able to read off the key, even though the encryption system itself from a mathematical point of view is secure. You’ve got to watch for things like that.

SS: Tell me something. You say people is the weakest link - do you think there’s a way a man out of equation?

MH: I think we need to be more careful about who we give power to, which comes back to some of the questions you’ve asked earlier, and we need to do more auditing. Office of Personnel Management, which has 20 million records for Americans with security currencies , me among them... I got a letter from the Office of Personnel Management about a week ago saying that my SSN and a lot of other personal information had been divulged. So, here you have that information being hacked into, there wasn’t adequate security surrounding it. So we need to have more audits, more “red teams”, so to speak, where they come in and try to break into things and don’t just trust people that we entrusted with the information in the first place.

SS: Professor, thank you very much for this wonderful insight into the world of cryptology. We’ve been talking to Professor Martin Hellman, encryption pioneer, inventor of public key cryptography, about the issues of privacy protection in the digital age. That’s it for this edition of Sophie&Co, I will see you next time.

Podcasts
0:00
13:3
0:00
13:32