When to keep a secret

I was listening to a interview with Bruce Schneier at Columbia Law School on the NSA and all that (audio here). It’s well worth listening to — I think that Snowden’s whistleblowing is worth it just for all the talks and interviews Schneier has given as a result.
Schneier pointed to another positive consequence of the Snowden revelations: the NSA and its counterparts now know that whatever they do, there is a chance it will come out in the open. The implicit assumption that no one will ever find out what you’re doing makes people careless and makes them cross boundaries they wouldn’t cross otherwise.
It reminded me of something journalist and historian Timothy Garton Ash wrote following the WikiLeaks cables a few years ago. He said that anyone wanting to keep information secret, should be able to withstand the following test: if this piece of information became public, could you credibly explain why it should not have become public?
Indeed, few would have batted an eyelid if it turned out that the NSA only read the email of Al-Qaeda members and listened to the phone calls of Kim Jong-un.


How the NSA cheated cryptography

Of all the revelations made by Edward Snowden, I find the recent one about Dual_EC_DRBG definitely the most intriguing and possibly the most shocking – even if it wasn’t really news.
It intrigues me because it is about elliptic curves. I love elliptic curves. I studied them quite extensively when I worked as a mathematician and although I don’t use them anymore, I still feel a fondness for them.
But more importantly, it intrigues me because initially I didn’t realise what had really happened – and judging from comments and articles I’ve seen, I wasn’t the only one.
The NSA didn’t weaken a crypto standard. Rather, it put a backdoor inside the standard. There’s an important difference. As a consequence, if you use Dual_EC_DRBG, you’re still well-protected if the adversary you’re defending against isn’t the NSA. But if it is, you’re pretty much stuffed.
Dual_EC_DRBG is a pseudorandom number generator (or deterministic random bit generator; hence the name). It is one of four of its kind that were defined in the 2006 NIST standard SP 800-90A (PDF). The standard was written with the help of some people at the NSA. As we now know*, the NSA effectively wrote the standard.

Well, this is awkward.

Randomness is an essential part of any crypto system. It is also where many crypto systems have weaknesses, so if you’re implementing cryptography, it makes sense to use a standard provided by a reputable organisation like NIST.
What pseudorandom number generators do is turn a small ‘seed’ of proper random data into a constant stream of random numbers, which enables you to get such a number with arbitrarily high entropy. Entropy is usually defined as a way to measure randomness, but here (and possibly in general) it is best to see it as a way to measure surprise to an adversary. A high entropy means the adversary will know very little about the random numbers the system generates.
Dual_EC_DRBG uses a given elliptic curve. Elliptic curves come with an extra structure, called a group structure. For the purpose of this post, it suffices to say that this allows you to walk along the curve but, rather than simply following the shape of the curve, your walk makes you seemingly go all over the place. It is this all-over-the-placeness which makes them useful to generate pseudorandom numbers (and for cryptography in general).
Elliptic curve with group structure
The group structure on an elliptic curve. Don’t worry if it doesn’t make sense.

Apart from the curve, the algorithm also uses two given points P and Q on this curve. Like the curve, they are given in an appendix to the NIST standard.
Now there exists a relationship between these points P and Q: if you start at Q and you continue walking, then, for some large number e, after e steps you end up at P. This is not a secret: it is a simple property of the group structure of elliptic curves. But if the curve is large (which the one used in this standard is), it will take you a long time to compute e. Think in terms of millions of years. So no one knows e and no one can know e.
No one? Well, if you simply choose a point P on the curve and choose a (very large) number e, you can use that to compute a point Q. If you then give out these P and Q to someone, they will still need a million years to compute e. But you know it.
And that’s exactly what the NSA did. They provided the P and the Q in the standard. They, as has become clear from Snowden’s documents, know e. We don’t. And we can’t even compute it.
Does this matter?
It does. In 2007, Dan Shumow and Niels Ferguson, two researchers then working for Microsoft, showed (pdf) that, if you know e, cracking the pseudorandom number generation becomes a little easier. A little easier? Actually, it becomes almost child’s play. They effectively showed that to the NSA, your high-entropy pseudorandom number generator, generates output with very few surprises.
In practise this means that, by knowing e, can read almost all TLS-traffic (which includes HTTPS) that is encrypted using a algorithm based on Dual_EC_DRBG.
After the likely backdoor was found in 2007, NIST actually updated the standard. It now shows you a method to choose ‘good’ P and Q yourself (for you can’t just choose arbitrary points). But it still says that if you want your crypto to be FIPS 140-certified, you need to choose the points they’ve chosen for you. “Trust us,” you read between the lines, “we know they work.”
So why would anyone trust them, especially after it was shown that someone could likely have inserted a backdoor? That is beyond me. But the standard is used in quite a few implementations.
What makes this even more strange is that, as Matthew Green pointed out in an excellent blog post, the algorithm is pretty flawed in a number of other ways too. No wonder the crypto world suddenly finds itself in an existential crisis.
Now it would have been bad if the NSA had somehow managed to make us all use weaker cryptography. Still, the playing field would have remained level, albeit with lower security for everyone.
It would have been a little worse if the NSA knew of a secret algorithm that enabled them to break cryptography. (It is possibly that one of the future revelations that Bruce Schneier hinted at will show they can do that for certain crypto standards.) Still, ultimately that is just beating your opponent by being more clever.
But what the NSA did was plain cheating. The crypto remains secure against any of us. But they can crack it. Because they wrote it. And they put a backdoor into it. And even though we know (and have known for some time) there was such a backdoor, it still doesn’t help us.
Cheating with the privacy of billions of Internet users is nothing but very, very wrong.
(Apart from the linked blog post by Matthew Green, there is this Wired piece on Dual_EC_DRBG that the aforementioned Bruce Schneier wrote back in 2007, when Edward Snowden was but a junior employee at the CIA working in Switzerland. As just about anything Schneier has written on cryptography, it is well worth a read.)
* The NSA hasn’t owned up and it is unlikely they ever will. While no one doubts that the NSA planted a backdoor into Dual_EC_DRBG, we can’t prove it. Throughout the blog post, I have assumed we are sure. It made for easier reading. And, frankly, we are quite sure.

There is no 'I know what I am doing' trump card in security

Ever since Edward Snowden revealed details of the NSA‘s PRISM program, I had been wanting to write something about it.
While most people in the security community are rather unhappy, if not outraged, about PRISM, a lot of focus has been on the fact that the NSA is apparently evil.
While this may be true, I don’t think this is relevant. Of course, no one wants to be spied upon by an organisation they consider evil. But what I think is relevant here is that even if the people at the NSA are good and well-meaning, mass-surveillance is still very wrong. (As Robert Graham put it: “NSA is wrong, not evil”.)
So, inspired by the Black Hat keynote given by the NSA‘s director gen. Keith Alexander, I wrote a blog post about it:

We have all been there. To continue the product you’re working on, you need to get some extra permission: a port needs to be opened, or perhaps some files need to be uploaded onto a protected system. You ask the IT department for this permission and, much to your frustration, they won’t give it to you until you’ve explained in full detail why you need it, and even then they will have to check with their management.
“But I know what I’m doing. And my manager says it is fine.”

Read the rest of the post at Virus Bulletin.