Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Most people assumed that the NSA was engaged in wiretapping activity, but BULLRUN was a big shock. People generally assumed that information assurance was at least as important to the NSA as signals intelligence and pointed to things like DES (which the NSA strengthened) as evidence.

Why would people have assumed that after the failure to enforce Clipper chip usage, that NSA would have simply given up on getting access to the plaintext of messages? That makes no sense whatsoever; NSA has been breaking codes (not merely wiretapping) since NSA has been around, just like it has long been an open secret (if unacknowledged publically) that NSA hacks into specific systems if they can.

It is true that COMSEC is a core function of NSA, but that is not inconsistent with NSA performing targeted SIGINT or general cryptanalysis. If their codebreakers can break into equipment used by the US government or major commercial interests then so can foreign entities, and therefore NSA needs to try anyways, even if we're talking about "pure" information assurance.

Likewise the history of NSA has shown them before to be willing to give people cryptosystems that are strong against everyone but the NSA. That's also in keeping with NSA's drive for COMSEC, since (from their POV) they won't be needlessly decrypting 'innocent' information anyways.

So I'm not saying you're wrong about what the academic community was assuming, but I am saying I can't imagine why the academic community was so naïve. Indeed, I thought the ever-present distrust of NSA was exactly why they looked for backdoors in things like Dual_EC_DRBG, and used "nothing up my sleeve" constants in algorithms designed in academia.



> "Why would people have assumed that after the failure to enforce Clipper chip usage, that NSA would have simply given up on getting access to the plaintext of messages?"

Consider this exchange on HN nearly three years ago:

  @moe: I had always considered the Clipper-Chip
        incident to hint at the tip of an iceberg.

        Do you really think that was an isolated one-time
        event?

  @kgo: The Clipper Chip was introduced in the open.
        They tried to push it through legislation. It's
        not like the NSA blackmailed Intel executives to
        include the capabilities secretly in their
        Pentium Processors without notifying their
        customers.

        Same with the new proposed legislation. But all
        that demonstrates is that the NSA has an
        interest in being able to (legally) monitor
        encrypted communications. Which everyone already
        knows.

        If someone had 'busted' the NSA trying to do
        something sneaky and/or covert and/or illegal,
        then you could argue that it's the tip of some
        iceberg of nefarious activity. But like I said
        this was all done out in the open.

        You might as well say that because we know the
        FBI wiretaps phones through legally obtained
        court orders, that's the tip of the iceberg that
        points to millions of illegal wiretaps. It's a
        bad inference.
(I would emphasis certain parts of kgo's response that I find particularly... accidentally prescient?... but the result would just be me emphasizing the whole damn thing...)

https://news.ycombinator.com/item?id=2018456

The gist here is that Clipper, being a public effort to subvert encryption, should not be seen as evidence that the NSA was interested in non-public subversion of security.

Was this naive? In retrospect? Obviously. At the time? Arguably.

Where people, who reasonably should have been informed, caught off guard by the revelations brought to us by Snowden? You betcha.

[edited for formatting]


Thanks for the quote, it's quite illuminating.

This is actually essentially what I was arguing, not just moe's point:

        But all
        that demonstrates is that the NSA has an
        interest in being able to (legally) monitor
        encrypted communications. Which everyone already
        knows.
I don't condone designing backdoors into products. Maybe I can research into BULLRUN more but the impression I had was that the intention was to subvert crypto in a way that only NSA could use (either via escrow or constants-up-their-sleeve), perhaps to take advantage of exploits that happened to get shipped to the commercial sector, but not to design broken gear that anyone could subvert if they know the secret handshake. Obviously even this is sneaky and shady as can be, but that goes with the territory (though @kgo would seem to agree that it's nefarious :-D).

But even assuming malice here, I'd have said even at the time, with the ongoing changes to communication patterns across the world, that there's simply no way you can assume NSA isn't trying to hack everything they can. When I was reading Schneier a decade or more ago they were the very definition of Mallory, and nothing has changed on that since. But I suppose you're right, that people at the time weren't seeing NSA the same way I was.


"NSA has been breaking codes (not merely wiretapping) since NSA has been around, just like it has long been an open secret (if unacknowledged publically) that NSA hacks into specific systems if they can."

BULLRUN is not a cryptanalysis program, and it is a stretch to call it a "hacking" program -- and certainly not one that targets specific systems. BULLRUN is the NSA program to deliberately introduce exploitable bugs in cryptographic protocols, software, and so forth.

"the history of NSA has shown them before to be willing to give people cryptosystems that are strong against everyone but the NSA."

First of all, the NSA does not frequently give cryptosystems to anyone outside of the US government. That being said, there are two prominent counterexamples to your statement: DES and DSA.

"I thought the ever-present distrust of NSA was exactly why they looked for backdoors in things like Dual_EC_DRBG, and used "nothing up my sleeve" constants in algorithms designed in academia."

It is worth noting that even when the Dual_EC_DRBG backdoor was discovered, there were people who were skeptical about it being deliberate. Even Bruce Schneier expressed some skepticism about it being an NSA backdoor:

https://www.schneier.com/blog/archives/2007/11/the_strange_s...

I don't understand why the NSA was so insistent about including Dual_EC_DRBG in the standard. It makes no sense as a trap door: It's public, and rather obvious.

As for nothing up my sleeve constants, that goes beyond just showing that there is no back door. Suppose you were analyzing a cryptosystem, and it required some particular constant. What about that choice of constant makes the system secure? If a different constant were used, would system still be secure? Even if you never gave any thought to backdoors, nothing-up-my-sleeve numbers would matter -- cryptographers like to know the reasoning behind design decisions, and it is much easier to explain why you picked 1234567890 as a constant than 43895762356157265.


You'll note that Schneier's criticism of NSA as the source of the DRBG backdoor is that it was too obvious (i.e. the work of clowns, not the work of the most-advanced cryptography outfit in the world), not that NSA wouldn't engage in that kind of activity.

He also mentioned how it was NSA (and NSA specifically) who were 'so insistent' on including that 'obviously' broken algorithm in the standard. I mean, that question kind of answers itself (even at the time!) as long as you're not presupposing that NSA turned pure white in 1995.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: