On Tue, Nov 1, 2011 at 4:40 PM, Marsh Ray marsh@extendedsubset.com wrote:
Definition of attack: something that breaks a security claim. Microsoft breaks a claim the GCM designers never made, Joux very cleverly breaks a part of the standard that their proof doesn't cover. The designers also didn't claim anything near the real security. (http://cr.yp.to/antiforgery/securitywcs-20050227.pdf is a tighter proof). True, NIST should put the claims front and center in the document, especially when they have such a nice form. But it is the responsibility of implementators of cryptography to exhaustively document what their security claims are. Hiding behind NIST won't save your data if you do something stupid.
NIST is standardizing cryptographic algorithms for government use. They make tradeoffs in their algorithms the same as any other user: speed vs. security. Blindly implementing standards is a way to fall into a trap. DES is a standard, Blowfish isn't as much. Which would you rather use?
Furthermore, NIST didn't write the draft that the comments from Ferguson were about. As the response indicates GCM could be secure if you made the right parameter choices. Its not substantially different from RSA in that regard. Now, why the options, why q=2^128 instead of some prime? Because NIST wanted to support high speed encryption of fiber optic links in hardware, and Galois fields are easy in hardware. This is the story I get from the comments on the standard. If you are working in software, I would suggest something else.
The standard is complicated But the problem is the picking the IV. I have no idea why NIST decided to use 96 bit IVs but fix the first 32 bits, so you are left with a 64 bit counter, then support longer ones through hashing, and then ultimately have to backtrack and support only 64 bit counters as nonces. But GCM is simple: it is polynomial evaluation, and encryption of the resulting value. The NSA has had these incidents before: ECMQV used to be in Suit B, before unknown key share attacks were published against it. The NIST P-256 was fast on 32 bit machines, but put a reflection right in the middle of a word on 64 bit machines. Etc, etc,. I'm sure there are many we never hear about.
More fundamentally implementing a system such as Tor requires making choices about how standards are used and fit together. It is the rare (and wonderful!) standard that has no options. Making those choices requires knowledge of the security offered by the standard. It is never going to be possible to rely on guidance alone. Implementors who implement AES with giant tables are in for a surprise when they run on cloud machines. Hashing a password is pointless when that hash is a hash function: checking all 10 character passwords takes about a week on a nice graphics card. Understanding of cryptography is required even if standards exist.
Nonces are required for protection against replay attacks. If nonces are unique, GCM is secure and pretty fast. Poly1305 is faster. HMAC does not have as high a margin of security proven, and when a hash function breaks so does HMAC. This is a good case for GCM or Poly1305 in Tor. Right now we have nothing: cells are encrypted with AES in counter mode only.
In order to do something constructive I might go through Torspec and look at how things are used and what we rely on them for. This might give us a better idea of what the requirements are.
You can indeed. Or any stream cipher really. (Provided it doesn't do very bad things)
Sincerely, Watson Ladd