Last Call Review of draft-ietf-jose-json-web-algorithms-31
I have reviewed this document as part of the security directorate's ongoing effort to review all IETF documents being processed by the IESG. These comments were written primarily for the benefit of the security area directors. Document editors and WG chairs should treat these comments just like any other last call comments.
This document sets the initial IANA registry values for the labels to be used to specify choices of cryptographic algorithms in the context of the JSON Web Encryption, JSON Web Signature, and JSON Web Key documents (parallel I-Ds). Some aspects of how the algorithms are used are specified here; others aspects reference other documents.
The issues I found with this document (all of which are minor):
Section 3.4 line 4 says Elliptic Curves are generally faster to execute (for equivalent security) than RSA. While that is true for private key operations (and even more dramatically so for key generation), it is generally not true for public key operations. This is a nit in the text since these trade-offs are well understood.
Section 4.5 Direct Encryption: It might be too late to change existing implementations, but it generally a good idea when using pre-negotiated keys to include some key identifier in the header to remove ambiguity in the case where there are multiple pre-negotiated keys (perhaps because they are in the process of being updated and they are not atomically updated on both ends of the connection).
Section 5.1: I don't believe the pairings of AES128/HMAC-SHA256, AES192/HMAC-SHA384, and AES256/HMAC-SHA512 are "natural" in the sense of providing equivalent cryptographic strength. Without the HMAC, they would, but I believe HMAC-SHA256 is generally believed to have 256 bits of cryptographic strength, making it suitable for pairing with any of the three AES key sizes. The choices of the longer SHA2 variants are conservative, however, and so do no harm if these pairings are already in widespread use.
Section 5.2: Similarly, it is not appropriate to have the length of the Message Authentication Code (MAC) reflect the key length, since it does not affect cryptographic strength but rather the strength against on-line MAC guessing attacks. For this purpose, 128 bits is generally considered adequate for all key sizes and many implementations truncate this to 96 bits or even 64. Again, the current specification is conservative (if slightly wasteful of bandwidth), and so does not harm if this is already in widespread use.
Section 18.104.22.168 says "The number of octets in the input key K is the sum of MAC_KEY_LEN and ENC_KEY_LEN." I believe it would be better to say something like "MUST BE the sum". The text goes on to say that the two keys must not overlap, but it is also important that an implementation not tolerate a gap between the two keys is a too large key is provided.
Section 22.214.171.124 says the authenticated decryption operation has four inputs... . I believe it has a fifth: the IV. Alternately, the IV is pre-pended to the ciphertext (and hence implicitly included in 'E').
Section 5.3 specifies AES/GCM in much less detail than the description of AES/HMAC in Section 5.2. Is this because the referenced document [NIST.800-38D] includes all the needed details (like use of PKCS7 padding)?
Section 126.96.36.199: Because of the controversy over NSA allegedly planting backdoors in the "NIST curves" listed, there is a growing demand that additional curves be supported. You might want to specify how additional curves can be specified in-line and/or how to get additional curves added to the IANA registry.
Section 8.7: I believe the advice in this section is too strong. It is generally considered secure to encrypt multiple data sets with the same key so long as the IV is correctly chosen and it is also generally considered secure to encrypt the same data set with multiple different keys. There are many scenarios in which this is nearly impossible to avoid. It is important that when encryption multiple data sets with the same key that the IV be chosen appropriately – which is very challenging when using GCM as noted in section 8.4.
Section 8.8: I believe the advice in this section is too weak. It is generally bad practice to derive cryptographic keys from passwords as passwords almost never have adequate entropy. Where possible, it would be better to have a strong (i.e. randomly chosen) cryptographic key associated with an entity and then use the password to acquire that cryptographic key. There are a number of means for doing that, including having the strong key encrypted with the password (or XORed with it) stored someplace the entity can access it, by having a server that will return the key based on a provided password, or with a strong password authentication protocol. Deriving the key from a password using PBES should be a last resort and demanding a longer password to derive a 256 bit key is only fooling yourself... you are never likely to get more than 64 bits of entropy.