Skip to main content

Minutes for CFRG at IETF-95
minutes-95-cfrg-2

Meeting Minutes Crypto Forum (cfrg) RG
Date and time 2016-04-08 13:00
Title Minutes for CFRG at IETF-95
State Active
Other versions plain text
Last updated 2016-05-19

minutes-95-cfrg-2
Crypto Forum Research Group
Friday 8 April 2016 morning session
Chairs: Alexey Melnikov (present) & Kenny Paterson


==========================================================================
CFRG status (Alexey)                                             - 15 mins
==========================================================================

[agenda; note well]

[document status, from slides]

XMSS: Extended Hash-Based Signatures is in RGLC; please review

other PAKE documents will not be finished until the requirements document
is finished.

Argon2 just accepted as a CFRG work item. It could be useful for openpgp.

W3C asked for help with the webcrypto algorithms document. No update since
the end of last year.

6090bis: now that we are done with most major elliptic curve work,
it's a good time to revive this one.


Crypto Review Team

was suggested by Stephen Farrell earlier this year.

Many people are asking to publish RFCs for national crypto; sometimes
it goes to AD, or CFRG, or directly to ISE.  Good to have a team of
people to give consistent reviews and get a state of the art for
algorithms in use.

[CFRG Crypto Review Panel slides]

Basic problem chairs have is that documents only get reviews if there
is big outcry or big excitement on the list, so it's hard to get
reviews in a reasonable timeline and get consistent reviews.

Sort of like a mini-directorate as different IETF areas have.

Panel review not a requirement before publishing or accepting as
CFRG document.

[discussion slide]

Vasily Dolmatov: original idea was to make public review for MTI
algorithms necessary. This idea is quite good since MTI algorithms
will be used for years. There should be extensive public review on it.
The proposed panel should review all crypto in IETF. This has nothing
in common with the needs of the security area. We require public review,
not some kind of appointed panel of reviewers and we require review of
MTI algorithms not all text related with crypto. A lot of crypto
specialists exists in the world, the very small subset of it is
participating in IETF. So review panel could not provide objective
review. I would like to remind that the home page of the IETF says
that the IETF does not try to overcome any protocols which are used
in the internet but do not belong to the IETF.


yoav nir: I think I've wrapped my head around it.  Let's try an example
Suppose someone comes in with a calina cipher, of the Ukranian government
what is th epanel supposed to do with this cipher?  Say it's broken?
Alexey: find existing research, point out if there are any known
analyses of it, both positive and negative; not a request to do original
research or pick on spelling nits.
yoav: how is that different than sending it to the CFRG list and asking
if there is stuff?  Do they also look at document quality?
alexey: yes, document quality is a key thing to look at.  The document should
be readable and implementable.
stephen farrell: great idea; well-informed literature review, plus
whatever guidance we can offer.  Rather than depending on the kindness
of strangers, having a set of people willing to do this stuff is a
great idea and should help the IETF as well.
alexey: This doesn't prevent any extra review or discussion on the CFRG list.
It's fine for rough consensus on the list to disagree with the panel.
farrell: example is the coda signature thing; document in IETF wanted to be
stds-track in MANET; had never had any review.  A year ago I asked
CFRG if anyone had seen any review, and no one had, and we processed
that information.  Having a team setup would be really good.
alexey: Very usefyl if we can't find volunteers on CFRG list, and having
a dedicated person (group of people) would really help.
stephen: not necessarly a decision, but provide an informed opinion.
david mcgrew: great idea.  Asked "how is different from current process?"
Can identify resources in advance, so not a long wait next time to
track someone down.  A previous commenter was worried about transparency.
You could write something up that would address transparency concerns;
send writeup to the list not chairs, etc.
alexey: mentioned in slides that review supposed to be public.
If any concerns about particular reviewer doesn't like particular
author, that's conflict of interest that chairs want to know about.
The process is supposed to be fair in this respect.
rich salz: also think it's a good idea.  Some really good candidates
for this group would be grad students with good current knowledge
of literature, and as a second effect might get new blood involved.
yaron sheffer: good idea.  Quick suggestion: most of target audience
of reviews do not understand details of resesarch, suggest that you
adopt a more formal way of presenting the findings (more formal
than secdir's ready/not-ready/xxx), so that people can get a feel
after a few reviews how one algorithm/choice compares to another.
something like level of research/review done high/medium/low,
level of best practices used in this, etc.
alexey: send mail to chairs to remind.
not entirely sure how format would look like, but something
uniform would probably help.

alexey: looking for volunteers. self-nomination and nominating
other people is a great idea.  if you think someone would be
good and they're not involved with IETF/IRTF send chairs email.
we'll talk to people and try to convince them and whatnot.
yoav channeling derek atkins: you can probably coordinate with
sec-dir chair (Tero) for samples of the process and template format



==========================================================================
XMSS: Extended Hash-Based Signatures (Andreas Huelsing, remotely)
[draft-irtf-cfrg-xmss-hash-based-signatures-03]                  - 15 mins
==========================================================================

[slides; no further questions/discussion]

==========================================================================
Secure state management for hash-based signatures (David McGrew) - 20 mins
==========================================================================

non-volatile cloning is unique to hash-based signatures, but
volatile cloning breaks all sorts of things, so we don't try to
deal with it here.

Introduce a reservation step into the signature model, in addition to
key generation, actual signing, and verification



==========================================================================
Argon2 (Dmitry Khovratovich, remotely)                           - 20 mins
==========================================================================

mcgrew: thanks for bringing your work to Internet community
if we use H other than blake, would the blake compression scheme be
used internally?
dmitry: yes.  original compression function is like one round of blake
and we added multiplications there, so it's not one round of blake,
so it mostly has to stay.
Any 1024-bit wide implementation we could adjust, but we could
also take several rounds per implementation of sha3, but that its more work
mcgrew: I anticipate that people will only have sha2/sha3, and would like
to use that; someone wanting to reuse.  if the sha3 compression func worked
that might be interesting
dmitry: if we use sha3, it will probably be 3-4 times slower and thus
less good.  blake parallelize well, and otherwise very fast.
kaduk: existing deployments sharing password files across different
systems, which supports standardizing a password file format.
stephen farrel: echo thanks for bringing; concerned that every extra
parameter gives confusion. parameters in the hash function itself
make things even worse.  In the RFC that peopel would reference, i
would like it to be one thing, and not a branch of things, since
it would be confusing for other people.  cutting it down to one
thing would be more useful than one thing with N parameters
alexey: make easier for engineers so we don't have to pick.
farrell: at the same time not having to explain differences.
dmitry: suppor that; our intention was to remove from users's control
as many parameters as possible, leaving only time+memory.  we would
like to have single function for password hashing and that seems
good (argon2i)
farrell: please continue to resist tendency of people who ask for
more parameters, and maybe go further to get rid of argon2d
rich salz: echoing farrell, the last thing the IETF needs is more
crypto families.  don't want to say "this works everywhere" when
it's not true, but some documents "this is how to use it" is good.
If there's a document "this is a family", it doesn't really help.
pick one that is the best for IETF uses, and document only that.

==========================================================================
Quantum Resistant Cryptography discussion                        - 30 mins
==========================================================================

alexey: David Mcgrew is playing devil's advocate today
mcgrew: hash-based sig is motivated by post-quantum crypto.
people are starting to take the possibility of quantum computers seriously,
based on the NSA change in stance, and the amount of money going into
building quantum computers.  Try to provide security not just now,
but into the future.
mcgrew: what are the uses cases for PQC?  I present three.
(1) software/firmware signatures.  router in the field for 15-20 years;
when it boots, want to verify that it's running official software.
Need to build a bootloader, which is not crypto-agile, and need a signature
algorithm.  "Router that hasn't been rebooted since 1998 ... also hasn't
been updated since 1998"
(2) long-term confidentiality; resist store-now-break-later attacks
government uses require multiple decades of confidentiality
kyle rose: for firmware signatures or hardware root-of-trust, tpm, sgx, etc.
I don't expect the hardware to last long enough that this would be an issue.
A device that's 15 years old, if you can't upgrade the firmware, you
either get performance problems from scale or the device is just woefully
out of date. confidentiality seems more reasonable if there are secrets
that are valuable enough to keep them around for 30-40 years.  it's not
what I'd most be concerned about.
mcgrew: infrastructure and IoT; you can't necessarily upgrade.
kyle rose: I guess I don't have experience with hardware that lasts that
long, so don't have ideas about hardware deployed in 1995 that's still running
derek: yes, there are many devices that want firmware updates that will
last 10-15 years
tero: firmware signatures is not necessarily easy; talk to factory automation
folks, they never update *anything*.  IF something went wrong, the whole
factory could stall.  Even if they do, it's not automated.  So the human
doing things could verify authenticity in some other way, out of band.
Also the company that makes firmware may have gone out of business, and
the devices could be broken and unfixable.
For confidentiality, in normal personal use, there is usually some sort
of side-channel leakage; companies and such have more secrets and try
harder to keep things secret for 30 years, but
Andreas: some things can't replace hardware, like satellites
derek: there are techniques to back out firmware updates if they fail
dkg: add to firmware signatures, another reason not so important, if
we talk about signature scheme used for 20 years in the future, than
there is the idea that there is no intermediate firmware that is attacked.
The chances are that over the course of 20 years, there are no signed
firmware updates that have bugs in them are very small.  A dedicated
attacker could go back and install an old buggy firmware and attack
those bugs.  One more vote for long-term confidentiality as a long term
kevin falk: a couple of areas related to signatures on large datasets;
we're facing the question of validating machine-learning algorithms.
large data sets become the software algorithms through training, so the
integrity of data sets going into the future becomes more important.
Also compliance for cross-validation
mcgrew: data-set integrity is interesting; please provide details on list
joe saloway: tend to agree with others that signatures on firmware is
a concern, but not necessarily the major concern for me.
certainly long-term confidentiality is a problem, in that data is
out there now that is encrypted, and that will be vulnerable.
Even for normal corporations, things 10-20 years in the future can still
be bad.  I also think that data-set integrity can be relevant.  There
may be statute of limitations that would make that not so improtant,
but that's not many cases.
(3) long-term secure remote management
mcgrew: we kind of covered that with long-term confidentiality.
Suppose building long-term (IoT) devices expected for 15 years that
actually run 25 years.  You log in remotely using, say, TLS, if there's
a web interface.  We mostly covered that already
tim polk: these are great use cases, I'm all for working on PQC and
people deploying it where they need to.  I would caution that
wherever possible, we'll still need crypto agility; the ultimate
black swan is not the quantum computer itself, but rather the
new quantum algotirhms; shor and grover are not the only smart people out
there.  don't want people to say "I'm doing PQC, I'm done", so still
build in agility.
mcgrew: yes, agility is crucial
PHB: it's also worth considering our experience with computational
breaks for DES.  A password cracker with 350 billion cracks/second.
These devices have been out there for a while, but I'm pretty sure
that they're rarely used, only on a fraction traffic.
When looking at defeating mass surveilance as opposed to targetted
surveilance, there are other countermeasures we can do that
still use standard crypto as opposed to PQC, like putting PFS in
more places.  The first 2048-bit QC will not be the day RSA falls,
but rather when it can do them fast.
mcrew: must do homework before that happens
PHB: I agree, but we should understand the nature of the threat and
all the ways that we can defeat it

mcgrew: [new slide] where is PQC on the list of priorities
It's important, we should work on it, but there are other priorities
as well that might come first.
stephen farrel: thanks for this slide; fantastic thing to do work here.
Need to be careful to not oversell the threat; we have a lot of
complext security solutions that don't get deployed, so we're the
threat.  work here should be done, but don't oversell it.  We've had
a number of WGs where we debate what Phill brought up; will ECC
fall faster than DH, etc.  The big problem now is non-adoption
of security technologies because they're too complex, and adopting
PQC might exacerbate that.
dkg: we also suffer from over-complexity of security systems that
are already deployed, and having things that we can't turn off
or don't know (how) to, like logjam, ecdhe, etc.  Adding additional
complexity that does get deployed is also an issue.
mcgrew: agree with comments about robustness; things like you
describe are a bigger priority than QC
joe saloway: continuing in that vein, the ability to be agile
and sunset bad things is important.  That falls into your
robustness space and could be expanded.  How do I know that
my implementation is doing the right thing and not doing
things it's not supposed to.  Something of an implementation
issue, but that's becoming more important.
farrell: as well as algorithms, there's a useful piece of work.
If properties of ciphers change, what is the impact of that
on internet protocols?  Could do this independent of thinking
on quantum computers.  If your key sizes get really big, what
protocols break.  As IETF works on next generation of protocols,
could think about where fragmentation is needed, etc.
russ housley: if you take cross-product of this and
previous slide, it's important to think about what we would need
to do if a QC were to become more likely/visible.
Previous slide talked about signing for software; if we focus
on QC-resistance of hash-based signatures, so that we can
sign things in the face of RSA/etc falling.  Right now if
RSA fals, we have no way to deploy software with confidence that
it's the right software being loaded.
mcrgew: need to be sure that agility updates are secure.
dkg: want to see statement about how we can mix post-quantum
with existing crypto, that is no worse than existing, should
PQC turn out to be bunk.  Might be really easy, but haven't seen
any academic analysis of that.
mcgrew: there's a number of attempts at that.  Scott and I+others
submitted draft to ipsecme "postquantum pre-shared keys for ike".
IKEv1 protects from QC if using large PSK; IKEv2 does not mix
the PSK into all derived keys.  This draft builds out PSKs on the side
for potential future use.  A lot of people say" that's great, but not
a free lunch".  Wanted a scheme with identity binding, to not make
IKEv2's other security worse.  So it's not a short draft.
PHB: Folowing bob ross's comment, we need to distinguish PQC that
is drop-in replacement for RSA, etc., but even if you came today with
a perfect algorithm, we have to wait 10 years to have full confidence
in it.  Want to distinguish that from post-quantum [xxx]s.
Say, with blockchains, where the workfactor goes up a lot to fake it.
Also, kerberos.  When I give S/MIME certs, I also give out a Kerberos
ticket that connects up to my service.  I'm not going to use it
immediately, but I have it there if I need to bootstrap when RSA fails,
so I can deploys stuff that's interesting.
debbie cooley: Back to intermediate solution idea.  Currently RSA/ECC,
in future could mix in PSK; after that full post-quantum.  Pre-shared
key distrubition is the "non-free-lunch" part of that.  We hear a lot
about PSK for authentication, but that's only two-party, so the
key distribution part is very difficult.  Could allow PSK for a small
community, not really authentication, just protection from post-quantum,
that might reduce the work factor for mitigating the threat.
There's an opportunity to design an easier-to-implement-and-deploy
solution that gives what you want but minimizes the distribution problem.
mcgrew: I agree with what you said, but want to hear more about the
opportunity you see.  post-quantum key as distincti from PSK in IKE?
deb: Just use PSK for confidentiality, and use EC for authentication.
As long as the community is not too big, you still get some protection.
Authentication is real-time, so it should stand the longest.
Could get away with classical longer, if only used for authentication.
russ: to follow on PHB.  When sha256 was standardized and we wanted
to get people to go sha1-->sha256, I was security AD.  We guessed
5 years; it took way longer.  IF we're going to get quantum-safe
sigalg out there and deployed, we need to get it going now.  We can't
wait; otherwise it will be time to deploy an we'll miss it.  Let's get
on with it.
dkg: deb cooley's proposal of a PSK shared among the community;
I'm not sure how much that really solves the distribution issue.
I think it actually raises a different set of difficulties of
distribution.  Have to decide who is "in" and "out" group.  For
confidentilaity, that changes over time.
PHB: re russ, not saying "don't do quantum sigs", but also have a plan B.
I'm gonna have plan C and D.
deb: dirty little secret is "Not always pairwise" for PSK
pairwise would be great, but not often done today.
As for russ's point, need to work on real system, but the more
agile we can be, that can help in the meantime.
kyle rose: re dkg, mix post-quantum in with classical keyex,
there may still be value in using PSK.  Just based on my cursory
knowledge of literature, keyex is where we have the least confidence
in existing post-quantum algorithms.  That's where we need the most
work at this point to gain greater confidence.
richard barns: continuing on kyle, some of the combination deployments
with two algorithms together can be useful for those objectives.
Would be useful from my perspective as an implementor to have this
group vet the combinatinos that they are being done in a sound way.
mcgrew: great comments.  Still have more slides...
lots of standards that might need PQC
what is easy/hard to do?
I think we can get hash-based sig specs in a final deployable form
within a year
What can CFRG do to build standards and get them adopted?
alexey: from chairs, NIST have medium-to-long-term view, and CFRG is
more immediate.  It's hard for CFRG chairs to say "come back in three years
when this is studied more".  We should definitely coordinate, but
hopefully do something quicker.
derek: there are also some braid-based algorithms coming down the pipe
mcgrew: people come to CFRG and ask, "what should we do now?".
"Use super-sized symmetric and be agile about asymmetric" might be good.
andreas: Many PQC algorithms allow for different trade-offs. CFRG / IETF could
collect performance requirements
mcgrew: yes, and the size of keys and such.
stephen farrell: something CFRG/community could do is improve confidence
that we're not doing something already that's fragile.  Demonstrating
robust implementations would be good.  I'm afraid that we're going to
scare people into doing things that are worse than what they already have.
Once we get people to take this seriously, they want to go out and do
the latest thing.  We want to make sure that's the right thing.