Skip to main content

Minutes IETF116: ufmrg: Wed 00:30

Meeting Minutes Usable Formal Methods Research Group (ufmrg) RG
Date and time 2023-03-29 00:30
Title Minutes IETF116: ufmrg: Wed 00:30
State Active
Other versions markdown
Last updated 2023-03-28


UFM at IETF-116

29/03/2023 9:30
Pacifico Yokohama North G314–G315
Chairs: Jonathan Hoyland, Stephen Farrell


  • Welcome and Agenda Bash (chairs, 10)
  • Formal Methods background (Jonathan Hoyland, 15)
  • Current work-in-progress: VDAF analysis (Chris Patton, 15)
  • Open discussion and review of (some) problem areas (all, 45)
  • Foundational End-to-End Verification of High-Speed Cryptography (Bas
    Spitters, 20, may shift to later in agenda)
  • Delegated Credentials analysis (Jonathan Hoyland, 15, if time
    permits, time did not)
  • Next steps/action (Chairs, 10)

Notes (by Chris Patton and Benjamin Beurdouche)

Thanks to the note takers!

Intro (Stephen)

  • UFRMG is currently a "proposed" RG; we will get bumped to a proper
    RG in a year or so if we are productive.
  • We welcome folks with expertise in formal methods as well as those
    who know nothing about this field.
  • Goals

    • Provide a forum for those in the IETF and beyond to use discuss
      formal methods
    • Explore strengths and limitations
    • Explore how to use formal methods to make IETF better
  • Word of caution

    • We don't get to tell the IETF what to do. Our goal is to
      demonstrate the benefit of formal methods to IETF
    • Don't "parachute" into other WGs and try to dictate the
  • UFMRG Wiki

    • Chris Wood and others put up some links to expemplaray analyses
      relevant to the IETF
    • This can be edited by anyone, please add info there
  • Open discussion for later

    • Sample problem
    • Scoping topics for RG
    • Why aren't formal methods not "usable"?
    • "You can't publish confirmatory results"
    • How do formal methods fit into RFC development
    • How to model privacy
    • HOWTO structure wiki
  • Sample problem

    • Looking for volunteers!

Formal Methods background — Jonathan Hoyland

  • Formal methods has two parts: "Formal analysis" (Joanathan) and
    "formal verification" (Bas, later)
  • Goal of formal analysis is to prove a protocol meets its goals
  • Symbolic vs Computational analysis

    • Symbolic typically doesn't go into the details of the message
      formats, it looks at abstract terms with no internal
      representation and how they interact together. Typically the
      cryptography in that setting is considered perfect. The goal is
      to have a high-level look at protocols and make sure their
      design is sound.
    • Computational analysis usually provide more find-grained
      analysis with concrete security bounds. Usually the complexity
      is higher.
  • Mechanization of the proofs with tools

    • The huge benefit is replayability and mechanisation is that we
      can get replay results easily.
    • One inconvenience that it has the cost of learning the tooling
      and sometimes requires a lot of computational ressources
      (hundred of hours of computation and thousands of dollars. It is
      however not always the case, and sometime proofs can be replayed
      very efficiently on laptops. It depends on the tool and the
      model itself.
    • Needham-Schroeder protocol
      • Used as a learning tool to demonstrate how formal methods
      • There is an attack which is that the identity of the
        recipient is not included in the messages which leads to a
        message forwarding attack. Demonstration of how this
        protocol is implemented in Tamarin and how the tool shows
        that secrecy of the shared secret doesn't hold.


  • Chris Patton: asking why the proofs are so huge in general

    • Jonathan Hoyland gives the example of TLS 1.3 which is difficult
      to study in one piece because it has so many components. This is
      in contrast with Needham-Schroeder which was not detected to
      have an issue for 17 years.
  • Colin Perkins: What sort of problem can it find?

  • Jonathan: Quite a few. For example, Deadllock/Livelock,
    Authentication, Confidentiality, Integrity...
  • Colin: Presumably, this is one of the many tools for this (Many
    other tools, )
  • Jonathan: Yes. I did previously stated it is not as user-friendly,
    but this is one of the best
  • Doug: Can you speak about formal analsys vs model checking? We use
    model checking to prove some properties but didn’t get anything out
    of it.

    • Jonathan: Model checker will give you a proof, but is not guided
      by a human. It should be able to prove the same things but it’s
      unliekely to do so.
  • Christopher Inacio:

    • Jonathan: F-star can do both symbolic analysis as well as ...
    • Benjamin Beurdouche: Distinguish between proof assistants (Coq,
      Isabelle, ) vs automated analysis tools focused on protocols
      (ProVerif, Tamarin) which have a limited scope and are more
      automated with "fancy heuristics"
  • Jonathan: They are application specific, and contains features to
    cater for that

  • Christopher I.:
  • ?: How to document protocol spec so that formal analysis can be
    easily understood in context? What is the next step towards FM

    • Jonathan: next step is to put experts in a room where people
      having issues with modelling their protocol can meet an expert.
      Also provide a big pile with everything we know about formal
      methods. Also be able to influence tool development.
  • Pieter Kasselman: How to make this usable and scalable?

    • Jonathan: I agree, we need to do a lot on training materials
  • Richard Barnes: One [use]case I came across was access control
    protocol. We can find vulnerabiliities in these situations. This is
    a nice non-academic usage of it [presumably: formal methods].

    • Jonathan: How did you learn to use the methods?
    • Richard: Mainly the documentations.
  • Carsten Bormann: To really close the loop, we need to get the
    protocol designers onboard. In addition, we need to get reviewers,
    implementers on board as well. One really important aspect of this
    is evolution. Protocol evolve, and proofs needs to move along with
    it and kept up-to-date. I have experience working with these tools
    and ...

    • Jonathan:

VDAF analysis — Chris Patton

  • Chris Patton: presenting pen and paper cryptographic proof of VDAF

    • VDAF: Really lightweight Multi Party Computation draft in

      • Developped as part of the IETF's PPM working group (privact
        preserving telemetry)
    • VDAF: Distribute computation amongst a small set of
      non-colluding aggregation servers

      • Arithmethic secret sharing
      • Add up shares across measurement and send to aggregation
        server to compute aggregated shares
      • Aggregated shares are combined by a collector
      • Very efficient for specific algebraic use cases, but this is
        not general MPC
      • Privacy : As long as aggregator are honest, the adversary
        cannot find more about the data than what is visible in
        aggregate shares.
      • Robustness: Combining shares leads to valid values
    • DAP is a higher level protocol to operate VDAF over HTTP.

    • Approach: model maximal flexibility of the system by using
      abstract high level functions which can be instantiated
      differently as will
    • Methodology : computational, pen and paper

      • The adversary run a collection of corrupt clients
        interacting at will with aggregators. The adversary wins if
        for some report, get an agregator to accept an invalid share
        of that report and compute an incorrect value of the
      • Indinstaguisibility game for privacy. Probability of winning
        the game should be negligeable.
      • This work uses the Universal Composability approach allows
        to have composable formal bounds, different corruption

        • Privacy: no compromise, find the tightest bound
        • Robustness: sligthly relaxed bounds, breaking robustness
          means loosing some data but is not catastrophic. We can
          adjust security parameters.
      • Using the UC framework lead to find out some details that
        needed to be feedback-ed into the specification

    • Proofs are not always checked. We hope to have a mechanized
      computational proofs.

      • Experts are saying that mechanizing is complicated.
        Interestingly, it is not clear why for everyone.
      • DAP evolves rapidly, we might want to have a symbolic


  • Stephen: Can you clarify difference between pen and paper vs
    formalized proofs?

    • Formalized proofs: written so that a computer can verify and
      check them.
    • Pen and paper: written in latex, checked by fellow reviewers (or
      not checked)
  • Stephen asks who knows the difference between pen-and-paper and
    mechanized computational proofs

    • Chris Patton explains that machine checked if
  • Wes Hardaker: the human is the problem because it misses details

    • Chris: it is an ongoin process and model evolve naturally
  • Colin Perkins: is the issue with mechanization that tools don't
    exist ?

    • Chris: there is a cultural bias towards learning specidc
  • Arnaud Taddei: is there a list of tools with examples and tutorials

    • Chris Patton: formal methods or pen and papers ?
    • AT: trend on simulation, no standard way to create a model
    • Chris: noone has the same set of tools, we potentially have the
      ability in the IETF to restrict the scope of the tool
    • Chris: let's do the list !

Foundational End-to-End verification of High-Speed Cryptography — Bas Spitters

  • HACSpec : Specification language for cryptography (presented at Real
    World Crypto 2023)

    • Problem statement: many different tools and languages to
      describe cryptographic implementations, can we unify this ?
    • The language are very different which prevents people to read
      and review without having language specific knowledge
    • The approach: use one single formal language and use tools to
      transpile to verification languages (EasyCrypt and FStar)

      • We take a pen-and-paper informal standard like an RFC
      • We formalize this RFC language into code in the HACSPEC
        language which is very close to Rust
      • We use compilers to generate code for Jasmin, FStar... which
        can be used for proving properties about the code
      • Ultimately we can use these framework for verification and
        code generation
        • For example, we can generate Assembly code from Jasmin
          or portable C code with FStar
    • Design goals

      • Familiar and easy to read/write
      • Succint and Strongly typped
      • Executable
    • Example of ChaCha20 specification in HACSpec and how it is
      translated to F*/Coq/EasyCrypt

  • libCRUX : Cryptographic Library written in HACSpec

    • ECDH. AEAD, Signature, Hashes, KDFs.
    • All specifications written in HACSpecs and transpiled to
      different verification framework to generate code (see above)
    • Most verified code is generated C or Assembly which are finally
      wrapped into Rust

    • HACSpec is written in a functional subset of rust that is easily
      translated to different languages

      • Specification language

        • Coq : one of the proof assistant with a large
          mathematical library
        • Formal consice specification
        • Verification of the correctness of this specification
      • Functional correctness and constant-time code

        • Jasmin : one of the tool to generate formally verified
        • Relate the high-level functional specification to the
          low level implementation details
        • Secure compilation of the specification into Highly
          optimised ASM code while preserving correctness and
          security guarantees
      • Cryptographic security

        • EasyCrypt : Mechanise computational proof based on the


  • Chris: do you have any code for generics?

    • Yes it’s coming in v2
  • Chris: What is the maintenance model for libCRUX?

    • There is a company that will do maintenance
  • Jonathan Hoyland: How usable are the tools

    • Bas: HACSpec itself is quite easy to use as it is close to Rust,
      we gave BSc students the task to specify an implementation and
      test it, and they managed to do this very efficiently.

Open discussion and review of problem areas — Chairs

  • Benjamin Beurdouche:

    • One thing is identifying/listing tools by capabilities.
    • Other thing: try collecting within the IETF to find out where WG
      are in need of formal methods.
      • Obvious for security areas, but less so for other areas.
      • we should not do the work here, but host discussions about
  • Colin:

    • IRTF runs a workshop where we could put publications
  • Michael Rosa: What do we call "formal methods"? De we still want to
    automatically extract from RFC text some specs that could be used
    for formal analysis?

  • Tetsuya: Different syntax in different tools but similar technique.
    It’s hard to define security properties.

    • Jonathan: A lot a protocols that we use are simple. We could
      provide some libraries so that people don’t have to start from
    • HOL, Isabelle, ... have some exsting libraries as well
  • Colin

    • What can we automate in terms of writting specification
  • Chris Patton: is it a goal of the work to be opinionated about
    techniques ?

    • Stephen Farrell: we probably don't want to do this as we will
      probably disagree on this
    • Jonathan Hoyand: we might want to recommand some of these though
  • Doug: similarly, how usable is "usable"? Different SDOs did provide
    intermediate representation that could be used to formalize things.

Next steps / action — Chairs

  • The chairs will send an email about this.
  • We can possibly have a call between now and IETF-117 if we want to.