Hey! I'm David, cofounder of zkSecurity and the author of the Real-World Cryptography book. I was previously a crypto architect at O(1) Labs (working on the Mina cryptocurrency), before that I was the security lead for Diem (formerly Libra) at Novi (Facebook), and a security consultant for the Cryptography Services of NCC Group. This is my blog about cryptography and security and other related topics that I find interesting.

# EdDSA, Ed25519, Ed25519-IETF, Ed25519ph, Ed25519ctx, HashEdDSA, PureEdDSA, WTF? posted March 2020

## The Edwards-curve Digital Signature Algorithm (EdDSA)

You've heard of EdDSA right? The shiny and new signature scheme (well new, it's been here since 2008, wake up).

Since its inception, EdDSA has evolved quite a lot, and some amount of standardization process has happened to it. It's even doomed to be adopted by the NIST in FIPS 186-5!

First, some definition:

• EdDSA stands for Edwards-curve Digital Signature Algorithm. As its name indicates, it is supposed to be used with twisted Edwards curves (a type of elliptic curve). Its name can be deceiving though, as it is not based on the Digital Signature Algorithm (DSA) but on Schnorr signatures!
• Ed25519 is the name given to the algorithm combining EdDSA and the Edwards25519 curve (a curve somewhat equivalent to Curve25519 but discovered later, and much more performant).

EdDSA, Ed25519, and the more secure Ed448 are all specified in RFC 8032.

## RFC 8032: Edwards-Curve Digital Signature Algorithm (EdDSA)

RFC 8032 takes some new direction from the original paper:

• It specifies a malleability check during verification, which prevents ill-intentioned people to forge an additional valid signature from an existing signature of yours. Whenever someone talks about Ed25519-IETF, they probably mean "the algorithm with the malleability check".
• It specifies a number of Ed25519 variants, which is the reason of this post.
• Maybe some other stuff I'm missing.

To sign with Ed25519, the original algorithm defined in the paper, here is what you're supposed to do:

1. compute the nonce as HASH(nonce_key || message)
2. compute the commitment R = [nonce]G with G the generator of the group.
3. compute the challenge as HASH(commitment || public_key || message)
4. compute the proof S = nonce + challenge × signing_key
5. the signature is (R, S)

where HASH is just the SHA-512 hash function.

At a high-level this is similar to Schnorr signatures, except for the following differences:

• The nonce is generated deterministically (as opposed to probabilistically) using a fixed nonce_key (derived from your private key, and the message M. This is one of the cool feature of Ed25519: it prevents you from re-using the same nonce twice.
• The challenge is computed not only with the commitment and the message to sign, but with the public key of the signer as well. Do you know why?

Important: notice that the message here does not need to be hashed before being passed to the algorithm, as it is already hashed as part of the algorithm.

Anyway, we still don't know WTF all the variants specified are.

## PureEdDSA, ContextEdDSA and HashEdDSA

Here are the variants that the RFC actually specifies:

• PureEdDSA, shortened as Ed25519 when coupled with Edwards25519.
• HashEdDSA, shortened as Ed25519ph when coupled with Edwards25519 (and where ph stands for "prehash").
• Something with no name we'll call ContextEdDSA, defined as Ed25519ctx when coupled with Edwards25519.

All three variants can share the same keys. They differ only in their signing and verification algorithms.

By the way Ed448 is a bit different, so from now on I'll focus on EdDSA with the Edwards25519 curve.

Ed25519 (or pureEd25519) is the algorithm I described in the previous section.

Easy!

Ed25519ctx (or ContextEd25519) is pureEd25519 with some additional modification: the HASH(.) function used in the signing protocol I described above is re-defined as HASH(x) = SHA-512(some_encoding(flag, context) || x) where:

• flag is set to 0
• context is a context string (mandatory only for Ed25519ctx)

In other words, the two instances of hashing in the signing algorithm now include some prefix. (Intuitively, you can also see that these variants are totally incompatible with each other.)

Right out of the bat, you can see that ContextEd25519 big difference is just that it mandates some domain separation to Ed25519.

Ed25519ph (or HashEd25519), finally, builds on top of ContextEd25519 with the following modifications:

• flag is set to 1
• context is now optional, but advised
• the message is replaced with a hash of the message (the specification says that the hash has to be SHA-512, but I'm guessing that it can be anything in reality)

OK. So the big difference now seems that we are doubly-hashing.

## Why HashEdDSA and why double hashing?

First, pre-hashing sucks, this is because it kills the collision resistance of the signature algorithm. In PureEdDSA we assume that we take the original message and not a hash. (Although this is not always true, the caller of the function can do whatever they want.) Then a collision on the hash function wouldn't matter (to make you create a signature that validates two different messages) because you would have to find a collision on the nonce which is computed using a secret (the nonce key).

But if you pre-hash the message, then finding a collision there is enough to obtain a signature that validates two messages.

Thus, you should use PureEdDSA if possible. And use it correctly (pass it the correct message.)

Why is HashEdDSA a thing then?

The EdDSA for more curves paper which was the first to introduce the algorithm has this to say:

The main motivation for HashEdDSA is the following storage issue (which is irrelevant to most well-designed signature applications). Computing the PureEdDSA signature of M requires reading through M twice from a buffer as long as M, and therefore does not support a small-memory “InitUpdate-Final” interface for long messages. Every common hash function H0 supports a smallmemory “Init-Update-Final” interface for long messages, so H0 -EdDSA signing also supports a small-memory “Init-Update-Final” interface for long messages. Beware, however, that analogous streaming of verification for long messages means that verifiers pass along forged packets from attackers, so it is safest for protocol designers to split long messages into short messages to be signed; this splitting also eliminates the storage issue.

## Why am I even looking at this rabbit hole?

Because I'm writing a book, and it'd be nice to explain what the hell is going on with Ed25519.

Typo in step 4 of the signing procedure:

S = (nonce + commitment × signing_key)

Should be:

S = (nonce + challenge × signing_key)

indeed, thanks!

#### david

discussion on HN: https://news.ycombinator.com/item?id=22569716

#### Frank

I see no explaination for ContextEdDSA, therefore I assume that

let's call it ContextEdDSA''
should be
ContextEd25519''

?

#### david

The naming is pretty bad in the RFC:

* the RFC defines Ed25519ctx
* I call that ContextEd25519
* generalized to EdDSA I call that ContextEdDSA

#### Matthew Riley

> The challenge is computed not only with the commitment and the message to sign, but with the public key of the signer as well. Do you know why?

The EdDSA paper says the “use of A is an inexpensive way to alleviate concerns that several public keys could be attacked simultaneously”.

#### Kieran Miller

Great article! Quick comment on pre-hashing - I wouldn't say it sucks as it has real value in systems where private keys remain non-exportable inside of a cryptographic token (e.g., HSM).

In many real-world scenarios the signing keys are centrally managed in a network-based HSM and so clients have to send their data to sign to the HSM (preferably via a signing service proxy). When the data to sign is small (e.g., challenge-response mechanisms, countersigning a signature, etc.) this is not a big deal but in cases where the data is large (e.g., signing large documents, code signing, etc.) this can become very slow and consume unnecessary bandwidth. For the latter case, pre-hashing is extremely valuable, especially when lots of signing is happening (e.g., CI/CD environment for large software companies signing each build), as it keeps the data over the network to a minimum and speeds things up.

All in all, pre-hashing is a trade-off depending on your particular situation. Also, while it is obviously possible to find hash collisions for any fixed-length hash function, using a properly-vetted hash function (and migrating away once it has been shown to have issues or otherwise becomes deprecated) should reasonably shield you from collisions. Of course, this all depends on the lifetime of your keys and how long your signatures need to be considered secure/valid.

Guess that wasn't quite a "quick" comment. Looking forward to reading your book!

#### david

Thanks for your comment Kieran! Interestingly we are pretty much faced with the same problem in libra.