Usage of existing crypto material serialization framework (JWK)

28 views
Skip to first unread message

Thibault Normand

unread,
Apr 12, 2022, 11:08:27 AM4/12/22
to The Update Framework (TUF)
Hello all,

I'm quite new to TUF, and its user/dev community.

I have a question conecrning the usage of the JWK standard to describe the crypto material in TUF. So that we don't need to reimplement marshallers and follow the RFC7517 standard.

Also I'm quite confused by the fact that the specification is merging signature scheme and keys in the same place. Is it expected? My concerns is about the fact that we could use different signer/verifier algorithms but with the same key at implementation time. Image FIPS compliant build vs other. In the case of ECDSA, a key could be used with a FIPS compliant signer (rfc6979 - deterministic signature), and a non-compliant signer (non-deterministic signature) depending to the execution environment.

Furthermore, I'm open to contribute.

Regards,


Tony Arcieri

unread,
Apr 12, 2022, 11:12:27 AM4/12/22
to Thibault Normand, The Update Framework (TUF)
On Tue, Apr 12, 2022 at 9:08 AM 'Thibault Normand' via The Update Framework (TUF) <theupdate...@googlegroups.com> wrote:
In the case of ECDSA, a key could be used with a FIPS compliant signer (rfc6979 - deterministic signature), and a non-compliant signer (non-deterministic signature) depending to the execution environment.

From the perspective of anyone but the signer these details are irrelevant.

Verifiers cannot tell the difference, as it only impacts the selection of `k` which is only known to the signer. While it indirectly impacts the `r` component of the signature, the two methods still verify the same, and a verifier looking at `r` cannot tell whether or not `k` was chosen randomly or by RFC6979.

--
Tony Arcieri

Thibault Normand

unread,
Apr 12, 2022, 11:49:30 AM4/12/22
to The Update Framework (TUF)
> From the perspective of anyone but the signer these details are irrelevant.

I completely agree.

I'm refering to https://theupdateframework.github.io/specification/latest/#keyval - which is used to declare a signature scheme but it also declares the key used to verify.
In this example :

```
{ "keytype" : "ecdsa-sha2-nistp256", "scheme" : "ecdsa-sha2-nistp256", "keyval" : { "public" : PUBLIC } }
```

The `keytype` value is the same of the `scheme` value. It's probably an error in the spec but for me this should be more presented like the following example.

```
{ "scheme" : "ecdsa-sha2-nistp256", "keyval" : {"kty":"EC", "crv":"P-256", "x":"f83OJ3D2xF1Bg8vub9tLe1gHMzV76e8Tus9uPHvRVEU", "y":"x_FEzRu9m36HLN_tue659LNpXW6pCyStikYjKIWI5a0", "kid":"Public key used in JWS spec Appendix A.3 example" }}
```

So that we decouple the signer/verifier implementation from the crypto material used as a resource to the implementation.
That's why also I'm currently ask why we don't use JWK spec to store the public and describe the signer as a key directly from the current specification representation.

To extend my sample about FIPS, we could for example have something like

```
{ "scheme" : "fips-ecdsa-sha2-nistp256", "keyval" :  {"kty":"EC", "crv":"P-256", "x":"f83OJ3D2xF1Bg8vub9tLe1gHMzV76e8Tus9uPHvRVEU", "y":"x_FEzRu9m36HLN_tue659LNpXW6pCyStikYjKIWI5a0", "kid":"Public key used in JWS spec Appendix A.3 example" } }
```

 to denote another verifier/signer implementation but use the same key.

IMHO, the current specification is a little foggy about the signer and the linked crypto material and implementations (at least Go) are currently reproducing the same issue (lack of responsibility segreggation).

Santiago Torres

unread,
Apr 12, 2022, 12:07:46 PM4/12/22
to Thibault Normand, The Update Framework (TUF)
Hi,

I'm just a bystander, so take my comments as such
> In this example :
>
> ```
> { "keytype" : "ecdsa-sha2-nistp256"
> <https://theupdateframework.github.io/specification/latest/#keytype-ecdsa-sha2-nistp256>
> , "scheme" : "ecdsa-sha2-nistp256"
> <https://theupdateframework.github.io/specification/latest/#scheme-ecdsa-sha2-nistp256>
> , "keyval" : { "public" : PUBLIC
> <https://theupdateframework.github.io/specification/latest/#keyval-ecdsa-public>
> } }
> ```
>
> The `keytype` value is the same of the `scheme` value. It's probably an
> error in the spec but for me this should be more presented like the
> following example.

This is a historical artifact, meant to help with backwards
compatibility. Overall, I don't think it'd be ideal to keep these two
around, but there may be implementations still handling the "scheme"
field. There is a proposal to merge these two:

https://github.com/secure-systems-lab/securesystemslib/issues/308

But overall this feels like a "trivial" change, compared to moving over
to the JOSE ecosystem...

>
> ```
> { "scheme" : "ecdsa-sha2-nistp256", "keyval" : {"kty":"EC", "crv":"P-256",
> "x":"f83OJ3D2xF1Bg8vub9tLe1gHMzV76e8Tus9uPHvRVEU",
> "y":"x_FEzRu9m36HLN_tue659LNpXW6pCyStikYjKIWI5a0", "kid":"Public key used
> in JWS spec Appendix A.3 example" }}
> ...
> To extend my sample about FIPS, we could for example have something like
>
> ```
> { "scheme" : "fips-ecdsa-sha2-nistp256", "keyval" : {"kty":"EC",
> "crv":"P-256", "x":"f83OJ3D2xF1Bg8vub9tLe1gHMzV76e8Tus9uPHvRVEU",
> "y":"x_FEzRu9m36HLN_tue659LNpXW6pCyStikYjKIWI5a0", "kid":"Public key used
> in JWS spec Appendix A.3 example" } }
> ```

Case in point, using a p256 key without point compression is iffy, and I
wonder why this is. Personally I'd argue for something between both
sides, but without having these confusing "kty", "crv" and such
fields...


> to denote another verifier/signer implementation but use the same key.
>
> IMHO, the current specification is a little foggy about the signer and the
> linked crypto material and implementations (at least Go) are currently
> reproducing the same issue (lack of responsibility segreggation).

I agree that making things a little clearer is a good goal, but I'd
argue against dealing with the JOSE ecosystem in to deal with this
ambiguity...

Cheers!
-Santiago
signature.asc

Thibault Normand

unread,
Apr 12, 2022, 12:15:09 PM4/12/22
to Santiago Torres, The Update Framework (TUF)

> Case in point, using a p256 key without point compression is iffy, and I
> wonder why this is.

I'm happy you noticed it because it was part of a second email.

> I agree that making things a little clearer is a good goal, but I'd
> argue against dealing with the JOSE ecosystem in to deal with this
> ambiguity...

JWK / JOSE or something else :-)

But don't define yet another crypto material serialization format to mutualize all security issue coverage and efforts.

I assumed that TUF uses JSON and JOSE/JWK is one of the most used standards in the case of JSON usages and COSE for CBOR.
That's what drove me to JOSE.

Santiago Torres

unread,
Apr 12, 2022, 12:25:08 PM4/12/22
to Thibault Normand, The Update Framework (TUF)
On Tue, Apr 12, 2022 at 06:14:57PM +0200, Thibault Normand wrote:
> > Case in point, using a p256 key without point compression is iffy, and I
> > wonder why this is.
>
> I'm happy you noticed it because it was part of a second email.

Haha, I just paid attention to this thread today...

>
> > I agree that making things a little clearer is a good goal, but I'd
> > argue against dealing with the JOSE ecosystem in to deal with this
> > ambiguity...
>
> JWK / JOSE or something else :-)
>
> But don't define yet another crypto material serialization format to
> mutualize all security issue coverage and efforts.
>
> I assumed that TUF uses JSON and JOSE/JWK is one of the most used standards
> in the case of JSON usages and COSE for CBOR.
> That's what drove me to JOSE.

Yeah, I think that's reasonable. I'll say that the TUF key format
*predates* JOSE by about 5 years. It's also strange, because some of our
keys (e.g., RSA) we use a slight wrapper over PKCS#8/7/PGP(!!) whatever.
Strictly speaking, I would assume that some of these formats achieve
enough in terms of FIPS requirements.

One way to look at it is that we are passing the key material as is from
underlying implementations and some metainformation about the key that
we are reasoning about (i.e., scheme, keyid). This is partly why I don't
know how much you win by pulling in JOSE (or COSE for that matter). A
lot of these formats are trying to not only replace but reason about
the same things that PKCS7, PKCS12, as well as a bunch of other TLS
related RFCs. This makes things a little hard to tractably study without
merging their PKI semantics (and-or creating some sub-spec to specify
what all of those fields mean in our context). I am wary because of
that, it adds more surface to reflect upon in terms of security
properties.

Cheers!
-Santiago
signature.asc

Thibault Normand

unread,
Apr 12, 2022, 1:08:59 PM4/12/22
to Santiago Torres, The Update Framework (TUF)
> I'm happy you noticed it because it was part of a second email.


I forgot to add the important word "incoming" after "second" in the sentence.


> A lot of these formats are trying to not only replace but reason about
> the same things that PKCS7, PKCS12, as well as a bunch of other TLS
> related RFCs.

Yes, PKCS formats are a bit too opaque and focused on private/public keys storage.
JOSE brings transparency to key components and offers storage for various crypto materials (symmetric keys, seeds, etc.)

BTW I'm not proposing to use JOSE but more asking why are we trying to define yet-another format which will
lead to feature requests already supported by existing RFC.

Tony Arcieri

unread,
Apr 12, 2022, 1:25:41 PM4/12/22
to Thibault Normand, The Update Framework (TUF)
On Tue, Apr 12, 2022 at 9:49 AM 'Thibault Normand' via The Update Framework (TUF) <theupdate...@googlegroups.com> wrote:
To extend my sample about FIPS, we could for example have something like

```
{ "scheme" : "fips-ecdsa-sha2-nistp256", "keyval" :  {"kty":"EC", "crv":"P-256", "x":"f83OJ3D2xF1Bg8vub9tLe1gHMzV76e8Tus9uPHvRVEU", "y":"x_FEzRu9m36HLN_tue659LNpXW6pCyStikYjKIWI5a0", "kid":"Public key used in JWS spec Appendix A.3 example" } }
```

 to denote another verifier/signer implementation but use the same key.

I don't think there's any reason for this to exist, especially on a public key.

For starters, RFC6979 can be used in a FIPS-compliant manner. RFC6979 is built on HMAC-DRBG, which is a SP 800-90A-compliant DRBG. RFC6979 Section 3.3 describes how to instantiate HMAC-DRBG with an "entropy_input" parameter which will produce a nondeterministic signature (but guaranteed to have a uniformly random k). So there's no reason to muddy up algorithm identifiers with these concerns: implementations which aim to be FIPS-compliant can handle these concerns internally.

In regard to attaching this information to public keys: this has no material impact on how signatures are verified, nor can a verifier even tell the difference, which means a signer can lie about how they produced the signature. In that regard attaching it to the public key serves no discernable purpose.

--
Tony Arcieri

Thibault Normand

unread,
Apr 12, 2022, 2:09:32 PM4/12/22
to Tony Arcieri, The Update Framework (TUF)
> I don't think there's any reason for this to exist, especially on a public key.

OK :-)

> For starters, RFC6979 can be used in a FIPS-compliant manner. RFC6979 is built on HMAC-DRBG, which is a SP 800-90A-compliant DRBG. RFC6979 Section 3.3 describes how to instantiate HMAC-DRBG with an "entropy_input" parameter which will produce a nondeterministic signature (but guaranteed to have a uniformly random k). So there's no reason to muddy up algorithm identifiers with these concerns: implementations which aim to be FIPS-compliant can handle these concerns internally.

Thank you for the explanation. Probably my FIPS sample makes you take a bad direction. It was just an example of key reuse with different signer algorithms.


> In regard to attaching this information to public keys: this has no material impact on how signatures are verified, nor can a verifier even tell the difference, which means a signer can lie about how they produced the signature. In that regard attaching it to the public key serves no discernable purpose.

Sure, but using a flat descriptor adds confusion (at least to me) and potential change conflict problems.
In addition to that, it drives the implementers (at least Go ones) to bad decisions (marshalling behaviors in signer). And by experience, the more confusing the spec is, the less it will be adopted.

As an example of JWK integration, I'm thinking more of something like https://w3c-ccg.github.io/security-vocab/#classes - used for DID (Distributed Identifiers). Digital signature tends to prove the provenance and the digital identity based on the public key trust. This enveloppe describes various identity verification mechanisms and split keys from signature to handle the different encoding requirements. TUF is essentially using local crypto based proof but we could imagine in the future various mechanisms used to prove the validity of the integrity proof in a distributed manner.

Jussi Kukkonen

unread,
Apr 14, 2022, 5:17:25 AM4/14/22
to The Update Framework (TUF), Thibault Normand
Hi Thibault,

I'm not a cryptography person but I am pretty familiar with (ab)using the TUF metadata format to make it do what I need... which leads me to ask: what is really preventing you from using JWK keys in TUF?

I'm not saying it would work without any​ integration but it seems like the required marshalling is really minimal: the "keyval" structure in TUF public key definition specifically is left completely up to the keytype, and you can define new keytypes. I would imagine you would just define new TUF JWK keytype(s), plonk the JWK public key structure inside "keyval"... and as far as the metadata is concerned, that's it.

If it all works you'd then try to get the keytype definition included in the TUF spec -- possibly as the default/suggested keytype -- and convince implementations to support it. 

Is there something I'm missing here?

Jussi


From: 'Thibault Normand' via The Update Framework (TUF) <theupdate...@googlegroups.com>
Sent: Tuesday, 12 April 2022 18.08
To: The Update Framework (TUF) <theupdate...@googlegroups.com>
Subject: [tuf] Usage of existing crypto material serialization framework (JWK)
 

⚠ External Email

--
You received this message because you are subscribed to the Google Groups "The Update Framework (TUF)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theupdateframew...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/theupdateframework/01e38ea8-7ae1-4a9a-a3f7-aed0d8bf00a9n%40googlegroups.com.


⚠ External Email: This email originated from outside of the organization. Do not click links or open attachments unless you recognize the sender.

Thibault Normand

unread,
Apr 14, 2022, 6:08:04 AM4/14/22
to Jussi Kukkonen, The Update Framework (TUF)
> I'm not a cryptography person but I am pretty familiar with (ab)using the TUF metadata format to make it do what I need... which leads me to ask: what is really preventing you from using JWK keys in TUF?

Nothing, yes I can completely abuse the spec. I always prefer to stick to the spec and try to remove ambiguities than trying to hack the spec implementation and finally diverge from the spec and get interoperability problems.

My comment is more about the signer object which handles ' keyType'  and ' scheme'  attributes. This is quite confusing because keyType is associated with keyVal attribute, but why are we specifying a 'scheme' here, which can refer to the signing method used and also 'keyType'  could take the same value of 'scheme'  which is semantically incorrect. While I'm reviewing the go-tuf implementation, it looks like I'm almost reimplementing something that really looks like a new JSON based key encoding standard. This feeling drove me to the question 'why reinvent something which is industry standard today?'

> I'm not saying it would work without any integration but it seems like the required marshalling is really minimal: the "keyval" structure in TUF public key definition specifically is left completely up to the keytype, and you can define new keytypes. I would imagine you would just define new TUF JWK keytype(s), plonk the JWK public key structure inside "keyval"... and as far as the metadata is concerned, that's it.

Using keyType as a discriminant to know when the keyVal contains a JWK is not a good idea either. Probably a keyEncoding attribute should be introduced to inform the consumer about the encoding used in keyVal as we could have PEM, JWK, Point, CompressedPoint, RAW, etc.

> If it all works you'd then try to get the keytype definition included in the TUF spec -- possibly as the default/suggested keytype -- and convince implementations to support it.

I'll try :-)

Jussi Kukkonen

unread,
Apr 14, 2022, 8:44:07 AM4/14/22
to Thibault Normand, The Update Framework (TUF)

Can you explain what you mean by the "signer object" exactly? A TUF signature structure consists of two things only:
  • a string identifier
  • the signature itself
The spec does list some public key fields confusingly very close to where the signature structure is defined but the fact remains: the signature structure does *not* contain any details about the key. Maybe you are looking at some specific implementation?

As to your question "why reinvent something" -- I don't have an answer but I do think the answer could have historical relevance only: TUF has existed in some form for more than a decade. It's invention cannot be prevented (without a time machine). In fact TUF might be about the same age as JSON Web keys...

Jussi


From: 'Thibault Normand' via The Update Framework (TUF) <theupdate...@googlegroups.com>
Sent: Thursday, 14 April 2022 13.07
To: Jussi Kukkonen <jkuk...@vmware.com>
Cc: The Update Framework (TUF) <theupdate...@googlegroups.com>
Subject: Re: [tuf] Usage of existing crypto material serialization framework (JWK)
 

⚠ External Email

Thibault Normand

unread,
Apr 14, 2022, 9:46:32 AM4/14/22
to Jussi Kukkonen, The Update Framework (TUF)
> Can you explain what you mean by the "signer object" exactly? A TUF signature structure consists of two things only:

From the spec, known as Key,


{ "keytype" : "ecdsa-sha2-nistp256", "scheme" : "ecdsa-sha2-nistp256", "keyval" : { "public" : PUBLIC } }

keytype and scheme could contain identical values. IMHO this is semantically incorrect to merge key type which is used to denote a key type as the spec mention "rsa", "ed25519" which are acceptable key types but "ecdsa-sha2-nistp256" is not a key type but a signature method using ECDSA with SHA256 for hash and P-256 as signing material. For me, it looks like mixing concepts, especially when I can see "scheme" and "keyType" sharing the same value. That's my initial observation.

It should be for semantic consistency

{ "keytype" : "nistp256", "scheme" : "ecdsa-sha2-nistp256", "keyval" : { "public" : PUBLIC } }

But by doing this, we mix the Key info (keyType and keyVal), and the Signer (scheme). Also semantically, imho, keyType is referring to the key family so "rsa" or "ec". Which will lead us to add keyCurve to specify the curve used in the case of a keyType "ec".

Finally, while I was designing this feature I realized that it was like reimplementing a JSON based serialization format to handle TUF Key deserialization, that's the reason why I asked the initial question? "Why not use JWK?"

From this point, I started to study how Key / Signer / Signature concepts are defined in TUF and implemented in go-tuf. Like the specification is mixing the concepts, the go-tuf implementation follows the same direction.
I started a fork - https://github.com/Zenithar/go-tuf/tree/feat_key_signer_decoupling - where I am splitting TUF Key serialization requirements from the signer instance to allow more flexibility in signing method implementation (delegate to Vault / KMS, etc.) and try to various issues concerning the key deserialization (JSON Deserialization possible DoS, possible public/private mismatch, use of uncompressed point marshalling, etc.).

> As to your question "why reinvent something" -- I don't have an answer but I do think the answer could have historical relevance only: TUF has existed in some form for more than a decade.

Probably, but inevitably if you want to extend the key support (P-384, Ed448, etc.) you will have to handle the key serialization topic, something already well covered by external standards which also exists and became industry standards. I don't want to weigh the for or against between standard usage, I'm just trying to bridge standards between them to be able to use the most effective part of each one. Also from an implementer PoV reusing existing standards enhances the chance to have libraries to support them vs being forced to reimplement a Key serialization logic in order to support TUF and also could be a source of issues.

> It's invention cannot be prevented (without a time machine). In fact TUF might be about the same age as JSON Web keys...

And I would like to thank you for its invention. :-)

Lukas Puehringer

unread,
Apr 19, 2022, 8:58:22 AM4/19/22
to Thibault Normand, The Update Framework (TUF)
On 14.04.2022, at 15:46, 'Thibault Normand' via The Update Framework (TUF) <theupdate...@googlegroups.com> wrote:

> Can you explain what you mean by the "signer object" exactly? A TUF signature structure consists of two things only:

From the spec, known as Key,

{ "keytype" : "ecdsa-sha2-nistp256", "scheme" : "ecdsa-sha2-nistp256", "keyval" : { "public" : PUBLIC } }

keytype and scheme could contain identical values. IMHO this is semantically incorrect to merge key type which is used to denote a key type as the spec mention "rsa", "ed25519" which are acceptable key types but "ecdsa-sha2-nistp256" is not a key type but a signature method using ECDSA with SHA256 for hash and P-256 as signing material. For me, it looks like mixing concepts, especially when I can see "scheme" and "keyType" sharing the same value. That's my initial observation.

It should be for semantic consistency

{ "keytype" : "nistp256", "scheme" : "ecdsa-sha2-nistp256", "keyval" : { "public" : PUBLIC } }

I agreed with all of this, and would love to see a semantically more accurate use of these fields. It seems like you read through https://github.com/secure-systems-lab/securesystemslib/issues/308, which describes the issue and calls for a revision of the supported values here. I suppose nobody has made an attempt to fix it so far, because, from an operational pov, the issue does not seem important enough to make a likely backwards incompatible change.

But by doing this, we mix the Key info (keyType and keyVal), and the Signer (scheme).

https://github.com/theupdateframework/taps/blob/master/tap9.md provides a detailed rationale for why the signing scheme is part of the authenticated (!) public key info in TUF metadata, and not part of the attacker-controlled (!) signature info as it used to be.
Reply all
Reply to author
Forward
0 new messages