Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revision 10's Extra 'algorithm' Note #29

Open
cjslep opened this issue May 16, 2018 · 15 comments
Open

Revision 10's Extra 'algorithm' Note #29

cjslep opened this issue May 16, 2018 · 15 comments
Labels
project Meta issue on project, principles, processes protocol Core mechanics of the protocol

Comments

@cjslep
Copy link

cjslep commented May 16, 2018

Revision 10 begins deprecating the 'algorithm' parameter. It added:

If algorithm is provided and differs from the key metadata identified by the keyId then an implementation MUST produce an error.

I interpreted this as being at verification time. This means that the only way for verification to continue if algorithm is provided is for it to match the keyId metadata.

This makes the following addition in revision 10 kind of moot:

Note: The application verifying the signature MUST derive the algorithm from the metadata associated with the keyId and MUST NOT use the value of algorithm from the signed message.

Since they will be the same (having thrown no error), this prioritization shouldn't matter. In the case they aren't the same, an error is produced so it trivially satisfies the MUST NOT.

Is there a nuance I am missing reading these?

@msporny
Copy link
Contributor

msporny commented May 16, 2018

I interpreted this as being at verification time. This means that the only way for verification to continue if algorithm is provided is for it to match the keyId metadata.

Yes, that is the proper interpretation.

This makes the following addition in revision 10 kind of moot:

Note: The application verifying the signature MUST derive the algorithm from the metadata associated with the keyId and MUST NOT use the value of algorithm from the signed message.
Since they will be the same (having thrown no error), this prioritization shouldn't matter. In the case they aren't the same, an error is produced so it trivially satisfies the MUST NOT.
Is there a nuance I am missing reading these?

You are correct in that the text is more or less moot/useless. We're attempting to deprecate the feature while ensuring backwards compatibility for existing implementations in the field until such a time as the field implementations are updated. We state the same sort of things in multiple places in the specification such that implementers don't miss our desire to deprecate the feature.

Do you have a suggestion on better language that would achieve these goals?

@cjslep
Copy link
Author

cjslep commented May 16, 2018

I see, thanks for clarifying! I don't have better language here.

However, I now have an additional question in the case of this deprecation: are the existing implementations that depend on only the algorithm parameter (not using metadata about keyId) now noncompliant with the spec? That is, there appears to be a missing period of time where algorithm was still required to be present but implementations needed to switch to using keyId metadata to do actual determinations. This would allow peers to understand one another during a transition period.

I am just diving this deep because I am implementing a fresh library in golang and want to be sure it is future-proof and still compatible with existing previously-conforming implementations. Which does mean doing the extra work to add the algorithm parameter despite it going away. Other new implementations may not want to do this.

@msporny
Copy link
Contributor

msporny commented May 16, 2018

However, I now have an additional question in the case of this deprecation: are the existing implementations that depend on only the algorithm parameter (not using metadata about keyId) now noncompliant with the spec?

None of the existing implementations will break based on this update. All of the existing implementations should still be semi-compliant with the spec in that none of them were checking metadata associated w/ keyId.

We expect that if existing implementations are not upgraded, that they will eventually break once we remove the requirement for the algorithm parameter. That is, clients won't send the "algorithm" parameter, which will confuse older implementations... at which time developers will complain and hopefully push people to upgrade their systems. Some systems will not be able to upgrade, which is why we're still allowing algorithm to exist... will deprecate it, but may never be able to fully remove it.

This is just a signal to the implementers that we would like to remove it entirely and to get feedback from them on doing that.

We know of no known exploit in the wild that tried to take advantage of the algorithm parameter being specified by the client. We're just making the change in advance of a potential security vulnerability.

I am just diving this deep because I am implementing a fresh library in golang and want to be sure it is future-proof and still compatible with existing previously-conforming implementations. Which does mean doing the extra work to add the algorithm parameter despite it going away. Other new implementations may not want to do this.

Great! Can we list you as an implementer? Is your implementation available on the public web (so we can point to it?)

@nightpool
Copy link

nightpool commented May 16, 2018

almost every s2s implementation on this page: https://activitypub.rocks/implementation-report/ uses HTTP Signatures for authenticating the content sent between servers. The largest of these is Mastodon, with 2000 servers and over a million users.

(as far as i understand it, this change technically breaks mastodon's implementation since we require the algorithm parameter, and throw an error if it doesn't exist, so an implementation that treated it as optional and didn't include it would not be able to communicate with us. like you said though, no implementations do this currently so it's not a huge deal)

@nightpool
Copy link

The projects that i know currently implement HTTP Signatures off the top of my head are Mastodon, Pleroma, Kroeg, Hubzilla, Peertube, Bridgy Fed, and Friendica. I don't know how many of those use existing libraries vs implementing it wholesale, like Mastodon does.

@msporny
Copy link
Contributor

msporny commented May 16, 2018

Mastodon, Pleroma, Kroeg, Hubzilla, Peertube, Bridgy Fed, and Friendica

Note to self (@msporny) to track down implementations and see what libs are being used.

Thanks @nightpool!

@nightpool
Copy link

as of mastodon/mastodon#7525 mastodon no longer requires the algorithm field. (although we still assume the signature is rsa-sha256)

@nightpool
Copy link

well, I'm here and I have a passing familiarity with most of these codebases, so I might as well take a look:

i was mistaken about friendica, I was thinking they were working on an AP implementation, but i can't find it now.

@cjslep
Copy link
Author

cjslep commented May 19, 2018

@msporny FYI my implementation is available as a stand alone golang lib at: https://github.com/go-fed/httpsig

@msporny
Copy link
Contributor

msporny commented May 20, 2018

@cjslep wrote:

@msporny FYI my implementation is available as a stand alone golang lib at: https://github.com/go-fed/httpsig

Noted here: #1 (comment)

Thank you! I'm in the process of noting all of the other implementations as well.

@liamdennehy
Copy link
Contributor

This proposal assumes there is metadata the verifier has associated with a keyId.

One of the benefits of using PKI is that certificates are issued to identities that don't require knowledge of each other. There is at least one scheme (required in legislation in the EU) that requires attributes encoded in the certificate itself and the specific issuer to be the entire basis of trust and identification when two strangers meet. There is no way to derive the algorithm from any pre-existing metadata or the certificate itself in this instance. This change would make the protocol incompatible with that requirement.

If a verifier requires specific attributes of the signature - such as headers, algorithm - then this protocol should allow those to be communicated in a standard way just as e.g. TLS specifies acceptable cipherspecs, and not assume that the verifier will be able to guess, especially as the field of algorithms evolves and broadens.

@liamdennehy
Copy link
Contributor

We're attempting to deprecate the feature while ensuring backwards compatibility for existing implementations in the field until such a time as the field implementations are updated

I think at some point we should explicitly forbid the legacy mechanisms - I don't think there's much value in publishing a specification that includes legacy sidesteps that came up during its development, and especially ones acknowledged to be removed for good security reasons.

Appreciate this is a big change, but then v11 is already quite breaking. Definitely needs discussion...

@liamdennehy liamdennehy added project Meta issue on project, principles, processes protocol Core mechanics of the protocol labels Aug 22, 2019
@nightpool
Copy link

One of the benefits of using PKI is that certificates are issued to identities that don't require knowledge of each other.

The current spec has nothing to do with PKI. I don't think attempting to introduce PKI into a proposal that currently avoids that whole sphere of complexity is a good idea

There is at least one scheme (required in legislation in the EU) that requires attributes encoded in the certificate itself and the specific issuer to be the entire basis of trust and identification when two strangers meet

A scheme of what? Required by who, for what purpose? Such a scheme (if building it is useful) can be productively built on top of HTTP Signatures without requiring us to include it in the base spec.

@liamdennehy
Copy link
Contributor

attempting to introduce PKI

Not my intention, but I am trying to make sure we don't cut off potential use cases. PKI was only illustrative.

A scheme of what? Required by who, for what purpose?

https://www.berlin-group.org
Building on: https://en.wikipedia.org/wiki/EIDAS

Indeed they can be built on top of this, but not if we don't communicate the hash algorithm:

The algorithm field encodes two distinct algorithm parameters: the signature algorithm itself that perform the signature, and the hash algorithm used to derive the actual data signed. My original objection was against implying we could remove all of it, and somehow the key would know which hash algorithm to use - something not encoded into the key or even preferentially associated with it.

With the algorithm defaulting to hs2019 (meaning SHA512) this is no longer an issue.

Given the value of algorithm determines other behaviour (such as permissibility of (created)), I don't think it eases implementation by making this optional. Implementers will need to make too many assumptions and/or implement more logic than is required than if they always have a single, reliable source for the behaviour mode.

@ioggstream
Copy link
Contributor

Given the value of algorithm determines other behaviour (such as permissibility of (created)), I don't think it eases implementation by making this optional

algorithm could then replace the versionin scheme I proposed here #53

I think that we could define an algorithm table with:

  • a list of allowed virtual headers (having the specs state that "if the algorithm is FOO and the virtual header is BAR then fail...") is bloated and open to misinterpretation
  • the hash algorithm to use
  • a de/serialization process

This would simplify the use of sha256 too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
project Meta issue on project, principles, processes protocol Core mechanics of the protocol
Projects
None yet
Development

No branches or pull requests

5 participants