[WG-UMA] Signed messages

4 views
Skip to first unread message

Paul C. Bryan

unread,
Mar 31, 2010, 8:23:59 PM3/31/10
to WG UMA
Signed messages

Does UMA require messages from the requester to the host to be signed in any circumstances, or does it need to ensure that the option is preserved in OAuth, or does it not care (Paul's current contention as discussed on 2010-03-25)?

I will argue that messages from the requester to the host need not be signed in order to sufficiently demonstrate to the user that access was authorized.


Background

User administers policies in AM. User trusts AM to make access control decisions for protected resources. User controls resources at host. User instructs host to defer to policy decisions from AM for protected resources.

There are no current requirements establishing any need for host to make secondary use of any token, or be provided any information about the authorizing user or requester through the UMA protocol.


Trust model

AM's burden to the user is to demonstrate it made appropriate access control decisions when access to resources was requested.

Host's burden to user is to demonstrate that it had authorization from AM when it provided access to a requester.

Requester's burden to user is to stipulate that whatever token it is issued is maintained securely and not shared with any third party.

Both AM and host's burden to user is to stipulate that secrets used to secure messages and/or tokens are maintained securely.


Bearer token scenario (structured token a la WRAP)

In this scenario, the requester obtains an access token from AM, which it presents to host in a TLS-secured connection when a resource is requested. The token itself is presumably signed by AM, and host can validate it on its face. It is also presumed to be scoped for the resource(s) that the requester is actually requesting from host. The same bearer token can be reused by requester to access resources multiple times from host, for the life of the bearer token.

This scenario requires key management infrastructure to manage the key used by host to verify the AM signature in the bearer token. This scenario allows host to stipulate it had authorization to provide access to the resource for the life of the bearer token. Since there is no published specification for token revocation, there is no facility to prevent replay attacks using a leaked bearer token.

Host trusts its own TLS connection and certificate issued by an industry-trusted third party. Host trusts the key issued to it to verify signatures by AM in the bearer token. Key revocation and expiry mechanisms are required for host to detect when AM key is no longer valid, as well as a mechanism for new AM key to be issued to and verified by host.

AM demonstrates it made appropriate access control decisions when access was requested by producing the bearer token it issued to the requester when the token issued to the requester. Host demonstrates to the user it had authorization from AM by producing the bearer token that was used to access resources. User can verify AM's signature on the token to determine if it was valid at the time access occurred.


Signed message scenario (opaque token and associated secret a la OAuth 1.x)

In this scenario, the requester negotiates a key that it uses to sign requests to host. Host can verify the signature of requester and associate it with a particular consumer/access token. Host would be expected to perform a back-channel request to a PDP to verify that the requester wielding the access token is allowed to access the resource. Each message contains attributes such as time stamp and nonce. Each message is signed separately, and is unique from all others.

This scenario requires key management infrastructure to manage the key issued to requester for host to verify its signatures. This scenario requires a channel for host to request authorization for resource(s) using the associated access token. This channel can be secured by TLS and AM can trust its own TLS connection and certificate issued by industry-trusted third party.

AM demonstrates it made appropriate access control decisions when access was requested by producing the request from host for the resource using what requester access token. Host demonstrates it had authorization from AM by producing the signed message from requester and the response from AM for the policy decision it made. Host is hard-pressed to prove it prevented replay attacks, as this is akin to proving a negative.


Comparison

Both scenarios require approximately the same magnitude of key management infrastructure. Neither WRAP nor OAuth 1.x provide the necessary protocol specification to support this, hence we would need to extend either to meet the requirements established under UMA either way.

Signed messages have a slight security edge, as each request is unique, and there is less opportunity for replay attack than in the bearer token scenario. However, bearer tokens do allow use of TLS to secure the channel, and the burden on requester is to obtain a token and supply it in a message rather than using its backing secret to digitally sign the message.


Conclusion

Currently, there is no requirement that UMA must specifically mitigate against replay attacks. This seems to be the only advantage signatures have over bearer tokens, and this advantage is only slight given host can only stipulate it has a system in place to prevent such attacks, not actually prove it to user. Given this, the conclusion seems to be that bearer tokens—though still suffering from resource scoping issues—is a more appropriate choice over signatures.

Should the underlying requirements change, this conclusion should be revisited.

ankit gurung

unread,
Apr 1, 2010, 9:11:21 AM4/1/10
to Paul C. Bryan, WG UMA
Hi Paul,

I felt another security requirement.. like instead of www.copmonkey.com if type error occurs like cpomonkey.com or copmnokey.com which hackers have put up.. Which also uses our protocol using OAuth or whatever.. how do you trust that AM? If the entire rule is kept in that server and the output is Allow/Disallow type of return value, even a simple DNS poisoning leaves us vulnerable.. And one more thing which is bugging me is that if all sites use one Password like OpenID, and CardSpace or whatever,, losing your password to keylogger or spyware apps is going to be bummer. . will it not lock you out from everything.. ??

_______________________________________________
WG-UMA mailing list
WG-...@kantarainitiative.org
http://kantarainitiative.org/mailman/listinfo/wg-uma


Thomas Hardjono

unread,
Apr 1, 2010, 1:22:17 PM4/1/10
to Paul C. Bryan, WG UMA

Paul,

 

I’m a newbie at UMA, but two cases come to mind where digital-signing may come-in handy.

 

The first is for logging purposes.  The Host may wish to log all requests (success or failed) to the Host. It needs to distinguish real requests versus bogus requests from external parties.

 

The other is for sub-function outsourcing, where the Host is in fact outsourcing some functions to another entity (opaque to the Requestor, and to the AM even). This outsource entity may desire to see the original signed request.

 

Just some thoughts.

 

/thomas/

hardjono[at]mit.edu

Paul C. Bryan

unread,
Apr 1, 2010, 1:26:18 PM4/1/10
to WG UMA
Okay, you've outlined a couple of attacks, worth separating:

1. Phishing

Problem:
User types-in the wrong domain name or is instructed to click a link that takes the user to a legitimate AM's doppelganger. User supplies credentials to AM-lookalike, and now it can run amok with user's account on legitimate AM. Alternately, user incorrectly types-in the AM host name to host when performing introduction, now the doppelganger can allow open access to "protected" resources.

Solution:
Today, no different than any other site on the Internet hosting secure resources, such as PayPal or your bank. In the host-introduction scenario, at least the user won't see the host when accessing the legitimate AM, and this should be grounds to dig further and see just who the host was introduced to. This doesn't pass the "mom" test, but it probably does pass the "son, can you come over and help me figure out this newfangled access management stuff" test.

2. DNS poisoning

Problem:
Someone poisons a caching DNS server with different IP addresses such that when it is used to lookup the AM, the wrong address is returned.

Solution:
Today, TLS does mitigate this. Industry-trusted third parties perform checks when issuing certificates to provide some level of assurance that the certificate is going to the operator of the domain/host name. We presume that the attacker will be unable to get a certificate to match the domain/host name, and clients configured to make HTTPS requests should validate the provided certificates. Further changes to the DNS infrastructure—namely the increasing use of DNSSEC—are also serving to mitigate DNS-based attacks. In the case of bearer tokens—where host does not go directly to the AM for decisions—then the security is based on the key infrastructure used for verifying signed tokens. 

Paul C. Bryan

unread,
Apr 1, 2010, 1:46:32 PM4/1/10
to WG UMA
Okay, today neither of these use cases are currently documented requirements. And, as I mentioned, if the requirements change, then the conclusion of my write-up should likely be revisited. That said, let's look at these potential new requirements:

1. Logging

In the case of bearer token, presumably if a request comes in from a "bogus" external party, it would not have the requisite bearer token to satisfy access. The host could log these equally as well as those that did come with authentic bearer tokens ("valid" and "invalid" flags).

If the issue is that in order to log a message, you'd want to log the signature along with it—presumably for the purposes of non-repudiation—which you could verify at a later time, which would be a problem with bearer token because you don't want to write the bearer tokens—secrets—in your log, then we need to understand something: what type of key infrastructure will be in place to allow signatures in logs to be verified potentially long after the transaction was logged?

Let's presume public key infrastructure because symmetric keys will be near impossible for host to use in any non-repudiation case. Okay, so now what will you log? Well, the message, including the signature, and presumably including the certificate of the signer. This certificate was signed by some trusted third-party. It's unlikely to be that of an industry-trusted third party (such as Verisign); more likely it will be that of AM. So, host now needs to store the certificate of the issuing AM, and be able to prove at a later time that it was from the issuing AM.

Does this sound like it's getting hard? It does to me. Seems like a high price to pay so that you can verify long after the fact the authenticity of logged requests.  

2. Outsourcing

I'm not sure why the outsourced component would ever have a legitimate basis to see the original request—unless it is simply acting as an internal sub-component of the original host, at which point none of it is out business protocol-wise. If the second application is indeed outsourced, and wanting to protect its resources on behalf of the user, why would it not just be another host in UMA terms, and the first host becomes a requester to it? This gives the user a degree of control of what resources in Host2 he/she is willing to share with Host1. Recursion: see recursion.

Paul

Thomas Hardjono

unread,
Apr 2, 2010, 10:54:45 AM4/2/10
to WG UMA

Hi Paul,

 

>>> If the issue is that in order to log a message, you'd want to log

>>> the signature along with it—presumably for the purposes of

>>> non-repudiation—which you could verify at a later time,

>>> which would be a problem with bearer token because you

>>> don't want to write the bearer tokens—secrets—in your log,

>>> then we need to understand something: what type of key

>>> infrastructure will be in place to allow signatures in

>>> logs to be verified potentially long after the transaction was logged?

 

Unfortunately, when it comes to legal disputes (ie. in court)

all the forensic evidence will come into play.

I’m thinking of the healthcare scenario under the recent HITECH Act.

Consider the Host as a small third-party medical database provider (supposedly

neutral) and consider the Requester being a Health Insurance Co (who

wants to know every detail of a person’s health record).

If I was the Host, I would like to have all possible evidence

of access to be logged. So much so, that I would insist that actual

digital signatures be applied (which holds-up in court BTW).

 

 

>>> Let's presume public key infrastructure because symmetric keys

>>> will be near impossible for host to use in any non-repudiation

>>> case. Okay, so now what will you log? Well, the message, including

>>> the signature, and presumably including the certificate of the signer.

>>> This certificate was signed by some trusted third-party.

>>> It's unlikely to be that of an industry-trusted third

>>> party (such as Verisign); more likely it will be that of AM.

 

ok, so I don’t really get this one. I would have thought that

the Host would insist that the Requestor use a cert issued

by a public CA like Verisign. (It’d be nice if the CA was also an AM,

but we can’t rely on that.)

 

>>> So, host now needs to store the certificate of the issuing AM,

>>> and be able to prove at a later time that it was from the issuing AM.

 

If a Public CA was the issuer, then the Host need only keep

a small number CA certs. There aren’t that many

industry grade CAs around. Even fewer than the number

of CA certs shipped in browsers J

 

 

>>  Does this sound like it's getting hard? It does to me.

>>> Seems like a high price to pay so that you can verify long after

>>> the fact the authenticity of logged requests.  

 

If the Host was taken to court and is later penalized on a per-incident

basis (as in HITECH Act), then no this is not a high price for

business survivability. It’d be interesting to hear the opinions of

real-world Hosts.

 

/thomas/

Hardjono[at]mit.edu

Paul C. Bryan

unread,
Apr 2, 2010, 11:40:15 AM4/2/10
to WG UMA
On Fri, 2010-04-02 at 10:54 -0400, Thomas Hardjono wrote:

> If I was the Host, I would like to have all possible evidence of
> access to be logged. So much so, that I would insist that actual
> digital signatures be applied (which holds-up in court BTW).

Currently, we do not have a requirement that reads along the lines of
"provide hosts the maximum amount of forensic evidence such that they
can prove their cases in court."

Please consider that my conclusion was based on the published
requirements in the UMA working group; if the requirements change, the
conclusion should likely be revisited. First, I think it's important to
get some buy-in to change the requirements.

So, presuming such a requirement was added, what exactly would such
digital signatures be used to prove in court? What supporting evidence
would be required to prove this? What kinds of added burdens will this
add to all parties to achieve this?

> ok, so I don’t really get this one. I would have thought that the Host
> would insist that the Requestor use a cert issued by a public CA like
> Verisign. (It’d be nice if the CA was also an AM, but we can’t rely on
> that.)

What exactly would Verisign provide assurance of when issuing such a
certificate? How would this be used by AM and host to correlate on this
requester? I agree we can't assume that all AMs are CAs or vice-versa.

> If the Host was taken to court and is later penalized on a
> per-incident basis (as in HITECH Act), then no this is not a high
> price for business survivability. It’d be interesting to hear the
> opinions of real-world Hosts.

Perhaps if host's liability is particularly high, it would/should invoke
technology beyond the UMA protocol to protect itself—or even in some
cases, possibly even negate the use of distributed policy delegation
models, as they remove too much control over access control.

Also, if host is subject to statutory per-incident fines for breaches of
security or privacy, it's likely that the host would not even accept
that user is free to choose their own arbitrary AM either—the lawyers
would probably conclude that it's too risky, digital signatures or not.

Paul

Eve Maler

unread,
Apr 2, 2010, 12:43:48 PM4/2/10
to Paul C. Bryan, WG UMA
Great discussion, guys; thanks!

Since our list of requirements is "emergent" and we sometimes pick up new requirements and even design principles after a need emerges, we might want to get more specific here. Note that requirement R0b (meaning it came from our charter) already talks about "policies and enforceable contract terms", which must be balanced with DP8, "We should avoid adding crypto burdens as part of our simplicity goal".

The answer to Paul's question below might be found in the answer to what sorts of assurance the host will seek that it's safe to defer access decisions to another party (that is, to become a mere policy enforcement point). While some low-sensitivity use cases may make this handover of control easy to do, we've got other use cases where we can anticipate hosts whitelisting trustworthy AMs, and possibly wanting extra assurances that the granting access "wasn't their fault". (I'm thinking of custodian and hData...)

Issues like this are one reason why it's important for us to continue the process of vetting our scenarios and deciding their disposition. We can probably expect our requirements doc to grow, if slowly, as a result.

Eve


Eve Maler
e...@xmlgrrl.com
http://www.xmlgrrl.com/blog

Thomas Hardjono

unread,
Apr 2, 2010, 2:18:21 PM4/2/10
to WG UMA
Hi Paul,

Comments inline.

> -----Original Message-----
> From: wg-uma-...@kantarainitiative.org [mailto:wg-uma-
> bou...@kantarainitiative.org] On Behalf Of Paul C. Bryan
> Sent: Friday, April 02, 2010 11:40 AM
> To: 'WG UMA'
> Subject: Re: [WG-UMA] Signed messages
>

> On Fri, 2010-04-02 at 10:54 -0400, Thomas Hardjono wrote:
>
> > If I was the Host, I would like to have all possible evidence of
> > access to be logged. So much so, that I would insist that actual
> > digital signatures be applied (which holds-up in court BTW).
>
> Currently, we do not have a requirement that reads along the lines of
> "provide hosts the maximum amount of forensic evidence such that they
> can prove their cases in court."

I would think that a broader "auditability" should be a requirement for any entity (offering any services today) wishing to garner trust from its users/consumers and partners.

Currently there is already an audit requirement R0d ("Allow an individual to audit and monitor various aspects of access relationships"). This is kind of vague/fuzzy, and I'm not sure how one would implement this requirement.

There is also another proposed requirement P8 Access to Audit Logs ("A Host must inform the AM protecting a particular resource on that Host in a timely way of all successful Requester access events")

[NB. I don't get the current P8: I thought the Host is holding/protecting the resource (and not the AM). The AM is issuing/managing the tokens for the resource sitting at the Host.]

Perhaps we could make P8 tighter. Hows this:

A Host protecting a resource belonging to a User must log all access events (in a non-repudiable manner) to that resource, and make such log data available to both the AM (who issued all access tokens to that resource) and the User owning that resource.

>
> Please consider that my conclusion was based on the published
> requirements in the UMA working group; if the requirements change, the
> conclusion should likely be revisited. First, I think it's important to
> get some buy-in to change the requirements.
>
> So, presuming such a requirement was added, what exactly would such
> digital signatures be used to prove in court? What supporting evidence
> would be required to prove this? What kinds of added burdens will this
> add to all parties to achieve this?

Yes, I absolutely understand this (crawl first, then run). There are technologies usable for non-value transactions (eg. login to FaceBook), and others for value-carrying transactions (eg. give access to my health records or credit reports).

So, I won't argue any longer for this signing thing :)


> > ok, so I don’t really get this one. I would have thought that the Host
> > would insist that the Requestor use a cert issued by a public CA like
> > Verisign. (It’d be nice if the CA was also an AM, but we can’t rely on
> > that.)
>
> What exactly would Verisign provide assurance of when issuing such a
> certificate? How would this be used by AM and host to correlate on this
> requester? I agree we can't assume that all AMs are CAs or vice-versa.

FYI, when you buy a certain class of certificate, Verisign does a bunch of manual checks/verifications about your identity. If you are a company, it checks the Dun & Bradstreet data, domain records as well as other data (which is why some certs take up to 2 weeks to issue). Then there is the actual recorded issuance, which may involve attorneys being present in-person.

So to summarize, a CA like Verisign ensures that your real-world brick mortar identity matches the "digital" identity". That is all Verisign claims/implies in its X509 certificates (nothing more). Underlying all this is the Certificate Practices Statement (105 page legal doc) which explains the liabilities and "warranties" (eg. $10K to $10M). IMHO, it is precisely this CPS doc which is one of the key reasons corporations buys certs from Verisign.

(PS. The above does not apply to the cornflakes-box $19 email cert :)


/thomas/

Paul C. Bryan

unread,
Apr 2, 2010, 2:37:25 PM4/2/10
to WG UMA
I believe what was requested of me was to draw some kind of line in the
sand regarding signing, which only made sense to me to do based on
something concrete—the currently documented requirements. I'm perfectly
fine with requirements changing; this is fully expected. Should they
change, we should definitely review the conclusion.

If we are contemplating adding new requirements, I think it would be
wise to project the effect/impact such requirements would have, and
balance these against other considerations, namely the complexity-burden
for implementors and ability for users—namely my mom—to understand and
effectively use such a system.

Paul

-----Original Message-----
From: Eve Maler <e...@xmlgrrl.com>
To: Paul C. Bryan <em...@pbryan.net>
Cc: 'WG UMA' <wg-...@kantarainitiative.org>
Subject: Re: [WG-UMA] Signed messages

Eve Maler

unread,
Apr 2, 2010, 3:38:25 PM4/2/10
to Paul C. Bryan, WG UMA
Fair enough -- I had imposed an AI on you to get the discussion rolling, and it's rolling now, which is good. :-) In the context of our decision to move onto a "WRAP paradigm" basis, a question arose about whether there's additional value *to an UMA ecosystem* of signing messages. Here are some potential circumstances we could find ourselves in:

- No hosts are interested in signed messages from requesters, when it comes right down to it, even in cases of high trust/sensitivity (maybe in this case the signing option even withers in the OAuth 2.0 spec?)

- Some deployed hosts demand signed messages from requesters for their own protection, but UMA doesn't "care" and doesn't enforce it -- it's entirely out of band (it becomes something negotiated separately, whether statically or at run time, by means standard or proprietary)

- We decide to take on-board some UMA protocol implications in some hosts wanting signed messages, and provide some interoperable means for a host to convey this demand and for a requester to heed it

Right now we're in a mode of passively accepting the OAuth 2.0 ability to sign messages. Do we need to say something in our own spec? advocate some change to OAuth? or just continue to be silent on the matter? That's what I'm interested in figuring out in order to put the issue to bed, and I invite everyone to weigh in...

Eve

Reply all
Reply to author
Forward
0 new messages