using vault aws to access S3 buckets in multiple AWS accounts

421 views
Skip to first unread message

Mathieu D

unread,
Jul 9, 2015, 8:43:32 AM7/9/15
to vault...@googlegroups.com
Hi,

I have to implement quite a specific workflow :

We are going to manipulate data in S3 buckets in numerous distinct AWS accounts (one account per customer we have).
The workflow is the following : the customer goes to our website, in its own personal space, and trigger an action, along with some password that he set previously. 
This signed action would give us a temporary permission to one of our application to access his S3 bucket, to do what we have to do with the data, and leave the bucket. Then the permission would expire.
The temporary permission, and the fact that he put is signature (password) for it should appear in some audit trail we can show to the cutomer.

Would vault (with its aws backend) be helpful in such a scenario ?
What would be the structure to put in place ? one vault, many vaults ? 

Mathieu

Clay Bowen

unread,
Jul 9, 2015, 9:51:25 AM7/9/15
to vault...@googlegroups.com
The way I can see this happening is:
1) Customer goes to website, enters password
2) Password is passed, along with user name to Vault
3) Vault returns token, which expires after x time, based on initial user creation
4) Applications on your end use TOKEN to retrieve encryption keys needed to access S3 back-end.  Keys are stored in memory, never written to storage.
5) After session ends or the application has completed its process, encryption keys are removed from memory.
6) TOKEN expires.

There are a few holes in this workflow (this is off the cuff, you can probably come up with a better scenario), but in general it will allow the critical information (the keys) to be stored safely and only accessed with the customer's password.  The AUDIT function will allow you to see that a token was issued (here's an example from my test system that uses the x509 CERT authentication method.  The red part shows that a token was issued)

{"type":"response","error":"","auth":{"display_name":"","policies":["users","scripts"],"metadata":{"cert_name":"tester","common_name":"*"}},"request":{"operation":"write","path":"auth/cert/login","data":null,"remote_address":"10.16.43.125"},"response":{"auth":{"client_token":"sha1:e5bd84547aa94559348f7e175d2037450e84a376","display_name":"cert-Tester","policies":["users","scripts"],"metadata":{"cert_name":"tester","common_name":"*"}},"secret":null,"data":null,"redirect":""}}

For production you would likely have 3 or so Vault servers behind a load balancer, with one being active and the rest stand-by.  A good HA backend for your data (there are several available) is also required.  This would be a single "Vault", though - the same storage would be accessed by all the machines.

I'd suggest you clone the repository and set up a test system to test the architecture.

Thanks,
Clay

Jeff Mitchell

unread,
Jul 9, 2015, 9:55:42 AM7/9/15
to vault...@googlegroups.com
Hello,

The AWS backend is designed to work with Amazon's IAM product, which I
believe means (but am not sure) that it can only generate access
credentials for resources under a single AWS account. You could mount
multiple copies of the backend and have each work with a different
customer's IAM account, but that is problematic in other ways.

A better approach might be to think about using Vault to store S3
access keys for individual customers (in the "generic" backend),
separated into different paths. Then, based on which customer is
triggering an action, you can issue tokens to the requesting
applications associated with the correct policies to allow them to
read the correct paths and the needed S3 access keys from Vault. When
the token expires, they will be unable to read those keys again
without getting a new token.

The hole in this workflow is that the S3 access keys don't change, so
a badly behaving application could save or log the information
somewhere instead of properly releasing it from memory.

--Jeff
> --
> This mailing list is governed under the HashiCorp Community Guidelines -
> https://www.hashicorp.com/community-guidelines.html. Behavior in violation
> of those guidelines may result in your removal from this mailing list.
>
> GitHub Issues: https://github.com/hashicorp/vault/issues
> IRC: #vault-tool on Freenode
> ---
> You received this message because you are subscribed to the Google Groups
> "Vault" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to vault-tool+...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/vault-tool/3363520d-997e-4a20-8a91-fc16af613867%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Mathieu D

unread,
Jul 9, 2015, 10:56:42 AM7/9/15
to vault...@googlegroups.com
sounds interesting, thanks
Reply all
Reply to author
Forward
0 new messages