Gob vulnerable to resource exhaustion attacks?

148 views
Skip to first unread message

tav

unread,
Jan 12, 2012, 7:42:11 AM1/12/12
to golan...@googlegroups.com
Hey fellas,

I was reading through the gob source again (kudos to whoever
architected it — it's beautiful!), and it looks to me like it might be
vulnerable to fairly easy resource exhaustion attacks when decoding
slices.

The decodeSlice method allocates an underlying array to whatever size
specified. Thus if you accept gob-encoded requests from untrusted
sources, e.g. over the internet, it would be pretty easy to exhaust
memory on the receiving server.

One solution I can think of is to specify an allocation limit on the
decoder when dealing with untrusted sources and keep a running tally
of allocated storage.

Or perhaps I've just misread the problem. Thoughts?

--
All the best, tav

plex:espians/tav | t...@espians.com | +44 (0) 7809 569 369
http://tav.espians.com | http://twitter.com/tav | skype:tavespian

Albert Strasheim

unread,
Jan 12, 2012, 12:02:38 PM1/12/12
to golang-nuts
Hello

On Jan 12, 2:42 pm, tav <t...@espians.com> wrote:
> I was reading through the gob source again (kudos to whoever
> architected it — it's beautiful!), and it looks to me like it might be
> vulnerable to fairly easy resource exhaustion attacks when decoding
> slices.

We've discussed this before:

https://groups.google.com/group/golang-nuts/browse_thread/thread/e3c85563eb0b31aa

Some of the panics in gob have been turned into errors, but I don't
know if it's had another round of "hardening".

Regards

Albert

Kyle Lemons

unread,
Jan 12, 2012, 3:04:16 PM1/12/12
to tav, golan...@googlegroups.com
While I do agree that a well-crafted gob could do what you describe, I think that the attempt to allocate too much memory will panic (i.e. if you try to allocate 1e9 objects).  It would have to allocate a large amount of memory, but not so much that it was impossible to fulfill, yes?
~K

Andrew Gerrand

unread,
Jan 12, 2012, 5:06:15 PM1/12/12
to tav, golan...@googlegroups.com
On 12 January 2012 23:42, tav <t...@espians.com> wrote:
> Hey fellas,
>
> I was reading through the gob source again (kudos to whoever
> architected it — it's beautiful!), and it looks to me like it might be
> vulnerable to fairly easy resource exhaustion attacks when decoding
> slices.
>
> The decodeSlice method allocates an underlying array to whatever size
> specified. Thus if you accept gob-encoded requests from untrusted
> sources, e.g. over the internet, it would be pretty easy to exhaust
> memory on the receiving server.
>
> One solution I can think of is to specify an allocation limit on the
> decoder when dealing with untrusted sources and keep a running tally
> of allocated storage.
>
> Or perhaps I've just misread the problem. Thoughts?

You're right - there are certainly DoS vectors to servers that blindly
decode gob streams. It's something that should probably be addressed,
but it will require some careful thought.

To solve this problem right now you could create a pass-through Reader
implementation that understands the gob format (at a basic level), and
puts a hard limit on the size of objects that can be sent. It should
be sufficient to simply close the connection to any client that sends
an object that is too big.

If you wrote the client code (even if you don't trust it), you should
be able to choose a maximum payload size.

Andrew

Rob 'Commander' Pike

unread,
Jan 12, 2012, 6:01:45 PM1/12/12
to Andrew Gerrand, tav, golan...@googlegroups.com

it would be trivial to add a method to Decode to limit the size it receives, since all messages have a header to set the length.

but not now.

-rob


Reply all
Reply to author
Forward
0 new messages