Memory issues on heroku

803 views
Skip to first unread message

ninian

unread,
Jul 14, 2010, 3:52:13 AM7/14/10
to Dragonfly
Hi all,

I'm having issues with memory allocation on heroku - going over the
300MB limit and then failing to load some libs my app uses elsewere.
I've been unable to replicate the problems locally, but I suppose they
may have to do with RMagick and dragonfly.

Has anyone encountered similar issues?

I'm thinking of writing a minimagick processor for cropping and
resizing, as it seems to be less likely to memory bloat. Are there any
out there already?

Thanks,

NL

Mohan Zhang

unread,
Jul 15, 2010, 12:28:38 AM7/15/10
to Dragonfly
I can confirm that I've seen this a couple of times too, though you
definitely sound like you've researched it more than I have, so I
can't add any useful details I'm afraid. I've been aggressively
caching pages as a response, but this is obviously not ideal for all
situations. Are you just doing heroku app:restart when you start to
get it?

Mo

ninian

unread,
Jul 15, 2010, 3:54:14 AM7/15/10
to Dragonfly
Evidently, Heroku won't work with minimagick, so that's not an option.

Although this is a little off-topic, what I've been doing aside from
caching aggressively is going through my app and trying to reduce the
bloat everywhere, and using memprof to help locate bottlenecks.

I am also going to investigate into less costly methods to resize
pictures.

NL

foz

unread,
Aug 8, 2010, 4:36:00 PM8/8/10
to Dragonfly
I've worked a lot with RMagick, and if you are doing multiple
operations on images, you can end up eating up loads of RAM very
quickly. Many of the RMagick methods return intermediate images. These
start to add up and it's easy to end up with 100s of megabytes in RAM
used on a single request. Even though the memory used is garbage
collected, the resident size of the Ruby process will stay huge. The
solution, for me, was to try to keep RAM usage low at all time.

Here's a simple way to find out you current memory usage (on mac/
unix):
`ps -o rss= -p #{$$}`
By logging this number at several places in my code, I was able to
find out where the usage peaks were.

To combat the bloat, you can try disposing intermediate images after
they are used by calling image.destroy! I used this method to bring my
process down from over 200MB to under 40mb.

I should note however that I am not using Heroku for hosting, and I'm
only just starting to get into Dragonfly, so I'm not sure how much
this will help - it's certainly possible the problem is in Dragonfly
itself.

Mark Evans

unread,
Aug 9, 2010, 9:52:08 AM8/9/10
to Dragonfly
Thanks for the info - that's useful.
I've been doing a couple of tests with RMagick and it seems as though
memory usage creeps up but doesn't go down again.
I've not looked into this much so am a bit ignorant but using this
simple test:

require 'rubygems'

puts `ps -o rss= -p #{$$}`
require 'RMagick'
puts `ps -o rss= -p #{$$}`
image = Magick::Image.from_blob(File.read('path/to/image/
file.png')).first
puts `ps -o rss= -p #{$$}`
image.destroy!
puts `ps -o rss= -p #{$$}`

I get:

10920
13196
14592
14596

Should the image.destroy! call not make the last number smaller than
the previous one?

Mark

Mark Evans

unread,
Aug 12, 2010, 9:11:56 AM8/12/10
to Dragonfly
After a bit of investigation it turns out that there are a few places
where calling image.destroy! does indeed help quite a lot so I've
added this and will be updating with the next major version of
Dragonfly over the next couple of weeks.
Thanks again
Mark

Mohan Zhang

unread,
Aug 12, 2010, 9:19:34 PM8/12/10
to Dragonfly
Huge thanks to foz for reviving the thread and Mark for following up.
Can't wait for the next release. Thanks a bunch!

foz

unread,
Aug 13, 2010, 5:27:26 PM8/13/10
to Dragonfly
Hi Mark,

My guess is that doing it this way:

image = Magick::Image.from_blob(File.read('path/to/image/
file.png')).first

The .first part is possibly leaving the original Magick::Image hanging
around.
Maybe try this:

image = Magick::Image.from_blob(File.read('path/to/image/file.png'))
# so something with image.first
image.destroy!

Or better:

image = Magick::ImageList.new('path/to/image/file.png')
image.destroy!

Also, where possible, using Magick::Image.ping and getting rid of
intermediate images helps too:

scaled = image.scale(scale_amount)
distorted = scaled.distort( ... )
scaled.destroy!
distorted.write(filename)
distorted.destroy!

I have a report_memory() function that logs usage after each series of
operations. Things don't always return to my original memory level,
but it at the end of it all you can end up pretty close - at least not
something thats 2x or 10x where you started from!

regards,
foz

Mark Evans

unread,
Aug 24, 2010, 8:13:54 PM8/24/10
to Dragonfly
I've just released a new version (0.7.0) which hopefully makes memory
consumption a bit better, making use of image.destroy! - I've not done
too many performance tests but the ones I did do showed a definite
improvement
Mark

ninian

unread,
Aug 29, 2010, 2:39:55 PM8/29/10
to Dragonfly
Hey Mark,

First, thanks again for all your effort on Dragonfly. I'm using it in
every project...

Although it seemed at first that the memory consumption was down,
somewhere between 0.7.0 and now, things went wrong... Right now, after
resizing 9 pictures on a slideshow page (which are very large, fair
enough), my process has reached 383.8Mb, and it isn't going back down.

Have you made some significant changes on that front?

Cheers,

NL

Mark Evans

unread,
Sep 1, 2010, 8:24:26 AM9/1/10
to Dragonfly
No I hadn't.

Having said that - it seems that using RMagick in-memory to_blob and
from_blob was pretty bad for memory usage - even with using
image.destroy! around the place.

So I've released yet another version - 0.7.5

RMagick stuff now uses the filesystem by default (this seems to still
work ok on Heroku, as Tempfiles are allowed), though it can be
configured to do everything in-memory (see Configuration doc) if it
has to.

I did a couple of tests in a Rails app and it seems to be a lot better
on memory.

Hopefully this will make things better.
Please let us know if you're still having problems

Mark

ninian

unread,
Oct 7, 2010, 3:46:07 AM10/7/10
to Dragonfly
Hey Mark,

I was able to contain the problem somewhat with very aggressive
caching, but the core memory consumption issue is still there.

Do you think it would be hard to have dragonfly write the resized
images on S3 and serve those when they are present? Or are any Rmagick
specialists able to suggest ways to limit the memory usage of the
processing? Some single images can put my ruby thread up to 250MB...

Cheers,

NL


On Sep 1, 2:24 pm, Mark Evans <mark.john.ev...@gmail.com> wrote:
> No I hadn't.
>
> Having said that - it seems that using RMagick in-memory to_blob and
> from_blob was pretty bad for memory usage - even with using
> image.destroy! around the place.
>
> So I've released yet another version - 0.7.5
>
> RMagick stuff now uses the filesystem by default (this seems to still
> work ok onHeroku, as Tempfiles are allowed), though it can be

Mark Evans

unread,
Oct 10, 2010, 6:47:59 AM10/10/10
to Dragonfly
Hi - what kind of processing are you doing? And what kind of size
images?
That seems quite high.
I personally don't know of anything else that can be done to keep
memory lower as it already uses the RMagick destroy!/ping/non-blob
methods - I wonder if that's just as far as RMagick goes.
The other solution is creating other processors which don't use
rmagick.
This is pretty easy - e.g.

Dragonfly[:images].processor.add :my_resize do |temp_object,
*args|
t = Tempfile.new('dragonfly')
`convert -some -options #{temp_object.path} #{t.path}`
t
end

would shell out to the imagemagick command line (btw the above is
pseudo-code - you might want a t.close in there, etc.)

As for serving straight from S3, you could always do something like

class Album
before_save :assign_thumb, :if => :cover_image_uid_changed?

image_accessor :cover_image
image_accessor :cover_image_thumb

private

def assign_thumb
self.cover_image_thumb = cover_image ?
cover_image.thumb('100x50') : nil
end

end

to have another accessor for the thumbnail, then serve directly from
S3 with the correct url
"http://s3.url.whatever.that.is/bucket_name/
#{album.cover_image_thumb_uid}"

Haven't tried that but it should work, or something similar

Mark
Reply all
Reply to author
Forward
0 new messages