one way to get around S3 slowness

7 views
Skip to first unread message

Jason LaPier

unread,
Apr 26, 2008, 11:39:43 PM4/26/08
to papercli...@googlegroups.com
I'm building an app for a media company that will need customers to
upload image files with print-level DPI; an average of 2MB per file,
usually 2 files at a time. The "boundless" nature of S3 sounds like a
great idea for someone who is going to run out of space on a little
VPS real quick, but with uploads that big going up to the server, then
making another trip over to S3 from there (and not too quickly), well,
this could be problematic.

Problem: Need S3 for long term storage, but too slow to use in "real
time" with large uploads

Possible Solution: Put two paperclip attachments on the model, one
stores locally, one on s3. Run a background or cron script to move the
file from one attachment to the other

Example: (http://pastie.caboo.se/187484)

class ImageFile < ActiveRecord::Base
has_attached_file :local, :styles => { :thumbnail => ["200x200!", :png] }

has_attached_file :remote, :styles => { :thumbnail => ["200x200!", :png] },
:storage => :s3, :s3_credentials =>
File.new(File.join(File.dirname(__FILE__), "/../../config/s3.yml")),
:bucket => "make-up-your-own-test-bucket", :path =>
":class/:attachment/:id/:style.:extension"

# for the views
def good_image
local_file_name ? local : remote
end

def archive_to_remote
if local_file_name
self.remote = local.to_file
if self.save and remote.original_filename and remote.exists?
self.local = nil
self.save
else
false
end
end
end
end

My preliminary tests show that this works. Of course, it's probably
regenerating the thumbnail during the archive, but I can live with
that.

Is this a Good Idea or a Bad Idea?

- Jason L.

--
My Rails and Linux Blog: http://offtheline.net

Reply all
Reply to author
Forward
0 new messages