--
You received this message because you are subscribed to the Google Groups "Shrine" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ruby-shrine...@googlegroups.com.
To post to this group, send email to ruby-...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ruby-shrine/f959f4c2-e7ab-4056-aa7e-7e08c9130362%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
uploader = CsvUploader.new(:store)
import.file = uploader.upload Down.open(import_file) # import_file is the S3 object.key
class CsvUploader < Shrine
plugin :direct_upload, max_size: 5 * 1024 * 1024 # 5 MB
endrequire 'shrine'
require 'shrine/storage/s3'
s3_options = {
access_key_id: ENV['AWS_ACCESS_KEY_ID'],
secret_access_key: ENV['AWS_SECRET_ACCESS_KEY'],
region: ENV['AWS_REGION'],
bucket: ENV['AWS_BUCKET']
}
Shrine.storages = {
cache: Shrine::Storage::S3.new(prefix: 'cache', **s3_options),
store: Shrine::Storage::S3.new(prefix: 'store', **s3_options)
}
Shrine.plugin :activerecord
Shrine.plugin :cached_attachment_data # for forms
Shrine.plugin :logging
Shrine.plugin :backgrounding
Shrine::Attacher.promote { |data| Delayed::Job.enqueue PromoteJob.new(data) }
Shrine::Attacher.delete { |data| Delayed::Job.enqueue DeleteJob.new(data) }
Can I do a move or a copy/delete in the same way? I want to remove the files that are uploaded after they have been saved to permanent location.
To view this discussion on the web visit https://groups.google.com/d/msgid/ruby-shrine/35a29298-4a92-4fa4-b658-9605a4ebcb94%40googlegroups.com.
yeah, not obvious I guess :) there are two separate flows that were using paperclip that I am trying to replace:
- Flow #1 Web user using the website attaches a CSV file directly to an Import model
- The Import model has that CsvUploader attribute
- Uses the cache/store approacj and seems to working as expected
- Flow #2 Background task attaches CSV files to an import model (per previous email)
- In this case the files are uploaded to another S3 bucket completely - let's call it Upload
- Rake task is run every 10 minutes looking for files in Upload and if found creates an Import and attaches the CSV to it
- I'm not sure at this point how the cache/store method works or how I should address it?
Shrine.storages = {
cache: Shrine::Storage::S3.new(bucket: "my-bucket", ...),
store: Shrine::Storage::S3.new(bucket: "my-bucket", ...),
}
Shrine.plugin :refresh_metadata
uploaded_file = CsvUploader.uploaded_file("id" => object_key, "storage" => "other_store")
uploaded_file.refresh_metadata! # retrieves metadata from the S3 object (recommended)
store = CsvUploader.new(:store)
stored_file = store.upload(uploaded_file) # S3 object is copied from :other_store to :store
import = Import.new
import.file_attacher.set(stored_file) # stored file is assigned without any re-uploading
import.save
Shrine.storages = {
cache: Shrine::Storage::S3.new(bucket: "my-bucket", ...),
store: Shrine::Storage::S3.new(bucket: "my-bucket", ...),
}
remote_file = Down.open(import_url)
store = CsvUploader.new(:store)
stored_file = store.upload(remote_file) # downloads and uploads the remote file to :store
import = Import.new
import.file_attacher.set(stored_file) # assigns the stored file without any re-uploading
import.save
To unsubscribe from this group and stop receiving emails from it, send an email to ruby-shrine+unsubscribe@googlegroups.com.
just need to remove the uploaded file after I have copied it.