Rendering of large PDF-Files: flushing of memory possible?

61 views
Skip to first unread message

Robert Neumayr

unread,
Mar 25, 2016, 3:59:46 AM3/25/16
to Prawn
Hello!

I have a project that renders pretty big PDF files. In one instance the PDF file is more than 600 pages with 4 images on almost every page. This eventually leads to a NoMemoryError by ruby cause the PDF is held entirely in memory... Is there any way to flush the memory to the file during the creation process? I am tinking something along the lines of "create 100 pages - flush - next 100 - flush - ..."

Thanks for taking the time and for this awesome library!

William Ross

unread,
Mar 25, 2016, 4:28:57 AM3/25/16
to prawn...@googlegroups.com
Dear Robert,

I would suggest writing out a series of smaller files and concatenating them with pdftk:


We do something similar to combine generated with uploaded pdf files and it seems reliable. For example:

  def collate_correspondence
    FileUtils.mkdir_p(Rails.root.join("tmp/applications/#{id}"))
    merged_path = Rails.root.join("tmp/applications/#{id}/correspondence_#{id}.pdf")
    source_pathnames = (reviews + references).map(&:local_tempfile_path).map {|f| "'#{f}'"}
    if merge_pdfs(source_pathnames, merged_path)
      self.correspondence_file = File.open(merged_path)
      self.save
    end
  end

  def merge_pdfs(source_paths, destination_path)
    Pdftk.new.merge(source_paths, destination_path)
  end


This is all quite slow because the files are pulled from S3, but the concatenation operation seems quick.

yours,

Will





--
You received this message because you are subscribed to the Google Groups "Prawn" group.
To unsubscribe from this group and stop receiving emails from it, send an email to prawn-ruby+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages