Default memory resource limits for ImageMagick on Cloud Functions

184 views
Skip to first unread message

Shai Ben-Tovim

unread,
Sep 12, 2017, 10:13:36 AM9/12/17
to Firebase Google Group


Having suffered quite a few Error: memory limit exceeded. Function killed. errors in FCF when using ImageMagick in a function scope it seems the default resource limits (set by env variables as described in IM docs) for IM are way higher than what's available for a default function.

This is what IM thinks it has on a FCF:

  File       Area     Memory        Map       Disk   Thread  Throttle       Time

--------------------------------------------------------------------------------

 18750    4.295GB       2GiB       4GiB  unlimited        8         0  unlimited  


While the default memory allocated to a function is 256M. This causes IM to try and allocate memory buffers as it thinks it has 2G of memory and never fallback to disk based buffers in case its reached the limit.


It seems that in GAE, the IM resource limits are set correctly (at least that's the amount of memory I have in my instance) or at least more realistic.

File Area Memory Map Disk Thread Throttle Time-------------------------------------------------------------------------------- 768 2.098GB 1000MiB 1.9539GiB unlimited 1 0 unlimited
As developers cannot access "regular" env variables (like in GAE) to set the defaults as required it would be helpful if FCF could provision these limits based on actual resources available to the function. Image manipulation functions would be longer in duration and more resource intensive than others and simply increasing memory allocated to functions (in GCP console) is not a great solution cost-wise and is not bullet proof to outlier cases.

Reply all
Reply to author
Forward
0 new messages