Google Cloud Build Array Size limits

48 views
Skip to first unread message

Jacob Rhoden

unread,
Jan 11, 2020, 1:45:35 AM1/11/20
to google-appengine-go
We have a package which consists of a single array of bytes. It is a 13mb lookup/translation table stored in an array in a .go file. When I attempt to deploy to google app engine, the build fails, reporting an out of memory error. (Details below)

Any ideas for how to build projects which contain large arrays?

Thanks!




Step #1 - "builder": 2020/01/11 06:27:11 Failed to build app: [go build -o /tmp/staging/usr/local/bin/start .] with env [PATH=/go/bin:/usr/local/go/bin:/builder/google-cloud-sdk/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HOSTNAME=c77b1898fb9a HOME=/builder/home BUILDER_OUTPUT=/builder/outputs DEBIAN_FRONTEND=noninteractive GOROOT=/usr/local/go/ GOPATH=/go GO111MODULE=on GOCACHE=/tmp/cache GOPATH=/go] failed: err=exit status 2

...
...
...

fatal error: runtime: out of memory

runtime stack:
runtime.throw(0xe44a39, 0x16)
\t/usr/local/go/src/runtime/panic.go:774 +0x72
runtime.sysMap(0xc0c0000000, 0x4000000, 0x15c6958)
\t/usr/local/go/src/runtime/mem_linux.go:169 +0xc5
runtime.(*mheap).sysAlloc(0x159e940, 0x2000, 0x2000, 0xc00003ded8)
\t/usr/local/go/src/runtime/malloc.go:701 +0x1cd
runtime.(*mheap).grow(0x159e940, 0x1, 0xffffffff)
\t/usr/local/go/src/runtime/mheap.go:1255 +0xa3
runtime.(*mheap).allocSpanLocked(0x159e940, 0x1, 0x15c6968, 0xc00001d320)
\t/usr/local/go/src/runtime/mheap.go:1170 +0x266
runtime.(*mheap).alloc_m(0x159e940, 0x1, 0xc000030012, 0x458b8a)
\t/usr/local/go/src/runtime/mheap.go:1022 +0xc2
runtime.(*mheap).alloc.func1()
\t/usr/local/go/src/runtime/mheap.go:1093 +0x4c
runtime.systemstack(0x0)
\t/usr/local/go/src/runtime/asm_amd64.s:370 +0x66
runtime.mstart()
\t/usr/local/go/src/runtime/proc.go:1146

goroutine 1 

Chaoming Li

unread,
Jan 12, 2020, 4:45:51 PM1/12/20
to google-appengine-go
You can put the large array into a JSON file and deploy with the project. In the main() function, you can read the JSON file and keep it in an array variable so that each instance will only need to read the file once in its lifecycle. I can imagine it will make your instance slower to spin up.

I have used this for some kind of static lookup data files, but not as big as 13mb. I hope this helps.

Jacob Rhoden

unread,
Jan 22, 2020, 1:35:20 PM1/22/20
to google-appengine-go
Hi Chaoming,

On Monday, 13 January 2020 08:45:51 UTC+11, Chaoming Li wrote:
You can put the large array into a JSON file and deploy with the project. In the main() function, you can read the JSON file and keep it in an array variable so that each instance will only need to read the file once in its lifecycle. I can imagine it will make your instance slower to spin up.

Thanks for your reply! It did occur to me that we could write code that converts binary data into a json file, and then write go code to convert that json data back into binary at runtime, but the only reason to do this would be to work-around an apparent google cloud build limitation. Having looked at it further, it seems to me that it might be a memory limitation causing the problem (so the actual go file containing the data is too large).

I have since discovered that if we use "Google Could Run" we can do the docker build on our local machine, which avoids the memory limitation. However, it creates a secondary problem, "Google Cloud Run" is only available (with local datastore/firebase) in a couple of regions. (Regions that have high latency to where I live).

Looking forward to the growing availability of Google Cloud Run.

Best regards,
Jacob

___
Reply all
Reply to author
Forward
0 new messages