I'm using http_jar to pull in external jar deps for our repo from s3, numbering about ~150 jars. On a slow connection, I've seen Bazel simply hang, and the console output is switching between downloading two or more files. This doesn't happen on a fast connection.
I've managed to reproduce this consistently with a network throttling tool and a simplified workspace with http_files.
Repro repo: https://github.com/mightyguava/bazel-external-bug
I uploaded 100 random 100kB files to s3, referencing them using http_file in the repo above.
With my network connection throttled to 2mbps (I'm using the "Network Link Conditioner" from apple developer tools), and running the command:
bazel build `bazel query //external:all | grep file_ | cut -d : -f 2 | sed -e 's/^/@/' -e 's/$/\/\/file/'`
I would consistently end up in a state that looks like https://cl.ly/3T1g3N0D3a2b. The files that hang seem to be random. I'd have to Ctrl-C twice to kill Bazel to get out of it, and the Bazel server remains unresponsive and is killed by the client on next invocation.
Not really sure if the problem is with my configuration, Bazel or S3, but I'd guess it has to do with Bazel trying to download too many things in parallel.
Would appreciate any help in debugging this.
Thanks,
Yunchi
--
You received this message because you are subscribed to the Google Groups "bazel-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bazel-discuss+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bazel-discuss/53e71b75-9163-4eaf-b961-c64f2ba74f1e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.