Sharing local cache

2,164 views
Skip to first unread message

Alexandre Garcia-Mayans

unread,
Jan 17, 2019, 4:50:53 PM1/17/19
to bazel-...@googlegroups.com
Hi,

We are trying to optimize our bazel setup with our current workflow where multiple developers are using the same machine to build. Is it possible to share the local cache somehow?

I've seen that even with the same user, if I copy a repository and trigger a build in both the original then the copy, then compilation starts from scratch in the copy.

Regards,
Alexandre Garcia Mayans

Alexandre Rostovtsev

unread,
Jan 17, 2019, 5:13:35 PM1/17/19
to the...@gmail.com, bazel-...@googlegroups.com
Have you tried the --disk_cache flag to use a local directory as the remote cache? See https://docs.bazel.build/versions/master/remote-caching.html#disk-cache

(Note that currently, Bazel doesn't garbage collect that directory, so you would probably want a script to occasionally prune it. See https://github.com/bazelbuild/bazel/issues/5139)

--
You received this message because you are subscribed to the Google Groups "bazel-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bazel-discus...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bazel-discuss/CAHSZu8rhKhXmwDj-b2D8uZydoxt-ZpBb8TgQ9iLFXjaXA8pwSg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Alex Garcia

unread,
Jan 17, 2019, 5:18:42 PM1/17/19
to Alexandre Rostovtsev, bazel-...@googlegroups.com
We do have a remote cache already set up. The problem is not really compilation time but more waste of space.

I’d expect the —disk_cache flag to still get the data into a local cache per repository right?

david.o...@gmail.com

unread,
Jan 18, 2019, 9:36:28 AM1/18/19
to bazel-discuss
Am Donnerstag, 17. Januar 2019 23:18:42 UTC+1 schrieb Alex Garcia:
> We do have a remote cache already set up. The problem is not really compilation time but more waste of space.
>
>
> I’d expect the —disk_cache flag to still get the data into a local cache per repository right?

This is exactly why I added this feature:

"Allow path options to use user specific paths": [1]

Here is one example how it is used: [2]

build --disk_cache=~/.gerritcodereview/bazel-cache/cas
build --repository_cache=~/.gerritcodereview/bazel-cache/repository

[1] https://github.com/bazelbuild/bazel/pull/4852
[1] https://github.com/GerritCodeReview/gerrit/blob/master/.bazelrc#L2-L3

Alexandre Rostovtsev

unread,
Jan 18, 2019, 5:22:21 PM1/18/19
to Alex Garcia, buc...@google.com, bazel-...@googlegroups.com
Right, a local cache will still be there taking some space. There is a plan to reduce local caching when a remote cache is available - https://github.com/bazelbuild/bazel/issues/6862 - but the work is at a very early and experimental stage.

At the moment, you might need to regularly run bazel clean.

+Jakob Buchgraber who knows the most about this area.

david.o...@gmail.com

unread,
Jan 18, 2019, 5:28:52 PM1/18/19
to bazel-discuss
On Friday, January 18, 2019 at 11:22:21 PM UTC+1, Alexandre Rostovtsev wrote:
> Right, a local cache will still be there taking some space. There is a plan to reduce local caching when a remote cache is available - https://github.com/bazelbuild/bazel/issues/6862 - but the work is at a very early and experimental stage.
>
>
> At the moment, you might need to regularly run bazel clean.

I don't see how disc cache is related to bazel clean,
even with `--expunge_async` option used it must not wipe
out the cache, so that currently, only `rm -rf` is your friend.

Alexandre Rostovtsev

unread,
Jan 18, 2019, 5:36:54 PM1/18/19
to david.o...@gmail.com, bazel-discuss
I think we might be talking about different caches. I meant the files cached under outputBase for a given workspace.

--
You received this message because you are subscribed to the Google Groups "bazel-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bazel-discus...@googlegroups.com.

mikhail...@fivetran.com

unread,
Apr 22, 2020, 9:44:53 AM4/22/20
to bazel-discuss
I double this question.

outputBase is not shared between job runs on a CI, so we waste a lot of time to re-analyze the state of the build (action cache, specifically).

Is it by design?


On Saturday, January 19, 2019 at 1:36:54 AM UTC+3, Alexandre Rostovtsev wrote:
I think we might be talking about different caches. I meant the files cached under outputBase for a given workspace.

On Fri, Jan 18, 2019 at 5:28 PM <david.o...@gmail.com> wrote:
On Friday, January 18, 2019 at 11:22:21 PM UTC+1, Alexandre Rostovtsev wrote:
> Right, a local cache will still be there taking some space. There is a plan to reduce local caching when a remote cache is available - https://github.com/bazelbuild/bazel/issues/6862 - but the work is at a very early and experimental stage.
>
>
> At the moment, you might need to regularly run bazel clean.

I don't see how disc cache is related to bazel clean,
even with `--expunge_async` option used it must not wipe
out the cache, so that currently, only `rm -rf` is your friend.

--
You received this message because you are subscribed to the Google Groups "bazel-discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bazel-...@googlegroups.com.
Message has been deleted

Lorena Dawne

unread,
Jun 4, 2020, 6:54:20 PM6/4/20
to bazel-discuss
Leave me alone 

On Mon., Jun. 1, 2020, 3:41 p.m. TomTom, <salleb...@gmail.com> wrote:
--
You received this message because you are subscribed to a topic in the Google Groups "bazel-discuss" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/bazel-discuss/mzEXZMCa2Ic/unsubscribe.
To unsubscribe from this group and all its topics, send an email to bazel-discus...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/bazel-discuss/445f155e-2cdf-4a09-a24f-d41fab8048d7%40googlegroups.com.

salleb...@gmail.com

unread,
Jun 6, 2020, 7:51:48 PM6/6/20
to bazel-discuss
Do you need my address ?
Reply all
Reply to author
Forward
0 new messages