I've just landed
a change in depot_tools which adds a new --no-history option to fetch (and gclient). This allows to get a working checkout (w/o history) in ~26 min from a 8
Mbit/s DSL [1].
In the very essence this works like a git clone --depth, but is DEPS-aware and works consistently with our pinned subprojects (shallowing the minimum amount of commits to track the right revisions).
Why:
This is to deal with the difficulties that many people have reported in this ML [2].
Our repository/history is huge (~9 Gigs of .git objects), git doesn't support resuming interrupted downloads -> fetching chrome from a slow / flaky network connection can be a huge pain.
When should you use it:
In general, this option is discouraged for regular chromium developers [3].
However, if you are trying to fetch chromium from a very slow connection and not interested in the history, this can save several gigs (approx. -7GB) from your checkout and reduce sensibly the fetch times (~26 mins instead of ~hours).
How does this affect your workflow:
This change should not interfere in any way with your current workflow (as long as you don't pass --no-history).
If you see anything weird, either locally or on the bots, related with history/shallow please let me know.
P.S. Many thanks to our infra folks for the prompt response / suggestions in the CR.
[1] I was able to complete a fetch in 26 mins over a 8M DSL instead of ~hours. Other infra folks experienced similar speedups. See comments in
crrev.com/437903002.
[2]
[chromium-dev] "fetch chromium" keeps hanging/getting stuck on Windows 7
[chromium-dev] Initial checkout with git taking long
[chromium-dev] Trying to get latest source code fails when fetching
[chromium-dev] Gclient sync takes too long
[3] I did a manual smoke test and at gclient sync / git cl upload seem to work fine with a shallow clone (tried on Mac and Ubuntu). I won't be surprised, though, if some of our other tools make assumptions on having a non-shallow checkout (certainly the bisect scripts do :) ).