--
--
Chromium Developers mailing list: chromi...@chromium.org
View archives, change email options, or unsubscribe:
http://groups.google.com/a/chromium.org/group/chromium-dev
you probably want to use -A, but keep in mind that your description as-is doesn't make sense. "old" pack files aren't a thing purely based on timestamps. they're incremental by default so having multiple packs in your objects tree is perfectly normal.
On Thu, Nov 19, 2015 at 1:39 PM, Mike Frysinger <vap...@chromium.org> wrote:you probably want to use -A, but keep in mind that your description as-is doesn't make sense. "old" pack files aren't a thing purely based on timestamps. they're incremental by default so having multiple packs in your objects tree is perfectly normal.The point of -a and -A is to not incrementally pack, and instead pack everything into a single pack (from the git-repack docs). Does that not actually mean "a single pack on disk"?
As to -A vs. -a, it seems like the only difference there is that if I use -A I have to run an extra git gc pass after the repack. Is that not true?
On Thu, Nov 19, 2015 at 1:43 PM, Peter Kasting <pkas...@google.com> wrote:On Thu, Nov 19, 2015 at 1:39 PM, Mike Frysinger <vap...@chromium.org> wrote:you probably want to use -A, but keep in mind that your description as-is doesn't make sense. "old" pack files aren't a thing purely based on timestamps. they're incremental by default so having multiple packs in your objects tree is perfectly normal.The point of -a and -A is to not incrementally pack, and instead pack everything into a single pack (from the git-repack docs). Does that not actually mean "a single pack on disk"?that isn't what your e-mail said. it said "i saw packs with old timestamps, therefore i then ran git repack" with the implication that the packs, being old, were useless. i'm pointing out that logic doesn't make sense by itself.
As to -A vs. -a, it seems like the only difference there is that if I use -A I have to run an extra git gc pass after the repack. Is that not true?-A will drop useless objects that are already in a pack. -a won't do that. since your stated goal is to minimize, it sounds like you want -A.
Hmm. I think maybe the issue is that I have another Chromium checkout inside my webrtc checkout. Unfortunately, running "git gc --prune=now" there eventually gives: "fatal: Out of memory? mmap failed: No error". Despite this being on a machine with 64 GB of RAM.
I wanted to do a repack, but it is prohibitively slow. The following worked for me and is much faster:Do a fresh clone. This will ensure that the repo has one big packfile. (And theserver packs it for you so no need to do it locally.)$ git clone https://chromium.googlesource.com/chromium/src.git chromium_srcRestore your local work to this repo (you will lose your stashes):$ cd chromium_src$ git remote add with_old_packs file:///path/to/chromium/src$ git fetch with_old_packs --tagsSwap the fragmented objects and packfiles with the fresh tidy packfiles:$ cd /path/to/chromium/src/.git$ mv objects objects.bak$ mv /path/to/fresh/chromium_src/.git/objects objectsStop Git from complaining about missing stashes:$ git stash clearIf you are convinced that you did not lose any local work, remove `chromium_src`and `.git/objects.bak`.You should now have two packfiles, a big one (~4.5 GiB) with the Chromium (andBlink) history, and a small one with your local work. There should be no looseobjects.
--
Somehow I missed this thread!On Thu, Nov 19, 2015 at 9:34 PM, 'Peter Kasting' via Chromium-dev <chromi...@chromium.org> wrote:>My drive is filled with large git pack files, many of which have old dates, totaling 55 GB.55 GB of pack files is insane. An ideal value should be ~4GB. A reasonable value should be around 5-6 GB.I'd expect "git repack -a -d" to collapse everything in one pack file. Peter, what you are seeing feels just a git bug.
> Unfortunately, running "git gc --prune=now" there eventually gives: "fatal: Out of memory? mmap failed: No error". Despite this being on a machine with 64 GB of RAM.This is a combined effect of:- depot tools setting a very high core.deltaBaseCacheLimit (which is a good idea in general)- your machine having a lot of cores, which hints git to use a high pack.threads concurrency (which is a good idea in general)So prune ends up spawning lot of threads each one with a huge deltaCache budget. The reason why the internet is not helping here is that: people on the internet usually don't have 40+ cores workstation and don't mess around with deltaBaseCacheLimit. You have the privilege of doing both :)The answer here is to either temporarily lower down the deltaBaseCacheLimit (say to 128M) or temporarily reduce pack.threads (I'd probably go for the latter).
On top of that, there is a possibility that even if you succeed to repack everything, the final pack file won't be as small as you expect. The answer is that all these repack operations don't rebuild the delta chains, which is what might hit your sizeof(.git/objects/pack) the most if you sync ~daily for ~years.What you really want in this case is passing -f (no reuse delta) to git repack. This will take ages (a night if you set the right deltaBaseCacheLimit or pack.threads options), but is the only thing that would actually bring full sanity to your packs... other than restarting from scratch, which honestly might not be such a bad idea ;-)