Hi,
on our Gerrit 2.9.4 pushing 21 changes created via
for i in {4..24}; do
dd if=/dev/urandom of=rf"$i" bs=1 count=1024000
git add .
git commit -m "commit $i"
done
takes very long, roughly 2 minutes. However, this only happens for a single projects. For other projects on the same server it only takes about 14 seconds, which is fine. So it's not a network bandwith issue, and Git also shows a transfer rate of about 8MB/s. Also, running GC on the project (which we do once a week anyway) does not help to improve performance.
I've been reading that a large number of refs/changes may cause the "Processing changes" step to take longer. However, what is "large" in this context? We have about 40k changes, which personally I do not think to be a lot.
Any other hints what could cause "Processing changes" to take so long? What exactly is done on the server side in this step? How can this step be profiled?
Thanks for any insights.
Regards,
Sebastian