What compression algorithm is used by the pipeline zip step, and is it tunable?

19 views
Skip to first unread message

Tim Black

unread,
Dec 2, 2019, 2:56:02 PM12/2/19
to Jenkins Users
Our projects produce large artifacts that now need to be compressed, and I'm considering my alternatives. The zip step would be a nice non-plugin solution but I'm curious what compression technique this uses. The documentation page linked above doesn't show any options that pertain to compression tuning.

I've also seen the Compress Artifacts Plugin, but I can't tell from its docs either whether the algo is tunable. Also I'd rather not depend on another plugin.

If neither of the above work, I'll simply use sh step to call xz, gzip, bzip, or the like, from my linux-based master.

Thanks for your consideration,
Tim Black

Björn Pedersen

unread,
Dec 3, 2019, 2:09:19 AM12/3/19
to Jenkins Users
Hi,

I would probably try to compress on the agent before even trying to transfer the large data to the master. This avoids load on the master a) due to transfer and b) due to compression.
And if the artifacts get really huge, consider storing them  independent from jenkins (S3, maven-style repo, whatever matches your use-case).


Björn

Tim Black

unread,
Dec 4, 2019, 3:00:05 PM12/4/19
to Jenkins Users
Thanks Björn. We're currently on a single master, and I definitely will take performance into consideration when we scale. We're looking into installing and ind integrating with Artifactory in the coming months, which should help with managing artifacts, but I suspect there will still be the issue of "who does the compression"..

Many of our artifacts are already compressed entities, but I have confirmed that zipping everything (using Jenkins zip) shrinks them by more than half, so I'm definitely on the right track..
Reply all
Reply to author
Forward
0 new messages