Yes, archiving artifacts is very slow, this has apparently always been the case with Jenkins - there are numerous jira issues that have come and been resolved without fixes to this. It is what it is..
As I
commented here, I'm also seeking massive performance gains by replacing copyArtifact with a shell call in my pipelines. In lieu of installing a proper artifact management system and replacing all archive/copyArtifact with calls to its REST API (which I'll be doing later this year), I'm hoping to find a quick alternative.
HTTP/wget/curl is problematic when you want to fetch anything but a single artifact or all artifacts, bc HTTP doesn't support a notion of a directory, so you have to fetch the index and preprocess before fetching what you really want. With scp I could just use unix glob pattern matching to fetch what I desire in a single simple call.
HTTP/wget/curl is also problematic bc you have to use Jenkins API tokens and authentication. I'm using ansible to setup my jenkins infra inside a firewalled LAN, and the jenkins users on all nodes are already set up to be able to freely ssh back and forth without password.
So, SCP would be a slam dunk for me if I could construct the source path correctly. The problem is that Jenkins uses an algorithm to create a unique folder name, both for workspace names, and for branch job names, but I'm not sure if that's consistent, and therefore I do not know if it would be safe to attempt to re-construct and reference job paths on the controller's disk.
E.g. to fetch artifacts from the corresponding branch of an upstream multibranch pipeline job with Full project name of "ProjectFolder/MyProject/feature%2Ffoo", in the downstream multibranch pipeline, I would do something like:
scp -r jenkins-controller:<JENKINS_HOME>/jobs/ProjectFolder/jobs/MyProject/branches/<HOW_DO_I_COMPUTE_THE_BRANCH_PORTION_OF_PATH?>/lastSuccessfulBuild/artifact/<GLOB>