[JIRA] (JENKINS-47046) s3Upload with includePathPattern does not upload files

2 views
Skip to first unread message

sorin.sbarnea@gmail.com (JIRA)

unread,
Jun 2, 2018, 2:14:03 PM6/2/18
to jenkinsc...@googlegroups.com
Sorin Sbarnea commented on Bug JENKINS-47046
 
Re: s3Upload with includePathPattern does not upload files

I can confirm this bug, mainly the `path` is fully ignored when include pathPattern is a pattern, making impossible to upload packages to specify target locations inside the buckets. 

Add Comment Add Comment
 
This message was sent by Atlassian JIRA (v7.3.0#73011-sha1:3c73d0e)
Atlassian logo

sorin.sbarnea@gmail.com (JIRA)

unread,
Jun 2, 2018, 2:15:01 PM6/2/18
to jenkinsc...@googlegroups.com
Sorin Sbarnea edited a comment on Bug JENKINS-47046
I can confirm this bug, mainly the `path` is fully ignored when include pathPattern is a pattern, making impossible to upload packages to specify target locations inside the buckets. 


I am still trying to find a way to workaround this bug but so solution yet. Anyone?

sorin.sbarnea@gmail.com (JIRA)

unread,
Jun 2, 2018, 2:15:04 PM6/2/18
to jenkinsc...@googlegroups.com

oliver.schoenborn@gmail.com (JIRA)

unread,
Feb 28, 2019, 2:45:02 AM2/28/19
to jenkinsc...@googlegroups.com
Oliver Schoenborn commented on Bug JENKINS-47046
 
Re: s3Upload with includePathPattern does not upload files

The only thing that works for me is
 

s3Upload( 
  bucket: 'BUCKET', 
  path: "PATH_TO_FOLDER", // no trailing slash 
  file: "FOLDER", 
  workingDir: "PARENT_OF_FOLDER" 
)

 
With includePath, the only one that worked partially was using "*/.yaml" as path; other patterns like ".yaml", "/" did not work. I say partially because it uploaded only one file eventhough there were several (there is a bug related to this).

This message was sent by Atlassian Jira (v7.11.2#711002-sha1:fdc329d)

oliver.schoenborn@gmail.com (JIRA)

unread,
Feb 28, 2019, 2:54:03 AM2/28/19
to jenkinsc...@googlegroups.com
Oliver Schoenborn edited a comment on Bug JENKINS-47046
The only thing that works for me is
 
{code:java}

s3Upload(
  bucket: 'BUCKET',
  path: "PATH_TO_FOLDER", // no trailing slash
  file: "FOLDER",
  workingDir: "PARENT_OF_FOLDER"
){code}
 
With includePath, the only one that worked *partially* was using "**/*.yaml" as path; other patterns like "*.yaml", "**/*" did not work. I say partially because it uploaded only one file eventhough there were several (there is a bug related to this).


Also findFiles is an option, as documented https://github.com/jenkinsci/pipeline-aws-plugin/issues/83.

robwhite83+jenkins@gmail.com (JIRA)

unread,
Mar 21, 2019, 7:18:03 PM3/21/19
to jenkinsc...@googlegroups.com

Sad to see no update on this.

I can confirm it's an issue for me and yes, only on my slaves.

demetrio.lm@gmail.com (JIRA)

unread,
Jun 26, 2019, 9:23:03 AM6/26/19
to jenkinsc...@googlegroups.com

my workaround is uploading file by file

 

pipeline {
  agent any
    stages {
      stage('Setting up environment variables'){
        def AWS_ACCOUNT_ID = '01010101010101010'
        def REGION = 'eu-north-1'
        def ROLE = 'MyIamRole'
        def EXTERNAL_ID = 'MyExternalId'
        def BUCKET = 'my-artifacts'
        def PROJECT = 'my-project'
      }
      stage ('Build app and upload artifacts to S3'){
        agent {
          label 'my-slave-with-maven'
        }
        steps {
          // build source code
          dir('./SourceCode') {
            sh 'mvn -B clean package'            
          }
        }
        script {
          // upload files to S3
          def jar_files = findFiles(glob: "**/SourceCode/${PROJECT}/target/*.jar")
          jar_files.each {
            echo "JAR found: ${it}"
            withAWS(externalId: "${EXTERNAL_ID}", region: "${REGION}", role: "${ROLE}", roleAccount: "${AWS_ACCOUNT_ID}") {
              s3Upload(file: "${it}", bucket: "${BUCKET}", path: "${PROJECT}/", acl: 'BucketOwnerFullControl')
            }
          }
        }
      }
    }
  }
}
Reply all
Reply to author
Forward
0 new messages