[JIRA] (JENKINS-59484) EXECUTOR_NUMBER is always expanded to '0' in workspace name when checking out with Perforce to a subdirectory

0 views
Skip to first unread message

mbrunton@eidosmontreal.com (JIRA)

unread,
Sep 23, 2019, 9:09:02 AM9/23/19
to jenkinsc...@googlegroups.com
Matthew Brunton created an issue
 
Jenkins / Bug JENKINS-59484
EXECUTOR_NUMBER is always expanded to '0' in workspace name when checking out with Perforce to a subdirectory
Issue Type: Bug Bug
Assignee: Unassigned
Components: p4-plugin
Created: 2019-09-23 13:08
Environment: Docker image jenkins/jenkins:2.176.3 in a docker network with a Perforce instance (https://github.com/p4paul/helix-docker) using p4-plugin 1.10.3
Priority: Blocker Blocker
Reporter: Matthew Brunton

When using the p4 plugin to checkout to a subdirectory, EXECUTOR_NUMBER is always expanded in the Perforce workspace name to be '0'. This messes up concurrent builds that rely on the Perforce workspace root since each subsequent build uses the same workspace and overwrites the root directory.

When not checking out to a subdirectory, the executor number is expanded correctly. It still doesn't match up with the result of 'echo ${EXECUTOR_NUMBER}', but at least multiple builds aren't clashing workspaces with each other.

This occurs both when using 'checkoutToSubdirectory' under options as well as manually specifying a dir() step with 'checkout scm' under it.

 

To reproduce, I used the jenkins/jenkins:2.176.3 docker image of Jenkins (I was able to reproduce it on 2.150.3, 2.164.3, and 2.176.3) with p4-plugin 1.10.3 as well as a clean docker instance of Perforce (https://github.com/p4paul/helix-docker). I created a stream depot with a dummy stream to use for tests. The workspace name format is still the default jenkins-${NODE_NAME}${JOB_NAME}${EXECUTOR_NUMBER}

I have been able to reproduce it by starting several concurrent builds on the same node (I tried both the master node and a slave node) and the following Jenkinsfiles:

 

pipeline
{
  agent any
  options
  {
    checkoutToSubdirectory('test')
  }
  stages
  {
    stage('testStage')
    {
      steps
      {
        script
        {
          sh "echo ${EXECUTOR_NUMBER}"
        }
      }
    }
  }
}

 

pipeline
{ 
  agent any
  options
  {
    skipDefaultCheckout(true)
  }
  stages
  {
    stage('testStage')
    {
      steps
      { 
        script
        { 
          dir('test')
          {
            checkout scm
            sh "echo ${EXECUTOR_NUMBER}"
          }
        }
      }
    }
  }
}

And example output of 3 concurrent builds:

P4 Task: establishing connection.
... server: helixdocker_build.helix_1:1666
... node: e75d28131ce5
(p4):cmd:... p4 where /var/jenkins_home/workspace/testMultibranch_testStream%403/test/Jenkinsfil___
p4 where /var/jenkins_home/workspace/testMultibranch_testStream%403/test/Jenkinsfile(p4):stop:5
Building on Node: master
(p4):cmd:... p4 client -o jenkins-master-testMultibranch-testStream-0
p4 client -o jenkins-master-testMultibranch-testStream-0
P4 Task: establishing connection.
... server: helixdocker_build.helix_1:1666
... node: e75d28131ce5
(p4):cmd:... p4 where /var/jenkins_home/workspace/testMultibranch_testStream%402/test/Jenkinsfil___
p4 where /var/jenkins_home/workspace/testMultibranch_testStream%402/test/Jenkinsfile(p4):stop:5
Building on Node: master
(p4):cmd:... p4 client -o jenkins-master-testMultibranch-testStream-0
p4 client -o jenkins-master-testMultibranch-testStream-0
P4 Task: establishing connection.
... server: helixdocker_build.helix_1:1666
... node: e75d28131ce5
(p4):cmd:... p4 where /var/jenkins_home/workspace/testMultibranch_testStream/test/Jenkinsfile
p4 where /var/jenkins_home/workspace/testMultibranch_testStream/test/Jenkinsfile(p4):stop:5
Building on Node: master
(p4):cmd:... p4 client -o jenkins-master-testMultibranch-testStream-0
p4 client -o jenkins-master-testMultibranch-testStream-0

 

 

Add Comment Add Comment
 
This message was sent by Atlassian Jira (v7.13.6#713006-sha1:cc4451f)
Atlassian logo

kwirth@perforce.com (JIRA)

unread,
Sep 23, 2019, 10:22:02 AM9/23/19
to jenkinsc...@googlegroups.com

kwirth@perforce.com (JIRA)

unread,
Sep 23, 2019, 10:22:02 AM9/23/19
to jenkinsc...@googlegroups.com

kwirth@perforce.com (JIRA)

unread,
Sep 23, 2019, 10:30:04 AM9/23/19
to jenkinsc...@googlegroups.com
Karl Wirth commented on Bug JENKINS-59484
 
Re: EXECUTOR_NUMBER is always expanded to '0' in workspace name when checking out with Perforce to a subdirectory

Hi Matthew Brunton - Please check on the Jenkins slave and master if an '@' executor number or '@script' directory is being created under 'workspaces'. I'm interested to know if '@' or a different delimiter is being used.

mbrunton@eidosmontreal.com (JIRA)

unread,
Sep 23, 2019, 12:35:02 PM9/23/19
to jenkinsc...@googlegroups.com

I cleaned the workspaces and then hit 'build' 4 times quickly. Below are the resulting folders:

drwxr-xr-x  3 jenkins jenkins 4096 Sep 23 16:29 testMultibranch_testStream@3
drwxr-xr-x  2 jenkins jenkins 4096 Sep 23 16:29 testMultibranch_testStream@3@tmp
drwxr-xr-x  3 jenkins jenkins 4096 Sep 23 16:29 testMultibranch_testStream@4
drwxr-xr-x  2 jenkins jenkins 4096 Sep 23 16:29 testMultibranch_testStream@4@tmp
-rw-r--r--  1 jenkins jenkins   54 Sep 23 16:29 workspaces.txt

The two builds that should have used testMultibranch_testStream and testMultibranch_testStream@2 failed with an error:

ERROR: P4: Task Exception: hudson.AbortException: P4JAVA: Error(s):
Path '/var/jenkins_home/workspace/testMultibranch_testStream/test/...' is not under client's root '/var/jenkins_home/workspace/testMultibranch_testStream%403/test'.

kwirth@perforce.com (JIRA)

unread,
Sep 30, 2019, 11:10:03 AM9/30/19
to jenkinsc...@googlegroups.com

kwirth@perforce.com (JIRA)

unread,
Sep 30, 2019, 11:10:03 AM9/30/19
to jenkinsc...@googlegroups.com

kwirth@perforce.com (JIRA)

unread,
Sep 30, 2019, 11:10:04 AM9/30/19
to jenkinsc...@googlegroups.com
Karl Wirth commented on Bug JENKINS-59484
 
Re: EXECUTOR_NUMBER is always expanded to '0' in workspace name when checking out with Perforce to a subdirectory

Hi Matthew Brunton - Thanks. I have easily been able to reproduce this with the  Jenkinsfiles you provided and spamming 'Build Now' so will raise this to the developers.

 

Example console output from a job that gets the Jenkinsfile from SCM (necessary):
Started by user unknown or anonymous
Obtained Jenkinsfile from p4-JenkinsMaster-//depot/JENKINS-59484/... //jenkins-${NODE_NAME}${JOB_NAME}${EXECUTOR_NUMBER}/...
Running in Durability level: MAX_SURVIVABILITY[Pipeline] nodeRunning on
Jenkins
in /var/lib/jenkins/workspace/JENKINS-59484ExecutorNumber@3[Pipeline] {[Pipeline] stage[Pipeline] { (testStage)[Pipeline] script[Pipeline] {[Pipeline] dirRunning in /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber@3/test[Pipeline] {[Pipeline] checkout... p4 client -o jenkins-masterJENKINS-59484-ExecutorNumber-0 +
... p4 info +
... p4 info +
... p4 client o jenkins-masterJENKINS-59484-ExecutorNumber-0 +
... p4 client -i +
... View: +
... p4 counter change +
... p4 changes m1 -ssubmitted //jenkins-masterJENKINS-59484-ExecutorNumber-0/... +
... p4 counter change +
... p4 changes m1 -ssubmitted //jenkins-masterJENKINS-59484-ExecutorNumber-0/...@978,___ +
... p4 repos -C +
P4: builds: 1978 ... p4 client o jenkins-masterJENKINS-59484-ExecutorNumber-0 +
... p4 info +
... p4 info +
... p4 client o jenkins-masterJENKINS-59484-ExecutorNumber-0 +
... p4 client -i +
... View: +
P4 Task: establishing connection.
... server: vm-kwirth-swarm182-xenial:1666
... node: vm-kwirth-swarm182-xenial... p4 where /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/Jenkinsfi___ +
Building on Node: master... p4 client o jenkins-masterJENKINS-59484-ExecutorNumber-0 +
... p4 info +
... p4 info +
... p4 client o jenkins-masterJENKINS-59484-ExecutorNumber-0 +
... p4 client -i +
... View: +
P4 Task: establishing connection.
... server: vm-kwirth-swarm182-xenial:1666
... node: vm-kwirth-swarm182-xenial

P4 Task: reverting all pending and shelved revisions.... p4 revert /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/... +
... rm [abandoned files]
duration: (14ms)

P4 Task: cleaning workspace to match have list.... p4 reconcile -f -w /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test___ +
duration: 0m 3s

P4 Task: syncing files at change: 1978... p4 sync /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/...@1978 +
P4 Task: attempt: 1[Pipeline] }[Pipeline] // dir[Pipeline] }[Pipeline] // script[Pipeline] }[Pipeline] // stage[Pipeline] }[Pipeline] // node[Pipeline] End of PipelineERROR: P4: Task Exception: com.perforce.p4java.exception.P4JavaException: com.perforce.p4java.exception.P4JavaException: hudson.AbortException: P4JAVA: Error(s):
Path '/var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/...' is not under client's root '/var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%404/test'.

Finished: FAILURE
 

 

 

kwirth@perforce.com (JIRA)

unread,
Oct 1, 2019, 10:46:03 AM10/1/19
to jenkinsc...@googlegroups.com
Karl Wirth edited a comment on Bug JENKINS-59484
Hi [~mbrunton27] - Thanks. I have easily been able to reproduce this with the  Jenkinsfiles you provided and spamming 'Build Now' so will raise this to the developers.


 

Example console output from a job that gets the Jenkinsfile from SCM (necessary):
Started by user unknown or anonymous
{code:java}
Obtained Jenkinsfile from p4-JenkinsMaster-//depot/JENKINS-59484/... //jenkins-${NODE_NAME} - ${JOB_NAME} - ${EXECUTOR_NUMBER}/...
Running in Durability level: MAX_SURVIVABILITY
{color:#9a9999} [Pipeline] node{color}Running nodeRunning on
[ Jenkins |http://vm-kwirth-swarm182-xenial:8080/computer/(master)/]
in /var/lib/jenkins/workspace/JENKINS-
59484-ExecutorNumber 59484ExecutorNumber @3 {color:#9a9999} [Pipeline] { {color}{color:#9a9999} [Pipeline] stage {color}{color:#9a9999} [Pipeline] { (testStage) {color}{color:#9a9999} [Pipeline] script {color}{color:#9a9999} [Pipeline] { {color}{color:#9a9999} [Pipeline] dir{color}Running dirRunning in /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber@3/test {color:#9a9999} [Pipeline] { {color}{color:#9a9999} [Pipeline] checkout {color} ... p4 client -o jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0 +

... p4 info +
... p4 info +
... p4 client - o jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0 +

... p4 client -i +
...   View: +
... p4 counter change +
... p4 changes - m1 -ssubmitted //jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0/... +

... p4 counter change +
... p4 changes - m1 -ssubmitted //jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0/...@978,___ +

... p4 repos -C +
P4: builds: 1978 ... p4 client - o jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0 +

... p4 info +
... p4 info +
... p4 client - o jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0 +

... p4 client -i +
...   View: +
P4 Task: establishing connection.
... server: vm-kwirth-swarm182-xenial:1666
... node: vm-kwirth-swarm182-xenial... p4 where /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/Jenkinsfi___ +
Building on Node: master... p4 client - o jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0 +

... p4 info +
... p4 info +
... p4 client - o jenkins- master masterJENKINS - JENKINS- 59484-ExecutorNumber-0 +

... p4 client -i +
...   View: +
P4 Task: establishing connection.
... server: vm-kwirth-swarm182-xenial:1666
... node: vm-kwirth-swarm182-xenial

P4 Task: reverting all pending and shelved revisions.... p4 revert /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/... +
... rm [abandoned files]
duration: (14ms)

P4 Task: cleaning workspace to match have list.... p4 reconcile -f -w /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test___ +
duration: 0m 3s

P4 Task: syncing files at change: 1978... p4 sync /var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/...@1978 +
P4 Task: attempt: 1 {color:#9a9999} [Pipeline] } {color}{color:#9a9999} [Pipeline] // dir {color}{color:#9a9999} [Pipeline] } {color}{color:#9a9999} [Pipeline] // script {color}{color:#9a9999} [Pipeline] } {color}{color:#9a9999} [Pipeline] // stage {color}{color:#9a9999} [Pipeline] } {color}{color:#9a9999} [Pipeline] // node {color}{color:#9a9999} [Pipeline] End of Pipeline{color}ERROR PipelineERROR : P4: Task Exception: com.perforce.p4java.exception.P4JavaException: com.perforce.p4java.exception.P4JavaException: hudson.AbortException: P4JAVA: Error(s):

Path '/var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%403/test/...' is not under client's root '/var/lib/jenkins/workspace/JENKINS-59484-ExecutorNumber%404/test'.

Finished: FAILURE
{code}

 

 

 

kwirth@perforce.com (JIRA)

unread,
Oct 2, 2019, 9:27:02 AM10/2/19
to jenkinsc...@googlegroups.com

kwirth@perforce.com (JIRA)

unread,
Oct 3, 2019, 10:46:07 AM10/3/19
to jenkinsc...@googlegroups.com
Karl Wirth commented on Bug JENKINS-59484
 
Re: EXECUTOR_NUMBER is always expanded to '0' in workspace name when checking out with Perforce to a subdirectory

Have met with the developers:

Concurrency bug = JENKINS-58119

Sub-directory support is an enhancement request.  To work around JENKINS-48882 we have to look at the directory name to figure out which executor we are on. If you use 'dir()' it breaks this check hence '0' every time.  We will need a redesign of this check to allow for the it to work at lower levels and it will need to cope with Windows and Linux paths.

kwirth@perforce.com (JIRA)

unread,
Oct 3, 2019, 10:48:02 AM10/3/19
to jenkinsc...@googlegroups.com
Karl Wirth edited a comment on Bug JENKINS-59484
Have met with the developers:

Concurrency bug = JENKINS-58119 TBD.

Sub-directory support is an enhancement request.  To work around JENKINS-48882 we have to look at the directory name to figure out which executor we are on. If you use 'dir()' it breaks this check hence '0' every time.  We will need a redesign of this check to allow for the it to work at lower levels and it will need to cope with Windows and Linux paths.

kwirth@perforce.com (JIRA)

unread,
Oct 3, 2019, 10:48:03 AM10/3/19
to jenkinsc...@googlegroups.com

kwirth@perforce.com (JIRA)

unread,
Oct 3, 2019, 10:57:02 AM10/3/19
to jenkinsc...@googlegroups.com
Karl Wirth edited a comment on Improvement JENKINS-59484
Have met with the developers:

Concurrency bug =TBD.

Sub-directory support is an enhancement request.  To work around JENKINS-48882 we have to look at the directory name to figure out which executor we are on. If you use 'dir()' it breaks this check hence '0' every time.  We will need a redesign of this check to allow for the it to work at lower levels and it will need to cope with Windows and Linux paths.

msmeeth@perforce.com (JIRA)

unread,
Dec 17, 2019, 11:41:02 AM12/17/19
to jenkinsc...@googlegroups.com
Matthew Smeeth started work on Improvement JENKINS-59484
 
Change By: Matthew Smeeth
Status: Open In Progress

msmeeth@perforce.com (JIRA)

unread,
Dec 17, 2019, 11:42:02 AM12/17/19
to jenkinsc...@googlegroups.com

msmeeth@perforce.com (JIRA)

unread,
Dec 19, 2019, 11:02:02 AM12/19/19
to jenkinsc...@googlegroups.com

cbopardikar@perforce.com (JIRA)

unread,
Jan 8, 2020, 11:00:02 AM1/8/20
to jenkinsc...@googlegroups.com

cbopardikar@perforce.com (JIRA)

unread,
Jan 8, 2020, 11:00:03 AM1/8/20
to jenkinsc...@googlegroups.com
Reply all
Reply to author
Forward
0 new messages