[JIRA] (JENKINS-2111) removing a job (including multibranch/org folder branches/repos) does not remove the workspace

153 views
Skip to first unread message

andrew.bayer@gmail.com (JIRA)

unread,
Sep 1, 2016, 11:38:04 AM9/1/16
to jenkinsc...@googlegroups.com
Andrew Bayer updated an issue
 
Jenkins / Bug JENKINS-2111
removing a job (including multibranch/org folder branches/repos) does not remove the workspace
Change By: Andrew Bayer
Summary: removing a job (including multibranch/org folder branches/repos) does not remove the workspace
Priority: Major Blocker
Add Comment Add Comment
 
This message was sent by Atlassian JIRA (v7.1.7#71011-sha1:2526d7c)
Atlassian logo

andrew.bayer@gmail.com (JIRA)

unread,
Sep 1, 2016, 11:40:15 AM9/1/16
to jenkinsc...@googlegroups.com
Andrew Bayer commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

From a Jesse Glick comment over on JENKINS-34177:

I think this is actually a more general core issue: Job.delete (or some associated ItemListener.onDeleted should proactively delete any associated workspaces it can find on any connected nodes. WorkspaceCleanupThread as currently implemented is not going to find them.

cecchisandrone@gmail.com (JIRA)

unread,
Sep 2, 2016, 9:17:07 AM9/2/16
to jenkinsc...@googlegroups.com

We have written a script and scheduled it with cron:

Unable to find source-code formatter for language: bash. Available languages are: actionscript, html, java, javascript, none, sql, xhtml, xml
#TODO handle folders with spaces
IFS=$'\n'

#find empty job directories & form the new folder structure for workspace directories
emptydirs_jobs=$(find . -type d -empty | cut -d '/' -f2-6)
emptydirs_workspace=$(find . -type d -empty | cut -d '/' -f2,4,6)

#remove the corresponding directory from workspace
for i in $emptydirs_workspace; do
rm -rf /var/jenkins_home/workspace/$i
done

#remove empty directories from jobs
for j in $emptydirs_jobs; do
rm -rf /var/jenkins_home/jobs/$j
done

I hope this can help you until a fix is provided.

cecchisandrone@gmail.com (JIRA)

unread,
Sep 2, 2016, 9:17:10 AM9/2/16
to jenkinsc...@googlegroups.com
Alessandro Dionisi edited a comment on Bug JENKINS-2111
We have written a script and scheduled it with cron:

{code: bash none }

#TODO handle folders with spaces
IFS=$'\n'

#find empty job directories & form the new folder structure for workspace directories
emptydirs_jobs=$(find . -type d -empty | cut -d '/' -f2-6)
emptydirs_workspace=$(find . -type d -empty | cut -d '/' -f2,4,6)

#remove the corresponding directory from workspace
for i in $emptydirs_workspace; do
rm -rf /var/jenkins_home/workspace/$i
done

#remove empty directories from jobs
for j in $emptydirs_jobs; do
rm -rf /var/jenkins_home/jobs/$j
done

{code}


I hope this can help you until a fix is provided.

andrew.bayer@gmail.com (JIRA)

unread,
Sep 16, 2016, 7:43:05 PM9/16/16
to jenkinsc...@googlegroups.com

Jesse Glick Feels to me like Job.performDelete may be more of the right place to do this?

andrew.bayer@gmail.com (JIRA)

unread,
Sep 16, 2016, 7:55:02 PM9/16/16
to jenkinsc...@googlegroups.com

Also, blergh, finding all the workspaces for a Pipeline job is...hard. node.getWorkspaceFor isn't useful here. I think we'd need to look for all FlowNode on a WorkflowRun for a given WorkflowJob to see if they've got a WorkspaceAction and then act on those WorkspaceAction...which is demented. Oy.

jspiewak@gmail.com (JIRA)

unread,
Sep 16, 2016, 8:05:05 PM9/16/16
to jenkinsc...@googlegroups.com

FWIW, I wrote this today:

Closure cleanMultiBranchWorkspaces
cleanMultiBranchWorkspaces = { item ->
  if (item instanceof com.cloudbees.hudson.plugins.folder.Folder) {
     if (item.name == 'archive') {
      println "Skipping $item"
    } else {
      println "Found folder $item, checking its items"
      item.items.each { cleanMultiBranchWorkspaces(it) }
    }
  } else if (item instanceof org.jenkinsci.plugins.workflow.multibranch.WorkflowMultiBranchProject) {
    println "Found a multi-branch workflow $item"
    
    workspaces = jenkins.model.Jenkins.instance.nodes.collect { it.getWorkspaceFor(item).listDirectories() }.flatten().findAll { it != null }
    
    def activeBranches = item.items.name
    println "Active branches = $activeBranches"

    if (workspaces) {
      workspaces.removeAll { workspace ->
        activeBranches.any { workspace.name.startsWith(it) }
      }

      workspaces.each {
        println "Removing workspace $it.name on ${it.toComputer().name} without active branch"
      }
    }
  }
}

jenkins.model.Jenkins.instance.items.each { cleanMultiBranchWorkspaces(it) }

Need to switch the startsWith to a regex for more exact matching.

Created a job with the Groovy plugin executing this as a system script.

jglick@cloudbees.com (JIRA)

unread,
Sep 17, 2016, 11:17:05 AM9/17/16
to jenkinsc...@googlegroups.com

I will try to solve this for branch projects as part of JENKINS-34564, since these are especially likely to be created and discarded rapidly.

I think a general implementation need not really be that difficult. Each node (master, agent) should just pay attention to when a workspace is used. (If in core, via WorkspaceList; otherwise, perhaps via WorkspaceListener.) Then record a workspaces.xml, a sibling of workspace/, with a list of records: relative workspace path, Item.fullName, timestamp. Periodically, or when an agent comes online, etc., iterate the list and check for jobs which no longer exist under that name (covers JENKINS-22240), or workspaces which have not been used in a long time. If in a plugin (JENKINS-26471) you could get fancy and modify behavior according to free disk space, etc.

yury.zaytsev@traveltainment.de (JIRA)

unread,
Sep 27, 2016, 4:38:02 AM9/27/16
to jenkinsc...@googlegroups.com

So was it addressed in JENKINS-34564 ? I've had a look at the commit mentioned in Jira, but I couldn't easily see any code pertaining to the deletion of the workspaces.

jglick@cloudbees.com (JIRA)

unread,
Oct 7, 2016, 2:26:03 PM10/7/16
to jenkinsc...@googlegroups.com

I couldn't easily see any code pertaining to the deletion of the workspaces.

here it is

mneale@cloudbees.com (JIRA)

unread,
Jan 8, 2017, 11:59:05 PM1/8/17
to jenkinsc...@googlegroups.com

Jesse Glick does that imply this can be closed as a newer branch-api-plugin has a fix for this?

jenkins@michaelpporter.com (JIRA)

unread,
Jan 27, 2017, 9:09:02 AM1/27/17
to jenkinsc...@googlegroups.com

I'm not sure if my issue is related.

We use mulitbranch to make a php site. Part of the script creates a database based on the branch name. It would be nice to have an onDelete hook we can include code to cleanup the site when we remove the branch. Something in the groovy script would be nice.

onDelete

{ // clean up DB and composer files. }

I can make a new ticket if this is not the right thread for this.

sepstein@arris.com (JIRA)

unread,
Feb 2, 2017, 4:48:10 PM2/2/17
to jenkinsc...@googlegroups.com
Scott Epstein edited a comment on Bug JENKINS-2111
[~jglick]
Jesse, I'm new to Jenkins and this forum.  I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved and was it resolved by 1.609 (the Jenkins version I am running)?

I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded.  Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete.  The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/
9990*33* 999033
/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/
9990*35* 999035
   where CONFIGURATION=${label}_${target}_${platform}_${type}

The 33 and 35 in the 999033 and 999035 match the Build History build numbers.

Thank you for your help.

sepstein@arris.com (JIRA)

unread,
Feb 2, 2017, 4:48:10 PM2/2/17
to jenkinsc...@googlegroups.com


Jesse, I'm new to Jenkins and this forum. I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved and was it resolved by 1.609 (the Jenkins version I am running)?

I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded. Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete. The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/sub-build/<Jenkins project>/<CONFIGURATION>/build/9990*33*
/export/build/<slave node>/sub-build/<Jenkins project>/<CONFIGURATION>/build/9990*35*
where CONFIGURATION=$

{label}

_$

{target}

_$

{platform}

_$

{type}

The 33 and 35 in the 999033 and 999035 match the Build History build numbers.

Thank you for your help.

sepstein@arris.com (JIRA)

unread,
Feb 2, 2017, 4:54:14 PM2/2/17
to jenkinsc...@googlegroups.com
Scott Epstein edited a comment on Bug JENKINS-2111
[~jglick]

Jesse, I'm new to Jenkins and this forum.  I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved and was it resolved by ? I am running 1.609 (the Jenkins version . I am running)? don't see a fix for this discarder issue listed here: https://jenkins.io/changelog/  


I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded.  Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete.  The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999033
/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999035

   where CONFIGURATION=${label}_${target}_${platform}_${type}

The 33 and 35 in the 999033 and 999035 match the Build History build numbers.

Thank you for your help.

sepstein@arris.com (JIRA)

unread,
Feb 2, 2017, 4:55:03 PM2/2/17
to jenkinsc...@googlegroups.com
Scott Epstein edited a comment on Bug JENKINS-2111
[~jglick]
Jesse, I'm new to Jenkins and this forum.  I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved? I am running 1.609.  I don't see a fix for this discarder issue listed here: https://jenkins.io/changelog/  


I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded.  Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete.  The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999033
/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999035
   where CONFIGURATION=${label}_${target}_${platform}_${type}

The 33 and 35 in the 999033 and 999035 at the end of the path match the Build History build numbers.

Do the directories correspond to workspaces?

Thank you for your help.

sepstein@arris.com (JIRA)

unread,
Feb 2, 2017, 5:08:02 PM2/2/17
to jenkinsc...@googlegroups.com
Scott Epstein edited a comment on Bug JENKINS-2111
[~jglick]
Jesse, I'm new to Jenkins and this forum.  I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved? I am running 1.609.  I don't see a fix for this discarder issue listed here: https://jenkins.io/changelog/  

I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded.  Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete.  The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999033
/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999035
   where CONFIGURATION=${label}_${target}_${platform}_${type}

The 33 and 35 in the 999033 and 999035 at the end of the path match the Build History build numbers.

Do the above directories correspond to workspaces?

I turned on some logging.  Are the negative ones below a problem?

Feb 02, 2017 4:42:37 PM FINE hudson.tasks.LogRotator
Running the log rotation for hudson.matrix.MatrixConfiguration@6ec64ed8[<Jenkins project>/label=e,platform=a,target=s,type=d] with numToKeep=-1 daysToKeep=-1 artifactNumToKeep=-1 artifactDaysToKeep=-1
Feb 02, 2017 4:44:30 PM FINE hudson.tasks.LogRotator
Running the log rotation for hudson.matrix.MatrixConfiguration@7b748f18[<Jenkins project>/label=e,platform=a,target=c,type=d] with numToKeep=-1 daysToKeep=-1 artifactNumToKeep=-1 artifactDaysToKeep=-1

Thank you for your help.

sepstein@arris.com (JIRA)

unread,
Feb 2, 2017, 5:12:05 PM2/2/17
to jenkinsc...@googlegroups.com
Scott Epstein edited a comment on Bug JENKINS-2111
[~jglick]
Jesse, I'm new to Jenkins and this forum.  I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved?
I am running 1.609.  I don't see a fix for this discarder issue listed here: https://jenkins.io/changelog/  


I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded.  Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete.  The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999033
/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999035
   where CONFIGURATION=${label}_${target}_${platform}_${type}

The 33 and 35 in the 999033 and 999035 at the end of the path match the Build History build numbers.

Do the above directories correspond to workspaces?


My Discard Old Build settings are:
  Strategy                     Log Rotation     (note: this is the only option given)
  Days to keep builds          7
  Max # of builds to keep   15

I turned on some logging.  Are the negative ones below a problem?   Is there a way for me to determine if ANYTHING is being deleted?

Feb 02, 2017 4:42:37 PM FINE hudson.tasks.LogRotator
Running the log rotation for hudson.matrix.MatrixConfiguration@6ec64ed8[<Jenkins project>/label=e,platform=a,target=s,type=d] with numToKeep=-1 daysToKeep=-1 artifactNumToKeep=-1 artifactDaysToKeep=-1
Feb 02, 2017 4:44:30 PM FINE hudson.tasks.LogRotator
Running the log rotation for hudson.matrix.MatrixConfiguration@7b748f18[<Jenkins project>/label=e,platform=a,target=c,type=d] with numToKeep=-1 daysToKeep=-1 artifactNumToKeep=-1 artifactDaysToKeep=-1

I am running 1.609.  I don't see a fix for this discarder issue listed here: https://jenkins.io/changelog/  

Thank you for your help.

sepstein@arris.com (JIRA)

unread,
Feb 6, 2017, 2:08:03 PM2/6/17
to jenkinsc...@googlegroups.com
Scott Epstein edited a comment on Bug JENKINS-2111
[~jglick]
Jesse, I'm new to Jenkins and this forum.  I apologize for any newbie errors in advance.

Am I seeing the problem that you resolved?

I have directories for Multi-configuration projects sticking around after the Discard Old Builds / Max # of builds to keep (15) has been exceeded.  Jobs at the bottom of the Build History (after the 15th job) disappear as new jobs complete.  The directories that are kept that I'd expect to be deleted are named as follows.

/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999033
/export/build/<slave node>/*sub-build*/<Jenkins project>/<CONFIGURATION>/*build*/999035
   where CONFIGURATION=${label}_${target}_${platform}_${type}

The 33 and 35 in the 999033 and 999035 at the end of the path match the Build History build numbers.

Do the above directories correspond to workspaces?


My Discard Old Build settings are:
  Strategy                     Log Rotation     (note: this is the only option given)
  Days to keep builds          7
  Max # of builds to keep   15

I turned on some logging.  Are the negative ones below a problem?  Is there a way for me to determine if ANYTHING is being deleted?

Feb 02, 2017 4:42:37 PM FINE hudson.tasks.LogRotator
Running the log rotation for hudson.matrix.MatrixConfiguration@6ec64ed8[<Jenkins project>/label=e,platform=a,target=s,type=d] with numToKeep=-1 daysToKeep=-1 artifactNumToKeep=-1 artifactDaysToKeep=-1
Feb 02, 2017 4:44:30 PM FINE hudson.tasks.LogRotator
Running the log rotation for hudson.matrix.MatrixConfiguration@7b748f18[<Jenkins project>/label=e,platform=a,target=c,type=d] with numToKeep=-1 daysToKeep=-1 artifactNumToKeep=-1 artifactDaysToKeep=-1

I am running 1.609.  I don't see a your fix for this discarder issue listed here in the change log : https://jenkins.io/changelog/  


Thank you for your help.

jglick@cloudbees.com (JIRA)

unread,
Feb 15, 2017, 2:21:03 PM2/15/17
to jenkinsc...@googlegroups.com

does that imply this can be closed as a newer branch-api-plugin has a fix for this?

No, it has a fix for the limited case that the Job is in fact a branch project, and the agent is online at the time.

Michael Porter see JENKINS-40606.

Scott Epstein no those are build directories, not workspaces, so unrelated to this ticket.

sepstein@arris.com (JIRA)

unread,
Feb 15, 2017, 2:27:09 PM2/15/17
to jenkinsc...@googlegroups.com

vivek.pandey@gmail.com (JIRA)

unread,
Feb 20, 2018, 11:06:05 AM2/20/18
to jenkinsc...@googlegroups.com
Vivek Pandey assigned an issue to rsandell
 
Change By: Vivek Pandey
Assignee: rsandell
This message was sent by Atlassian JIRA (v7.3.0#73011-sha1:3c73d0e)
Atlassian logo

captrespect@gmail.com (JIRA)

unread,
Mar 13, 2018, 12:57:03 PM3/13/18
to jenkinsc...@googlegroups.com
Jon Roberts commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

As a workaround for this had to write a cron job that scans the get branches, matches them up with the workspace folders, then removes them.  An option that would just do then whenever the job is removed would be fantastic.

jglick@cloudbees.com (JIRA)

unread,
Sep 21, 2018, 7:59:08 PM9/21/18
to jenkinsc...@googlegroups.com
Jesse Glick assigned an issue to Jesse Glick
 
Change By: Jesse Glick
Assignee: rsandell Jesse Glick
This message was sent by Atlassian Jira (v7.11.2#711002-sha1:fdc329d)

jglick@cloudbees.com (JIRA)

unread,
Sep 21, 2018, 7:59:19 PM9/21/18
to jenkinsc...@googlegroups.com
Jesse Glick started work on Bug JENKINS-2111
 
Change By: Jesse Glick
Status: Open In Progress

jglick@cloudbees.com (JIRA)

unread,
Oct 2, 2018, 5:28:07 PM10/2/18
to jenkinsc...@googlegroups.com

jglick@cloudbees.com (JIRA)

unread,
Oct 24, 2018, 3:01:04 PM10/24/18
to jenkinsc...@googlegroups.com
Jesse Glick commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

If anyone is interested in trying my fix for this (and other) issues, you can install this experimental build.

mneale@cloudbees.com (JIRA)

unread,
Oct 24, 2018, 3:15:03 PM10/24/18
to jenkinsc...@googlegroups.com

nice to see this happening!

Also what is up with the date on this ticket: create in 2008? It foretold Pipeline-as-Code? 

jglick@cloudbees.com (JIRA)

unread,
Oct 24, 2018, 4:13:06 PM10/24/18
to jenkinsc...@googlegroups.com

tyler.jeremy.smith@gmail.com (JIRA)

unread,
Oct 29, 2018, 9:05:06 AM10/29/18
to jenkinsc...@googlegroups.com

I tried your experimental build and it worked for the slave agents. When the, apparently scheduled, clean up occurred, it removed <name> and <name>@tmp folders from all the slave agents. However, I still have a <name>@script folder on my Jenkins master. Is that the responsibility of another component to delete?

tyler.jeremy.smith@gmail.com (JIRA)

unread,
Oct 29, 2018, 9:30:05 AM10/29/18
to jenkinsc...@googlegroups.com
Tyler Smith edited a comment on Bug JENKINS-2111
I tried your experimental build and it worked for the slave agents. When the, apparently scheduled, clean up occurred, it removed <name> and <name>@tmp folders from all the slave agents. However, I still have a <name>@script folder on my Jenkins master. Is that the responsibility of another component to delete?


Edit:

Nevermind. I recently updated the subversion plugin which enabled lightweight checkouts and no longer creates <name>@script folders, so it is a non-issue.

jglick@cloudbees.com (JIRA)

unread,
Oct 30, 2018, 9:00:27 AM10/30/18
to jenkinsc...@googlegroups.com

I still have a <name>@script folder on my Jenkins master. Is that the responsibility of another component to delete?

Yes, this is created by Pipeline builds under certain SCM configurations (“heavyweight checkouts”), and is not covered by the workspace infrastructure. Probably those should be cleaned up, too, but it would be a separate patch.

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 8, 2018, 4:27:05 PM11/8/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 8, 2018, 4:28:05 PM11/8/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 8, 2018, 4:30:04 PM11/8/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 8, 2018, 4:32:05 PM11/8/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 8, 2018, 4:35:06 PM11/8/18
to jenkinsc...@googlegroups.com
Uma Shankar commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

Jesse Glick Changes to Branch API has got strange behavior, please see my below images:

 

Jenkins job:

 

Workspace folder when builds on master Jenkins:

Looks better:

Looks same as before (but why one branch works and another fails to old behavior)

 

when building on Slave: (Looks good, but the path is too long and variable -Djenkins.branch.WorkspaceLocatorImpl.PATH_MAX=2 doesn't help now, I was able to short path before, looks like new plugin isn't caring variable)

Jenkins Version  - 2.138.2

Branch API 2.0.21-rc632.1afb188ed43f

jglick@cloudbees.com (JIRA)

unread,
Nov 8, 2018, 4:46:03 PM11/8/18
to jenkinsc...@googlegroups.com

Uma Shankar for compatibility, existing workspaces created under the old scheme continue to be used (but their names are now tracked). The new naming policy applies only when defining a workspace for a given job on a given node for the first time.

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 8, 2018, 4:49:04 PM11/8/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 9:26:13 AM11/9/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 9:28:03 AM11/9/18
to jenkinsc...@googlegroups.com
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

Jesse Glick I cleared up everything from my workspace (master/slave), I still see the same issue with folder name on slave workspace (Still adding %2F. I created a new branch to test, so there were no history about this branch)

 

Also, Jenkins master workspace has a file called `workspace.txt`, but slave doesn't have it.

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 9:41:04 AM11/9/18
to jenkinsc...@googlegroups.com
Uma Shankar edited a comment on Bug JENKINS-2111
[~jglick] I cleared up everything from my workspace (master/slave), I still see the same issue with folder name on slave workspace (Still adding %2F. I created a new branch to test, so there were no history about this branch)

 

!image-2018-11-09-14-25-37-159.png|width=780,height=75!


Also, Jenkins master workspace has a file called `workspace.txt`, but slave doesn't have it.


 

Build on master looks good

!image-2018-11-09-14-40-45-316.png|width=519,height=55!

jglick@cloudbees.com (JIRA)

unread,
Nov 9, 2018, 1:42:03 PM11/9/18
to jenkinsc...@googlegroups.com

And you are using a multibranch project in each case? For now, the feature is limited to branch projects by default; you can run with -Djenkins.branch.WorkspaceLocatorImpl.MODE=ENABLED to apply it to all projects.

Also check your system log for any warnings.

jglick@cloudbees.com (JIRA)

unread,
Nov 9, 2018, 1:55:07 PM11/9/18
to jenkinsc...@googlegroups.com
 

rsandell just released. Arguably should have been as 2.1, but I have never been able to make sense of the version numbering schemes used in plugins historically maintained by Stephen Connolly (since we certainly are not enforcing semver).

Change By: Jesse Glick
Status: In Review Resolved
Resolution: Fixed
Released As: 2.0.21

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 2:24:04 PM11/9/18
to jenkinsc...@googlegroups.com
Uma Shankar commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

All the projects are using Declarative pipeline with multi-branch. I am running jenkins with -Djenkins.branch.WorkspaceLocatorImpl.PATH_MAX=2. I will add -Djenkins.branch.WorkspaceLocatorImpl.MODE=ENABLED and see if that helps.

jglick@cloudbees.com (JIRA)

unread,
Nov 9, 2018, 2:37:03 PM11/9/18
to jenkinsc...@googlegroups.com

PATH_MAX is ignored by the new implementation except for purposes of identifying folders apparently created by the old implementation. In other words, it has no effect on a node without existing workspaces.

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 2:46:26 PM11/9/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 2:46:40 PM11/9/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 9, 2018, 2:46:46 PM11/9/18
to jenkinsc...@googlegroups.com
Uma Shankar edited a comment on Bug JENKINS-2111
That didn't help I see %2F in folder name on slaves

!image-2018-11-09-19-45-07-763.png|width=694,height=79!

Dont see anything related info in system log.

jglick@cloudbees.com (JIRA)

unread,
Nov 9, 2018, 3:26:10 PM11/9/18
to jenkinsc...@googlegroups.com

Uma Shankar no idea. Please install this update and create a logger (Manage Jenkins » System Log) on jenkins.branch.WorkspaceLocatorImpl recording at least FINER and check for messages there when you build a branch project for the first time on a given agent. Also check whether c:\Jenkins\workspace\workspaces.txt exists and, if so, what it contains.

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 12, 2018, 8:47:06 AM11/12/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 12, 2018, 8:48:10 AM11/12/18
to jenkinsc...@googlegroups.com

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 12, 2018, 8:48:37 AM11/12/18
to jenkinsc...@googlegroups.com
Uma Shankar commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

Jesse Glick I tried to follow your instruction, Earlier these wasn't any workspace.txt, but I see that now, however its empty and folders are with %2F

 

also, no logs being generated as part of jenkins.branch.WorkspaceLocatorImpl logger (note, I have select `All` to record)

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 12, 2018, 8:49:04 AM11/12/18
to jenkinsc...@googlegroups.com
Uma Shankar edited a comment on Bug JENKINS-2111
[~jglick] I tried to follow your instruction, Earlier these wasn't any workspace.txt, but I see that now, however its empty and folders are still with %2F

 

!image-2018-11-12-13-46-17-318.png!


also, no logs being generated as part of jenkins.branch.WorkspaceLocatorImpl logger (note, I have select `All` to record)

!image-2018-11-12-13-47-19-049.png|width=380,height=80!

itsmeshankar1@gmail.com (JIRA)

unread,
Nov 12, 2018, 8:51:03 AM11/12/18
to jenkinsc...@googlegroups.com
Uma Shankar edited a comment on Bug JENKINS-2111
[~jglick] I tried to follow your instruction, Earlier these wasn't any workspace.txt, but I see that now, however its empty and folders are still with %2F

 

!image-2018-11-12-13-46-17-318.png!

!image-2018-11-12-13-50-15-039.png|width=492,height=221!

also, no logs being generated as part of jenkins.branch.WorkspaceLocatorImpl logger (note, I have select `All` to record)

!image-2018-11-12-13-47-19-049.png|width=380,height=80!

jsoref+jenkins@gmail.com (JIRA)

unread,
Nov 12, 2018, 5:49:04 PM11/12/18
to jenkinsc...@googlegroups.com

Jesse Glick this commit (https://github.com/jenkinsci/branch-api-plugin/commit/481e4857c9a450c2904ff5188aa9076cf1db1f1c) resulted in a fairly consistent but slightly random and mind boggling build failure.

We just upgraded to Jenkins ver. 2.151 today (and upgraded all of our plugins at the same time).

We were able to review our plugin update list and then sifted through commits. Eventually we settled on this one and fixed it by downgrading
Branch API from 2.0.21 to 2.0.20 which effectively backed this out.

We have a declarative Jenkinsfile pipeline file which uses parallel in a stage... It looks approximately like this. Note that We define an agent for the outer job and for Scala Full, but not for Frontend Prod. Frontend Prod builds happily (which means it can find the npm.sh file), Scala Full fails with an error trying to say that it can't find the sbt.sh file (these files all exist in a single git repository). If we pin all stages to a single node, it works. We're using the branch plugin to monitor pull requests from github and it's automagically building our commits based on that...

pipeline {
    agent {
        label "pipeline"
    }

    environment {
        PIPELINE_SCRIPTS="$WORKSPACE/pipeline/scripts"
        SBT_SCRIPT="$PIPELINE_SCRIPTS/sbt.sh"
        NPM_SCRIPT="$PIPELINE_SCRIPTS/npm.sh"
    }

    stages {
        stage ("Checkout") {
            steps {
                script {
                    result = sh (script: "git log -1 --pretty=format:'%an' | grep 'Jenkins'", returnStatus: true)
                    if (result != 0) {
                        echo ("Non jenkins user commit")
                    }
                }
            }
        }

        stage("Compile") {
            parallel {
                stage("Scala Full") {
                    agent {
                        label "pipeline"
                    }
                    steps {
                        sh "$SBT_SCRIPT compile"
                    }
                }
                stage("Frontend Prod") {
                    steps {
                        sh "$NPM_SCRIPT install"
                    }
                }
            }
        }
    }
}

jglick@cloudbees.com (JIRA)

unread,
Nov 13, 2018, 8:42:10 AM11/13/18
to jenkinsc...@googlegroups.com

Uma Shankar sorry, I have no idea what is happening on your machine. You may want to install the Support Core plugin, which captures much richer diagnostics, and either attach a generated support bundle here or send it to me privately. (If attaching publicly, I recommend using the anonymization option, unless this is just a test server with no confidential projects.)

Josh Soref I cannot even guess what might be causing your build failure. If you manage to narrow it down to a minimal, reproducible test case, please file a bug report and Link to it from here.

jglick@cloudbees.com (JIRA)

unread,
Nov 15, 2018, 6:01:09 PM11/15/18
to jenkinsc...@googlegroups.com

dariusz.danilko@gmail.com (JIRA)

unread,
Dec 3, 2018, 4:51:11 AM12/3/18
to jenkinsc...@googlegroups.com

Josh Soref: I would expect that if you specify nested 'pipeline' agent then the inner one should get a new workspace, which is why it does not have access to the git repo, which had been cloned to the workspace of the outer agent. What are you trying to achieve with such a pipeline anyway?

dariusz.danilko@gmail.com (JIRA)

unread,
Dec 3, 2018, 4:55:03 AM12/3/18
to jenkinsc...@googlegroups.com
Dariusz Daniłko edited a comment on Bug JENKINS-2111
[~jsoref]: I would expect that if you specify nested 'pipeline' agent then the inner one should get a new workspace, which is why it does not have access to the git repo, which had been cloned to the workspace of the outer agent.

You could get access to it if you `stash`ed the git repo after checkout and then `unstash`ed it in the nested pipeline stage.

However, the pipeline does not look clean to me.
What are you trying to achieve with such a nested pipeline anyway?

jsoref+jenkins@gmail.com (JIRA)

unread,
Dec 3, 2018, 6:17:06 AM12/3/18
to jenkinsc...@googlegroups.com

The nested agents do get their own git clones. The ones without a specified agent run on the same node as the main pipeline. I only need to use stash to share build products (omitted because it isn't relevant to the failure).

david.aldrich@emea.nec.com (JIRA)

unread,
Feb 18, 2019, 6:08:03 AM2/18/19
to jenkinsc...@googlegroups.com

I see this message in our console log:

Feb 18, 2019 8:29:06 AM WARNING jenkins.branch.WorkspaceLocatorImpl getWorkspaceRoot
JENKINS-2111 path sanitization ineffective when using legacy Workspace Root Directory ‘${ITEM_ROOTDIR}/workspace’; switch to ‘${JENKINS_HOME}/workspace/${ITEM_FULL_NAME}’ as in JENKINS-8446 / JENKINS-21942

It seems harmless, but how can I fix it?

jglick@cloudbees.com (JIRA)

unread,
Feb 18, 2019, 1:22:04 PM2/18/19
to jenkinsc...@googlegroups.com

David Aldrich it is not harmless. See JENKINS-21942 esp. my last comment of 2018-04-24. Or just stop doing builds on the master.

david.aldrich@emea.nec.com (JIRA)

unread,
Feb 20, 2019, 3:26:06 AM2/20/19
to jenkinsc...@googlegroups.com

Jesse Glick Thanks for your reply. I'm afraid I'm stuck on this one. I don't understand what the console message means nor where to look in order to fix it. Please will you explain what I need to do?

jglick@cloudbees.com (JIRA)

unread,
Feb 20, 2019, 8:50:16 AM2/20/19
to jenkinsc...@googlegroups.com

carsten.pfeiffer@gebit.de (JIRA)

unread,
Oct 1, 2019, 7:49:09 AM10/1/19
to jenkinsc...@googlegroups.com
Carsten Pfeiffer assigned an issue to Carsten Pfeiffer
 
Change By: Carsten Pfeiffer
Assignee: Jesse Glick Carsten Pfeiffer
This message was sent by Atlassian Jira (v7.13.6#713006-sha1:cc4451f)
Atlassian logo

carsten.pfeiffer@gebit.de (JIRA)

unread,
Oct 1, 2019, 8:00:06 AM10/1/19
to jenkinsc...@googlegroups.com

alexander.samoylov@gmail.com (JIRA)

unread,
Mar 3, 2020, 6:50:06 AM3/3/20
to jenkinsc...@googlegroups.com
Alexander Samoylov commented on Bug JENKINS-2111
 
Re: removing a job (including multibranch/org folder branches/repos) does not remove the workspace

I am using Jenkins 2.204.3 and Pipeline Plugin 2.6 and this issue is clearly reproducible for me.
I delete the job with a "Delete Pipeline" button and I see all these workspaces present on every node:

  • <workspace> itself
  • <workspace>@1,2,3...
  • <workspace>@tmp

Should I reopen this ticket or wait for the next version of the core or plugin?

This message was sent by Atlassian Jira (v7.13.12#713012-sha1:6e07c38)
Atlassian logo

jglick@cloudbees.com (JIRA)

unread,
Mar 3, 2020, 10:17:05 AM3/3/20
to jenkinsc...@googlegroups.com

Alexander Samoylov neither. You should file a fresh issue with complete, self-contained, explicit steps to reproduce from scratch, and Link to this one. (By the way “Pipeline Plugin 2.6” is meaningless. This is just an aggregator with no code.)

Reply all
Reply to author
Forward
0 new messages