[JIRA] (JENKINS-54126) Jenkinsfile not found in PR on GitHub -- Does not meet criteria

4 views
Skip to first unread message

jsoref+jenkins@gmail.com (JIRA)

unread,
May 24, 2019, 4:42:04 PM5/24/19
to jenkinsc...@googlegroups.com
Josh Soref updated an issue
 
Jenkins / Bug JENKINS-54126
Jenkinsfile not found in PR on GitHub -- Does not meet criteria
Change By: Josh Soref
Summary: Jenkinsfile not found in PR on GitHub -- Does not meet criteria
Add Comment Add Comment
 
This message was sent by Atlassian Jira (v7.11.2#711002-sha1:fdc329d)

bitwiseman@gmail.com (JIRA)

unread,
May 29, 2019, 1:22:03 PM5/29/19
to jenkinsc...@googlegroups.com

bitwiseman@gmail.com (JIRA)

unread,
May 29, 2019, 1:28:03 PM5/29/19
to jenkinsc...@googlegroups.com
Liam Newman commented on Bug JENKINS-54126
 
Re: Jenkinsfile not found in PR on GitHub -- Does not meet criteria

Aleksey Smyrnov, Jesse Scott, Joseph Collard, Josh Soref, David Sanftenberg, Alun Edwards, Ebrahim Moshaya, Nelson Wolf
We're are trying to test whether this underlying issue could be fixed or improved by moving to okhttp3. If you are willing to help, please try the patched version of the plugin shown in the description of https://issues.jenkins-ci.org/browse/JENKINS-57411 . Thanks.

adam.beswick@bbc.co.uk (JIRA)

unread,
Jun 4, 2019, 9:02:05 AM6/4/19
to jenkinsc...@googlegroups.com

I tried 2.5.4-rc849.b58a1bae7fce, it did not fix the issue for me.

 

I had been running builds on the branch (without a PR up) with no problem. Then as soon as I made a PR for the branch, Jenkins lost track of the branch and PR because it could no longer see the Jenkinsfile on scan.

adam.beswick@bbc.co.uk (JIRA)

unread,
Jun 4, 2019, 9:05:04 AM6/4/19
to jenkinsc...@googlegroups.com
Adam Beswick edited a comment on Bug JENKINS-54126
I tried 2.5.4-rc849.b58a1bae7fce, it did not fix the issue for me.

 

I had been running builds on the branch (without a PR up) with no problem. Then as soon as I made a PR for the branch, Jenkins lost track of the branch and PR because it could no longer see the Jenkinsfile on scan. I had made the PR after merging master, and resolving conflicts in the Jenkinsfile.

adam.beswick@bbc.co.uk (JIRA)

unread,
Jun 4, 2019, 9:07:04 AM6/4/19
to jenkinsc...@googlegroups.com
Adam Beswick edited a comment on Bug JENKINS-54126
I tried 2.5.4-rc849.b58a1bae7fce, it did not fix the issue for me.

I had been running builds on the branch (without a PR up) with no problem. Then as soon as I made a PR for the branch, Jenkins lost track of the branch and PR because it could no longer see the Jenkinsfile on scan. I had made the PR after merging master, and resolving conflicts in the Jenkinsfile.


Deleting the cache (as above), also fixed this for me.

carpnick@gmail.com (JIRA)

unread,
Jun 21, 2019, 3:42:04 PM6/21/19
to jenkinsc...@googlegroups.com

Workaround that worked for me:

Groovy config - for a container

 

//See BUG - https://issues.jenkins-ci.org/browse/JENKINS-54126 org.jenkinsci.plugins.github_branch_source.GitHubSCMSource.cacheSize=0

And

GitHubServerConfig server = new GitHubServerConfig("my_github_credential_API"); 
server.setApiUrl(githubAPIurl)
server.setClientCacheSize(0) //<--See https://issues.jenkins-ci.org/browse/JENKINS-54126. Had to disable this one too to get this to stop happening

Once both of those are configured, delete the directory:

$jenkins_home/org.jenkinsci.plugins.github_branch_source.GitHubSCMProbe.cache

 

Now that directory is not getting re-populated.  Not sure why 2nd code block was required, but for what its worth, to the next person.

 

carpnick@gmail.com (JIRA)

unread,
Jun 21, 2019, 3:54:04 PM6/21/19
to jenkinsc...@googlegroups.com
Nick Carpenter edited a comment on Bug JENKINS-54126
Workaround that worked for me:

Groovy config - for a container

 
{code:java}
//See BUG - https://issues.jenkins-ci. org /browse/JENKINS-54126 org .jenkinsci.plugins.github_branch_source.GitHubSCMSource.cacheSize=0
{code}
And
{code:java}

GitHubServerConfig server = new GitHubServerConfig("my_github_credential_API");
server.setApiUrl(githubAPIurl)
server.setClientCacheSize(0) //<--See https://issues.jenkins-ci.org/browse/JENKINS-54126. Had to disable this one too to get this to stop happening
{code}

Once both of those are configured, delete the directory:

$jenkins_home/org.jenkinsci.plugins.github_branch_source.GitHubSCMProbe.cache

 

Now that directory is not getting re-populated.  Not sure why 2nd code block was required, but for what its worth, to the next person.

 

tliddle30@gmail.com (JIRA)

unread,
Jul 15, 2019, 9:11:04 AM7/15/19
to jenkinsc...@googlegroups.com

Hello All - 

I had this same issue with the Jenkinsfile was being discovered during the scan.  After deleting the cache file specified and re-scanning it worked perfectly.

bitwiseman@gmail.com (JIRA)

unread,
Jul 16, 2019, 11:48:03 AM7/16/19
to jenkinsc...@googlegroups.com

Thomas Liddle
It is good to know the workaround still works. Thanks!

sudheermg@yahoo.com (JIRA)

unread,
Jul 22, 2019, 3:52:05 PM7/22/19
to jenkinsc...@googlegroups.com

This issue occurred on one our branch today. On login to Jenkins server I noticed entire branch was missing under jobs folder. We use Multipipeline project and 2.164.2, this is so annoying as we lost entire history of PR(s) that went to this branch. We map build number to PR to do feature testing and now we don't have that details. 

jon_cormier@yahoo.com (JIRA)

unread,
Aug 20, 2019, 9:04:05 AM8/20/19
to jenkinsc...@googlegroups.com

We encountered this problem today, we have:
Jenkins v2.176.2
Branch API Plugin v2.5.4
GitHub Branch Source v2.5.5
Multijob plugin v 1.32
Pipeline: Multibranch 2.21

deleting the files under $JENKINS_HOME/org.jenkinsci.plugins.github_branch_source.GitHubSCMProbe.cache resolved the problem. This isn't a viable longterm solution though.

jayache80@gmail.com (JIRA)

unread,
Aug 27, 2019, 10:31:04 PM8/27/19
to jenkinsc...@googlegroups.com
Jay Ache commented on Bug JENKINS-54126

Steps to reproduce:
1. Create a branch testbranch that is one commit behind the tip of master (or some branch that is to be merged into)
2. Make a commit to testbranch that will cause a conflict with the latest on master.
3. In GitHub, create a pull-request for testbranch to be merged into master. (It will warn you that it can't be automatically merged, but "don't worry, you can still create the pull-request". (Jenkins Multibranch pipeline will refuse to create a build for both the branch and the PR).
4. Rebase testbranch off the latest on master, resolve the conflicts, and git push origin testbranch --force.
5. Jenkins Multibranch pipeline will catch that the branch changed, and build the branch, however, it still doesn't sense the pull-request, and refuses to create a build for it.

Closing and re-opening the pull-request doesn't help.
Closing the pull-request and opening a new one (for the same, conflict-free branch) does work (but is obviously not ideal).
rm -rf $JENKINS_HOME/org.jenkinsci.plugins.github_branch_source.GitHubSCMProbe.cache/* does work.

I'd also like to state that there indeed was a Jenkinsfile on testbranch, and that the trigger of this bug is when there's a merge conflict at the time the pull-request is created.

jayache80@gmail.com (JIRA)

unread,
Aug 27, 2019, 10:35:14 PM8/27/19
to jenkinsc...@googlegroups.com
Jay Ache edited a comment on Bug JENKINS-54126
Steps to reproduce:
1. Create a branch {{testbranch}} that is one commit behind the tip of {{master}} (or some branch that is to be merged into)
2. Make a commit to {{testbranch}} that will cause a conflict with the latest on {{master}}.
3. In GitHub, create a pull-request for {{testbranch}} to be merged into {{master}}. (It will warn you that it can't be automatically merged, but "don't worry, you can still create the pull-request". (Jenkins Multibranch pipeline will refuse to create a build for both the branch and the PR).
4. Rebase {{testbranch}} off the latest on {{master}}, resolve the conflicts, and {{git push origin testbranch --force}}.

5. Jenkins Multibranch pipeline will catch that the branch changed, and build the branch, however, it still doesn't sense the pull-request, and refuses to create a build for it.

Closing and re-opening the pull-request doesn't help.
Closing the pull-request and opening a new one (for the same, conflict-free branch) does work (but is obviously not ideal).
{{rm -rf $JENKINS_HOME/org.jenkinsci.plugins.github_branch_source.GitHubSCMProbe.cache/*}} does work.

I'd also like to state that there indeed was a Jenkinsfile on {{testbranch}}, and that the trigger of this bug is when there's a merge conflict _at the time the pull-request is created_.

Jenkins 2.176.2
GitHub Source Plugin 2.5.6
Pipeline: Multibranch 2.21

jordanjennings@gmail.com (JIRA)

unread,
Sep 6, 2019, 12:28:05 PM9/6/19
to jenkinsc...@googlegroups.com

We're seeing this all the time on PRs that use the draft PR feature in GitHub. I found a related issue reported for it already:

https://issues.jenkins-ci.org/browse/JENKINS-57206

alli@allir.org (JIRA)

unread,
Oct 7, 2019, 1:41:04 PM10/7/19
to jenkinsc...@googlegroups.com

Seeing this happen still,

Jenkins 2.190.1
Github Branch Source Plugin. 2.5.8
Pipeline: Multibranch: 2.21

This has been happening from time to time for us and now again today.

I did notice in a related/linked issue, https://issues.jenkins-ci.org/browse/JENKINS-57206, that this might be related to the webhook from github setting:
"mergeable_state": "unknown",
And I can confirm that it is what was sent in our case. And to recover we had to delete the cache at `$JENKINS_HOME/org.jenkinsci.plugins.github_branch_source.GitHubSCMProbe.cache/*` as it would not recover by re-scanning nor retriggering the webhook.

This message was sent by Atlassian Jira (v7.13.6#713006-sha1:cc4451f)
Atlassian logo

bitwiseman@gmail.com (JIRA)

unread,
Oct 7, 2019, 10:48:05 PM10/7/19
to jenkinsc...@googlegroups.com

Jordan Jennings - That is a completely separate issue. 

Jay Ache Jon Cormier Aðalsteinn Rúnarsson 

Would you be willing to try the OkHttp3 update in JENKINS-57411 ?  It will not fix the issue while it is happening, but we are hoping it may prevent or reduce the occurrences, but we need more people that have actually seen the issue to try the fix.

alli@allir.org (JIRA)

unread,
Oct 11, 2019, 5:50:05 AM10/11/19
to jenkinsc...@googlegroups.com

Hey, Liam Newman

This is happening very rarely for us, but it is annoying when it happens
I can see about running that OkHttp3 update, it seems to be a patch version behind though? Are there any drawbacks from installing the OkHttp3 version?

alli@allir.org (JIRA)

unread,
Oct 11, 2019, 5:57:06 AM10/11/19
to jenkinsc...@googlegroups.com
Hey, [~bitwiseman]

This is happening very rarely for us, but it is annoying when it happens :P

I can see about running that OkHttp3 update, it seems to be a patch version behind though? Are there any drawbacks from installing the OkHttp3 version?


 

As a side note, It might be related, sometimes when we add a github repo and the credentials provided in Jenkins don't have access we seem to run into a cache-ing issue as well.

So we add the Jenkins user to the Github repo with Admin access, but rescanning the pipeline still fails with invalid credentials error. Also the "Validate" function in the Job configuration will output `Error: Credentials Ok`
This can be worked around by changing the case of any letter in the repo url in the Job configuration. (for example change `https://github.com/user/repo` to `https://github.com/User/repo`)

I haven't tried deleting the Github Branch Source cache but I'm guessing that would also solve the issue, I will try that next time instead of the above mentioned workaround.

jimklimov@gmail.com (JIRA)

unread,
Nov 27, 2019, 8:57:05 PM11/27/19
to jenkinsc...@googlegroups.com

+1 for the issue that was annoying for so long now
Yes we do use GitHub caching, advocated for it to appear, and hope it stays

So I set out digging in the data, and found that cached HTTP-404s in *.0 files correlate with very short *.1 files (the compact error message from Github REST API), so selecting those to look deeper:
````
:; find . -name '.1' -size 1 | sed -e 's,^./,,' -e s',.1$,,' | while read F ; do egrep 'HTTP.*404' "$F.0" >&2 && echo "=== $F" && head -1 "$F.0" && ls -la "$F" ; done
````

Due to reasons unknown however, the cached response for some of the URLs is HTTP/404 even with a valid JSON in the (gzipped) `hashstring.1` file:
````

{"message":"No commit found for the ref refs/heads/4.2.0-FTY","documentation_url":"https://developer.github.com/v3/repos/contents/"}

````

and the corresponding `hashstring.0` file looks like:
````
:; cat fbe7227813e6f1a6bbb2f1e5202a84a2.0

https://api.github.com/repos/42ity/libzmq/contents/?ref=refs%2Fheads%2F4.2.0-FTY
GET
1
Authorization: Basic NDJpdHktY2k6NjA5MDk2YTVmNzNhNTc1YzE1OWYxZjI3NDJlZmI1YjhiMTQzZmIzMw==
HTTP/1.1 404 Not Found
31
X-OAuth-Scopes: admin:repo_hook, public_repo, repo:status, repo_deployment
X-Accepted-OAuth-Scopes:
X-GitHub-Media-Type: github.v3; format=json
Content-Encoding: gzip
Transfer-Encoding: chunked
Connection: keep-alive
Content-Type: application/octet-stream
X-Cache: MISS from thunderbolt.localdomain
X-Cache-Lookup: MISS from thunderbolt.localdomain:8080
Via: 1.1 thunderbolt.localdomain (squid/3.4.4)
Server: GitHub.com
Date: Thu, 28 Nov 2019 00:41:22 GMT
Status: 304 Not Modified
X-RateLimit-Limit: 5000
X-RateLimit-Remaining: 5000
X-RateLimit-Reset: 1574905280
Cache-Control: private, max-age=60, s-maxage=60
Vary: Accept, Authorization, Cookie, X-GitHub-OTP
ETag: "2513f4bbc2abb8b63adbec8336a82810a4fb5dc5"
Last-Modified: Wed, 05 Dec 2018 10:54:24 GMT
Access-Control-Expose-Headers: ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type
Access-Control-Allow-Origin: *
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
X-Frame-Options: deny
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Referrer-Policy: origin-when-cross-origin, strict-origin-when-cross-origin
Content-Security-Policy: default-src 'none'
X-GitHub-Request-Id: F066:2FC4:FE494:24E933:5DDF17B1
OkHttp-Sent-Millis: 1574901681913
OkHttp-Received-Millis: 1574901682110

TLS_RSA_WITH_AES_128_GCM_SHA256
2
MIIECDCCAvCgAwIBAgIUEG8XFkmTLxiL4iPSXqLddY7e6AswDQYJKoZIhvcNAQEFBQAwga0xCzAJBgNVBAYTAkNaMRcwFQYDVQQIDA5QcmFndWUgc3VidXJiczEQMA4GA1UEBwwHUm96dG9reTENMAsGA1UECgwERUVJQzERMA8GA1UECwwIQklPUyBMQUIxJDAiBgNVBAMMG3RodW5kZXJib2x0LnJvei5sYWIuZXRuLmNvbTErMCkGCSqGSIb3DQEJARYcRWF0b25JUENPcGVuc291cmNlQEVhdG9uLmNvbTAeFw0xOTA3MDgwMDAwMDBaFw0yMDA3MTYxMjAwMDBaMGgxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpDYWxpZm9ybmlhMRYwFAYDVQQHEw1TYW4gRnJhbmNpc2NvMRUwEwYDVQQKEwxHaXRIdWIsIEluYy4xFTATBgNVBAMMDCouZ2l0aHViLmNvbTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKIFH+JTppW1pvbrqnLU1SCYOsFsI6vdoL66M/497v413h1TOEwGWEo1wvZq3YhD65VSlxrsEj7xGd+ZUy2/mzRh2XmGRolJUWd/XKCQ+lJukRLX3BYhRBXfGK9Njv/afR1OIs96A4dTZA7PpPwC5Gvk34iTcJe4gilud//3UqD55A0jk+uEwQqosAImeGQg4Ayqo3K5rR+NhF8NnR7kXT1Cijk6jySbgX5Lhu8FPu7LdiPntxjuvFNJNaRy+6t4PxHJ1iRRlDdsVHyZMcZGb8klafrKsr7kLBWSMKiVaXTdlNc26bUOctH+LySlZB6Q7LgSec3MBqXZBFk0AzfwxPcCAwEAAaNkMGIwIwYDVR0RBBwwGoIMKi5naXRodWIuY29tggpnaXRodWIuY29tMA4GA1UdDwEB/wQEAwIFoDAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADANBgkqhkiG9w0BAQUFAAOCAQEAocIF+SVNlLFzWv0A/OUu4TG+aRBdzplMrF6Gy8JxwBSp22SB1PD2H71R5bi4U7UA3vgnpLbyg283XhZndNern1rIf49XXTqFbPC1xcZi85NcYc6xE18pnO0GQRaVgple2MOZXrn32FPgV2Zn/5XxGlQU1eL8leLc8tvMZkokmuBWRkuvCkx7xM5YMSAo4lRsL6zqzio/RLTOqWP1d6qSsGsf3Zc4HJ5RUTeA2QnyO1TRVvO+8bo5rQUHBOVmYhc006zs35LsjaUhG/6R1POZW2OS55U8ArQgLE/dZZV9mNJsTdd2hefv3v0+/whB+Y3stiO7zDMVFIOoHEd0+cUfGg==
MIIELzCCAxegAwIBAgIJAOz23xAU+F0TMA0GCSqGSIb3DQEBCwUAMIGtMQswCQYDVQQGEwJDWjEXMBUGA1UECAwOUHJhZ3VlIHN1YnVyYnMxEDAOBgNVBAcMB1JvenRva3kxDTALBgNVBAoMBEVFSUMxETAPBgNVBAsMCEJJT1MgTEFCMSQwIgYDVQQDDBt0aHVuZGVyYm9sdC5yb3oubGFiLmV0bi5jb20xKzApBgkqhkiG9w0BCQEWHEVhdG9uSVBDT3BlbnNvdXJjZUBFYXRvbi5jb20wHhcNMTgwNDAzMTIxNzU2WhcNMjgwMzMxMTIxNzU2WjCBrTELMAkGA1UEBhMCQ1oxFzAVBgNVBAgMDlByYWd1ZSBzdWJ1cmJzMRAwDgYDVQQHDAdSb3p0b2t5MQ0wCwYDVQQKDARFRUlDMREwDwYDVQQLDAhCSU9TIExBQjEkMCIGA1UEAwwbdGh1bmRlcmJvbHQucm96LmxhYi5ldG4uY29tMSswKQYJKoZIhvcNAQkBFhxFYXRvbklQQ09wZW5zb3VyY2VARWF0b24uY29tMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAogUf4lOmlbWm9uuqctTVIJg6wWwjq92gvroz/j3u/jXeHVM4TAZYSjXC9mrdiEPrlVKXGuwSPvEZ35lTLb+bNGHZeYZGiUlRZ39coJD6Um6REtfcFiFEFd8Yr02O/9p9HU4iz3oDh1NkDs+k/ALka+TfiJNwl7iCKW53//dSoPnkDSOT64TBCqiwAiZ4ZCDgDKqjcrmtH42EXw2dHuRdPUKKOTqPJJuBfkuG7wU+7st2I+e3GO68U0k1pHL7q3g/EcnWJFGUN2xUfJkxxkZvySVp+sqyvuQsFZIwqJVpdN2U1zbptQ5y0f4vJKVkHpDsuBJ5zcwGpdkEWTQDN/DE9wIDAQABo1AwTjAdBgNVHQ4EFgQUAf/vfDxEB9kv3Cfo9fb3ikvyWNswHwYDVR0jBBgwFoAUAf/vfDxEB9kv3Cfo9fb3ikvyWNswDAYDVR0TBAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEAlwBAM+b+mxtzgP+Q5AFWzLqj2TwSWXERGNnZQFDVeoZXb2y7UqaAf+Dz8WvTOrn51/fE5jsyqYHUCBXucbFJIuFx4G7vhsspcraIgenTGoP5N4L2UamrEkrqBl1CkYVhP2aykdA9G2Tu/61/rHMNycuLCf/CrZA54QlVQ8M8KtAQo+CEKcGeDBabP4TOtWvPO7ScM9kj5vRTiwy0DaVIL2VaNWLsdqT9tQ8e01wB1CRtjBFb1lhr3zMT0wXF8gAA9zcL6h1/1yiD5lNFKYUTKtsAuLpNb51lUq1k8eshyqiCHMrSm9/nj4L1WcWSiiR4MxvU2DTGUmwrKJ6Z3tf1Xw==
0
````

It seems that a large portion of such files appeared Jul 22 between 15:45-16:30 UTC so maybe there was an outage of GitHub at that time... there were a few this year. The few other short files apparently point to scans/builds of recently merged PRs so the ephemeral branch is really not there.

For reasons unknown, the "Date:" timestamp in the .0 header file is fresh, probably from the last scan; the result and content on-disk remain unchanged. Manually submitted requests through same proxy do return expected contents of the Git branch (wrapped into GitHub's REST API JSON markup).

Previously tried forcing the job configs to be not-disabled (via on-disk XMLs and reload of Jenkins configuration), this got the jobs not-marked with gray balls in the dashboard... but then they were re-marked probably due to this cache issue. For our OrgFolders making MultiBranch pipelines, the half-successful magic looked like this:
````
:; for D in /var/lib/jenkins/jobs//jobs ; do ( cd "$D" && for F in */branches//config.xml ; do sed 's,<disabled>true</disabled>,<disabled>false</disabled>,' -i "$F" ; done ); done
````

jimklimov@gmail.com (JIRA)

unread,
Nov 27, 2019, 9:03:07 PM11/27/19
to jenkinsc...@googlegroups.com
Jim Klimov edited a comment on Bug JENKINS-54126
+1 for the issue that was annoying for so long now
Yes we do use GitHub caching, advocated for it to appear, and hope it stays :) with the poor internet uplink we have, and with github REST API quotas for uncached requests abound, and no possibility to get hooks thus requiring polling, - our farm couldn't really work without it.

So I set out digging in the data, and found that cached HTTP-404s in \*.0 files correlate with very short \*.1 files (the compact error message from Github REST API), so selecting those to look deeper:
```` {code}
:; find . -name '*.1' -size 1 | sed -e 's,^./,,' -e s',.1$,,' | while read F ; do egrep 'HTTP.*404' "$F.0" >&2 && echo "=== $F" && head -1 "$F.0" && ls -la "$F"* ; done
```` {code}

Due to reasons unknown however, the cached response for some of the URLs is HTTP/404 even with a valid JSON in the (gzipped) `hashstring.1` file:
```` {code}
{"message":"No commit found for the ref refs/heads/4.2.0-FTY","documentation_url":"https://developer.github.com/v3/repos/contents/"}
```` {code}

and the corresponding `hashstring.0` file looks like:
```` {code}
```` {code}

It seems that a large portion of such files appeared Jul 22 between 15:45-16:30 UTC so maybe there was an outage of GitHub at that time... there were a few this year. The few other short files apparently point to scans/builds of recently merged PRs so the ephemeral branch is really not there.

For reasons unknown, the "Date:" timestamp in the .0 header file is fresh, probably from the last scan; the result and content on-disk remain unchanged. Manually submitted requests through same proxy do return expected contents of the Git branch (wrapped into GitHub's REST API JSON markup). Probably the client did submit the cached Etag, maybe with object timestamp, and Github confirmed the cached value is still valid (except due to that hiccup it isn't).

Possibly the sort-of-fix would be to set up an optional timeout for cached (negative only?) responses so eventually they are retried. Or making a forced option to Branch indexing/MBP rescan/SCM Polling/... so that the manually issued request is done not-cached (for all or negative cached replies) so updating the cache with real current replies as if from scratch.

Previously tried forcing the job configs to be not-disabled (via on-disk XMLs and reload of Jenkins configuration), this got the jobs not-marked with gray balls in the dashboard... but then they were re-marked probably due to this cache issue. For our OrgFolders making MultiBranch pipelines, the half-successful magic looked like this:
```` {code}
:; for D in /var/lib/jenkins/jobs/*/jobs ; do ( cd "$D" && for F in */branches/*/config.xml ; do sed 's,<disabled>true</disabled>,<disabled>false</disabled>,' -i "$F" ; done ); done
```` {code}

jimklimov@gmail.com (JIRA)

unread,
Nov 27, 2019, 9:03:42 PM11/27/19
to jenkinsc...@googlegroups.com
Jim Klimov edited a comment on Bug JENKINS-54126
+1 for the issue that was annoying for so long now , although most often seen on real branches (masters and releases of our project that are really providing a Jenkinsfile)

jimklimov@gmail.com (JIRA)

unread,
Nov 27, 2019, 9:16:06 PM11/27/19
to jenkinsc...@googlegroups.com
Jim Klimov edited a comment on Bug JENKINS-54126
+1 for the issue that was annoying for so long now, although most often seen on real branches (masters and releases of our project that are really providing a Jenkinsfile)
UPDATE: https://t.co/cFs8GfdpVV marks it at 15:46 ;)

For reasons unknown, the "Date:" timestamp in the .0 header file is fresh, probably from the last scan; the result and content on-disk remain unchanged. Manually submitted requests through same proxy do return expected contents of the Git branch (wrapped into GitHub's REST API JSON markup). Probably the client did submit the cached Etag, maybe with object timestamp, and Github confirmed the cached value is still valid (except due to that hiccup it isn't).

Possibly the sort-of-fix would be to set up an optional timeout for cached (negative only?) responses so eventually they are retried. Or making a forced option to Branch indexing/MBP rescan/SCM Polling/... so that the manually issued request is done not-cached (for all or negative cached replies) so updating the cache with real current replies as if from scratch.

Previously tried forcing the job configs to be not-disabled (via on-disk XMLs and reload of Jenkins configuration), this got the jobs not-marked with gray balls in the dashboard... but then they were re-marked probably due to this cache issue. For our OrgFolders making MultiBranch pipelines, the half-successful magic looked like this:
{code}
:; for D in /var/lib/jenkins/jobs/*/jobs ; do ( cd "$D" && for F in */branches/*/config.xml ; do sed 's,<disabled>true</disabled>,<disabled>false</disabled>,' -i "$F" ; done ); done
{code}

bitwiseman@gmail.com (JIRA)

unread,
Jan 17, 2020, 8:25:15 PM1/17/20
to jenkinsc...@googlegroups.com

We have determined that issue is being caused by a bug in the GitHub API. The problem is described in https://github.com/github-api/github-api/pull/669 .

The linked PR (https://github.com/github-api/github-api/pull/665) now shows a workaround that should work for all scenarios. The workaround will only execute when actually needed and will occur without caller of the github-api library knowing about it.

I'll do the work to upgrade this dependency (already in progress) and this problem will go away. This is my top priority for the coming week.

jglick@cloudbees.com (JIRA)

unread,
Jan 18, 2020, 8:19:21 AM1/18/20
to jenkinsc...@googlegroups.com

bitwiseman@gmail.com (JIRA)

unread,
Jan 18, 2020, 5:06:05 PM1/18/20
to jenkinsc...@googlegroups.com

Jesse GlickYes, I've reported it to github via a support ticket. I have not heard back from them beyond an automated response.

bitwiseman@gmail.com (JIRA)

unread,
Jan 29, 2020, 5:57:09 PM1/29/20
to jenkinsc...@googlegroups.com
Liam Newman updated Bug JENKINS-54126
 

The fix for this issue has been merged and will be released in the next day or two in github-branch-source v2.6.0.

If you want to try it out now:
install github-api-plugin 1.6.0
and then install the hpi from: https://repo.jenkins-ci.org/incrementals/org/jenkins-ci/plugins/github-branch-source/2.5.9-rc1028.3059575bf1cc/

Change By: Liam Newman
Status: In Progress Fixed but Unreleased
Resolution: Fixed

raihaan.shouhell@autodesk.com (JIRA)

unread,
Jan 30, 2020, 10:56:04 PM1/30/20
to jenkinsc...@googlegroups.com
Raihaan Shouhell commented on Bug JENKINS-54126
 
Re: Jenkinsfile not found in PR on GitHub -- Does not meet criteria

Liam Newman AFAICT with just github-api-plugin 1.106 the problem should go away? Of course it might do something less optimal

bitwiseman@gmail.com (JIRA)

unread,
Feb 3, 2020, 5:07:08 PM2/3/20
to jenkinsc...@googlegroups.com
 

Github-branch-source-plugin v2.6.0

Change By: Liam Newman
Status: Fixed but Unreleased Closed

pascal@pwiddershoven.nl (JIRA)

unread,
Feb 4, 2020, 2:30:06 AM2/4/20
to jenkinsc...@googlegroups.com
Pascal Widdershoven commented on Bug JENKINS-54126
 
Re: Jenkinsfile not found in PR on GitHub -- Does not meet criteria

Thanks for fixing this Liam Newman! It's been a long time frustration.

Reply all
Reply to author
Forward
0 new messages