Is this a BUG REPORT or FEATURE REQUEST?:
/kind bug
/priority failing-test
What happened:
https://k8s-testgrid.appspot.com/sig-network-gce#gci-gce-latest-upgrade-kube-proxy-ds&width=20 started failing since 11/13/2017. According to the timestamp and commit hash, seems failures happened after #55457 (cherrypick for release-1.8) merged.
The first failed run is: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/ci-kubernetes-e2e-gci-gce-latest-upgrade-kube-proxy-ds/1691
From kube-apiserver logs found lots of stacktraces, which seems suspicious. Sample error below:
I1114 03:06:28.518192 5 wrap.go:42] GET /healthz: (641.99µs) 500
goroutine 1701 [running]:
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/httplog.(*respLogger).recordStatus(0xc422691340, 0x1f4)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/httplog/httplog.go:207 +0xdd
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/httplog.(*respLogger).WriteHeader(0xc422691340, 0x1f4)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/httplog/httplog.go:186 +0x35
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*baseTimeoutWriter).WriteHeader(0xc42aa74a00, 0x1f4)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:188 +0xac
net/http.Error(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa8c000, 0x302, 0x1f4)
/usr/local/go/src/net/http/server.go:1930 +0xda
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/healthz.handleRootHealthz.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/healthz/healthz.go:121 +0x508
net/http.HandlerFunc.ServeHTTP(0xc427b61d20, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*pathHandler).ServeHTTP(0xc429d97700, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:241 +0x55a
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*PathRecorderMux).ServeHTTP(0xc422753030, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:234 +0xa1
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x3915c88, 0xf, 0xc4216be240, 0xc422753030, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:161 +0x6ad
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.(*director).ServeHTTP(0xc42505dae0, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
<autogenerated>:1 +0x75
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:51 +0x37d
net/http.HandlerFunc.ServeHTTP(0xc42549c280, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:96 +0x318
net/http.HandlerFunc.ServeHTTP(0xc4238504c0, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:49 +0x203a
net/http.HandlerFunc.ServeHTTP(0xc42549c2d0, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAudit.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/audit.go:53 +0x74c
net/http.HandlerFunc.ServeHTTP(0xc42549c320, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:79 +0x2b1
net/http.HandlerFunc.ServeHTTP(0xc42549c3c0, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/request.WithRequestContext.func1(0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/request/requestcontext.go:110 +0xcb
net/http.HandlerFunc.ServeHTTP(0xc42505db00, 0x7f85bd73d5f0, 0xc429b52988, 0xc42aa69100)
/usr/local/go/src/net/http/server.go:1918 +0x44
k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc42505dba0, 0x88b7b20, 0xc429b52988, 0xc42aa69100, 0xc426a54360)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x8d
created by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:92 +0x1ab
logging error output: "[+]ping ok\n[-]etcd failed: reason withheld\n[+]poststarthook/generic-apiserver-start-informers ok\n[+]poststarthook/start-apiextensions-informers ok\n[+]poststarthook/start-apiextensions-controllers ok\n[-]poststarthook/bootstrap-controller failed: reason withheld\n[-]poststarthook/rbac/bootstrap-roles failed: reason withheld\n[-]poststarthook/ca-registration failed: reason withheld\n[+]poststarthook/start-kube-apiserver-informers ok\n[+]poststarthook/start-kube-aggregator-informers ok\n[+]poststarthook/apiservice-registration-controller ok\n[+]poststarthook/apiservice-status-available-controller ok\n[+]poststarthook/apiservice-openapi-controller ok\n[+]poststarthook/kube-apiserver-autoregistration ok\n[-]autoregister-completion failed: reason withheld\nhealthz check failed\n"
[[curl/7.38.0] 35.188.77.130:46178]
Anything else we need to know?:
@dixudx @DirectXMan12
@kubernetes/sig-api-machinery-test-failures
—
You are receiving this because you are on a team that was mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.![]()
cc @jpbetz
@dixudx Thanks for looking. Yeah your cherrypick might not be relevant. Just saw https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/ci-kubernetes-e2e-gci-gce-latest-upgrade-kube-proxy-ds/1690 passed with the same commit.
Error is not occurring anymore.
/close
Closed #55726.