Upon upgrading to v2.2.x we are experiencing status 401 responses with over 500 concurrent users per simulation

23 views
Skip to first unread message

Ori Cohen

unread,
Jan 18, 2017, 9:31:55 AM1/18/17
to Gatling User Group
Running the same simulation on Gatling v2.1.7 with the same (or even much higher) load per simulation, on the same machine/s we do not get these 401 responses.

The requests that get these responses are heartbeat requests sent once every 2 seconds per user.
The users are pre-authenticated and requests are sent with oauth from that point on.

This is also not as a result of load on the worker linux machine, the load average on the machine is around 0.3 at 500 concurrent users.

To verify, I ran three different copies of the same simulation on the same machine as different processes, at the same time, reaching a combined 1500 concurrent users and did not get any 401 status responses.

This is also not a result of a specific duration. I can load the users much faster and much slower and still get to this problem when reaching ~510 concurrent users on this scenario on the same simulation.

This reproduces every time on both Gatling v2.2.2 and v2.2.3.

In addition, the requests that fail look exactly the same as the ones that return the expected response, and there is nothing in the scenario that happens when we reach ~500 users that could possibly cause a backend change that would result in a behavioral change. Even if there was such a thing, as I said before, this works without a problem on a much higher load on Gatling v2.1.7.

We are not sure what else could have changed. Possibly Gatling configuration defaults or other

Below are two example requests the first returns the expected response and the second (that occurred only a couple of seconds later) occurred just after the 510 concurrent agents mark was reached.

Please help urgently.
Any assistance would be greatly appreciated.


Non failing request:

=========================
HTTP request:
headers=
accept: application/json
AUTHORIZATION: Bearer b74c97783f825b85f2d094870ef069970ce6e82cbc69bfb5d208da1c238da20b
Connection: keep-alive
Accept: */*
Content-Type: application/json; charset=UTF-8
Accept-Language: en-US,en;q=0.8,ru;q=0.6,he;q=0.4
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2939.0 Safari/537.36
Host: be1.server
X-Requested-With: XMLHttpRequest
Cookie: sessionKey=abcdefghij-klmnopqrstusessionKey=abcdefghij-klmnopqrstu; idpLastDomain=be1.server; idpLastSiteId=123456789
=========================
HTTP response:
status=
200 OK
headers= 
Content-Type: application/json
Server: WS
Set-Cookie: sessionKey=714116790-1484566265037;Version=1;Path=/api/account/123456789/session/216071500;Secure
P3P: CP="NON BUS INT NAV COM ADM CON CUR IVA IVD OTP PSA PSD TEL SAM"
Date: Mon, 16 Jan 2017 11:35:05 GMT
Transfer-Encoding: chunked
Content-Encoding: gzip

body= [...]
<<<<<<<<<<<<<<<<<<<<<<<<<


Failing request:

=========================
HTTP request:
headers=
accept: application/json
AUTHORIZATION: Bearer b74c97783f825b85f2d094870ef069970ce6e82cbc69bfb5d208da1c238da20b
Connection: keep-alive
Accept: */*
Content-Type: application/json; charset=UTF-8
Accept-Language: en-US,en;q=0.8,ru;q=0.6,he;q=0.4
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2939.0 Safari/537.36
Host: be1.server
X-Requested-With: XMLHttpRequest
Cookie: sessionKey=abcdefghij-klmnopqrstusessionKey=abcdefghij-klmnopqrstu; idpLastDomain=be1.server; idpLastSiteId=123456789
=========================
HTTP response:
status=
401 Unauthorized
headers= 
Content-Type: application/json
Server: WS
WWW-Authenticate: OAuth realm=be3.server
P3P: CP="NON BUS INT NAV COM ADM CON CUR IVA IVD OTP PSA PSD TEL SAM"
Date: Mon, 16 Jan 2017 11:35:07 GMT
Transfer-Encoding: chunked
Content-Encoding: gzip

body=
oauth_problem=token_rejected&oauth_problem_advice=the oauth_token is unacceptable to the Service Provider
<<<<<<<<<<<<<<<<<<<<<<<<<


Regards,
Ori
Reply all
Reply to author
Forward
0 new messages