Running Cloud Taurus Test on Blazemeter Always Returns Status Code 0

284 views
Skip to first unread message

tkir...@clickboardingdemo.com

unread,
Jan 31, 2020, 11:44:32 AM1/31/20
to codename-taurus
I am running a Taurus Test that is wrapping a JMeter test using Blazemeter's cloud offering. Everything is working great except that when tests fail, the Taurus process is still exiting with status code 0 instead of status code 3. 

Additional Details: I am running Taurus as part of an Azure DevOps pipeline, and my goal is to get that pipeline run to fail if any of the tests fail, which should happen if the Taurus process exits with status code != 0;

Here is my Taurus.yml file.

settings:
 
default-executor: jmeter

provisioning
: cloud

execution
:
- concurrency: 30
  hold
-for: 25m
  ramp
-up: 0m
  scenario
: test1
  locations
:
    us
-west-1: 30
  files
:
 
- config.csv

scenarios
:
  test1
:
    script
: Test.jmx

modules
:
  blazemeter
:
    test
: "Test Name"
    project
: "Project Name"
    token
:
    timeout
: 7s
    check
-interval: 10s
    send
-interval: 30s
    timeout
: 30s

I have also tried adding the below "passfail" snippet to the bottom, but when Taurus runs it says "WARNING: Passfail has no effect for cloud, skipped"

reporting:
- module: passfail
  run
-at: cloud
  criteria
:
 
- fail>0, stop as failed

In Blazmeter I have Failure Criteria configured and the test properly shows as failed there. 

Even the end of run stats state that there were errors reported.

15:18:40 INFO: Test duration: 0:22:44
15:18:40 INFO: Samples count: 2835, 0.21% failures
15:18:40 INFO: Average times: total 1.185, latency 1.098, connect 0.000
...
15:18:40 INFO: Done performing with code: 0


Mark Asbury

unread,
Jan 31, 2020, 3:33:17 PM1/31/20
to codename-taurus
If you have not done so already, take a look at the Pass/Fail Criteria page in the Taurus User Manual to see some example on how to set the pass /fail criteria that is appropriate for your requirements.

grey....@gmail.com

unread,
Feb 2, 2020, 12:21:26 AM2/2/20
to codename-taurus
Hi.
It sounds like our bug, might be taurus doesn't get result from cloud in that case. I'll check.

---
Taras 

grey....@gmail.com

unread,
Feb 3, 2020, 4:58:31 AM2/3/20
to codename-taurus
You're right, 'reporting from cloud' has some problems. It will be fixed but doubt in nearest time.

---
Taras


On Friday, January 31, 2020 at 11:33:17 PM UTC+3, Mark Asbury wrote:

tkir...@clickboardingdemo.com

unread,
Feb 3, 2020, 8:54:40 AM2/3/20
to codename-taurus
Thanks for the reply. Good to know I'm not making a simple mistake.

Nick Gover

unread,
Jun 8, 2020, 11:05:09 AM6/8/20
to codename-taurus
Hi Taras,

Do you know if this has been resolved?  I appear to be having the same issue.  I've added the cloud acceptance-criteria to the reporting module and can verify that the test does fail on BlazeMeter but the test finishes with an exit code of 0 still.  So my pipeline will always show as passed since the test doesn't trigger the job to generate a non-zero exit code.

My Taurus Config:

execution:
  - concurrency: 3
    ramp-up: 90s
    hold-for: 2m
    scenario: simple
    locations:
      us-west-1: 1


scenarios:
  simple:
    default-address: https://nekcchamber.com/
    think-time: 2s  # global
    requests:
      - /

provisioning: local

reporting:
  - module: passfail
    run-at: cloud
    criteria:
      - avg-rt>100ms for 10s, stop as failed
  - module: passfail
    run-at: local
    criteria:
      - avg-rt>1s for 10s, stop as failed

modules:
  blazemeter:
    token: '******:**************'  # API id and API secret divided by :
    timeout: 10s  # BlazeMeter API client timeout
    project: demo
    test: taurus-demo
    report-name: taurus-demo-001
    browser-open: none  # auto-open browser on test start/end/both/none
    check-interval: 5s  # interval which Taurus uses to query test status from BlazeMeter
    public-report: false  # make test report public, disabled by default
    send-report-email: false  # send report email once test is finished, disabled by default
    request-logging-limit: 10240 # use this to dump more of request/response data into logs, for debugging

If I execute it locally it will fail and exit with a code of 3, but if I execute through BlazeMeter it does not.

Here's a screenshot of the results.

grey fenrir

unread,
Jun 10, 2020, 4:40:08 PM6/10/20
to codename-taurus
Hi Nick, thanks for detailed report. We're working on that, resolution time is vague..)

---
Taras

Nick Gover

unread,
Jun 10, 2020, 4:42:49 PM6/10/20
to codename-taurus
Alright, thanks!  Just figured I'd check to make sure it wasn't on my side.

Smita Dutta

unread,
Apr 2, 2021, 1:19:35 AM4/2/21
to codename-taurus
Hi Team,
Can anyone please let me know if this has been resolved or not?  I am also facing the same issue.  I've added the pass/fail acceptance-criteria to the reporting and can verify that the test does fail on BlazeMeter but the test finishes with an exit code of 0 still.  So my pipeline is always showing as passed.
Screen Shot 2021-04-02 at 4.16.21 pm.png

grey....@gmail.com

unread,
Apr 6, 2021, 1:48:21 AM4/6/21
to codename-taurus
Hello.
It's famous problem with BM API and can't be resolved on our side.
Please contact BM support to report the problem.

---
Taras

Reply all
Reply to author
Forward
0 new messages