Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Run Tests not working

75 views
Skip to first unread message

PERALDI Olivier

unread,
Jul 9, 2018, 2:14:53 PM7/9/18
to RDM Testing
Hello,

Strange problem. Here my configuration:

- Ubuntu 18.04 on vMware,
- install ola, ola-python, ola-rdm-test as well as all attached package via synaptic
- Robe Universal interface

Reboot... with usb Roe interface connected...

1/ OLA Admin is working

2/ RDM Responder Tests & Publisher loads normally. 
  - See connected device
  - RDM Responder Publisher Tab collects Responder information normally
  - RDM Responder Tests TAB seems to work, but the Run Tests button DO NOTHING...

Thank you. Regards.

Olivier




Peter Newman

unread,
Jul 9, 2018, 10:48:23 PM7/9/18
to RDM Testing
Hi Oliver,

Possibly a basic thing, have you added some tests to run? If you click the Add All button on the right, it will add all the tests to the list to run.

At a guess this is probably a Javascript error of some sort. Which web browser are you using to run the tests? Can you try another one? Can you open up the web console or equivalent, reload the page and then either copy/paste or screenshot any error messages.

In the short term, you could use the CLI tool to run the tests instead:

PERALDI Olivier

unread,
Jul 10, 2018, 8:44:30 AM7/10/18
to RDM Testing
Hello Peter,

1/ Yes, I have added tests to the list. Several times...

2/ I am using Firefox. I have several Virtual machine, and firefox is working on the other ones. I have also used Chrome, directly from my PC (not from a VM), and same result.

I have no message...

3/ When I launch rdm_responder_test.py from a terminal, I have this message:
Traceback (most recent call last):
  File "/usr/bin/rdm_responder_test.py", line 28, in <module>
    from TimingStats import TimingStats
ImportError: No module named TimingStats

I guess, I miss something.

As i told you, i have used Synaptic to install last running version of ola. 0.10.5.nojsmin-3, for The Bionic Beaver, normally :-)

Thank you.

Olivier

Peter Newman

unread,
Jul 10, 2018, 10:24:49 AM7/10/18
to RDM Testing
Hi Oliver,

Can you change line 28 of /usr/bin/rdm_responder_test.py to:
from ola.testing.rdm.TimingStats import TimingStats

And then try running rdm_responder_test.py again.

If that works, the same change to line 22 of /usr/lib/python2.7/dist-packages/ola/testing/rdm/TestRunner.py will probably fix the test server webpage version too.

PERALDI Olivier

unread,
Jul 11, 2018, 9:18:52 AM7/11/18
to RDM Testing
Hello Peter,
1/ rdm_responder_test.py is now working. i have been able to run a test, See Below.




2/ Changing line 22 in TestRunner.py does not change anything... The web version is not running. 

I have noticed couples of strange things:

- I have OLA 0.10.5 on a Mint 18.3. When i am running it, I have 580 tests, but the above rdm_responder_test.py runs only 426 tests

- When I load the WebPage in Mint, I see 


but on the Ubuntu with problem, I see. The UID is there, but not the company name...


Hope this help. tank you and best regards.


Olivier

Peter Newman

unread,
Jul 11, 2018, 10:08:56 PM7/11/18
to RDM Testing
Excellent. The fix for rdm_responder_test.py was just in time to make it into 0.10.7, which should be released soon.

So the Mint page will actually be from our master branch of git, rather than a release tag or a release branch.

Do the web based tests fail on both Mint and Ubuntu?

Can we see any console output from rdm_test_server.py while it's running, or failing to run, the tests please?

Finally, don't forget to use the RDM Model Collector and publish to http://rdm.openlighting.org/ when you've got a finished/released product.

PERALDI Olivier

unread,
Jul 12, 2018, 9:19:09 AM7/12/18
to RDM Testing
Hello Peter,

The web page is working perfectly on Mint, but I have installed the version completely, without the help of synaptics or similar program, from a master branch, with some troubles...

We already got that discussion about the difficulties for a end user to know which version he is really downloading when using "git clone https://code.google.com/p/open-lighting/ ola"... At that time, I end up with a non stable version. It is why i have chosen to use this time synaptics, hoping it will be easier to install (which is definitively the case), but i end up with probably an older release and trouble to run it... Nothing is perfect... 

I don't have a trace of the web page, because I guess, that, with the synaptic version, rdm_test_server.py as well as olad are started at log in. Therefore I don't have a terminal displaying the trace.

If I try to launch rdm_test_server.py, I have a message saying that 'Address already in use". This confirms the above paragraph, I guess.

I guess there is somewhere a log file, but I don't know where is that file...

When i run the rdm_responder_test.py, everything is normal.

Last time i have tried RDM_Model_Collector (2015), the system was having problem to load an image of the product as well as recognize and record manufacturer PIDs. Is it solved?

Regards.

Olivier

PERALDI Olivier

unread,
Jul 12, 2018, 12:19:36 PM7/12/18
to RDM Testing
I have try something else, in a terminal (see also my other response):

ps -ef | grep olad
olad       1436      1  0 05:35 ?        00:00:03 /usr/bin/olad --syslog --log-level 3 --config-dir /etc/ola
olad       1448      1  0 05:35 ?        00:00:05 /usr/bin/python /usr/bin/rdm_test_server.py --world-writeable
olivier   14685  14603  0 09:11 pts/0    00:00:00 grep --color=auto olad
 
 sudo kill -9 1448

rdm_test_server.py
Checking olad status
Running RDM Tests Server on 127.0.0.1:9099
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/rdmtests.html HTTP/1.1" 200 15598
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/jquery-ui.css HTTP/1.1" 200 37555
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/ui.multiselect.css HTTP/1.1" 200 1886
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/jquery.min.js HTTP/1.1" 200 136062
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/ui.multiselect.js HTTP/1.1" 200 11024
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/rdm_tests.js HTTP/1.1" 200 29800
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/jquery-ui.min.js HTTP/1.1" 200 312014
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/images/logo.png HTTP/1.1" 200 1551
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/images/discovery.png HTTP/1.1" 200 651
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/images/external.png HTTP/1.1" 200 163
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /GetUnivInfo?_=1531411970232 HTTP/1.1" 200 78
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /GetTestDefs?c=0&_=1531411970233 HTTP/1.1" 200 11693
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/images/ui-icons_777777_256x240.png HTTP/1.1" 404 0
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /GetDevices?u=0&_=1531411970234 HTTP/1.1" 200 43
127.0.0.1 - - [12/Jul/2018 09:12:50] "GET /static/images/ui-icons_444444_256x240.png HTTP/1.1" 404 0
127.0.0.1 - - [12/Jul/2018 09:12:51] "GET /static/images/favicon.ico HTTP/1.1" 200 1406
127.0.0.1 - - [12/Jul/2018 09:12:58] "GET /static/images/ui-icons_555555_256x240.png HTTP/1.1" 404 0
127.0.0.1 - - [12/Jul/2018 09:12:59] "GET /static/images/ui-icons_ffffff_256x240.png HTTP/1.1" 404 0

Seems to get frozen here...

Olivier

On Wednesday, 11 July 2018 22:08:56 UTC-4, Peter Newman wrote:

Peter Newman

unread,
Jul 12, 2018, 11:27:24 PM7/12/18
to RDM Testing
Ah yes, I remember that discussion now. Personally I'd say, and I suspect I said the same before, that running the extra 150 tests against your responder is probably beneficial and stick with git master (on Mint or Ubuntu). There's generally no explicit testing of the RDM tests after the commit and before release, so the only benefit is they may have been run a few more times beforehand. Although I generally run master rather than a release build when I'm running them, to get all the latest tests available.

I think we've fixed a few image loading bugs now and we're working on Continuous Deployment for our RDM website, which will mean future bug fixes get live onto the website sooner.

Manufacturer PIDs are gathered by the model collector (since 2013), although there's currently no automated process to convert this data and load it into the PID store (if nothing else because the current E1.20 PARAMETER_DESCRIPTION doesn't give enough info to do so; there is a task group for E1.37-5, which when it's finished may enable such automated conversion, but the data from manufacturers would probably want sanity checking first still).

There is some basic info on our PID store here:

There is also some technical info on the PID format, including how to add them yourself:

Or if you can give us some info on them, one of us can write the definitions.

Peter Newman

unread,
Jul 13, 2018, 12:18:43 AM7/13/18
to RDM Testing
Yes, that will do the trick. Alternatively you can use "sudo /etc/init.d/ola-rdm-tests stop" to do the same thing.

I didn't get a chance to double-check against mine earlier, but this still looks like a browser/JS error. When you click the Run Tests button, it validates the form, before hitting the /RunTests URL.

So just to confirm, Chrome's Developer tools->Console shows no messages, even if you do Ctrl+F5 to force the page to reload? Can you try recording network traffic from within the browser?

When you think it's frozen, can you click the magnifying glass, which should cause it to do a discovery which will log something in the RDM test server again.

It's probably unrelated, but I'm a bit confused you're getting some 404s for some icons that don't exist in our source (but we have similar names), I don't really get what's going on there.

Peter Newman

unread,
Jul 13, 2018, 5:03:26 AM7/13/18
to RDM Testing
Okay, so I've just tested on mine, here's my log.

./tools/rdm/rdm_test_server.py
Checking olad status
Running RDM Tests Server on http://127.0.0.1:9099
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/rdmtests.html HTTP/1.1" 200 15633
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/jquery-ui-1.8.21.custom.css HTTP/1.1" 200 33311
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/ui.multiselect.css HTTP/1.1" 200 1886
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/jquery-1.7.2.min.js HTTP/1.1" 200 94840
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/jquery-ui-1.8.21.custom.min.js HTTP/1.1" 200 206923
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/ui.multiselect.js HTTP/1.1" 200 11024
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/rdm_tests.js HTTP/1.1" 200 30097
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/logo.png HTTP/1.1" 200 1579
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/discovery.png HTTP/1.1" 200 860
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/external.png HTTP/1.1" 200 165
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/ui-bg_inset-hard_75_999999_1x100.png HTTP/1.1" 200 114
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/ui-bg_glass_60_eeeeee_1x400.png HTTP/1.1" 200 110
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /GetUnivInfo?_=1531471617344 HTTP/1.1" 200 730
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /GetTestDefs?c=0&_=1531471617346 HTTP/1.1" 200 17011
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/ui-icons_70b2e1_256x240.png HTTP/1.1" 200 4369
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /GetDevices?u=0&_=1531471617386 HTTP/1.1" 200 145
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/ui-bg_glass_35_dddddd_1x400.png HTTP/1.1" 200 109
127.0.0.1 - - [13/Jul/2018 09:46:57] "GET /static/images/ui-bg_inset-soft_50_c9c9c9_1x100.png HTTP/1.1" 200 96
127.0.0.1 - - [13/Jul/2018 09:46:58] "GET /static/images/favicon.ico HTTP/1.1" 200 1406
127.0.0.1 - - [13/Jul/2018 09:47:15] "GET /static/images/ui-bg_glass_100_f8f8f8_1x400.png HTTP/1.1" 200 105
127.0.0.1 - - [13/Jul/2018 09:47:15] "GET /static/images/ui-icons_3383bb_256x240.png HTTP/1.1" 200 4369
127.0.0.1 - - [13/Jul/2018 09:47:23] "GET /static/images/ui-icons_454545_256x240.png HTTP/1.1" 200 4369
127.0.0.1 - - [13/Jul/2018 09:47:24] "GET /static/images/ui-bg_flat_0_eeeeee_40x100.png HTTP/1.1" 200 180
127.0.0.1 - - [13/Jul/2018 09:47:24] "GET /RunTests?u=0&uid=...

I click the button when the f8f8f8 image gets loaded; we seem to have slgihtly different sets of pictures. The final thing to check, does http://127.0.0.1:9099/RunTests work for you (change the IP as necessary); run like that it should throw an error about a missing parameter. If it does at least that, then it's definitely a client/JS issue of some sort.
Reply all
Reply to author
Forward
0 new messages