Bugs and weird LIONESS behavior

65 views
Skip to first unread message

Thomas Schilling

unread,
May 4, 2024, 4:35:15 AM5/4/24
to LIONESS Lab help and discussion
I opened this thread for everyone to randomly report bugs and strange behaviors of the LIONESS platform to have them in one overview so that the LIONESS team can fix those when they have time. 
I haven't read recently if any of this has been reported before, nor if this was recently added in the documentation (maybe because some of it can't be programmed differently?).


LIONESS often stops being active in the JavaScript element when I type stuff there. This is a bit difficult to explain. What I mean is that when I type JS code, LIONESS suddenly makes the JS element inactive, as if I clicked the mouse outside the element. The flashing bar is still there, and it looks like the element is active. But when I keep typing, nothing is entered in the JS element anymore. When I press the space bar, the window scrolls down. So I have to either click with the mouse back into the JS element or I have to use one of the arrow keys to continue typing in the JS element. 

When I try to record a variable to the database as a string variable (e.g., record("varString", "this is a string"); , then LIONESS will not accept this and won't create the variable. To record a string, I first must declare a string variable (e.g., var string = "this is a string") and then write to the database (record("varString", string); and then it works. This is a bit odd. 


Thomas Schilling

unread,
May 10, 2024, 3:29:17 AM5/10/24
to LIONESS Lab help and discussion
Hi LIONESS team,

Not sure this has been reported or others have the same problem. I think LIONESS sometimes has issues when saving variables. 
I ran two pilot studies recently and in both pilot studies, some variables remained unsaved/empty/with default values (the way I declare and recorded them on the first stage). 

To give you a bit more info. I record a variable, e.g., with record('varName', 99) and included a discrete choice element on a different stage so that the selection should automatically overwrite this variable. I set this element up so that it directly saves the information on click, because I use this information in a conditional display in the same stage. 
Hence, I know this choice has somehow been registered, because the stage moves on and displays everything conditionally the way I set it up. But later, when I look into the final data, some variables sometimes were empty or still with their default values (Prolific participants in the pilot study told me they completed everything, they also saw the correct completion code at the very end of the study, which I never showed anywhere earlier). 

Can you perhaps test this out and try to find out why this may happen? It costs quite a bit of money (lost in payments since the participant was not at fault) and lots of time to figure out what happened, why info is (usually randomly) missing, and for dealing with those participants. 
You could try out my study with the ID 37418. If you try it out, though, you may need to try with multiple participants at once. I never noticed this problem when I tested this myself over multiple runs. Maybe LIONESS has problems recognizing/sorting/ordering/handling multiple participants at once? This was never an issue before. I did pilot tests collecting data from 30-40 people participating (almost?) simultaneously before, and it all worked fine (all answers and choices have usually been recorded correctly unless I made a coding mistake, but that's not the case here). 

Hope my description helps. If not, or you need any further info, please let me know. 

Cheers
Thomas

Lucas Molleman

unread,
May 10, 2024, 2:30:59 PM5/10/24
to Thomas Schilling, LIONESS Lab help and discussion
Hi Thomas,

I see you use a lot of custom code with quite some conditional display (and conditional recording of the responses). To efficiently track down the issue it would be useful to know have a bit more information. When variables are not recorded as expected, are all values for a participant missing, or just some? That is, is the 'program' in the buttons not executed at all? On what stage exactly do you think the issue might be occurring?

Best wishes,
Lucas


--
You received this message because you are subscribed to the Google Groups "LIONESS Lab help and discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to lioness-lab...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/lioness-lab/cd3ff272-03fb-440f-ba4a-8d8a4017bc34n%40googlegroups.com.

Lucas Molleman

unread,
May 10, 2024, 2:35:10 PM5/10/24
to Thomas Schilling, LIONESS Lab help and discussion
By the way, I could just submit empty values in the tasks, and finish without making any entries... perhaps the issue is resolved when you check whether the participant has entered valid responses before you (manually) record them?

Thomas Schilling

unread,
Jun 27, 2024, 3:56:00 PM6/27/24
to LIONESS Lab help and discussion
Hello Lucas, dear LIONESS team, 

The bug unfortunately persists. 

It looks like I never replied to your message, did I? 
I am sorry if I did not. And I thank you for your answers. I wasn't working on this experiment for a while. 
The following is a longer text than I expected when I started writing. I realize my experiment is also heavy with customized code, and it may take long to understand it. 
So I would like to offer here to have a live chat (via a video call), if you prefer that. 
If you do, please email me to arrange a convenient time for us to chat next week (I'm only available from Wednesday, 3 July, but then very flexibly for most of the next two weeks). 

I have two bugs to report that I think need to be fixed. I believe these are LIONESS-related bugs. But it could have something to do with individual users' system settings? I am only conjecturing this because the bugs I report below happen for some people but not for others, and different bugs seem to affect different users. 

The first problem (already reported before) happens on stages 2 and 3 (where participants count letters or numbers). 
Sometimes, the code is executed correctly, and all data is recorded correctly. Other times, it is not. 
Whenever I test the experiment (ID 34856) myself or ask others personally to test it, we never have any issues. When Prolific participants do it, it seems like about 50% of them have problems. 
I asked participants what browser they used, and they said they used Google Chrome. So it's not the browser, because I am using Chrome, too. 

The problem is that sometimes the system records values or numbers that simply make no sense. 
I am incrementing a score based on entered values in a task where participants count numbers (or letters). 
This incremented score can only range from 0 to 20, because there are only 20 values that can be entered. But for some participants, I have score values of 30, 50, ... This makes no sense. 
The codes I am referring to below are in stage 3. 

In the code, it looks like the system correctly saves and records the participants' entered values, with, for example
count_0a = parseInt(document.getElementById("0a").value);
setValue('count0a', count_0a) ;   
(in the JS code on lines 205-206). 
 
But the system seems to add up the score incorrectly sometimes. For example, 
    if(count_0a==8 && skipped_0a===0){
        scoreNumbersP1 += 1;
    }
(in the JS code on lines 212-214)
should only increment the scoreNumbersP1 by one, and only if the count_0a equals 8 and skipped_0a equals zero. 
A similar increment is done for each number in paragraph 1, for a total of 10 different numbers (see lines 204-319 for paragraph 1). 
The resulting score is then saved to the database with
setValue('scoreNumbersP1', scoreNumbersP1) ;
(in the JS code on line 315) 

Therefore, the scoreNumberP1 should have a range from 0 to 10. But for some participants, the system calculates a value of, for example, 24 or 30. 
The weirdest thing is that this problem does not always happen, and it does not happen for every participant.

A similar things seems to have happened on stage 4 where I increment attention checks in the final LIONESS button on that stage, with for example:
if(attentionSurvey1==6) {attentionSurvey += 1;} 

Could it be that incrementing like this makes problems in some browsers or for some users who have specific system settings, perhaps who use browser add-ons or so? 

There is yet another bug. 
When I test the experiment, on stage 4 (survey), I allow participants to continue only after they have responded to all survey items (Likert scales).
The system automatically makes a button appear (conditional display) once participants have selected a value for all items. Well, this does not always work. For some participants (and these arei not the same participants who had problems with the score increment), the system simply never makes this button appear, even if they replied to all currently shown questions in the survey. 
For some participants it worked, as they were able to continue and finish the study, but for others this did not work. 

What is happening here? Seems like something is making the LIONESS JS code unreliable. What could it be? 

I hope you have the time to look into this and suggest any fixes. Many thanks for your efforts! 

Best regards
Thomas

Thomas Schilling

unread,
Jul 15, 2024, 8:38:02 AM7/15/24
to LIONESS Lab help and discussion
Would someone from the LIONESS team please help me here? 
Cheers
Thomas
Reply all
Reply to author
Forward
0 new messages