SDMS Command Failed - ZSI-10001 - ConnectError

53 views
Skip to first unread message

Sandhya Srinivasan

unread,
Jun 2, 2022, 9:58:05 AM6/2/22
to schedulix
Hello, 

We are running schedulix-2.9 on docker and suddenly it stooped working today. We are able to connect to the SDMS and Zope but when I click on JOBS then I get the below error. 

ZSI-10001-Connect Error.

And when I look at the pod logs from kubernetes, I could see this.  

INFO    [Thread-2]    02 Jun 2022 13:57:16 GMT Read 2, Loaded 2 rows for EVENT
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT Duplicate id during load Object
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT ****************** Start Stacktrace *********************
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT de.independit.scheduler.server.util.SDMSThread.doTrace(SDMSThread.java:168)
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT de.independit.scheduler.server.exception.FatalException.<init>(FatalException.java:50)
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT de.independit.scheduler.server.repository.SDMSTable.loadObject(SDMSTable.java:194)
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT de.independit.scheduler.server.repository.SDMSRunnableQueueTableGeneric.loadTable(SDMSRunnableQueueTableGeneric.java:216)
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT de.independit.scheduler.server.repository.TableLoader.SDMSrun(SDMSRepository.java:430)
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT de.independit.scheduler.server.util.SDMSThread.run(SDMSThread.java:225)
ERROR   [Thread-1]    02 Jun 2022 13:57:16 GMT ****************** End Stacktrace   *********************


can you please help. What could have changed and how to fix this?

Thanks,
Sandhya

Ronald Jeninga

unread,
Jun 2, 2022, 10:14:49 AM6/2/22
to schedulix
Hi Sandhya,

duplicate IDs shouldn't happen and the server doesn't know anything better to do than to abort.
This doesn't happen very often, and in most cases it is caused by some human mistake (like running init.sql twice, or by "repairing" the system using SQL).

As it seems you have duplicate IDs in the RUNNABLE_QUEUE table.
Could you please have a look at that table and find out which ID occurs multiple times?
Something like

select *
  from runnable_queue
 where id in (
       select id
         from runnable_queue
        group by id
       having count(*) > 1
);

should give all the rows with duplicate IDs.
From there we'll have to investigate further.

Best regards,

Ronald
Reply all
Reply to author
Forward
0 new messages