Groups keyboard shortcuts have been updated
Dismiss
See shortcuts

Does windows use JOBFILEPREFIX?

59 views
Skip to first unread message

Dan Power

unread,
Oct 19, 2024, 4:33:08 AM10/19/24
to schedulix
I have a windows jobserver and I received a complaint that all the task log files are going into the DEFAULTWORKDIR instead of the JOBFILEPREFIX. I have tried all sorts of variations and they still go to the DEFAULTWORKDIR. It is like it is not using it. What am I doing wrong? To give a example of what I have tried. Here is a sample directory structure and values I have tried:
c:\program\workdir
c:\program\workdir\taskfiles
c:\logs\taskfiles - aiming for this one

DEFAULTWORKDIR = c:\program\workdir
values tried for JOBFILEPREFIX:
c:\logs\taskfiles\
\logs\taskfiles\
..\..\logs\taskfiles\
c:\program\workdir\taskfiles\
\program\workdir\taskfiles\
taskfiles\
taskfiles

Is there a step I should do after each change other than save? I was very surprised by that last one as I thought for sure I could at least get that on the front of all the files. But no. Jobserver is 2.10 and window is windows server 2019 datacenter.

Ronald Jeninga

unread,
Oct 19, 2024, 4:54:32 AM10/19/24
to schedulix
Hi.

first of all, Google classified your question as possible spam. I don't know why. It is an absolute legitimate question.
But that's why I couldn't answer earlier. Sorry for that.

As it seems, there are two kinds of files which seem to be mixed up.

The so-called taskfile is created by the jobserver in order to keep information about jobs that are executed in persistent storage.
This file contains the argv array and some progress information (is it just received (STARTING), is the jobexecutor started (STARTED) has the jobexecutor started the payload /RUNNING) or did it terminate and needs to be reported to the scheduling server (FINISHED).
This way the state of a job is preserved, even if someone kills the jobserver. And if the jobexecutor is killed, at least the information that some job was there is still present.
These files are stored using the JOBFILEPREFIX, ideally pointing to some directory which is writable for the owner and no one else.
Note that the JOBFILEPREFIX MUST contain an absolute path and some initial characters that are specific to the jobserver.
(You could have multiple jobservers active that share the same taskfile directory and they must be able to distinguish their files from the files of the other jobservers).

The output written to stdout and stderr are written to the LOGFILE and ERRLOG.
These can be defined in the job definition. If those fields (you find them on the run tab) are left empty, the output to stdout and stderr are discarded.
If both fields contain the same value, the output is merged.
If the fields contain a value that is a relative path (like e.g. log\$JOBID.log), this file is created (or appended) relative to the working directory of the job.
If they contain an absolute path (like e.g. c:\temp\$JOBID.log), the log output is stored there.

There is another field that plays a role here, which is the WORKDIR. If it has no value, the DEFAULTWORKDIR as defined by the executing jobserver is used.
If it contains a value, it must be an absolute path to some directory.

Best regards,

Ronald
Message has been deleted
Message has been deleted

Dan Power

unread,
Oct 22, 2024, 3:29:00 PM10/22/24
to schedulix
I don't think I submitting my reply.
"The output written to stdout and stderr are written to the LOGFILE and ERRLOG."
Is there a server wide method to affect these? Is there a folder parameter that can affect these?

Dan Power

unread,
Oct 22, 2024, 3:29:00 PM10/22/24
to schedulix
Is there no server wide control for this:
The output written to stdout and stderr are written to the LOGFILE and ERRLOG.

Or is it just a job by job basis.

On Saturday, October 19, 2024 at 1:54:32 AM UTC-7 Ronald Jeninga wrote:

Ronald Jeninga

unread,
Oct 23, 2024, 7:00:15 AM10/23/24
to schedulix
Hi,

there are a few things to consider here.

First of all, if you create a Job Definition, the value of the LOGFILE and ERRLOG is usually set to something like ${JOBID}.log.
This is nice, but might not fit your ideas of what it should be.
This default value can be changed by a GUI administrator by opening the Zope management interface.
Then click on the Custom folder and open the Properties. There you find two fields DefaultLogfile and DefaultErrLogfile.
The value of those fields is used as the default values in the Job Definition.
If you change them, all new Job Definitions will have the new values in their definition.

It is possible to use any Parameters in the definition of the LOGFILE. Those Parameters will be evaluated as usual.
The parameter resolution is pretty complex, but with a proper naming scheme it'll be easy to handle.
In any case it means that you can use Folder Parameters to determine the value of the LOGFILE. 
You can also define a Parameter at Scope/Jobserver level (which would have precedence over a Folder Parameter) or make more specific exceptions at Job level.
(Idea: a single Job is more specific than a Scope, and a Scope is more specific than a Folder).

Last but not least you might want to change all existing Job Definitions and replace the definition of the LOGFILE by some new value.
That can be done using a Select Statement in the GUI Shell.
As an example, say you want to store the log files in a directory pointed to by $MYLOGPATH and you'd like to group the log files per master.
A statement like

select 'ALTER JOB DEFINITION ' as col1, id, 'WITH LOGFILE = ''$MYLOGPATH/$MASTERID_$JOBID.log'', ERRLOGFILE = ''$MYLOGPATH/$MASTER_ID_$JOBID.log''' as col3
  from sci_c_scheduling_entity
 where logfile = '${JOBID}.log'
   and errlogfile = '${JOBID}.log'
with id folder quoted;

gives you an output  that can be copied and pasted in the command window of the shell.
After executing it, all your jobs will have the new definition.
Note though that if a Job can't resolve the MYLOGPATH Parameter, it will end up in ERROR and you'll have to cancel it.
Hence, create your infrastructure before doing bulk changes.

Naturally you might want to filter the Job Definitions to alter somewhat more carefully than I did in the example above, but I'll leave that as an exercise (ask if you run into problems).
 
Best regards,

Ronald

Dan Power

unread,
Oct 28, 2024, 8:31:51 PM10/28/24
to schedulix
"Zope management interface"

Is there a zope user documentation for schedulix? Or did I miss a section in here. https://schedulix.org/Downloads/schedulix_web_en-2.10.pdf

Ronald Jeninga

unread,
Oct 29, 2024, 8:29:23 AM10/29/24
to schedulix
Hi,

changing defaults of the application is hardly a task for ordinary users, it is more within the responsibility of administrators.
Hence Zope management is not documented within the schedulix_web document.

If you are a GUI admin, which I assume, you can open the Zope management interface by opening


This is described in the installation guide, although not very much in detail.
You need this if you want to install the system by hand.
You also need this if you want to upgrade the schedulix release or if you need to apply a patch to the GUI.
And if you've found a bug (I didn't see before), you can find the Python stack trace in the error_log folder. 
There are a few things you can configure in the Custom folder. Either by changing the properties, or by writing Python code.
Changing the properties is relatively easy. Writing Python code often requires knowledge of the internal processing.
Some of it is pretty easy (like extending the SDMSservers data structure), some of it is far complexer (like programming the CAN_CANCEL hook).
Hence the latter is something we usually do (not for free).

HTH

Best regards,

Ronald
Reply all
Reply to author
Forward
0 new messages