How to setup jobs so they can run in different environments ?

70 views
Skip to first unread message

wim.ve...@billinghouse.nl

unread,
May 19, 2018, 7:57:38 AM5/19/18
to schedulix
Hi,

As said before, I am looking into schedulix as it looks a very promising tool for our batch requirements. We have multiple batch  jobs that should be able to run in multiple similar environments (dev, testing, user acceptance, migration, production). batch jobs can consist of a number of steps to be ran successively. From what I have seen so far this tool should be capable of doing this. An environment in this context consists of several nodes where different services and applications are deployed. (Usually a front-end node, a routing service and different backends for orders, invoicing and finance).
Our idea is to have only one schedulix server, with the batch jobs always running on the same jobserver, but servicing all environments. (Maybe production will be seperate, this is to be discussed and decided later).

However I am struggling with the concepts used to realize such a configuration in shedulix.  

To get acquainted with the system I decided to just create a simple job that displays the value of a parameter set at the environment. For this I want to use two environments (test and aut). But I am lost how to define the various things in shedulix (I am using the downloaded VM for playing around). Defining two environments is easy.
But how to connect the environments and jobs ?

Is there an example of someting similar in the examples that I could study ?

Also if there is documentation describing the concepts used (environment, resource, footprint, job, etc) and how they interact that would be quite helpful.

Regards,
Wim Veldhuis.




Ronald Jeninga

unread,
May 20, 2018, 6:14:40 AM5/20/18
to schedulix
Hi Wim,

although the schedulix documentation is pretty good (I use it myself), it's hard to learn the concepts from it.
But if you search this groups for keywords like Named Resource, Environment, Footprint, you should find quite a few posts (usually by me) that give insights into the concepts.
And note that an Environment in schedulix isn't the same as an environment in your terminology. 

Why do you want a single jobserver that services all environments?
Jobservers have a pretty small footprint and you won't notice if you start a couple of them.
And if you do so, you can easily setup different run-time environments and choose between them using a single parameter.

Unfortunately I don't have a clear view what you're trying to achieve.
That makes it somewhat difficult to give you a good answer now.

Please continue asking and explaining what you'd like to achieve.

Best regards,

Ronald

wim.ve...@billinghouse.nl

unread,
May 22, 2018, 8:16:36 AM5/22/18
to schedulix
Hi Ronald,

We roughly have the following environment.
We have about 10-20 'environments' where we run our software. Each such environment has its own network segment and consists of a number of servers. Together they form an application (front-end, queuing, business process modelling and a few backends). All those environments offer the same functionality, but some of them are for testing, user acceptance, development, migration (where data of previous system is loaded and tested), pre-production and production. We have a number of batch jobs like invoicing, renewals of contracts etc. that should be able to run in each environment. Obviously not everybody is allowed to trigger batch jobs in each environment.

Being a programmer myself, I problably look at the issue the wrong way. I was assuming we would have a batch definition for each batch and when starting it we would give it the parameters the job needs (like invoice date for an invoice batch etc.). However for infra related parameters (like running the job for UAT03 environment) should not force the user to specify that as a parameter. So somehow I must organize the things in shedulix so it will pick up the infra related parameters 'on the fly'.

A thing I did not like initially is that it seems to imply that you copy the jobs into different folders. That would mean that if we change a job we would have quite some work fixing all the jobs. Looking at it longer and reading some hints from other mails, I think the copying should be seen as a deployment of the job to that environment, which solves the otherwise upcoming version trouble. 

With the above description, what would be a logical setup for a simple job to be able to run in two environments (like dev01 and uat01) ? The only parameter needed per environment is problably the name of the network segment the servers are in (the nodes always have the same name as they are exposed though a proxy in the network segment).

Regards,
Wim.

Ronald Jeninga

unread,
May 22, 2018, 10:27:09 AM5/22/18
to schedulix
Hi Wim,

thank you for your explanation. That makes more sense to me now.

You're going to need a jobserver on every node where you'd like to execute programs.
Because of the symmetrical setup that will be a pretty easy task, and with a bit of effort you can even automate it.

I don't think you need to copy entire job trees. But you will need at least a folder for each "environment". Somehow you'll have to be able to distinguish them (you want to be able to address them).
On the other side you might want to have different job trees for, let me say, development, integration test and production. I can't really recommend a specific setup as a best practice here.

But let me just assume that you indeed have a different folder for each of your environments.
You can define a parameter, let me call it JOB_ENV_NAME, for each of those folders and give it some value (dev01, uat01, UAT03, ...).

If you have a copy of the entire job tree below that folder, this parameter will be visible to all the jobs within that tree.

If you don't have a copy of the entire job tree, you'll need a batch that has the job tree as a child that defines a parameter JOB_ENV_NAME of type IMPORT.
(just look at job trees as functions in a programming language; the same way a function can be called at different places, a job tree can be child of several batches).
If the batch is submitted, it'll import the value of JOB_ENV_NAME from the folder. And because of the submit hierarchy the value will be visible to all jobs below that batch.

So regardless the exact setup, the value of the parameter JOB_ENV_NAME is visible to each job.

The second step is to define a parameter at each appropriate scope.
Let me just assume you have a scope GLOBAL.DEVELOPMENT and below that scope you have another scope (and/or jobserver) for each of the servers that belong to your development environment.
Within the scope GLOBAL.DEVELOPMENT you again define a parameter, let me call it SCOPE_ENV_NAME and give it the name of the environment as a value (e.g. dev01).
This way the value of this parameter is visible for everything below GLOBAL.DEVELOPMENT.

In the Environments (ours, not yours) you can now add a condition like '$JOB_ENV_NAME == $SCOPE_ENV_NAME'.

Done.

I think this method is the most flexible and at the same time pretty easy to understand.

Does this make sense to you?

Best regards,

Ronald

wim.ve...@billinghouse.nl

unread,
May 23, 2018, 11:07:01 AM5/23/18
to schedulix
This works indeed.

I do need to run the jobs in different schedulix environments, but in our environments. It means I can do with one jobserver for now. But is is nice to know how to use different job servers.
When you add the condition to the environment, it requires a named resource. Can I just create a dummy one ?

Another question:

I am still using the VM for testing / POC.
It looks like the scripts that are specified must be located in the same folder where SDSMpopup.sh script is located. When I specify an absolute or relative path I get permission denied errors.
How can I specify a different location for my script files ?

Regards,
Wim.

Ronald Jeninga

unread,
May 23, 2018, 11:45:16 AM5/23/18
to schedulix
Hi Wim,

an Environment is mandatory when creating jobs. Using an empty Environment is allowed, but it matches every execution environment and is therefore element of the set of bad ideas.
So yes, you create an Environment with a single, for now dummy, resource request and then specify the condition.
Note that the resource you are requesting must be present in your jobserver.

The scripts you want to execute must be executable (and in case of scripts readable) by the user that runs the jobserver.
If you enable the "use_path" option in the jobserver's configuration, the jobexecutor will do an execvp() instead of an execv().
Hence using full qualified paths, relative paths (take the working directory into account) and calling executables that are in your PATH should work.
If it doesn't work, I think one of the intermediate directories blocks access, or the executable itself isn't executable by the user that runs the jobserver.

Regards,

Ronald 

wim.ve...@billinghouse.nl

unread,
May 23, 2018, 3:39:28 PM5/23/18
to schedulix
Hi Ronald,

Yes, I use one environment which is tied to one jobserver. (I use the SERVER@LOCALHOST from the examples in the demo VM).
At the folder level I specify the host to be used, so the batches and jobs inherit that. At the job level I specify the environment.
This works fine, at least for demonstration purposes.

I created the folder for the new scripts as the schedulix user, and also defined a symbolic link from default location to that folder. I did not try a relative path, only absolute.
I did notice later that when I echoed $0 from the script, it showed /opt/schedulix/schedulix/scripts/scriptname.sh (notice that schedulix is shown twice, the actual path is /opt/schedulix/scripts/scriptname.sh). Could be something strange in the VM.

I looked at the LOCALHOST.SERVER configuration, it specifies true for USEPATH:

USEPATH  GLOBAL.EXAMPLES  true 

That seems to be correct, I might investigate later whether there is another issue. 

There is a strange thing I noticed.

I defined the jobs in a folder ONTWIKKEL.SUBJOBS.
The batches are created in ONTWIKKEL.BATCH1 folder, that contains a batch. The batch has the required jobs added as children (so linked, not copied)

For development it was easier to define a dummy host at the ONTWIKKEL folder level.

Now when I copied the batch to a UAT folder with the correct host specified at that folder level and I submitted the batch, it took the value from the ONTWIKKEL folder. Only when I removed the parameter from that folder the correct one was found.
Not a big deal, but it was a bit surprising.

The comments were also not copied, I posted a separate thread for that.

Regards,
Wim Veldhuis.

wim.ve...@billinghouse.nl

unread,
May 23, 2018, 3:51:19 PM5/23/18
to schedulix
I fixed the issue of the jobs not starting in other directories.

Turned out the first line was
#/bin/bash
and not
#!/bin/bash

This also caused the double folder name to be printed.
Linux, it is like magic :)

Regards,
Wim.

Ronald Jeninga

unread,
May 23, 2018, 4:09:07 PM5/23/18
to schedulix
Hi Wim,

I'm happy we're not the only ones that make little mistakes ;-)
But I actually don't understand why the directory name was printed twice.

Never mind. It's not the first time I notice that things work if you do everything correctly.

Regards,

Ronald

Ronald Jeninga

unread,
May 23, 2018, 4:19:29 PM5/23/18
to schedulix
Hi Wim,

this time the parameter resolution behaved a little bit differently than I expected.
It is a very complex piece of code where the submit hierarchy, the folder hierarchy and the scope hierarchy are evaluated.
And sometimes it behaves a little differently from what I expect.

In case of name clashes the most specific value is returned (and from the job's perspective the value within the ONTWIKKEL folder is more specific than a value within the folder where the parent resides).
But I must admit that I expected that the definition of an IMPORT parameter at parent level would be regarded more specific than a folder parameter.
Obviously I was wrong here.

If you look at the ONTWIKKEL folder as a kind of library, it is acceptable that it shouldn't contain a configuration.

Regards,

Ronald
Reply all
Reply to author
Forward
0 new messages