I'd be interested to see if anyone else has tackled this problem in a
different manner. I'd like to chain together a couple of shell and
Perl scripts passing the output from one to the other. Using a JSON
store appears to be an option, but for me, writing a wrapper script
would probably be an easier route.
-Paul
On Oct 18, 8:48 am, Soumyak Bhattacharyya
<
bhattacharyya.soum...@gmail.com> wrote:
> Hi Brian,
>
> This is just an idea for the problem statement you mentioned .....
>
> Jobs can receive their parameters from a json store ..... i.e. an url say
> in form ofhttp://localhost:8080/jsonstore/options.json
>
> Now as a last step of a job, if it is able to post the outcome to the json
> store, the following job can extract from it.
>
> I must say that I have not experimented this option, but will do and
> confirm.
>
> Regards
> Soumyak
>
>
>
>
>
>
>
> On Friday, October 12, 2012 8:46:23 AM UTC+5:30, Brian Clowers wrote:
>
> > RunDeckers:
>
> > I am struggling on how to pass the output of one job onto another. After
> > searching the documentation and forum using terms I would consider relevant
> > I still have not found an example that explains how to make this happen
> > either via the command line or via the webUI. If someone has a simple
> > example I would greatly appreciate the help.
>
> > Ideally it would be nice to have the output of one function pass a string
> > variable pointing to another path for a continuation of execution. Granted
> > this example could be done in a single job I was looking for something
> > simple to demonstrate the principle.
>
> > For example some pseudo code for a workflow would be
> > User Input Option: user_option = 2
> > Job 1:
> > Square Input:
> > return ${user_option}*${user_option} as output1
>
> > Job 2:
> > Replicate New Value:
> > for i in range(${output1}):
> > print i*i
>
> > Please Help,
>
> > Brian