That is, with Python running in the background, I want to send one batch
of commands to Python and read the result back, send another batch and
read the result, and so on. Python should not terminate between
batches, because I want to access data from previous calls.
I've tried something like
python <fifo.in >fifo.out &
echo "print 2.0+3.0" > fifo.in
cat fifo.out
I get the result, but Python terminates as soon as EOF is reached in the
<stdin> stream.
My embedded Python will stay alive between batches. But, this solves
problem only for Python. Any insight to solving this problem for a any
external program (ie. Perl, Ruby, etc.) would be greatly appreciated.
--
William Park, Open Geometry Consulting, <openge...@yahoo.ca>
Linux solution for data management and processing.
Sounds very interesting :-)
-gustavo
Instead of running -just- Python, run a Python program. It can attach
itself to a couple fifos (or a unix socket, or a tcp socket, or ...), read
until EOF, process the string (easy, with exec/eval), write the results out,
then re-open the input file.
There's also a project that keeps a Python interpreter running and lets
you use a mini-interpreter (for the #!) to connect to it, but I forget the
name. It was mentioned on this group a week or two ago, so maybe you can
find it w/ a little searching.
Jp
--
It is practically impossible to teach good programming style to
students that have had prior exposure to BASIC: as potential
programmers they are mentally mutilated beyond hope of
regeneration. -- Dijkstra
--
up 11 days, 2:28, 4 users, load average: 0.29, 0.18, 0.17
Well, if you are asking about my extension of Bash with Python
embedded... The syntax mirrors the normal Python, ie.
embeddedpython scriptfile
embeddedpython -c "string"
except that Python doesn't terminate between calls, so you can access
data from previous call.
For example,
embeddedpython -c "import math"
embeddedpython -c "print math.pi" --> 3.14159265359
If file 'aaa' has content,
A = [1,2]
B=['a','b']
Then, you can do
embeddedpython aaa
embeddedpython -c "print math.pi, A+B" --> 3.14159265359 [1, 2, 'a', 'b']
With the normal Python, each call is separate invocation. With embedded
Python, however, each call is just continuation of previous calls, so
you get access to previous data.
Interestingly, if you run Python in subshell, then any modifications
don't percolate back up to the parent Python. This is normal for shell,
though. For example,
( embeddedpython -c "B=[11,12]"
embeddedpython -c "print A, B" ) --> [1, 2] [11, 12]
But, back in the main shell,
embeddedpython -c "print A, B" --> [1, 2] ['a', 'b']
Thanks Jp. I overlooked 'exec' function in Python.
I was initially thinking about writing a C wrapper program, which will
read the batch commands from 'stdin' (redirected from FIFO), pass it to
PyRun_SimpleString(), and repeat. The result will in 'stdout'
(redirected to FIFO), and reading it would be up to the calling program.
But, somehow, I ended up with full-blown embedded Python within Bash
shell. :-)
>
> There's also a project that keeps a Python interpreter running and
> lets you use a mini-interpreter (for the #!) to connect to it, but I
> forget the name. It was mentioned on this group a week or two ago, so
> maybe you can find it w/ a little searching.
--
Finally found 'exec' solution, thanks to Jp. I've come up with the
following:
import sys
fifo_in = sys.argv[1]
fifo_out = sys.argv[2]
while 1:
fin = open(fifo_in, "r")
fout = open(fifo_out, "w")
sys.stdout = fout
exec fin
sys.stdout.flush()
fout.close()
fin.close()
If this is called 'coprocess.py', then you can do
mkfifo in out
python coprocess.py in out &
echo "print 1.0+2.0" > in
cat out
echo "import math" > in
echo "print math.pi" > in
cat out
Another good candidate for our dear Python Cookbook?
--
Carlos Ribeiro
crib...@mail.inet.com.br
That's IPython:
http://www-hep.colorado.edu/~fperez/ipython/
Very nice. :^)