If done well, these would be great!
The above would have made a past project I wrote using RPyC much, much simpler.
- Tal
well, it is all nice, but that's not related to rpyc -- it's called rsync -- and i don't see a reason to reinvent the wheel here. the main reason i hate twisted (i love the idea, but it's a horrible piece of code) is its bloatedness -- rpyc will remain slim and to-the-point.
i do agree that it would be better to use a temp dir and add it to sys.path instead of writing to site-packages... but writing a dependency-solver that traverses imports in a dynamic language like python -- that sounds too much.
instead of pushing everything into the library, why not have external tools? for instance,
* rpysync - like rsync over rpyc
* rpydist - distribution of python packages over rpyc
and remove upload()/download(), etc. from the library altogether.
of course we'll publish these tools on the site so if users want to use a certain functionality, they'll just download the "add-on" they want.
obtain() pickles a remote object and unpickles it locally. you basically get a copy of the remote object instead of a reference to it:
>>> c.modules.sys.path
['', '/home/tomer/p4client/tools/pypackages', ..... ]
>>> type(c.modules.sys.path) # this is a netref
<netref class '__builtin__.list'>
>>> x = rpyc.classic.obtain(c.modules.sys.path)
>>> x
['', '/home/tomer/p4client/tools/pypackages', ..... ]
>>> type(x) # x is a real list, a copy of the remote one
<type 'list'>
>>>
changes made to x will not propagate to the remote list, as x is only a copy of the original list.
Python will check the local directory whenever it wants to load a
module. So you can connect to the server, search for .py files that
are newer than your local version, download (read remote, write local)
and bingo.
As a side note, Python creates compiled bytecode files (.pyc) of all
modules loaded, these files are cross-platform, so you should be able
to use the .pyc files instead of the .py files on the client machines.
Timothy
--
Two wrights don't make a rong, they make an airplane. Or bicycles.
Something I'd like to see in "rpycu" is the ability to load classic
servers with communication via popen. Something like
import rpycu
rpycu.classic.server_per_cpu()
sub_cpu1 = rpycu.classic.cpus[0]
sub_cpu1.modules.foo()
Excelent, this way I can bring remote functions to me and execute them
locally !
For example:
>>> import rpyc
>>> conn=rpyc.classic.connect('10.200.0.120')>>> a=rpyc.classic.obtain(conn.modules.os.listdir)
I think the option to send modules to the server would be very useful and raised a request for improved code uploading functions the other day: I don't think Tomer was too keen on including these in the core library though.
Perhaps what we need then is a lib like OpenGL does. In OGL land we have gl.h that contains the functions for the main lib, and glu.h that contains "utility" functions, that expand uppon the core GL.
Something I'd like to see in "rpycu" is the ability to load classic servers with communication via popen. Something like