Executing Remote Function As Local.

863 views
Skip to first unread message

apat...@gmail.com

unread,
Nov 21, 2008, 11:48:17 AM11/21/08
to rpyc
Is there any way to execure a remote (exposed) function on a server as
It is local into the client ?

For example, lets suppose That I'v a library with a lot of functions I
want to Use in a lot of clients, but I don't want that code to be in
the clients, I want that code to be accessed and executed locally (in
the client) using RPyC.

Thanks in advance,

tomer filiba

unread,
Nov 23, 2008, 10:14:34 AM11/23/08
to rp...@googlegroups.com
you would have to ask your question more specifically.
it's very hard to understand what you're trying to accomplish.

if you want to be able to execute a remote function (that "lives" on the server) from your client machines,
then see the tutorial on services (part 3 i believe)


-tomer
--
An NCO and a Gentleman

Asa Ayers

unread,
Nov 29, 2008, 12:20:51 PM11/29/08
to rp...@googlegroups.com
Am I understanding you correctly that you want to have all of the code on the server, but when its executed it needs to run on the client's hardware? The server just needs to distribute the program without allowing users access to the code? The API documents mention upload, upload_package, and update_module, I'm not sure how to use them, but from the descriptions it sounds to me like you want the server to send the compiled code to the client.
--
Always remember: You're unique, just like everybody else.

tomer filiba

unread,
Nov 30, 2008, 7:17:11 AM11/30/08
to rp...@googlegroups.com
rpyc.remoting is used to copy files and modules between servers. if you have a farm of computers running
the classic server, you can thus deploy modules and packages between them with ease.

for example,
import rpyc
import my_local_module

c = rpyc.classic.connect("foobar")
rpyc.classic.upload_package(c, my_local_module)

which will copy the package/module my_local_module to the server.
note: i haven't tested the remoting utils, it might have some silly bugs/name errors

another thing is the execute method, which executes code on the other side

# this code will be executed on the server
c.execute("""
def foobar(x,y):
    return x+y
""")

# now, from the client you can invoke it like so:
c.namespace["foobar"](1,2)


-tomer

tomer filiba

unread,
Nov 30, 2008, 3:37:06 PM11/30/08
to rpyc
okay, i fixed some bug with the remoting utilities.
btw, it's rpyc.classic.upload/etc.

fixed in this bugfix release (3.01), with more thorough tests


-tomer
> > On Sun, Nov 23, 2008 at 7:14 AM, tomer filiba <tomerfil...@gmail.com>wrote:
>
> >> you would have to ask your question more specifically.
> >> it's very hard to understand what you're trying to accomplish.
>
> >> if you want to be able to execute a remote function (that "lives" on the
> >> server) from your client machines,
> >> then see the tutorial on services (part 3 i believe)
>
> >> -tomer
>

Jamie Kirkpatrick

unread,
Nov 30, 2008, 3:57:18 PM11/30/08
to rp...@googlegroups.com
I found one issue when doing this myself which made me write my own remoting utils that had a little more control, but whilst we are on the subject I could make a couple of suggestions:

• It would be good to be able to configure the remote directory where code is uploaded on the server.  In my case I wanted a temporary directory to be set up each time and for this path to be added to sys.path.  Perhaps just offering a way to pass in a directory name to these functions would help.
• I also implemented some more advanced stuff that could work out if a given module had changed by comparing the MD5 sums for each file involved, and only upload the changed files.
• It would be interesting if the modulefinder code in Python could be reused to find all dependent code (within reason...hidden imports cant be coped with etc) to allow a seamless and easy way to upload a script and have all its deps uploaded automatically.  This seems like quite a common use case.
Thoughts appreciated.

2008/11/30 tomer filiba <tomer...@gmail.com>



--
Jamie Kirkpatrick
07818 422311

Tal Einat

unread,
Nov 30, 2008, 5:48:23 PM11/30/08
to rp...@googlegroups.com
On Sun, Nov 30, 2008 at 10:57 PM, Jamie Kirkpatrick
<j...@kirkconsulting.co.uk> wrote:
> I found one issue when doing this myself which made me write my own remoting
> utils that had a little more control, but whilst we are on the subject I
> could make a couple of suggestions:
>
> • It would be good to be able to configure the remote directory where code
> is uploaded on the server. In my case I wanted a temporary directory to be
> set up each time and for this path to be added to sys.path. Perhaps just
> offering a way to pass in a directory name to these functions would help.
> • I also implemented some more advanced stuff that could work out if a given
> module had changed by comparing the MD5 sums for each file involved, and
> only upload the changed files.
> • It would be interesting if the modulefinder code in Python could be reused
> to find all dependent code (within reason...hidden imports cant be coped
> with etc) to allow a seamless and easy way to upload a script and have all
> its deps uploaded automatically. This seems like quite a common use case.
>
> Thoughts appreciated.

If done well, these would be great!

The above would have made a past project I wrote using RPyC much, much simpler.

- Tal

tomer filiba

unread,
Dec 1, 2008, 9:30:15 AM12/1/08
to rp...@googlegroups.com
well, it is all nice, but that's not related to rpyc -- it's called rsync -- and i don't see a reason to reinvent the wheel here. the main reason i hate twisted (i love the idea, but it's a horrible piece of code) is its bloatedness -- rpyc will remain slim and to-the-point.

i do agree that it would be better to use a temp dir and add it to sys.path instead of writing to site-packages... but writing a dependency-solver that traverses imports in a dynamic language like python -- that sounds too much.

instead of pushing everything into the library, why not have external tools? for instance,
* rpysync - like rsync over rpyc
* rpydist - distribution of python packages over rpyc
and remove upload()/download(), etc. from the library altogether.

of course we'll publish these tools on the site so if users want to use a certain functionality, they'll just download the "add-on" they want.

btw, if anyone's interested, i tried to use rpyc-over-ssh. setting up ssh-tunnels is a headache you must deal with everytime. tlslite et al are cute, but you have to mess with certificates and all that overhead. it seems a waste to redo what ssh already does for you.

i thought of running the classic_server over stdio, something like
p = Popen("ssh %s classic_server.py -m std")
and then work with p.stdin/p.stdout, but it didn't seem to work well. if someone wants to take it as a pet-prohect, it could be cool.


-tomer

Jamie Kirkpatrick

unread,
Dec 1, 2008, 9:44:40 AM12/1/08
to rp...@googlegroups.com
Tomer

well, it is all nice, but that's not related to rpyc -- it's called rsync -- and i don't see a reason to reinvent the wheel here. the main reason i hate twisted (i love the idea, but it's a horrible piece of code) is its bloatedness -- rpyc will remain slim and to-the-point.

i do agree that it would be better to use a temp dir and add it to sys.path instead of writing to site-packages... but writing a dependency-solver that traverses imports in a dynamic language like python -- that sounds too much.

instead of pushing everything into the library, why not have external tools? for instance,
* rpysync - like rsync over rpyc
* rpydist - distribution of python packages over rpyc
and remove upload()/download(), etc. from the library altogether.

of course we'll publish these tools on the site so if users want to use a certain functionality, they'll just download the "add-on" they want.

Does this really add more bloat?  It's just another "util" (which you already provide for) and is totally optional.  I would not suggest you write a depepndency-solver at all: I would suggest that you leverage the built-in dependency solver code to your advantage.  I would want this routine to be a max of 20 lines or so (which I think should be possible) so not bloated: but as I said, and others subsequently agreed with it would be a very useful addition and a common use-case.

In the end its about preventing each other "reinventing the wheel".  If every user of rpyc in "classic" mode ends up doing this themselves I would say thats a failure on the part of the project: it should allow people to do what they most commonly would want to do with such a tool, and I think if you poll'd people they would probably want to be able to throw a script at a remote server and have it run without having to do anything complex.

Dunno, but perhaps open if up for a vote?  See what the community feel about it...

Jamie Kirkpatrick

unread,
Dec 1, 2008, 10:00:18 AM12/1/08
to rp...@googlegroups.com
The module you need for this is here:

http://docs.python.org/library/modulefinder.html#module-modulefinder

As I say, in two lines you can find out what files are required (excluding hidden imports as previously mentioned).  Perhaps you might reconsider based on this...
2008/12/1 Jamie Kirkpatrick <j...@kirkconsulting.co.uk>

apat...@gmail.com

unread,
Dec 2, 2008, 8:19:03 PM12/2/08
to rpyc
That's Exactly what I ment !!!!. Is It possible to do that with the
"obtain" function of rpyc ?

I was trying to use obtain Function and did some test with copy
function also to copy remote objects as local ones, but I always got
the following error:

conn=rpyc.classic.connect('10.200.0.120')
In [19]: conn.modules.loads
Out[19]: <module 'loads' from '/root/reports/loads.py'> (Loads is my
Object I want to execute as local)
In [20]: local_loads=rpyc.classic.obtain(conn.modules.loads)

TypeError: can't pickle module objects

And the same is for copy(), like:

import copy
local_loads=copy.copy(rpyc.modules.loads)

TypeError: can't pickle module objects


Thanks

On Nov 29, 3:20 pm, "Asa Ayers" <asa.ay...@gmail.com> wrote:
> Am I understanding you correctly that you want to have all of the code on
> the server, but when its executed it needs to run on the client's hardware?
> The server just needs to distribute the program without allowing users
> access to the code? The API documents mention upload, upload_package, and
> update_module, I'm not sure how to use them, but from the descriptions it
> sounds to me like you want the server to send the compiled code to the
> client.
>
>
>
> On Sun, Nov 23, 2008 at 7:14 AM, tomer filiba <tomerfil...@gmail.com> wrote:
> > you would have to ask your question more specifically.
> > it's very hard to understand what you're trying to accomplish.
>
> > if you want to be able to execute a remote function (that "lives" on the
> > server) from your client machines,
> > then see the tutorial on services (part 3 i believe)
>
> > -tomer
>

tomer filiba

unread,
Dec 3, 2008, 8:41:37 AM12/3/08
to rp...@googlegroups.com
what you are asking for is called deployment, and obtain/deliver are not what you need. pickle doesn't support serializing code/modules/classes.

you need to use of the of upload/upload_package functions of rpyc.classic to move the code over to the other side.

on the other hand, rpyc has been designed so it wouldn't be necessary: you can access the entire machine as if it were local, using remote modules like os and builtin.__open__, etc. this approach is better than deploying code over, because it's stateless and doesn't require copying files around -- the "application logic" sits on the client, while the server runs it.

however, there are two reasons you might want to use deployment:
* performance - for example, iterating over a large iterator (i.e., DB query, long list, etc.) locally would be much faster than iterating over it from the client, because it doesn't have the network overhead associated with every iteration (calling .next() on the remote object). although if this is your problem, see rpyc.buffiter
* compiled/platform specific modules - not all code can be pure python -- sometimes it's compiled extension modules etc., that require direct OS interfaces. for example, a binding library for a driver/program. in this case you'll have to uploed the code to the remote machine, where it would run natively.

hope this helps,
-tomer

apat...@gmail.com

unread,
Dec 3, 2008, 9:02:50 AM12/3/08
to rpyc
Ok, just a question now, in what cases do you use the obtain function
of utils in rpyc ?
Could you gime me a real word (small one) example ? I didn't saw any
piece of code in the page and in google that uses the obtain function,
and seems to be that I'm wrong in the use of that function.

I just wanted to have all the pieces of code in the server, and a
minimal client code to obtain that functions/objects from the server
and execute them locally. So, if I want to do some modification on
that functions I can modify them in the server and the clients just
download them every time they need to execute that function. And I
need the CLIENT to connect to the SERVER and download that functions,
because I have a pool of devices working behind firewalls and I have
no control of those firewall to open a bi-dir connection.

But, I would be great a small piece of code that uses obtain function,
so I can see if Its what I need or not.

The example of upload_package, I saw in this thread, is used to upload
a piece of code from the client to the server, and what I need is to
download a piece of code from the server TO the client,

I'm truly sorry if I misspell some words English is not my Natural
Language.

Thanks in advance again for your support, I did recommend rypc to a
lot of friends of mine, I think this is a great and the best yet
simple solution of RPC over python.

tomer filiba

unread,
Dec 3, 2008, 9:21:39 AM12/3/08
to rp...@googlegroups.com
obtain() pickles a remote object and unpickles it locally. you basically get a copy of the remote object instead of a reference to it:

>>> c.modules.sys.path
['', '/home/tomer/p4client/tools/pypackages', ..... ]
>>> type(c.modules.sys.path)  # this is a netref
<netref class '__builtin__.list'> 
>>> x = rpyc.classic.obtain(c.modules.sys.path)
>>> x 
['', '/home/tomer/p4client/tools/pypackages', ..... ]
>>> type(x) # x is a real list, a copy of the remote one
<type 'list'>
>>>

changes made to x will not propagate to the remote list, as x is only a copy of the original list.


hope this helps
-tomer

Tal Einat

unread,
Dec 3, 2008, 9:29:10 AM12/3/08
to rp...@googlegroups.com
tomer filiba wrote:

obtain() pickles a remote object and unpickles it locally. you basically get a copy of the remote object instead of a reference to it:

>>> c.modules.sys.path
['', '/home/tomer/p4client/tools/pypackages', ..... ]
>>> type(c.modules.sys.path)  # this is a netref
<netref class '__builtin__.list'> 
>>> x = rpyc.classic.obtain(c.modules.sys.path)
>>> x 
['', '/home/tomer/p4client/tools/pypackages', ..... ]
>>> type(x) # x is a real list, a copy of the remote one
<type 'list'>
>>>

changes made to x will not propagate to the remote list, as x is only a copy of the original list.

If I'm not mistaken, when Tomer says that changes will not propagate to the remote list, he is talking about changes made to the second 'x' in his example, i.e. the one that is an actual list and not a netref.

This, as opposed to changes made to the first x in his example which is actually a netref; such changes will actually be made on the real list which resides on the remote host.

- Tal

apat...@gmail.com

unread,
Dec 3, 2008, 9:33:54 AM12/3/08
to rpyc
Excelent, this way I can bring remote functions to me and execute them
locally !

For example:

>>> import rpyc
>>> conn=rpyc.classic.connect('10.200.0.120')
>>> a=rpyc.classic.obtain(conn.modules.os.listdir)
>>> a("/")
Local directory root listing .....

I did try this, and I can list the local directory using the remote
listdir function of remote os module executed locally !
I think I can do the same for my own remote functions of my objects
can't I ?

I will give It a try ...

Seems to work OK for functions like listdir of os, but not for the all
os module It fails if I want to do:

a=rpyc.classic.obtain(conn.modules.os)

But with the functions I'm Ok for now,

Thanks again

Timothy Baldridge

unread,
Dec 3, 2008, 9:48:42 AM12/3/08
to rp...@googlegroups.com
If I'm not mistaken I think our reader wants to do something like MMO
games do. You connect to a remote server, check for patches, then load
the game. If this is what you're looking for, then it should be rather
easy.

Python will check the local directory whenever it wants to load a
module. So you can connect to the server, search for .py files that
are newer than your local version, download (read remote, write local)
and bingo.

As a side note, Python creates compiled bytecode files (.pyc) of all
modules loaded, these files are cross-platform, so you should be able
to use the .pyc files instead of the .py files on the client machines.

Timothy


--
Two wrights don't make a rong, they make an airplane. Or bicycles.

apat...@gmail.com

unread,
Dec 3, 2008, 9:59:46 AM12/3/08
to rpyc
Yeah, kind of, but I just wanted to know If I was able to do this with
rpyc.

Jamie Kirkpatrick

unread,
Dec 3, 2008, 10:01:50 AM12/3/08
to rp...@googlegroups.com
Modules cannot be pickled: that is they cannot be serialized and sent in this manner.  I think the option to send modules to the server would be very useful and raised a request for improved code uploading functions the other day:  I don't think Tomer was too keen on including these in the core library though.
He suggested that we could perhaps start to implement some kind of utilities library that would be optional for users of the code if they wanted this kind of functionality.  I think it should exist somewhere and am keen to see it happen: I'll bet if you surveyed the users of this library a good few of us have implemented things like this already.

Timothy Baldridge

unread,
Dec 3, 2008, 10:29:21 AM12/3/08
to rp...@googlegroups.com
Perhaps what we need then is a lib like OpenGL does. In OGL land we
have gl.h that contains the functions for the main lib, and glu.h that
contains "utility" functions, that expand uppon the core GL.

Something I'd like to see in "rpycu" is the ability to load classic
servers with communication via popen. Something like

import rpycu
rpycu.classic.server_per_cpu()
sub_cpu1 = rpycu.classic.cpus[0]
sub_cpu1.modules.foo()

tomer filiba

unread,
Dec 3, 2008, 5:54:26 PM12/3/08
to rp...@googlegroups.com
On Wed, Dec 3, 2008 at 16:33, apat...@gmail.com <apat...@gmail.com> wrote:

Excelent, this way I can bring remote functions to me and execute them
locally !

For example:

>>> import rpyc
>>> conn=rpyc.classic.connect('10.200.0.120')
>>> a=rpyc.classic.obtain(conn.modules.os.listdir)

no, you've got it all wrong. obtain creates a copy of the remote object in your local process using pickle. it turns out that pickle on functions just saves the function's name and module, and unpickling it just looks it up in your interpreter's modules. so in this case, what you've got in `a` is not the remote function -- it's the local function!

as i said before, obtain and deliver will not move code/functions/classes/modules. pickle doesn't support that. what you want is called deployment and is done by upload() and download(). i hope this settles it.


On Wed, Dec 3, 2008 at 17:01, Jamie Kirkpatrick <j...@kirkconsulting.co.uk> wrote:
I think the option to send modules to the server would be very useful and raised a request for improved code uploading functions the other day:  I don't think Tomer was too keen on including these in the core library though.

seeing this is a repeating pattern, i will add this to the next release. i'm not going to implement the whole of rsync, but it will allow you to upload code to a temporary directory where it would be importable. something like

import my_lovely_module
rpyc.classic.deploy(conn, my_lovely_module)

and then
conn.modules.my_lovely_module

or perhaps
conn.deployed.my_lovely_module

if you need more complex behavior (i.e., upload only changes files, etc), i really do think that an (external) tool like rsync is better at this job... simply invoke it from your python process upon startup.


On Wed, Dec 3, 2008 at 17:29, Timothy Baldridge <tbald...@gmail.com> wrote:
Perhaps what we need then is a lib like OpenGL does. In OGL land we have gl.h that contains the functions for the main lib, and glu.h that contains "utility" functions, that expand uppon the core GL.
Something I'd like to see in "rpycu" is the ability to load classic servers with communication via popen. Something like

yes, this seems like a great idea. i'd love to see community-inspired projects. you all can collaborate project ideas on the wiki at:

http://rpyc.wikidot.com/community:projects



-tomer

yai...@gmail.com

unread,
Dec 3, 2008, 8:49:31 PM12/3/08
to rpyc
modulefinder - thanks for referring me to this module Jamie.
it will be a great help for my scons build script.
thanks!

On Dec 1, 10:00 am, "Jamie Kirkpatrick" <j...@kirkconsulting.co.uk>
wrote:
Reply all
Reply to author
Forward
0 new messages