>>> run("cmd")
were "cmd" is a command with its arguments to pass them to the shell
and run it, i.e.
>>> run("pwd")
or
>>> run("ls /home")
Does anybody knows any library to help me to avoid the use of the main
quotes, and brackets?
I would to use anything as:
$ ls /home => run("ls /home")
or, at least
run pwd => run("pwd")
Check the docs on os.system().
--
Josh "dutchie" Holland <j...@joshh.co.uk>
http://www.joshh.co.uk/
http://twitter.com/jshholland
http://identi.ca/jshholland
So rewrite the script to read the commands from a file and execute them
one by one?
regards
Steve
--
Steve Holden +1 571 484 6266 +1 800 494 3119
PyCon is coming! Atlanta, Feb 2010 http://us.pycon.org/
Holden Web LLC http://www.holdenweb.com/
UPCOMING EVENTS: http://holdenweb.eventbrite.com/
I'm suspicious of your original intent. Essentially, you want to write
code in which a literal string, such as ...
ls -l
... is *not* enclosed in quotes. Why run the risk of creating confusion
(in other people who look at your code, in syntax-checking tools, etc.)
between variables and literals?
But I'm in sympathy with your desire to make the code as clean as
possible and to minimize the number of times you have to type a quote
character. My suggestions:
1. Create a function (say, "Run") that encapsulates as much of the
syntax as possible: os.system(), subprocess.call(), string-splitting,
whatever. So an invocation would look like this:
Run("ls -l *.txt")
(I think you've already done this step.)
2. Find a text editor that supports keyboard macros, so that a single
keystroke turns this text line:
ls -l *.txt
... into this one:
Run("ls -l *.txt")
HTH,
John
>
> But I'm in sympathy with your desire to make the code as clean as
> possible and to minimize the number of times you have to type a quote
> character. My suggestions:
>
> 1. Create a function (say, "Run") that encapsulates as much of the
> syntax as possible: os.system(), subprocess.call(), string-splitting,
> whatever. So an invocation would look like this:
>
> Run("ls -l *.txt")
>
> (I think you've already done this step.)
Yes, I made a funtion very cool to call to system commands, that works
well with pipes and passes the variables (i.e. "LANG=C grep -e 'foo' /
home")
> 2. Find a text editor that supports keyboard macros, so that a single
> keystroke turns this text line:
>
> ls -l *.txt
>
> ... into this one:
>
> Run("ls -l *.txt")
This is not what I'm looking for. I'm supposing that could be solved
with a DSL or a macro library, any?
Python is not perl.
Thank God/Guido.
I can't see you avoiding quotes etc, but extending on John's comment,
the obvious next step would be to run everything in a loop i.e. place
all the commands into a list and create a loop that ran each command
in the list.
Almost all editors support macros - most editors support some form of
language sensitive editing (NOT the prompt call parameters style but
rather help with the syntax via a 'form' style of fill-in) that will
allow you to reduce typing effort. But macros would be the first and
easiest choice for this activity.
Peter
> Almost all editors support macros - most editors support some form of
> language sensitive editing (NOT the prompt call parameters style but
> rather help with the syntax via a 'form' style of fill-in) that will
> allow you to reduce typing effort. But macros would be the first and
> easiest choice for this activity.
The goal of my program is substitute to bash scripts, so the macros in
editors are irrelevant fo this one.
I think that the best solution that I've is to build a program that
parses the script to convert *$ command* to run("command") before of
be called by python.
def speed(float dist, float time):
return dist/time
then the compiler would generate code to first check parameter types (or even do some casts if appropriate, say cast an int into float) in the beginning of this function. and the rest of the function would then be compiled with the assumption that 'dist' and 'time' are of the type float.
Of course, dynamically-typed-ness is still the same as before. Python is well known for providing multiple programming paradigms, I wonder if we could also sneak this in nicely.
Any thoughts?
There are various attempts to achieve this.
The most generic one, which is most promising in the long run is PyPy,
the implementation of Python in itself, with the added benefit of making
code-generators that emit e.g. C based on Python-code.
Then there is Cython, which blends Python with C & integrates very nicely.
Last but not least, for you actual example, psyco is the easiest thing
to use, it's a JIT aimed to especially optimize numeric operations as
the one you present.
Diez
How about this?
def pwd(): return run("pwd")
pwd()
def ls(l=False, files=()):
args = []
if l: args.insert(0, '-l')
args.append(files)
return run("ls", args)
ls(l=True, "/foo")
Python already has the /syntax/, e.g.
>>> def speed( dist: float, time: float ) -> float:
... return dist/time
...
>>> print( speed.__annotations__ )
{'dist': <class 'float'>, 'return': <class 'float'>, 'time': <class 'float'>}
>>> _
However, this syntax, while exploitable, is by default nothing but an annotation
device, like doc strings.
I'm not sure I like your idea of introducing static typing to increase speed,
but it could be done without introducing new syntax simply by defining a special
meaning to such annotation expressions that are 'type' invocations, say, then like
def speed( dist: type( float ), time: type( float ) ) -> type( float )
Since there are umpteen projects to increase speed of Python this idea may
already have been explored...
Cheers & hth.,
- Alf (who has some other ideas)
> I'm not sure I like your idea of introducing static typing to increase
> speed, but it could be done without introducing new syntax simply by
> defining a special meaning to such annotation expressions that are
> 'type' invocations, say, then like
>
> def speed( dist: type( float ), time: type( float ) ) -> type(
> float )
>
That would be particularly useless:
>>> type(float) is type
True
So your declaration is identical to:
def speed(dist: type, time: type) -> type:
Much better just to stick to something like:
def speed( dist: float, time: float ) -> float:
where at least you can tell from the annotations what types were actually
used.
That's the point.
> Much better just to stick to something like:
>
> def speed( dist: float, time: float ) -> float:
>
> where at least you can tell from the annotations what types were actually
> used.
No, you do not want to redefine the meaning of existing (actually used) constructs.
That would be particularly useless, to use your own words. :-)
There would be to make a function for each system command to use so it
would be too inefficient, and follow the problem with the quotes.
The best is make a parser into a compiled language
I believe you're working on Linux, so how about using "sed"? Here's a
(prettified) BASH transcript of a sed script (edit.sed) transforming a
6-line text file (myprog.py). The text file has both Python statements
and "special commands", which have "$ " at the beginning of the line.
>>> cat myprog.py
print "hello"
$ ls -l
r = range(10)
$ grep foo bar.data
pass
print "bye"
>>> cat edit.sed
s/^\$ \(.*\)/Run("\1")/
>>> sed -f edit.sed data.txt
print "hello"
Run("ls -l")
r = range(10)
Run("grep foo bar.data")
pass
print "bye"
-John
Yeah, you could do that. Or you can simply rely on /bin/sh to do the
parsing and everything else for you. No need to re-invent the wheel. I
don't think Python will ever beat sh as a shell replacement.
When people say that Python is great for some situations, but not so
much for others, I think they thought of running commands like this as
"other",
It's not a library, but IPython[1] provides a lot of what you're
after:
IPython 0.9.1 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object'. ?object also works, ?? prints
more.
In [1]: ls /home
bb/ dcallan/ ehornsby/ ftp/ uqamartl/ uqckorte/ uqmtrev2/
In [2]: path = '/home'
In [3]: ls $path
bb/ dcallan/ ehornsby/ ftp/ uqamartl/ uqckorte/ uqmtrev2/
In [4]: output = !ls $path
In [5]: output
Out[5]: SList (.p, .n, .l, .s, .grep(), .fields(), sort() available):
0: bb
1: dcallan
2: ehornsby
3: ftp
4: uqamartl
5: uqckorte
6: uqmtrev2
In [6]: output[6]
Out[6]: 'uqmtrev2'
You'll need to run your scripts with IPython, so this may not be a
solution if you plan on distributing them.
> I shall blaspheme, and suggest that maybe the language you want
> to use is REXX (ooREXX or Regina).
Heh. That isn't blasphemy, because no true Pythonista [0] would claim
Python to be the god of that domain.
It's no sin to say that Python isn't a good choice for specific things;
and “I want to write programs by indistinguishably mixing statements
with external system calls” is one of them, IMO.
[0] fully aware of <URL:http://rationalwiki.com/wiki/No_True_Scotsman>,
thanks in advance.
--
\ “Natural catastrophes are rare, but they come often enough. We |
`\ need not force the hand of nature.” —Carl Sagan, _Cosmos_, 1980 |
_o__) |
Ben Finney
Sounds like the REXX designers already got the blaspheming covered
when they came up with such an inelegant-sounding feature...
> By default, ANY statement that can not be confused for a REXX
> language statement is sent to the currently defined command handler
> (Which on most OSs is equivalent to Python's os.system() call; the late
> Amiga, and IBM's mainframe OS had features that support defining other
> applications as command handlers).
>
> A common practice is to put quotes about the first word of the
> command to ensure it gets treated as external command.
Cheers,
Chris
--
http://blog.rebertia.com
That's mostly a problem with the CPython interpreter, which is a naive
interpreter. Many dynamically typed languages have implementations which
optimize out much of the run-time type handling. Shed Skin does this
for Python. LISP has been doing this better for decades. The JIT
systems for Javascript also do this.
Psyco has some explicit typing capability, but it doesn't do much about
eliminating redundant attribute lookups.
The two big wins that Python needs for performance are 1) at least
recognize when a variable can be represented as "long" "double", or
"char", and 2) recognize when an object or module doesn't need dynamic
attribute lookup and the attribute slots can be nailed down at compile
time.
(Plus, of course, something so that multithreaded programs don't
suck so bad on multicore CPUs.)
John Nagle
I started to working on this project (Scripy [1]) because I wanted to
hacking cryptsetup in my ubuntu. The funcionts to manage its
initialization are in bash and it goes to be non-maintainable code,
cryptic and very hard to debug (as whatever bash script of medium
size). Here you have the beast:
Using Scripy I can debug easily the commands run from the shell, and
log all if I would.
Now, thanks to Scripy I've created a script for easily disks
partitioning [2] using a simple cofiguration in YAML [3]. The great
thing is that I can add volumes and encrypted partitions :)
The only problem is that it's too verbose but I could rename *run()*
by a simple function as *_()* or *r()*
[1] http://bitbucket.org/ares/scripy/src/
[2] http://bitbucket.org/ares/scripypartition/src/tip/lib/scripy/part/disk.py#cl-22
[3] http://bitbucket.org/ares/scripypartition/src/tip/bin/init_crypto.py#cl-46
Yes, this would a well solution. Simple and fast to build.
How is that different from bash scripting?
--
Aahz (aa...@pythoncraft.com) <*> http://www.pythoncraft.com/
import antigravity
Best is to have a text file outside your program, in which you define
commands and symbolic names for those commands.
Then you have a python module which reads these commands and names, and
creates functions that invoke them via the specified name.
This way you get concise syntax, don't have to type as much boilerplate,
and don't add line noise.
Mixing two languages as though they were the same language greatly
increases the complexity of the original language with little gain.
Consider PowerShell - it's almost like two different languages smushed
together, and you have to know how a command was implemented to know how
to read it.
Also, what if someday The Powers That Be (the python language core
designers) decide they need to use $ for something? I hope they won't,
but if they do, your preprocessor might make quite a mess of it.
A quick note on terminology: open() is typically a system call.
fopen is probably never a system call - instead, it is a function in
the C library that wraps open(), making open() easier to use. Then
there's the system() function - like fopen(), it isn't really a
system call, despite its name. Rather, it is a C library function
that typically will wrap the fork() and exec*() system calls.
> Ben Finney wrote:
> > It's no sin to say that Python isn't a good choice for specific
> > things; and “I want to write programs by indistinguishably mixing
> > statements with external system calls” is one of them, IMO
> > From
> http://stromberg.dnsalias.org/~dstromberg/debugging-with-syscall-tracers.html#terminology
>
> A quick note on terminology: open() is typically a system call.
[…]
That is a different use of “system call”. I used it in the ‘os.system’
sense: meaning “invoke an external OS command as a child of this
process”.
--
\ “I was in a bar the other night, hopping from barstool to |
`\ barstool, trying to get lucky, but there wasn't any gum under |
_o__) any of them.” —Emo Philips |
Ben Finney
But in bash scripting, you'd just use rsync or cp or rm -- maybe an
example would make clearer how REXX differs from bash.
IOW, kinda like AppleScript?