How to use global variable as the default value for procedure parameter. I.e.,
proc foo {{bar $var}} {
puts $bar
}
set var "hello world!"
foo
When I do this, stdout prints $var instead of "hello world!"
thx
Dean
How about something like...
proc foo {{bar ""}} {
if {$bar eq ""} {
set bar $::var
}
puts $bar
}
Jeff
Roland
proc foo [list [list x var]] {
upvar $x z
puts $z
}
OR
proc foo [list [list bar $::var]] {
do_the_real_work_of_foo $bar
}
Tricky!
The first step is to choose a default value which will never be used
in the actual application, then you test the value of the passed in
value. If it is equal to the "default" value, use the global.
Some of the above examples only use the value of the global at the
time the proc was defined, here is an example which uses the current
value of the global variable:
proc ::myproc {arg1 {arg2 "I WILL NEVER USE THIS VALUE FOR ARG2"} } {
if {$arg2 eq "I WILL NEVER USE THIS VALUE FOR ARG2"} {
set arg2 $::arg2
}
puts $arg2
}
% global arg2
% set arg2 8
8
% myproc a
8
% myproc a ttt
ttt
Of course, this also works with not-global variables, they just have
to be in the :: namespace. Problem with using global or namespaced
variables is that you might change the value of these variables inside
the proc, which you might not intend to do.
>The first step is to choose a default value which will never be used
>in the actual application, then you test the value of the passed in
>value. If it is equal to the "default" value, use the global.
Yuck.
Why do people recommend this kludgy and fragile approach,
when you can do it so much more robustly by using the
"args" variadic-argument list?
proc myproc {arg1 args} {
switch [llength $args] {
0 {set arg2 $::var}
1 {set arg2 [lindex $args 0]}
default {error "bad args: should be\n myproc arg1 ?arg2?"}
}
.... now use $arg1 and $arg2 ....
}
Default arguments are good for "in-band" defaults, like the "1"
default for [incr]. They are no good for procs that need to
behave differently when called with different numbers of arguments.
--
Jonathan Bromley, Consultant
DOULOS - Developing Design Know-how
VHDL * Verilog * SystemC * e * Perl * Tcl/Tk * Project Services
Doulos Ltd., 22 Market Place, Ringwood, BH24 1AW, UK
jonathan...@MYCOMPANY.com
http://www.MYCOMPANY.com
The contents of this message may contain personal views which
are not the views of Doulos Ltd., unless specifically stated.
args is for an unknown number of additional agruments, and if there
ever was a kludge it would be using args to simulate overloaded
commands/functions.
The current example is already ugly. I would never recommend my
solution, but I would first off, never recommend the situation which
required such a solution.
The "args" argument should usually be used with a high level branching
command to feed into a more specific command. But it also works when
all remaining arguments are members of a group (like [lappend x a b
c ...].
The problem with your "args" solution is that it is more ambiguous
than my hack. Both solutions require a hidden interpretation of the
arguments, but the "args" solution is the most ambiguous. With "args"
there are no limits on the ambiguity.
IMHO, use args when you intend to expose a complex set of related
commands (like [file ...], [string ...], etc.), but in this instance,
I don't see any need to use args (which I agree is very useful in
certain circumstances and I can provide working examples if you need
'em).
Personally I never use globals, which are stupid and extremely short
sighted, but some exist, and the concept exists. Just because
something exists does not mean that it can be used everywhere. This is
true of both global vars and args.
Better:
proc foo {{bar ""}} {
if {[llength [info level 0]] == 1} {
>args is for an unknown number of additional agruments
Eh? Since I can *inspect* the number of arguments
passed as $args, and throw an informative error if
it's inappropriate, I cannot see why you try to
make that prescription.
> if there ever was a kludge it would be using
> args to simulate overloaded commands/functions.
I just don't understand you here. What's kludgy about
being able to check the number of arguments you've
received, and using that value to determine how to
proceed or how to error-out?
>The problem with your "args" solution is that
> it is more ambiguous than my hack.
I completely fail to see its ambiguity. It throws
an error if abused. It gives distinct behavior in
the two distinct cases, much in the same way that
the built-in [set] does. Apart from the fact that
I didn't get around to writing a man page for it,
what's ambiguous about that?
>Both solutions require a hidden interpretation of the
>arguments
What's "hidden" about it? Every proc does "hidden"
interpretation of its arguments - that's called
the body of the proc :-)
set ::Defaults(procA) [ list items 22 levels 5]
proc procA args {
upvar ::Defaults(procA) def
foreach {opt val} $def {
set $opt $val
}
foreach opt val $args {
set [ string trimleft $opt -] $val
}
# body of proc
....
}
procA -items 11
uwe
set ::Defaults(procA) [ list -items 22 -levels 5]
proc procA args {
array set opt $::Defaults(procA)
array set opt $args
# body of proc, using $opt(-...)
}
-Alex
uwe
the one downside (and your version had it too ;)
is that is silently ignores invalid options
so if i typo an option name in the call e.g.
procA -ietms 5
instead of an error, i just get the default value
and have no idea where i screwed up.
Bruce
But I use it on occasion to hand down options
to procs deeper in the stack.
> Bruce
uwe
This code doesn't flag an error, but it won't set the variable if it
isn't named properly by the caller. Also, it removes the leading "-",
which is optional:
proc ::tnt::setArgs { args } {
foreach arg $args {
set __option [lindex $arg 0]
upvar $__option $__option
set $__option [lindex $arg 1]
}
uplevel {
foreach {__option __value} $args {
set __option [string trimleft $__option "-"]
set $__option $__value
}
}
}
Use setArgs like this in proc:
proc ::tnt::cookie::setCookie { name value args } {
variable SetCookie2Version
variable SendSetCookie2
::tnt::setArgs\
{Comment ""}\
{CommentURL ""}\
{Discard "true"}\
{Domain ""}\
{Max_Age ""}\
{Path "/"}\
{Port "any"}\
{Secure "false"}
....
}
Call proc like this:
::tnt::cookie::setCookie SessionID ABCD123... -Path /images -Max_Age
600
Note that you can't change the "default" value with each call, which
makes no sense anyway.
another downside is that $opt(-item1) is slightly slower
then $item1
if one sets defaults for every arg then you could check via
[::llength [::array names opt]] if there is an invalid option
koyama
foreach arg $args {
upvar [lindex $arg 0] var
set var [lindex $arg 1]
}
orig: 71.013 microseconds per iteration
faster: 57.967 microseconds per iteration
slightly faster
I'm not impressed by the speed gains (one million requests and I save
users 13 seconds), but the code is a little cleaner, so I'll update
the proc. I'm more concerned with the following uplevel polluting the
caller with __option and __value variables. The point of the proc is
simple documentation (note that the defaults show up as part of the
proc body) and removing the args handling code from the procedure.
I know it is a personal preference, but I really don't like options.
They usually document the desire to reuse code by making it more
complicated and fragile. The basic Tcl API mostly avoids this, and all
the complexity is handled in C code anyway, where there are lots of
good examples of option handling.
It would be ideal if options could be boiled down into subcommands.
This isn't possible with every command, but I was able to reduce the
new 8.5+ [switch] command into exactly 12 subcommands. If [switch] had
been written this way, more opportunities would exist for compiling
switch bodies.
18% is quite a good improvement, FWIW, especially as it is really not
much more than a peephole optimization. Small scale stuff usually only
wins a percent or so...
Donal.
Various levels of error checking ar easy to add. Here is what I use:
###################################################
# Set up default arguments
array set opts {
-user m...@mymail.com
-password *****
}
# Minimal error checking: odd number of args,
# or first of pair not a valid option name
set optsLength [llength [array get opts]]
set optsNames [array names opts]
if {[llength $args] % 2 != 0} {puts "Unpaired option request.\
Valid options are:\n$optsNames"; return}
array set opts $args
if {[llength [array get opts]] != $optsLength} {puts \
"Illegal option. Valid options are:\n$optsNames"; return}
###################################################
It checks for an odd number of arguments and for the first element of a
pair not being a valid option. In either case is lists the valid options.
In many situations it would make no sense to run with no options being
set. It would be trivial to add another test or modify the first one to
check for that.
Showing what invalid option was used would not be hard, but I decided it
would not be useful enough to add it.
Gerry
I'm more surprised that nobody noticed I didn't use [upvar 1 otherVar
myVar].
The original reason for using individual variable names was that I was
under the mistaken impression that "myVar" must not exist. From the
manpage: "There must not exist a variable by the name myVar at the
time upvar is invoked." Apparently myVar doesn't count as a local
variable, as the faster version reuses "var". The manpage does explain
this later on: "it is possible to retarget an upvar variable by
executing another upvar command."
I still think the context for profiling code is at least at the proc
level. The speed gains in the foreach loop may depend on the number of
times the loop runs. Should I profile for 1,2,3...10 options? Should I
profile at 10, 100, 10000 reps? From my experience you sometimes get
ambiguous results the more testing you do. But if your algorithm only
spends a small fraction of the overall time in a section of code, the
gains are not 18%, but much less.