[Python-ideas] Default arguments in Python - the return

58 views
Skip to first unread message

Pascal Chambon

unread,
May 8, 2009, 4:31:57 PM5/8/09
to python-ideas

Hello,


I'm surely not original in any way there, but I'd like to put back on the table the matter of "default argument values".
Or, more precisely, the "one shot" handling of default values, which makes that the same mutable objects, given once as default arguments, come back again and again at each function call.
They thus become some kinds of "static variables", which get polluted by the previous calls, whereas many-many-many python users still believe that they get a fresh new value at each function call.
I think I understand how default arguments are currently implemented (and so, "why" - technically - it does behave this way), but I'm still unsure of "why" - semantically - this must be so.

I've browsed lots of google entries on that subject, but as far as I'm concerned, I've found nothing in favor current semantic.
I've rather found dozens, hundreds of posts of people complaining that they got biten by this gotcha, many of them finishing with a "Never put mutable values in default arguments, unless you're very very sure of what you're doing !".

And no one seemed to enjoy the possibilities of getting "potentially static variables" this way. Static variables are imo a rather bad idea, since they create "stateful functions", that make debugging and maintenance more difficult ; but  when such static variable are, furthermore, potentially non-static (i.e when the corresponding function argument is supplied), I guess they become totally useless and dangerous - a perfect way to get hard-to-debug behaviours.

On the other hand, when people write "def func(mylist=[]):", they basically DO want a fresh new list at each call, be it given by the caller or the default argument system.
So it's really a pity to need tricks like
> def f(a, L=None):
> if L is None:
> L = []
to get what we want (and if None was also a possible value ? what other value should we put as a placeholder for "I'd like None or a fresh new list but I can't say it directly ?").

So I'd like to know : are there other "purely intellectual" arguments for/against the current semantic of default arguments (I might have missed some discussion on this subject, feel free to point them ?

Currently, this default argument handling looks, like a huge gotcha for newcomers, and, I feel, like an embarrassing wart to most pythonistas. Couldn't it be worth finding a new way of doing it ?
Maybe there are strong arguments against a change at that level ; for example, performance issues (I'm not good in those matters). But I need to ensure.

So here are my rough ideas on what we might do - if after having the suggestions from expert people, it looks like it's worth writting a PEP, I'll be willing to particpateon it.
Basically, I'd change the python system so that, when a default argument expression is encountered, instead of being executed, it's wrapped in some kind of zero-argument lambda expression, which gets pushed in the "func_defaults" attribute of the function.
And then, each time a default argument is required in a function call, this lambda expression gets evaluated and gives the expected value.

I guess this will mean some overhead during function call, so this might become another issue.
It's also a non retrocompatible change, so I assume we'd have to use a "from __future__ import XXX" until Python4000.
But I think the change is worth the try, because it's a trap which waits for all the python beginners.

So, if this matters hasn't already been marked somewhere as a no-go, I eagerly await the feedback of users and core developpers on the subject. :)

By the way, I'm becoming slightly allergical to C-like languages (too much hassle for too little gain, compared to high level dynamic languages), but if that proposition goes ahead, and no one wants to handle the implementation details, I'll put the hands in the engine ^^

Regards,
Pascal


Chris Rebert

unread,
May 8, 2009, 5:12:28 PM5/8/09
to Pascal Chambon, python-ideas
On Fri, May 8, 2009 at 1:31 PM, Pascal Chambon
<chambon...@wanadoo.fr> wrote:
> Hello,
>
> I'm surely not original in any way there, but I'd like to put back on the
> table the matter of "default argument values".
> Or, more precisely, the "one shot" handling of default values, which makes
> that the same mutable objects, given once as default arguments, come back
> again and again at each function call.
> They thus become some kinds of "static variables", which get polluted by the
> previous calls, whereas many-many-many python users still believe that they
> get a fresh new value at each function call.
> I think I understand how default arguments are currently implemented (and
> so, "why" - technically - it does behave this way), but I'm still unsure of
> "why" - semantically - this must be so.
<snip>

> So I'd like to know : are there other "purely intellectual" arguments
> for/against the current semantic of default arguments (I might have missed
> some discussion on this subject, feel free to point them ?

Point-point: http://mail.python.org/pipermail/python-ideas/2007-January/000121.html
And see also the links below.

> Currently, this default argument handling looks, like a huge gotcha for
> newcomers, and, I feel, like an embarrassing wart to most pythonistas.
> Couldn't it be worth finding a new way of doing it ?
> Maybe there are strong arguments against a change at that level ; for
> example, performance issues (I'm not good in those matters). But I need to
> ensure.
>
> So here are my rough ideas on what we might do - if after having the
> suggestions from expert people, it looks like it's worth writting a PEP,
> I'll be willing to particpateon it.
> Basically, I'd change the python system so that, when a default argument
> expression is encountered, instead of being executed, it's wrapped in some
> kind of zero-argument lambda expression, which gets pushed in the
> "func_defaults" attribute of the function.
> And then, each time a default argument is required in a function call, this
> lambda expression gets evaluated and gives the expected value.
>
> I guess this will mean some overhead during function call, so this might
> become another issue.
> It's also a non retrocompatible change, so I assume we'd have to use a "from
> __future__ import XXX" until Python4000.
> But I think the change is worth the try, because it's a trap which waits for
> all the python beginners.
>
> So, if this matters hasn't already been marked somewhere as a no-go, I
> eagerly await the feedback of users and core developpers on the subject. :)

It's basically been rejected. See GvR Pronouncement:
http://mail.python.org/pipermail/python-3000/2007-February/005715.html
regarding the pre-PEP "Default Argument Expressions":
http://mail.python.org/pipermail/python-3000/2007-February/005704.html

Unless your exact idea somehow differs significantly from my pre-PEP
(sounds like it doesn't IMHO), it's not gonna happen. It's basically
too magical.

Cheers,
Chris
--
http://blog.rebertia.com
_______________________________________________
Python-ideas mailing list
Python...@python.org
http://mail.python.org/mailman/listinfo/python-ideas

George Sakkis

unread,
May 8, 2009, 6:06:43 PM5/8/09
to python-ideas
On Fri, May 8, 2009 at 5:12 PM, Chris Rebert <pyi...@rebertia.com> wrote:

> On Fri, May 8, 2009 at 1:31 PM, Pascal Chambon

>> So, if this matters hasn't already been marked somewhere as a no-go, I
>> eagerly await the feedback of users and core developpers on the subject. :)
>
> It's basically been rejected. See GvR Pronouncement:
> http://mail.python.org/pipermail/python-3000/2007-February/005715.html
> regarding the pre-PEP "Default Argument Expressions":
> http://mail.python.org/pipermail/python-3000/2007-February/005704.html
>
> Unless your exact idea somehow differs significantly from my pre-PEP
> (sounds like it doesn't IMHO), it's not gonna happen. It's basically
> too magical.

FWIW I don't find the dual semantics, with explicit syntax for the new
semantics ("def foo(bar=new baz)") mentioned in the PEP too magical.
If even C, a relatively small language, affords two calling semantics,
why would it be too confusing for Python ? Perhaps that PEP might had
had better luck if it didn't propose replacing the current semantics
with the new.

George

Terry Reedy

unread,
May 8, 2009, 7:34:42 PM5/8/09
to python...@python.org
Pascal Chambon wrote:

> I'm surely not original in any way there, but I'd like to put back on
> the table the matter of "default argument values".

There have been two proposals:
1. Evaluate the expression once, store the result, and copy on each
function call.
- Expensive.
- Nearly always not needed.
- Not always possible.
2. Store the expression and evaluate on each function call (your
re-proposal).
- Expensive.
- The result may be different for each function call, and might raise an
exception.
- This is the job of the suite !!!!!!!!!!!!!!!!
- Which is to say, run-time code belongs in the function body, not the
header.

> And no one seemed to enjoy the possibilities of getting "potentially
> static variables" this way.

You did not search hard enough.

> Static variables are imo a rather bad idea,

So you want to take them away from everyone else. I think *that* is a
rather bad idea ;-). No one is forcing you to use them.

> On the other hand, when people write "def func(mylist=[]):", they
> basically DO want a fresh new list at each call,

Maybe, maybe not.

> be it given by the caller or the default argument system.
> So it's really a pity to need tricks like

> >/ def f(a, L=None):
> />/ if L is None:
> />/ L = []

Or don't supply a default arg if that is not what you really want.
Putting call time code in the function body is not a trick.

> /to get what we want (and if None was also a possible value ?

__none = object()
def(par = __none):
if par == __none: ...

as had been posted each time this question has been asked.

> I guess this will mean some overhead during function call,

I absolutely guarantee that this will. Functions calls are expensive.
Adding a function call for each default arg (and many functions have
more than one) multiplies the calling overhead.

> so this might become another issue.

Is and always has been.

Terry Jan Reedy

spir

unread,
May 9, 2009, 6:42:56 AM5/9/09
to python...@python.org
Le Fri, 08 May 2009 22:31:57 +0200,
Pascal Chambon <chambon...@wanadoo.fr> s'exprima ainsi:

> And no one seemed to enjoy the possibilities of getting "potentially
> static variables" this way. Static variables are imo a rather bad idea,
> since they create "stateful functions", that make debugging and
> maintenance more difficult ; but when such static variable are,
> furthermore, potentially non-static (i.e when the corresponding function
> argument is supplied), I guess they become totally useless and dangerous
> - a perfect way to get hard-to-debug behaviours.

If we want static vars, there are better places than default args for this. (See also the thread about memoizing). E.g. on the object when it's a method, or even on the func itself.

def squares(n):
square = n * n; print square
if square not in squares.static_list:
squares.static_list.append(n)
squares.static_list = []

squares(1);squares(2);squares(1);squares(3)
print squares.static_list

Denis
------
la vita e estrany

Pascal Chambon

unread,
May 9, 2009, 6:56:20 AM5/9/09
to Terry Reedy, python...@python.org
Thanks everyone for the feedback and the links (I was obviously too
confident in Google's first pages, to miss such things >_<)

Terry Reedy a écrit :


>> And no one seemed to enjoy the possibilities of getting "potentially
>> static variables" this way.
>
> You did not search hard enough.
>

Well, for sure some people here and there used that semantic to have,
for example, a "default cache" handling the requests for which a
specific cache isn't provided.
But that behavior can as easily be obtained with a much more explicit
way, which furthermore lets you access your default cache easily from
inside the function code, even when a specific cache is provided :

class a:
cache=[1,2,3]
def func(self, x, newcache=cache):
print "Current cache state :",y
y.append(x)
print "The static, default cache is ", cache

So I don't see the default argument trick as a "neat feature", rather as
a way of doing simple things obscure.

>> Static variables are imo a rather bad idea,
>
> So you want to take them away from everyone else. I think *that* is a
> rather bad idea ;-). No one is forcing you to use them.
>

I don't want to annihilate all traces of static variables :p ;
I just find them ugly, because they create stateful functions whose
state is hidden in them (like some do with free variables, too), and
that's imo not a "robust code best practice".

But what kills me with current default arguments is that those aren't
even real static variables : they're "potentially static variables", and
as far as I've seen, you have no easy way to check whether, for
instance, the argument value that you've gotten is the default, static
one, or a new one provided by the caller (of course, you can store the
default value somewhere else for reference, but it's lamely redundant).
If people want static variables in python, for example to avoid OO
programming and still have stateful functions, we can add an explicit
"static" keyword or its equivalent. But using the ambiguous value given
via a default-valued argument is not pretty, imo.
Unless we have a way to access, from inside a code block, the function
object in which this code block belongs.

Does it exist ? Do we have any way, from inside a call block, to browse
the default arguments that this code block might receive ?


>
>> I guess this will mean some overhead during function call,
>
> I absolutely guarantee that this will. Functions calls are expensive.
> Adding a function call for each default arg (and many functions have
> more than one) multiplies the calling overhead.
>
> > so this might become another issue.
>
> Is and always has been.
>

Well, if, like it was proposed in previous threads, the expression is
only reevaluated in particular circumstances (i.e, if the user asks it
with a special syntax), it won't take more time than the usual "if myarg
is None : myarg = []" ;
but I agree that alternate syntaxes have led to infinite and complex
discussions, and that the simpler solution I provided is likely to be
too CPU intensive, more than I expected...


>
>> /to get what we want (and if None was also a possible value ?
>
> __none = object()
> def(par = __none):
> if par == __none: ...
>
> as had been posted each time this question has been asked.

Well, I didn't turn my rhetorical question properly it seems ^^.
I wholly agree that you can always use another object as a placeholder,
but I don't quite like the idea of creating new instances just to
signify "that's not a valid value that you can use, create one brand new"

On the other hand, would anyone support my alternative wish, of having a
builtin "NotGiven", similar to "NotImplemented", and dedicated to this
somehow usual taks of "placeholder" ?
There would be two major pros for this, imo :
- giving programmers a handy object for all unvanted "mutable
default argument" situations, without having to think "is None a value I
might want to get ?"
- *Important* : by appearing in the beginning of the doc near True
and False, this keyword would be much more visible to beginners than the
deep pages on "default argument handling" ; thus, they'd have much more
chances to cross warnings on this Gotcha, than they currently have (and
seeing "NotGiven" in tutorials would force them to wonder why it's so,
it's imo much more explicit than seeing "None" values instead)

So, since reevaluation of arguments actually *is* a no-go, and
forbidding mutable arguments is obviously a no-go too, would you people
support this integrating of "NotGiven" (or any other name) in the
builtins ? It'd sound to me like a good practice.

Regards,
Pascal

Pascal Chambon

unread,
May 9, 2009, 7:16:46 AM5/9/09
to spir, python...@python.org
spir a écrit :
Well, I've just realized I'd sent a semi-dumb question in my previous answer :p

I'd never quite realized it was possible to store stuffs inside the function object, by retrieving it from inside the code object.

And it works pretty well...
In classes you can access your function via self, in closures they get caught in cells... it's only in global scope that there are problems : here, if you rename squares (newsquares = squares ; squares = None), you'll have an error by calling newsquares, because it searches "squares" in the global scope, without the help of "self" or closures.

Still, an explicit way of targetting "the function I'm in" would be sweet imo, but retrieving it that way is not far from being as handy.

Thanks for the tip that opened my eyes,
regards,
Pascal

Steven D'Aprano

unread,
May 9, 2009, 7:32:35 AM5/9/09
to python...@python.org
On Sat, 9 May 2009 08:56:20 pm Pascal Chambon wrote:

> But what kills me with current default arguments is that those aren't
> even real static variables : they're "potentially static variables",
> and as far as I've seen, you have no easy way to check whether, for
> instance, the argument value that you've gotten is the default,
> static one, or a new one provided by the caller (of course, you can
> store the default value somewhere else for reference, but it's lamely
> redundant).

I'm not really sure why you would want to do that. The whole point of
default values is to avoid needing to care whether or not the caller
has provided an argument or not.

[...]


> Does it exist ? Do we have any way, from inside a call block, to
> browse the default arguments that this code block might receive ?

>>> def spam(n=42):
... return "spam "*n
...
>>> spam.func_defaults
(42,)


dir(func_object) is your friend :)


[...]


> but I agree that alternate syntaxes have led to infinite and complex
> discussions, and that the simpler solution I provided is likely to be
> too CPU intensive, more than I expected...

I would support... no, that's too strong. I wouldn't oppose the
suggestion that Python grow syntax for "evaluate this default argument
every time the function is called (unless the argument is given by the
caller)". The tricky part is coming up with good syntax and a practical
mechanism.

[...]


> On the other hand, would anyone support my alternative wish, of
> having a builtin "NotGiven", similar to "NotImplemented", and
> dedicated to this somehow usual taks of "placeholder" ?

There already is such a beast: None is designed to be used as a
placeholder for Not Given, Nothing, No Result, etc.

If None is not suitable, NotImplemented is also a perfectly good
built-in singleton object which can be used as a sentinel. It's already
used as a sentinel for a number of built-in functions and operators.
There's no reason you can't use it as well.


> There would be two major pros for this, imo :
> - giving programmers a handy object for all unvanted "mutable
> default argument" situations, without having to think "is None a
> value I might want to get ?"

But then they would need to think "Is NotGiven a value I might want to
get, so I can pass it on to another function unchanged?", and you would
then need to create another special value ReallyNotGiven. And so on.


> - *Important* : by appearing in the beginning of the doc near
> True and False, this keyword would be much more visible to beginners
> than the deep pages on "default argument handling" ; thus, they'd
> have much more chances to cross warnings on this Gotcha, than they
> currently have (and seeing "NotGiven" in tutorials would force them
> to wonder why it's so, it's imo much more explicit than seeing "None"
> values instead)

Heh heh heh, he thinks beginners read manuals :-)


> So, since reevaluation of arguments actually *is* a no-go, and
> forbidding mutable arguments is obviously a no-go too, would you
> people support this integrating of "NotGiven" (or any other name) in
> the builtins ? It'd sound to me like a good practice.

-1 on an extra builtin. There's already two obvious ones, and if for
some reason you need to accept None and NotImplemented as valid data,
then you can create an unlimited number of sentinels with object(). The
best advantage of using object() is that because the sentinel is unique
to your module, you can guarantee that nobody can accidentally pass it,
or expect to use it as valid data.

--
Steven D'Aprano

Steven D'Aprano

unread,
May 9, 2009, 7:36:12 AM5/9/09
to python...@python.org
On Sat, 9 May 2009 09:16:46 pm Pascal Chambon wrote:
> Still, an explicit way of targetting "the function I'm in" would be
> sweet imo, but retrieving it that way is not far from being as handy.

There's a rather long discussion on the comp.lang.python newsgroup at
the moment about that exact question. Look for the recent thread
titled "Self function".

If you can't get Usenet and don't like Google Groups, the c.l.py
newsgroup is also available as a python mailing list, and a gmane
mailing list.


--
Steven D'Aprano

Steven D'Aprano

unread,
May 9, 2009, 7:39:53 AM5/9/09
to python...@python.org
On Sat, 9 May 2009 09:36:12 pm Steven D'Aprano wrote:
> There's a rather long discussion on the comp.lang.python newsgroup at
> the moment about that exact question

Er, to be precise, by "at the moment" I actually mean "over the last few
days". The thread seems to have more-or-less finished now.

Of course, no Usenet thread is every *completely* finished. Please feel
free to resurrect it if you have any good ideas, questions or insight
into the issue.

spir

unread,
May 9, 2009, 8:34:08 AM5/9/09
to python...@python.org
Le Sat, 09 May 2009 12:56:20 +0200,

Pascal Chambon <chambon...@wanadoo.fr> s'exprima ainsi:

> If people want static variables in python, for example to avoid OO

> programming and still have stateful functions, we can add an explicit
> "static" keyword or its equivalent.

This is far from beeing pythonic anyway, I guess. Ditto for storing data on the func itself (as shown in another post). It provide a way of linking together data and behaviour; similar techniques are used e.g. in Lisp, so that many Lisp people find OO pretty useless.
But python has OO in-built, and even as mainsteam paradigm. Data related to behaviour should be set on an object.

> But using the ambiguous value given
> via a default-valued argument is not pretty, imo.
> Unless we have a way to access, from inside a code block, the function
> object in which this code block belongs.
>
> Does it exist ? Do we have any way, from inside a call block, to browse
> the default arguments that this code block might receive ?

This is a feature of much more reflexive/meta languages like Io (or again Lisp), that were indeed designed from scratch with this capacity in mind and intended as a major programming feature.
In Io you can even access the 'raw' message _before_ evaluation, so that you get the expression of argument, not only the resulting value.

Denis
------
la vita e estrany

Georg Brandl

unread,
May 9, 2009, 11:38:12 AM5/9/09
to python...@python.org
Pascal Chambon schrieb:

> Still, an explicit way of targetting "the function I'm in" would be
> sweet imo, but retrieving it that way is not far from being as handy.

You could use e.g.

def selffunc(func):
@wraps(func)
def newfunc(*args, **kwds):
return func(func, *args, **kwds)
return newfunc

@selffunc
def foo(func, a):
func.cache = a

to avoid the "ugly" lookup of the function in the global namespace.

Georg

--
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

Tennessee Leeuwenburg

unread,
May 9, 2009, 8:19:01 PM5/9/09
to Pascal Chambon, python-ideas
Hi Pascal,

Taking the example of 

def foo(bar = []):
  bar.append(4)
  print(bar)

I'm totally with you in thinking that what is 'natural' is to expect to get a new, empty, list every time. However this isn't want happens. As far as I'm concerned, that should more or less be the end of the discussion in terms of what should ideally happen. 

The responses to the change in behaviour which I see as more natural are, to summarise, as follows:
  -- For all sorts of technical reasons, it's too hard
  -- It changes the semantics of the function definition being evaluated at compile time
  -- It's not what people are used to

With regards to the second point, it's not like the value of arguments is set at compile time, so I don't really see that this stands up. I don't think it's intuitive, it's just that people become accustomed to it. There is indeed, *some sense* in understanding that the evaluation occurs at compile-time, but there is also a lot of sense (and in my opinion, more sense) in understanding the evaluation as happening dynamically when the function is called. 

With regards to the first point, I'm not sure that this is as significant as all of that, although of course I defer to the language authors here. However, it seems as though it could be no more costly than the lines of code which most frequently follow to initialise these variables. 

On the final point, that's only true for some people. For a whole lot of people, they stumble over it and get it wrong. It's one of the most un-Pythonic things which I have to remember about Python when programming -- a real gotcha. I don't see it as changing one way of doing things for another equally valid way of doing things, but changing something that's confusing and unexpected for something which is far more natural and, to me, Pythonic.

For me, Python 3k appears to be a natural place to do this. Python 3 still appears to be regarded as a work-in-progress by most people, and I don't think that it's 'too late' to change for Python 3k. Perhaps, given the timing, the people involved, the complexity of change etc, then for pragmatic reasons this may have to be delayed, but I don't think that's a good thing. I'd much rather see it done, personally. I think that many people would feel the same way.

Regards,
-Tennessee

_______________________________________________
Python-ideas mailing list
Python...@python.org
http://mail.python.org/mailman/listinfo/python-ideas




--
--------------------------------------------------
Tennessee Leeuwenburg
http://myownhat.blogspot.com/
"Don't believe everything you think"

Greg Ewing

unread,
May 9, 2009, 9:23:42 PM5/9/09
to python...@python.org
Steven D'Aprano wrote:

> Of course, no Usenet thread is every *completely* finished.

Unless Nazis have been mentioned, of course. (Oops,
looks like I just ended this thread -- sorry about
that!)

--
Greg

Steven D'Aprano

unread,
May 9, 2009, 9:23:36 PM5/9/09
to python...@python.org
On Sun, 10 May 2009 10:19:01 am Tennessee Leeuwenburg wrote:
> Hi Pascal,
> Taking the example of
>
> def foo(bar = []):
> bar.append(4)
> print(bar)
>
> I'm totally with you in thinking that what is 'natural' is to expect
> to get a new, empty, list every time.

That's not natural to me. I would be really, really surprised by the
behaviour you claim is "natural":

>>> DEFAULT = 3
>>> def func(a=DEFAULT):
... return a+1
...
>>> func()
4
>>> DEFAULT = 7
>>> func()
8

For deterministic functions, the same argument list should return the
same result each time. By having default arguments be evaluated every
time they are required, any function with a default argument becomes
non-deterministic. Late evaluation of defaults is, essentially,
equivalent to making the default value a global variable. Global
variables are rightly Considered Harmful: they should be used with
care, if at all.


> However this isn't want
> happens. As far as I'm concerned, that should more or less be the end
> of the discussion in terms of what should ideally happen.

As far as I'm concerned, what Python does now is the idea behaviour.
Default arguments are part of the function *definition*, not part of
the body of the function. The definition of the function happens
*once* -- the function isn't recreated each time you call it, so
default values shouldn't be recreated either.


> The responses to the change in behaviour which I see as more natural
> are, to summarise, as follows:
> -- For all sorts of technical reasons, it's too hard
> -- It changes the semantics of the function definition being
> evaluated at compile time
> -- It's not what people are used to

And it's not what many people want.

You only see the people who complain about this feature. For the
multitude of people who expect it or like it, they have no reason to
say anything (except in response to complaints). When was the last time
you saw somebody write to the list to say "Gosh, I really love that
Python uses + for addition"? Features that *just work* never or rarely
get mentioned.


> With regards to the second point, it's not like the value of
> arguments is set at compile time, so I don't really see that this
> stands up.

I don't see what relevance that has. If the arguments are provided at
runtime, then the default value doesn't get used.


> I don't think it's intuitive,

Why do you think that intuitiveness is more valuable than performance
and consistency?

Besides, intuitiveness is a fickle thing. Given this pair of functions:

def expensive_calculation():
time.sleep(60)
return 1

def useful_function(x=expensive_calculation()):
return x + 1

I think people would be VERY surprised that calling useful_function()
with no arguments would take a minute *every time*, and would complain
that this slowness was "unintuitive".


> it's just that people become
> accustomed to it. There is indeed, *some sense* in understanding that
> the evaluation occurs at compile-time, but there is also a lot of
> sense (and in my opinion, more sense) in understanding the evaluation
> as happening dynamically when the function is called.

No. The body of the function is executed each time the function is
called. The definition of the function is executed *once*, at compile
time. Default arguments are part of the definition, not the body, so
they too should only be executed once. If you want them executed every
time, put them in the body:

def useful_function(x=SENTINEL):
if x is SENTINEL:
x = expensive_calculation()
return x+1

> With regards to the first point, I'm not sure that this is as
> significant as all of that, although of course I defer to the
> language authors here. However, it seems as though it could be no
> more costly than the lines of code which most frequently follow to
> initialise these variables.
>
> On the final point, that's only true for some people. For a whole lot
> of people, they stumble over it and get it wrong. It's one of the
> most un-Pythonic things which I have to remember about Python when
> programming -- a real gotcha.

I accept that it is a Gotcha. The trouble is, the alternative behaviour
you propose is *also* a Gotcha, but it's a worse Gotcha, because it
leads to degraded performance, surprising introduction of global
variables where no global variables were expected, and a breakdown of
the neat distinction between creating a function and executing a
function.

But as for it being un-Pythonic, I'm afraid that if you really think
that, your understanding of Pythonic is weak. From the Zen:

The Zen of Python, by Tim Peters

Special cases aren't special enough to break the rules.
Although practicality beats purity.
If the implementation is hard to explain, it's a bad idea.

(1) Assignments outside of the body of a function happen once, at
compile time. Default values are outside the body of the function. You
want a special case for default values so that they too happen at
runtime. That's not special enough to warrant breaking the rules.

(2) The potential performance degradation of re-evaluating default
arguments at runtime is great. For practical reasons, it's best to
evaluate them once only.

(3) In order to get the behaviour you want, the Python compiler would
need a more complicated implementation which would be hard to explain.


> I don't see it as changing one way of
> doing things for another equally valid way of doing things, but
> changing something that's confusing and unexpected for something
> which is far more natural and, to me, Pythonic.

I'm sorry, while re-evaluation of default arguments is sometimes useful,
it's more often NOT useful. Most default arguments are simple objects
like small ints or None. What benefit do you gain from re-evaluating
them every single time? Zero benefit. (Not much cost either, for simple
cases, but no benefit.)

But for more complex cases, there is great benefit to evaluating default
arguments once only, and an easy work-around for those rare cases that
you do want re-evaluation.


> For me, Python 3k appears to be a natural place to do this. Python 3
> still appears to be regarded as a work-in-progress by most people,
> and I don't think that it's 'too late' to change for Python 3k.

Fortunately you're not Guido, and fortunately this isn't going to
happen. I recommend you either accept that this behaviour is here to
stay, or if you're *particularly* enamoured of late evaluation
behaviour of defaults, that you work on some sort of syntax to make it
optional.

--
Steven D'Aprano

Carl Johnson

unread,
May 9, 2009, 10:06:06 PM5/9/09
to python...@python.org
I think this is a case where there are pros and cons on both sides.
There are a lot of pros to the current behavior (performance,
flexibility, etc.), but it comes with the con of confusing newbies and
making people go through the same song and dance to set a  "sentinel
value" when the want the other behavior and they can't ensure that
None won't be passed. The newbie problem can't be fixed from now until
Python 4000, since it would break a lot of existing uses of default
values, but we could cut down on the annoyance of setting and check a
sentinel value by introducing a new keyword, eg.

def f(l=fresh []):
   ...

instead of

__blank = object()
def f(l=__blank):
   if l is __blank:
       l = []
   ...

The pros of a new keyword are saving 3 lines and being more clear
upfront about what's going on with the default value . The con is that
adding a new keyword bloats the language. We could try reusing an
existing keyword, but none of the current ones seem to fit:

and                 elif                import              return
as                  else                in                  try
assert              except              is                  while
break               finally             lambda              with
class               for                 not                 yield
continue            from                or
def                 global              pass
del                 if                  raise

(I copied this from Python 3.0's help, but there seems to be a
documentation error: nonlocal, None, True, and False are also keywords
in Python 3+.)

The best one on the current list it seems to me would be "else" as in

def f(l else []):
   ...

But I dunno… It just not quite right, you know?

So, I'm -0 on changing the current behavior, but I'm open to it if
someone can find a way to do it that isn't just an ad hoc solution to
this one narrow problem but has a wider general use.

Tennessee Leeuwenburg

unread,
May 9, 2009, 10:06:50 PM5/9/09
to Steven D'Aprano, python...@python.org

> For me, Python 3k appears to be a natural place to do this. Python 3
> still appears to be regarded as a work-in-progress by most people,
> and I don't think that it's 'too late' to change for Python 3k.

Fortunately you're not Guido, and fortunately this isn't going to
happen. I recommend you either accept that this behaviour is here to
stay, or if you're *particularly* enamoured of late evaluation
behaviour of defaults, that you work on some sort of syntax to make it
optional.


Thank you for the rest of the email, which was (by and large) well-considered and (mostly) stuck to the points of the matter. I will get to them in proper time when I have been able to add to the argument in a considered way after fully understanding your points.

However, this last section really got under my skin. It seems completely inappropriate to devolve any well-intentioned email discussion into an appalling self-service ad-hominem attack. Your assertion of your ethical viewpoint (use of Fortunately without a backing argument), attempt to bully me out of my position (recomment you accept this behaviour is here to stay) are not appreciated. You have *your* view of what is fortunate, right and appropriate. I took every care NOT to assert my own viewpoint as universally true; you have not  done so.

Guido is just a person, as you are just a person, as I am just a person. Can we not please just stick to a simple, civilised discussion of the point without trying to win cheap debating points or use the "Zen" of Python to denigrate people who have either genuinely failed to grasp some aspect of a concept, or whose intuition is simply different. Without people whose intuition is different, no advancement is possible. Without debate about what constitutes the "Zen" of Python, the "Zen" of Python must always be static, unchanging, unchallenged and therefore cannot grow. I do not think that is what anyone meant when they were penning the "Zen" of Python.

This list is not best-served by grandstanding. It may not even be best-served by the now effectively personal debate which you have drawn me into through your personalisation of the issue (I quote: "your understanding is weak"). Terms such as weak and strong are inherently laden with ethical and social overtones -- incomplete, misplaced, or any number of other qualifiers could have kept the debate to the factual level.

Regards,
-Tennessee

Scott David Daniels

unread,
May 10, 2009, 1:28:13 AM5/10/09
to python...@python.org

Any argument for changing to a more "dynamic" default scheme had better
have a definition of the behavior of the following code, and produce a
good rationale for that behavior:

x = 5
def function_producer(y):
def inner(arg=x+y):
return arg + 2
return inner

x = 6
f1 = function_producer(x)
x = 4.1
y = 7
f2 = function_producer(3)
print x, y, f1(), f2()
del y
x = 45
print x, f1(), f2()
del x
print f1(), f2()

--Scott David Daniels
Scott....@Acm.Org

Terry Reedy

unread,
May 10, 2009, 1:47:12 AM5/10/09
to python...@python.org
Tennessee Leeuwenburg wrote:
>
> > For me, Python 3k appears to be a natural place to do this. Python 3
> > still appears to be regarded as a work-in-progress by most people,
> > and I don't think that it's 'too late' to change for Python 3k.

Sorry, it *is* too late. The developers have been very careful about
breaking 3.0 code in 3.1 only with strong justification. 3.1 is in
feature freeze as of a few days ago.

> Fortunately you're not Guido, and fortunately this isn't going to
> happen. I recommend you either accept that this behaviour is here to
> stay, or if you're *particularly* enamoured of late evaluation
> behaviour of defaults, that you work on some sort of syntax to make it
> optional.
> Thank you for the rest of the email, which was (by and large)
> well-considered and (mostly) stuck to the points of the matter. I will
> get to them in proper time when I have been able to add to the argument
> in a considered way after fully understanding your points.
>
> However, this last section really got under my skin. It seems completely
> inappropriate to devolve any well-intentioned email discussion into an
> appalling self-service ad-hominem attack.

I do not see any attack whatsoever, just advice which you took wrongly.

> ...se of Fortunately without a backing argument),

'Fortunately' as is clear from the context, was in respect to your
expressed casual attitude toward breaking code. Some people have a
negative reaction to that. In any case, it is a separate issue from
'default arguments'.

> attempt to bully me out of my position (recomment you accept this
behaviour
> is here to stay) are not appreciated.

He recommended that you not beat your head against a brick wall because
of a misconception about what is currently socially possible. He then
suggested something that *might* be possible. If that advice offends
you, so be it.

Terry Jan Reedy

George Sakkis

unread,
May 10, 2009, 2:32:07 AM5/10/09
to python-ideas
On Sun, May 10, 2009 at 1:28 AM, Scott David Daniels
<Scott....@acm.org> wrote:
>
> Any argument for changing to a more "dynamic" default scheme had better
> have a definition of the behavior of the following code, and produce a
> good rationale for that behavior:
>
>    x = 5
>    def function_producer(y):
>        def inner(arg=x+y):
>            return arg + 2
>        return inner

I don't think the proposed scheme was ever accused of not being
well-defined. Here's the current equivalent dynamic version:

x = 5
def function_producer(y):

missing = object()
def inner(arg=missing):
if arg is missing:
arg = x+y


return arg + 2
return inner

-1 for changing the current semantics (too much potential breakage),
+0.x for a new keyword that adds dynamic semantics (and removes the
need for the sentinel kludge).

George

Larry Hastings

unread,
May 10, 2009, 6:14:32 AM5/10/09
to python-ideas
George Sakkis wrote:
+0.x for a new keyword that adds dynamic semantics (and removes the
need for the sentinel kludge).


We don't need new syntax for it.  Here's a proof-of-concept hack that you can do it with a function decorator.
import copy

def clone_arguments(f):
  default_args = list(f.func_defaults)
  if len(default_args) < f.func_code.co_argcount:
    delta = f.func_code.co_argcount - len(default_args)
    default_args = ([None] * delta) + default_args
  def fn(*args):
    if len(args) < default_args:
      args = args + tuple(copy.deepcopy(default_args[len(args):]))
    return f(*args)

  return fn

@clone_arguments
def thing_taking_array(a, b = []):
  b.append(a)
  return b

print thing_taking_array('123')
print thing_taking_array('abc')

-1 on changing Python one iota for this,


/larry/

Carl Johnson

unread,
May 10, 2009, 6:34:32 AM5/10/09
to python...@python.org
Larry Hastings wrote:
> George Sakkis wrote:
>
> +0.x for a new keyword that adds dynamic semantics (and removes the
> need for the sentinel kludge).
>
> We don't need new syntax for it.  Here's a proof-of-concept hack that you
> can do it with a function decorator.

Your decorator only works for mutables where you just want a deep
copy. It doesn't work for cases where you want a whole expression to
be re-evaluated from scratch. (Maybe for the side effects or
something.) That said, it couldn't be that hard to work out a similar
decorator using lambda thunks instead. The internals of the decorator
would be something like:

for n, arg in enumerate(args):
if arg is defaults[n]: #If you didn't get passed anything
args[n] = defaults[n]() #Unthunk the lambda

The usage might be:

@dynamicdefaults
def f(arg=lambda: dosomething()):

It cuts 3 lines of boilerplate down to one line, but makes all your
function calls a little slower.

-- Carl

spir

unread,
May 10, 2009, 7:19:29 AM5/10/09
to python...@python.org
Le Sat, 9 May 2009 16:06:06 -1000,
Carl Johnson <cmjohnson....@gmail.com> s'exprima ainsi:

> I think this is a case where there are pros and cons on both sides.
> There are a lot of pros to the current behavior (performance,
> flexibility, etc.), but it comes with the con of confusing newbies and
> making people go through the same song and dance to set a  "sentinel
> value" when the want the other behavior and they can't ensure that
> None won't be passed. The newbie problem can't be fixed from now until
> Python 4000, since it would break a lot of existing uses of default
> values, but we could cut down on the annoyance of setting and check a
> sentinel value by introducing a new keyword, eg.
>
> def f(l=fresh []):
>    ...
>
> instead of
>
> __blank = object()
> def f(l=__blank):
>    if l is __blank:
>        l = []
>    ...

[...]

Maybe the correctness of the current behaviour can be checked by a little mental experiment.

=======
Just imagine python hasn't default arguments yet, and they are the object of a PEP. An implementation similar to the current one is proposed. Then, people realise that in the case where the given value happens to be mutable _and_ updated in the function body,... What do you think should/would be decided?

-1- Great, we get static variables for free. It is a worthful feature we expected for a while. In numerous use cases they will allow easier and much more straightforward code. Let's go for it.

-2- Well, static variables may be considered useful, in which case there should be a new PEP for them. Conceptually, they are a totally different feature, we shall certainly not mess up both, shall we?
=======

I bet for n°2, for the reasoning of people stating it's a major gotcha will be hard to ignore. But may be wrong. Still, default arguments actually *are* called "default arguments", which means they should be considered as such, while they do not behave as such in all cases.
Now, we must consider the concrete present situation in which their real behaviour is used as a common workaround. I do not really understand why default args are used as static vars while at least another possibility exists in python which is semantically much more consistent:

### instead of "def callSave(number, record=[])"
### just set record on the func:
def callSave(value):
callSave.record.append(value)
return callSave.record
callSave.record = []
print callSave(1) ; print callSave(2) ; print callSave(3)
==>
[1]
[1, 2]
[1, 2, 3]

Also, func attributes are an alternative for another common (mis)use of default arguments, namely the case of a function factory:

def paramPower(exponent):
### instead of "def power(number, exponent=exponent)"
### just set exponent on the func:
def power(number):
return number**power.exponent
power.exponent = exponent
return power
power3 = paramPower(3) ; power5 = paramPower(5)
print power3(2) ; print power5(2)
==>
8
32

In both cases, tha notion of a func attribute rather well matches the variable value's meaning. As a consequence, I find this solution much nicer for a func factory as well as for a static variable.

Denis
------
la vita e estrany

Tennessee Leeuwenburg

unread,
May 10, 2009, 8:12:25 AM5/10/09
to Terry Reedy, python...@python.org

However, this last section really got under my skin. It seems completely inappropriate to devolve any well-intentioned email discussion into an appalling self-service ad-hominem attack.

I do not see any attack whatsoever, just advice which you took wrongly.

...se of Fortunately without a backing argument),

'Fortunately' as is clear from the context, was in respect to your expressed casual attitude toward breaking code.  Some people have a negative reaction to that.  In any case, it is a separate issue from 'default arguments'.


> attempt to bully me out of my position (recomment you accept this behaviour
> is here to stay) are not appreciated.

He recommended that you not beat your head against a brick wall because of a misconception about what is currently socially possible.  He then suggested something that *might* be possible.  If that advice offends you, so be it.

It's not the content of the advice (don't push stuff uphill) which got to me at all, it was the tone and manner in which it was conveyed. Much of the email was well-balanced, which I fully acknowledged. Maybe you're just more inclined to overlook a few bits of innuendo, and probably most of the time so am I. However, it's actually not okay, and the implied personal criticism was very clearly present. It wasn't severe, ,perhaps my reaction was quite forceful, but it's just not okay to be putting people down.

I don't have a casual attitude towards breaking code, just an open mind towards discussions on their merits. I don't really appreciate the negative tones, and I'm sure that if anyone else is in the firing line, they wouldn't appreciate either, even if it to some extent it's all a bit of a storm in a teacup. Unless someone who is happy to cop a bit of flak stands up and says that's not on, then maintaining a "thick skin" -- i.e. putting up with people putting you down, be it through a clear and direct put-down, or through a more subtle implication -- becomes the norm. It becomes acceptable, perhaps indeed even well-regarded, to take a certain viewpoint then suggest that anyone who doesn't share it is doing something wrong.

Well nuts to that. Emails are, as everyone should know, an unclear communication channel. I've found myself on the wrong side of this kind of debate before, and I've heard plenty of stories of people who were put down, pushed out or made to feel stupid -- and for what? There are just so many stories, many of which I have heard first-hand, of people who have felt alienated on online lists where prowess and insight are so highly regarded that they become means by which others are put down. It's that larger problem which people need not to put up with.

However, I'm just about to go offline for 12 hours or so, and I know the US will be waking up to their emails shortly, so I just wanted to take this opportunity before the sun rotates again to say to the list and the original author that I'd really like to avoid a continued shouting contest or make anyone upset. I've obviously ruffled some feathers already, and I guess probably this email may ruffle some more, but really I just want to make clear that :
  (a) It's not okay to put myself or anyone else down, claiming some personal superiority
  (b) That attitude is all this email is about. It doesn't need to be any bigger than that.

Regards,
-Tennessee

Georg Brandl

unread,
May 10, 2009, 10:58:30 AM5/10/09
to python...@python.org
spir schrieb:

> Also, func attributes are an alternative for another common (mis)use of default arguments, namely the case of a function factory:
>
> def paramPower(exponent):
> ### instead of "def power(number, exponent=exponent)"
> ### just set exponent on the func:
> def power(number):
> return number**power.exponent
> power.exponent = exponent
> return power
> power3 = paramPower(3) ; power5 = paramPower(5)
> print power3(2) ; print power5(2)
> ==>
> 8
> 32

You don't need a function attribute here; just "exponent" will work fine.

The problem is where you define multiple functions, e.g. in a "for" loop,
and function attributes don't help there:

adders = []

for i in range(10):
def adder(x):
return x + i

This will fail to do what it seems to do; you need to have some kind of
binding "i" in a scope where it stays constant. You could use a "make_adder"
factory function, similar to your paramPower, but that is more kludgy than
necessary, because it can easily be solved by a default argument.

Georg

--
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

_______________________________________________

Georg Brandl

unread,
May 10, 2009, 10:59:11 AM5/10/09
to python...@python.org
Tennessee Leeuwenburg schrieb:

> I don't have a casual attitude towards breaking code, just an open mind
> towards discussions on their merits. I don't really appreciate the
> negative tones, and I'm sure that if anyone else is in the firing line,
> they wouldn't appreciate either, even if it to some extent it's all a
> bit of a storm in a teacup. Unless someone who is happy to cop a bit of
> flak stands up and says that's not on, then maintaining a "thick skin"
> -- i.e. putting up with people putting you down, be it through a clear
> and direct put-down, or through a more subtle implication -- becomes the
> norm. It becomes acceptable, perhaps indeed even well-regarded, to take
> a certain viewpoint then suggest that anyone who doesn't share it is
> doing something wrong.

The problem is that people whose proposal immediately meets negative
reactions usually feel put down no matter what exactly you say to them.
If there was a polite way of saying "This will not change, please don't
waste more of our time with this discussion." that still gets the point
across, I would be very grateful.

Georg

--
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

_______________________________________________

Arnaud Delobelle

unread,
May 10, 2009, 12:10:33 PM5/10/09
to Larry Hastings, python-ideas

On 10 May 2009, at 11:14, Larry Hastings wrote:

> George Sakkis wrote:
>>
>> +0.x for a new keyword that adds dynamic semantics (and removes the
>> need for the sentinel kludge).
>
>
> We don't need new syntax for it. Here's a proof-of-concept hack
> that you can do it with a function decorator.
> import copy

Not that we don't need syntax for default arguments either :) Here is
a decorator that does it:

def default(**defaults):
defaults = defaults.items()
def decorator(f):
def decorated(*args, **kwargs):
for name, val in defaults:
kwargs.setdefault(name, val)
return f(*args, **kwargs)
return decorated
return decorator


Here it is in action:


>>> z=1
>>> @default(z=z)
... def foo(a, z):
... print a + z
...
>>> z=None
>>> foo(3)
4
>>>
>>> @default(history=[])
... def bar(x, history):
... history.append(x)
... return list(history)
...
>>> map(bar, 'spam')
[['s'], ['s', 'p'], ['s', 'p', 'a'], ['s', 'p', 'a', 'm']]


Let's get rid of default arguments altogether, and we will have solved
the problem!


Furthemore, by removing default arguments from the language, we can
let people choose what semantics they want for default arguments.
I.e, if they want them to be reevaluated each time, they could write
the default decorator as follows (it is exactly the same as the one
above except for a pair of parentheses that have been added on one line.

def dynamic_default(**defaults):
defaults = defaults.items()
def decorator(f):
def decorated(*args, **kwargs):
for name, val in defaults:
kwargs.setdefault(name, val())
# ^^
return f(*args, **kwargs)
return decorated
return decorator


Example:


>>> @dynamic_default(l=list)
... def baz(a, l):
... l.append(a)
... return l
...
>>> baz(2)
[2]
>>> baz(3)
[3]


;)

--
Arnaud

George Sakkis

unread,
May 10, 2009, 1:00:15 PM5/10/09
to python-ideas
On Sun, May 10, 2009 at 12:10 PM, Arnaud Delobelle
<arn...@googlemail.com> wrote:

> Furthemore, by removing default arguments from the language, we can let
> people choose what semantics they want for default arguments.  I.e, if they
> want them to be reevaluated each time, they could write the default
> decorator as follows (it is exactly the same as the one above except for a
> pair of parentheses that have been added on one line.

Cute, but that's still a subset of what the dynamic semantics would
provide; the evaluated thunks would have access to the previously
defined arguments:

def foo(a, b, d=(a*a+b+b)**0.5, s=1/d):
return (a,b,d,s)

would be equivalent to

missing = object()
def foo(a, b, d=missing, s=missing):
if d is missing:
d = (a*a+b+b)**0.5
if s is missing:
s = 1/d
return (a,b,d,s)

George

Scott David Daniels

unread,
May 10, 2009, 2:30:49 PM5/10/09
to python...@python.org
George Sakkis wrote:
> On Sun, May 10, 2009 at 1:28 AM, Scott David Daniels
> <Scott....@acm.org> wrote:
>> Any argument for changing to a more "dynamic" default scheme had better
>> have a definition of the behavior of the following code, and produce a
>> good rationale for that behavior:
>>
>> x = 5
>> def function_producer(y):
>> def inner(arg=x+y):
>> return arg + 2
>> return inner
>
> I don't think the proposed scheme was ever accused of not being
> well-defined. Here's the current equivalent dynamic version:
>
> x = 5
> def function_producer(y):
> missing = object()
> def inner(arg=missing):
> if arg is missing:
> arg = x+y
> return arg + 2
> return inner

So sorry.

def function_producer(y):
def inner(arg=x+y):
return arg + 2

y *= 10
return inner

I was trying to point out that it becomes much trickier building
functions with dynamic parts, and fluffed the example.

--Scott David Daniels
Scott....@Acm.Org

Larry Hastings

unread,
May 10, 2009, 3:27:15 PM5/10/09
to Arnaud Delobelle, python-ideas

Arnaud Delobelle wrote:
> Not that we don't need syntax for default arguments either :) Here is
> a decorator that does it: [...] Let's get rid of default arguments
> altogether, and we will have solved the problem! Furthemore, by
> removing default arguments from the language, we can let people choose
> what semantics they want for default arguments. [...] ;)

Comedy or not, I don't support getting rid of default arguments. Nor do
I support changing the semantics of default arguments so they represent
code that is run on each function invocation.

As I demonstrated, people can already choose what semantics they want
for default arguments, by choosing whether or not to decorate their
functions with clone_arguments or the like. We don't need to remove
default arguments from the language for that to happen.


/larry/

George Sakkis

unread,
May 10, 2009, 4:06:53 PM5/10/09
to python-ideas
On Sun, May 10, 2009 at 1:00 PM, George Sakkis <george...@gmail.com> wrote:

> On Sun, May 10, 2009 at 12:10 PM, Arnaud Delobelle
> <arn...@googlemail.com> wrote:
>
>> Furthemore, by removing default arguments from the language, we can let
>> people choose what semantics they want for default arguments.  I.e, if they
>> want them to be reevaluated each time, they could write the default
>> decorator as follows (it is exactly the same as the one above except for a
>> pair of parentheses that have been added on one line.
>
> Cute, but that's still a subset of what the dynamic semantics would
> provide; the evaluated thunks would have access to the previously
> defined arguments:
>
> def foo(a, b, d=(a*a+b+b)**0.5, s=1/d):
>    return (a,b,d,s)
>
> would be equivalent to
>
> missing = object()
> def foo(a, b, d=missing, s=missing):
>    if d is missing:
>        d = (a*a+b+b)**0.5
>    if s is missing:
>        s = 1/d
>    return (a,b,d,s)


Just for kicks, here's a decorator that supports dependent dynamically
computed defaults; it uses eval() to create the lambdas on the fly:

@evaldefaults('s','d')
def foo(a, b, d='(a*a+b*b)**0.5', t=0.1, s='(1+t)/(d+t)'):
return (a,b,d,t,s)

print foo(3,4)

#=======

import inspect
import functools
# from http://code.activestate.com/recipes/551779/
from getcallargs import getcallargs

def evaldefaults(*eval_params):
eval_params = frozenset(eval_params)
def decorator(f):
params,_,_,defaults = inspect.getargspec(f)
param2default = dict(zip(params[-len(defaults):], defaults))
if defaults else {}
param2lambda = {}
for p in eval_params:
argsrepr = ','.join(params[:params.index(p)])
param2lambda[p] = eval('lambda %s: %s' % (argsrepr,
param2default[p]),
f.func_globals)
@functools.wraps(f)
def wrapped(*args, **kwargs):
allkwargs,missing = getcallargs(f, *args, **kwargs)
missing_eval_params = eval_params.intersection(missing)
f_locals = {}
for i,param in enumerate(params):
value = allkwargs[param]
if param in missing_eval_params:
allkwargs[param] = value = param2lambda[param](**f_locals)
f_locals[param] = value
return f(**allkwargs)
return wrapped
return decorator

Tennessee Leeuwenburg

unread,
May 10, 2009, 8:45:31 PM5/10/09
to Steven D'Aprano, python...@python.org
On Sun, May 10, 2009 at 11:23 AM, Steven D'Aprano <st...@pearwood.info> wrote:
On Sun, 10 May 2009 10:19:01 am Tennessee Leeuwenburg wrote:
> Hi Pascal,
> Taking the example of
>
> def foo(bar = []):
>   bar.append(4)
>   print(bar)
>
> I'm totally with you in thinking that what is 'natural' is to expect
> to get a new, empty, list every time.

That's not natural to me. I would be really, really surprised by the
behaviour you claim is "natural":

>>> DEFAULT = 3
>>> def func(a=DEFAULT):
...     return a+1
...
>>> func()
4
>>> DEFAULT = 7
>>> func()
8

Good example! If I may translate that back into the example using a list to make sure I've got it right...

default = []

def func(a=default):
   a.append(5)

func()
func()

default will now be [5,5]
 
For deterministic functions, the same argument list should return the
same result each time. By having default arguments be evaluated every
time they are required, any function with a default argument becomes
non-deterministic. Late evaluation of defaults is, essentially,
equivalent to making the default value a global variable. Global
variables are rightly Considered Harmful: they should be used with
care, if at all.


If I can just expand on that point somewhat...

In the example I gave originally, I had in mind someone designing a function, whereby it could be called either with some pre-initialised term, or otherwise it would use a default value of []. I imagined a surprised designer finding that the default value of [] was a pointer to a specific list, rather than a new empty list each time.

e.g.

def foo(bar = []):
    bar.append(5)
    return bar

The same argument list (i.e. no arguments) would result in a different result being returned every time. On the first call, bar would be [5], then [5,5], then [5,5,5]; yet the arguments passed (i.e. none, use default) would not have changed.

You have come up with another example. I think it is designed to illustrate that a default argument doesn't need to specify a default value for something, but could be a default reference (such as a relatively-global variable). In that case, it is modifying something above its scope. To me, that is what you would expect under both "ways of doing things". I wonder if I am missing your point...

I'm totally with you on the Global Variables Are Bad principle, however. I don't design them in myself, and where I have worked with them, usually they have just caused confusion.

> However this isn't want
> happens. As far as I'm concerned, that should more or less be the end
> of the discussion in terms of what should ideally happen.

As far as I'm concerned, what Python does now is the idea behaviour.
Default arguments are part of the function *definition*, not part of
the body of the function. The definition of the function happens
*once* -- the function isn't recreated each time you call it, so
default values shouldn't be recreated either.

I agree that's how you see things, and possibly how many people see things, but I don't accept that it is a more natural way of seeing things. However, what *I* think is more natural is just one person's viewpoint... I totally see the philosophical distinction you are trying to draw, and it certainly does help to clarify why things are the way they are. However, I just don't know that it's the best way they could be.

> The responses to the change in behaviour which I see as more natural
> are, to summarise, as follows:
>   -- For all sorts of technical reasons, it's too hard
>   -- It changes the semantics of the function definition being
> evaluated at compile time
>   -- It's not what people are used to

And it's not what many people want.

You only see the people who complain about this feature. For the
multitude of people who expect it or like it, they have no reason to
say anything (except in response to complaints). When was the last time
you saw somebody write to the list to say "Gosh, I really love that
Python uses + for addition"? Features that *just work* never or rarely
get mentioned.

:) ... well, that's basically true. Of course there are some particular aspects of Python which are frequently mentioned as being wonderful, but I see your point.  However, I'm not sure we really know one way or another about what people want then -- either way.
 
> With regards to the second point, it's not like the value of
> arguments is set at compile time, so I don't really see that this
> stands up.

I don't see what relevance that has. If the arguments are provided at
runtime, then the default value doesn't get used.

I think this is the fundamental difference -- to me speaks worlds :) ... I think you just have a different internal analogy for programming than I do. That's fine. To me, I don't see that a line of code should not be dynamically evaluated just because it's part of the definition. I just don't see why default values shouldn't be (or be able to be) dynamically evaluated. Personally I think that doing it all the time is more natural, but I certainly don't see why allowing the syntax would be bad. I'd basically do that 100% of the time. I'm not sure I've ever used a default value other than None in a way which I wouldn't want dynamically evaluated.
 
> I don't think it's intuitive,

Why do you think that intuitiveness is more valuable than performance
and consistency?

Because I like Python more than C? I'm pretty sure everyone here would agree than in principle, elegance of design and intuitive syntax are good. Agreeing on what that means might involve some robust discussion, but I think everyone would like the same thing. Well, consistency is pretty hard to do without... :)
 
Besides, intuitiveness is a fickle thing. Given this pair of functions:

def expensive_calculation():
   time.sleep(60)
   return 1

def useful_function(x=expensive_calculation()):
   return x + 1

I think people would be VERY surprised that calling useful_function()
with no arguments would take a minute *every time*, and would complain
that this slowness was "unintuitive".

That seems at first like a good point. It is a good point, but I don't happen to side with you on this issue, although I do see that many people might. The code that I write is not essentially performance-bound. It's a lot more design-bound (by which I mean it's very complicated, and anything I can do to simplify it is well-worth a bit of a performance hit).

However, when the design options are available (setting aside what default behaviour should be), it's almost always possible to design things how you'd like them.

e.g.

def speed_critical_function(x=None):
    if x is None:
       time.sleep(60:

    return 1

def handy_simple_function(foo=5, x=[]): or maybe (foo=5, x = new []):
    for i in range(5):
        x.append(i)

    return x

Then, thinking about it a little more (and bringing back a discussion of default behaviour), I don't really see why the implementation of the dynamic function definition would be any slower than using None to indicate it wasn't passed in, followed by explicit default-value setting.

 
> it's just that people become
> accustomed to it. There is indeed, *some sense* in understanding that
> the evaluation occurs at compile-time, but there is also a lot of
> sense (and in my opinion, more sense) in understanding the evaluation
> as happening dynamically when the function is called.

No. The body of the function is executed each time the function is
called. The definition of the function is executed *once*, at compile
time. Default arguments are part of the definition, not the body, so
they too should only be executed once. If you want them executed every
time, put them in the body:

def useful_function(x=SENTINEL):
   if x is SENTINEL:
       x = expensive_calculation()
   return x+1

I agree that's how things *are* done, but I just don't see why it should be that way, beyond it being what people are used to. It seems like there is no reason why it would be difficult to implement CrazyPython which does things as I suggest. Given that, it also doesn't seem like there is some inherent reason to prefer the design style of RealActualPython over CrazyPython. Except, of course that RealActualPython exists and I can use it right now (thanks developers!), versus CrazyPython which is just an idea.



Your logic is impeccable :) ... yet, if I may continue to push my wheelbarrow uphill for a moment longer, I would argue that is an implementation detail, not a piece of design philosophy.
 
(2) The potential performance degradation of re-evaluating default
arguments at runtime is great. For practical reasons, it's best to
evaluate them once only.

Maybe that's true. I guess I have two things to say on that point. The first is that I'm still not sure that's really true in a problematic way. Anyone wanting efficiency could continue to use sentinel  values of None (which obviously don't need to be dynamically evaluated) while other cases would surely be no slower than the initialisation code would be anyway. Is the cost issue really that big a problem?

The other is that while pragmatics is, of course, a critical issue, it's also true that it's well-worth implementing more elegant language features if possible. It's always a balance. The fastest languages are always less 'natural', while the more elegant and higher-level languages somewhat slower. Where a genuine design improvement is found, I think it's worth genuinely considering including that improvement, even if it is not completely pragmatic.
 
(3) In order to get the behaviour you want, the Python compiler would
need a more complicated implementation which would be hard to explain.

Yes, that's almost certainly true.
 
> I don't see it as changing one way of
> doing things for another equally valid way of doing things, but
> changing something that's confusing and unexpected for something
> which is far more natural and, to me, Pythonic.

I'm sorry, while re-evaluation of default arguments is sometimes useful,
it's more often NOT useful. Most default arguments are simple objects
like small ints or None. What benefit do you gain from re-evaluating
them every single time? Zero benefit. (Not much cost either, for simple
cases, but no benefit.)

But for more complex cases, there is great benefit to evaluating default
arguments once only, and an easy work-around for those rare cases that
you do want re-evaluation.

Small ints and None are global pointers (presumably!) so there is no need to re-evaluate them every time. The list example is particularly relevant (ditto empty dictionary) since I think that would be one of the most common cases for re-evaluation. Presumably a reasonably efficient implementation could be worked out such that dynamic evaluation of the default arguments (and indeed the entire function definition) need only occur where a dynamic default value were included.

I agree that the workaround is not that big a deal once you're fully accustomed to How Things Work, but it just seems 'nicer' to allow dynamic defaults. That's all I really wanted to say in the first instance, I didn't think that position would really get anyone's back up.
 
Regards,
-Tennessee

Tennessee Leeuwenburg

unread,
May 10, 2009, 9:09:50 PM5/10/09
to Georg Brandl, python...@python.org


On Mon, May 11, 2009 at 12:59 AM, Georg Brandl <g.br...@gmx.net> wrote:
Tennessee Leeuwenburg schrieb:

> I don't have a casual attitude towards breaking code, just an open mind
> towards discussions on their merits. I don't really appreciate the
> negative tones, and I'm sure that if anyone else is in the firing line,
> they wouldn't appreciate either, even if it to some extent it's all a
> bit of a storm in a teacup. Unless someone who is happy to cop a bit of
> flak stands up and says that's not on, then maintaining a "thick skin"
> -- i.e. putting up with people putting you down, be it through a clear
> and direct put-down, or through a more subtle implication -- becomes the
> norm. It becomes acceptable, perhaps indeed even well-regarded, to take
> a certain viewpoint then suggest that anyone who doesn't share it is
> doing something wrong.

The problem is that people whose proposal immediately meets negative
reactions usually feel put down no matter what exactly you say to them.
If there was a polite way of saying "This will not change, please don't
waste more of our time with this discussion." that still gets the point
across, I would be very grateful.

I understand that. I think there is a good way to do it. First of all, I would recognise that this is the python-ideas list, not the python-dev list, and that this is *exactly* the place to discuss ideas on their merits, and potentially put aside pragmatics to engage in a discussion of design philosophy.

For bad ideas, I would suggest:
   "Thanks for your contribution. However, this has been discussed quite a lot before, and the groundswell of opinion is most likely that this is not going to be a good addition to Python. However, if you'd like to discuss the idea further, please consider posting it to comp.lang.py."

For good/okay ideas that just won't get up, I would suggest:
  "Thanks for your contribution. I see your point, but I don't think it's likely to get enough traction amongst the developer community for someone else to implement it. However, if you'd like more feedback on your ideas so that you can develop a PEP or patch, please feel free to have a go. However, please don't be disappointed if it doesn't get a lot of support unless you are happy to provide some more justification for your position.".

I don't really think anyone on a mailing list really needs to waste any more time than they want to -- just ignore the thread.

I would definitely avoid things like:
  "You clearly have no idea what you are talking about"
  "If you only knew what I knew, you'd know differently"

It's probably not possible to avoid people with an idea feeling deflated if their ideas are not popular, but on an ideas list such as this, I think that having conversations should be encouraged. Certainly that's what got under my skin. If I was chatting in person, or with friends, or in a meeting, the appropriate thing to do would be to say "Hey, that's a bit rough!" and then probably the attitude would be wound back, or the person would respond with "Oh, that's not what I meant, I just meant this..." and the misunderstanding would be quickly resolved.

Unfortunately, email just *sucks* for telling the difference between someone with a chip on their shoulder, or someone who is just being helpful and made a bad choice of words.

Cheers,
-T

Terry Reedy

unread,
May 11, 2009, 12:05:26 AM5/11/09
to python...@python.org
George Sakkis wrote:
> On Sun, May 10, 2009 at 1:28 AM, Scott David Daniels
> <Scott....@acm.org> wrote:
>> Any argument for changing to a more "dynamic" default scheme had better
>> have a definition of the behavior of the following code, and produce a
>> good rationale for that behavior:
>>
>> x = 5
>> def function_producer(y):
>> def inner(arg=x+y):
>> return arg + 2
>> return inner

In this version, x is resolved each time when function_producer is
called which could be different each time. The x = 5 line is possible
irrelevant.

> I don't think the proposed scheme was ever accused of not being
> well-defined. Here's the current equivalent dynamic version:
>
> x = 5
> def function_producer(y):
> missing = object()
> def inner(arg=missing):
> if arg is missing:
> arg = x+y
> return arg + 2
> return inner


In this version, x is resolved each time each of the returned inners is
called, which could be different each time and which could be different
from any of the times function_producer was called. Not equivalent.

Given how often this issue is resuscitated, I am will consider raising
my personal vote from -1 to -0.

My friendly advice to anyone trying to promote a change is to avoid
categorical (and false or debatable) statements like "Newbies get
confused" (Some do, some do not) or "The intuitive meaning is" (to you
maybe, not to me). Also avoid metaphysical issues, especially those
that divide people.*

Do focus on practical issues, properly qualified statements, and how a
proposal would improve Python.

Terry Jan Reedy

* The initial argument for changing the meaning of int/int amounted to
this: "integers are *merely* a subset of reals, therefore....". Those
with a different philosophy dissented and the actual proposal was
initially lost in the resulting fire.

Stephen J. Turnbull

unread,
May 11, 2009, 4:57:49 AM5/11/09
to Tennessee Leeuwenburg, Georg Brandl, python...@python.org
> On Mon, May 11, 2009 at 12:59 AM, Georg Brandl <g.br...@gmx.net> wrote:

> > The problem is that people whose proposal immediately meets
> > negative reactions usually feel put down no matter what exactly
> > you say to them. If there was a polite way of saying "This will
> > not change, please don't waste more of our time with this
> > discussion." that still gets the point across, I would be very
> > grateful.

I thought that the way to do that is to say "This proposal is
un-Pythonic".[1]

Now, a claim that something is "un-Pythonic" is basically the end of
discussion, since in the end it's a claim that the BDFL would reject
it, not something based entirely on "objective criteria". Even the
BDFL sometimes doesn't know what's Pythonic until he sees it! So a
bare claim of "un-Pythonic" is borrowing the BDFL's authority, which
should be done very cautiously.

IMO, the "problem" arose here because Tennessee went outside of
discussing the idea on its merits. Rather, he wrote "the current
situation seems un-Pythonic," but didn't point to violations of any of
the usual criteria. OTOH, Steven did discuss the idea on its Pythonic
merits, showing that it arguably violates at least three of the tenets
in the Zen of Python.

In turn, Steven could have left well-enough alone and avoided
explicitly pointing out that, therefore, Tennessee doesn't seem to
understand what "Pythonic" is. FWIW, early in my participation in
Python-Dev I was told, forcefully, that I wasn't qualified to judge
Pythonicity. That statement was (and is) true, and being told off was
a very helpful experience for me. How and when to say it is another
matter, but avoiding it entirely doesn't serve the individual or the
community IMO.

Note that this whole argument does not apply to terms like "natural,"
"intuitive," "readable," etc., only to the pivotal term "Pythonic".
The others are matters of individual opinion. AIUI, "Pythonic" is a
matter of *the BDFL's* opinion (if it comes down to that, and
sometimes it does).


Footnotes:
[1] If it's a fundamental conceptual problem. "This proposal is
incompatible" can be used if it's a matter of violating the language
definition according to the language reference and any relevant PEPs.

Stephen J. Turnbull

unread,
May 11, 2009, 5:29:40 AM5/11/09
to python...@python.org
Terry Reedy writes:

> Given how often this issue is resuscitated, I am will consider raising
> my personal vote from -1 to -0.

> Do focus on practical issues, properly qualified statements, and how a
> proposal would improve Python.

One thing I would like to see addressed is use-cases where the
*calling* syntax *should* use default arguments.[1] In the case of
the original example, the empty list, I think that requiring the
argument, and simply writing "foo([])" instead of "foo()" is better,
on two counts: EIBTI, and TOOWTDI. And it's certainly not an
expensive adjustment.

In a more complicated case, it seems to me that defining (and naming)
a separate function would often be preferable to defining a
complicated default, or explicitly calling a function in the actual
argument. Ie, rather than

def consume_integers(ints=fibonacci_generator()):
for i in ints:
# suite and termination condition

why not

def consume_integers(ints):
for i in ints:
# suite and termination condition

def consume_fibonacci():
consume_integers(fibonacci_generator())

or

def consume_integers_internal(ints):
for i in ints:
# suite and termination condition

def consume_integers():
consume_integers_internal(fibonacci_generator())

depending on how frequent or intuitive the "default" Fibonacci
sequence is? IMO, for both above use-cases EIBTI applies as an
argument that those are preferable to a complex, dynamically-
evaluated default, and for the second TOOWTDI also applies.

Footnotes:
[1] In the particular cases being advocated as support for dynamic
evaluation of default arguments, not in general. It is clear to me
that having defaults for rarely used option arguments is a good thing,
and I think that is a sufficient justification for Python to support
default arguments.

Aahz

unread,
May 11, 2009, 8:29:54 AM5/11/09
to python...@python.org
On Mon, May 11, 2009, Tennessee Leeuwenburg wrote:
> On Sun, May 10, 2009 at 11:23 AM, Steven D'Aprano <st...@pearwood.info>wrote:
>>
>> (1) Assignments outside of the body of a function happen once, at
>> compile time. Default values are outside the body of the function. You
>> want a special case for default values so that they too happen at
>> runtime. That's not special enough to warrant breaking the rules.
>
> Your logic is impeccable :) ... yet, if I may continue to push my
> wheelbarrow uphill for a moment longer, I would argue that is an
> implementation detail, not a piece of design philosophy.

I'm not going to look it up, but in the past, Guido has essentially
claimed that this behavior is by design (not so much by itself but in
conjunction with other deliberate decisions). I remind you of this:

"Programming language design is not a rational science. Most reasoning
about it is at best rationalization of gut feelings, and at worst plain
wrong." --GvR, python-ideas, 2009-3-1
--
Aahz (aa...@pythoncraft.com) <*> http://www.pythoncraft.com/

"It is easier to optimize correct code than to correct optimized code."
--Bill Harlan

Pascal Chambon

unread,
May 11, 2009, 3:49:14 PM5/11/09
to python...@python.org

Well well well,

lots of interesting points have flowed there, I won't have any chance of reacting to each one ^^


>And no one seemed to enjoy the possibilities of getting "potentially static variables" this way.
>You did not search hard enough.

Would anyone mind pointing me to people that have made a sweet use of mutable default arguments ?
At the moment I've only run into "memoization" thingies, which looked rather awkward to me : either the "memo" has to be used from elsewhere in the program, and having to access it by browsing the "func_defaults" looks not at all "KISS", or it's really some private data from the function, and having it exposed in its interface is rather error-prone.


[...]
  
Does it exist ? Do we have any way, from inside a call block, to
browse the default arguments that this code block might receive ?
    
[...]

dir(func_object) is your friend :)

  
Whoups, I meant "from inside a *code* block" :p
But that's the "self function" thread you noted, I missed that one... thanks for pointing it ^^


On the other hand, would anyone support my alternative wish, of
having a builtin "NotGiven", similar to "NotImplemented", and
dedicated to this somehow usual taks of "placeholder" ?
    
There already is such a beast: None is designed to be used as a 
placeholder for Not Given, Nothing, No Result, etc. 

If None is not suitable, NotImplemented is also a perfectly good 
built-in singleton object which can be used as a sentinel. It's already 
used as a sentinel for a number of built-in functions and operators. 
There's no reason you can't use it as well.

  
Well, to me there was like a ternary semantic for arguments :
*None -> make without this argument / don't use this feature
*"NotGiven" -> create the parameter yourself
*"Some value" -> use this value as a parameter
But after reflexion, it's quite rare that the three meanings have to be used in the same place, so I guess it's ok like it is...

Even though, still, I'd not be against new, more explicit builtins. "None" has too many meanings to be "self documenting", and I feel "NotImplemented" doesn't really fit where we mean "notGiven"
That's the same thing for exceptions : I've seen people forced to make ugly workarounds because they got "ValueError" where they would have loved to get "EmptyIterableError" or other precise exceptions.
But maybe I'm worrying on details there :p

Heh heh heh, he thinks beginners read manuals :-)

  
 ^^ I'm maybe the only one, but I've always found the quickest way to learn a language/library was to read the doc.
Wanna learn python ? Read the language reference, then the library reference. Wanna know what's the object model of PHP5 versus PHP4 ? Read the 50 pages chapter on that matter. Wanna play with Qt ? Read the class libraries first. :p
Good docs get read like novels, and it's fun to cross most of the gotchas and implementation limitations without having to be biten by them first. People might have the feeling that they gain time by jumping to the practice, I've rather the feeling that they lose hell a lot of it, that way.



Back to the "mutable default arguments" thingy :
I think I perfectly see the points in favor of having them as attributes of the function method, as it's the case currently.
It does make sense in many ways, even if less sense than "class attributes", for sure (the object model of python is rock solid to me, whereas default arguments are on a thin border that few languages had the courage to explore - most of them forbidding non-litteral constants).

But imo, it'd be worth having a simple and "consensual way" of obtaining dynamic evaluation of default arguments, without the "if arg==none:" pattern.
The decorators that people have submitted have sweet uses, but they either deepcopy arguments, use dynamic evaluation of code strings, or force to lamba-wrap all arguments ; so they're not equivalent to what newbies would most of the time expect - a reevaluation of the python code they entered after '='.

So the best, imo, would really be a keyword or some other form that reproduces with an easy syntax the "lambda-wrapping" we had.

If adding keywords is too violent, what would you people think of some notation similar to what we already have in the "function arguments world ", i.e stars ?

def func(a, c = *[]):
    pass

Having 1, 2 or 3 stars in the "default argument" expression, wouldn't it be OK ? I guess they have no meaning there at the moment, so we could give them one : "keep that code as a lamda functio nand evaluate it at each function call".
Couldn't we ?
The risk would be confusion with the other "*" and "**", but in this case we might put 3 stars (yeah, that's much but...).

Any comment on this ?

Regards,
Pascal

PS : Has anyone read Dale Carneghie's books here ? That guy is a genius of social interactions, and I guess that if 1/3 of posters/mailers had read him, there would never be any flame anymore on forums and mailing lists.
Contrarily to what his titles might look like, he doesn't promote hypocrisy or cowardness ; he simply points lots of attitudes (often unconscious) that ruin discussions without helping in any way the matter go forward.
I'm myself used to letting scornful or aggressives sentences pass by ; but others don't like them, and I fully understand why.
So could't we smoothen edges, in order to keep the discusion as it's supposed to be - a harmless sharing of pros and cons arguments, which endangers no one's life - instead of having it randomly turned into a confrontation of egos, pushing each other down as in an attempt not to drown ?
http://en.wikipedia.org/wiki/How_to_Win_Friends_and_Influence_People



Chris Rebert

unread,
May 11, 2009, 4:32:21 PM5/11/09
to Pascal Chambon, python...@python.org
On Mon, May 11, 2009 at 12:49 PM, Pascal Chambon
<chambon...@wanadoo.fr> wrote:
<snip>

> So the best, imo, would really be a keyword or some other form that
> reproduces with an easy syntax the "lambda-wrapping" we had.
>
> If adding keywords is too violent, what would you people think of some
> notation similar to what we already have in the "function arguments world ",
> i.e stars ?
>
> def func(a, c = *[]):
>     pass
>
> Having 1, 2 or 3 stars in the "default argument" expression, wouldn't it be
> OK ? I guess they have no meaning there at the moment, so we could give them
> one : "keep that code as a lamda functio nand evaluate it at each function
> call".
> Couldn't we ?
> The risk would be confusion with the other "*" and "**", but in this case we
> might put 3 stars (yeah, that's much but...).
>
> Any comment on this ?

Seems unnecessarily confusing and sufficiently unrelated to the
current use of stars in Python. -1 on this syntax.
I'd look for a different punctuation/keyword.

Cheers,
Chris
--
http://blog.rebertia.com

Stephen J. Turnbull

unread,
May 11, 2009, 11:36:09 PM5/11/09
to Pascal Chambon, python...@python.org
Pascal Chambon writes:

> So could't we smoothen edges, in order to keep the discusion as it's
> supposed to be - a harmless sharing of pros and cons arguments, which
> endangers no one's life -

In discussions about Python development, misuse of the term "Pythonic"
to support one's personal preference is not harmless. It leads to
confusion of newbies, and ambiguity in a term that is already rather
precise, and becoming more so with every PEP (though it is hard to
express in a few words as a definition). The result is that the BDFL
may use that term at his pleasure, but the rest of us risk being
brought up short by somebody who knows better.

> instead of having it randomly turned into a confrontation of egos,

This was not a random event. It was triggered by, *and responded only
to*, the misuse of the word "Pythonic".

> pushing each other down as in an attempt not to drown ?

Hey, I haven't seen one claim that "this feature would look good in
Perl" yet. The gloves are still on.<wink>

Tennessee Leeuwenburg

unread,
May 12, 2009, 12:22:31 AM5/12/09
to Stephen J. Turnbull, python...@python.org
On Tue, May 12, 2009 at 1:36 PM, Stephen J. Turnbull <ste...@xemacs.org> wrote:
Pascal Chambon writes:

 > So could't we smoothen edges, in order to keep the discusion as it's
 > supposed to be - a harmless sharing of pros and cons arguments, which
 > endangers no one's life -

In discussions about Python development, misuse of the term "Pythonic"
to support one's personal preference is not harmless.  It leads to
confusion of newbies, and ambiguity in a term that is already rather
precise, and becoming more so with every PEP (though it is hard to
express in a few words as a definition).  The result is that the BDFL
may use that term at his pleasure, but the rest of us risk being
brought up short by somebody who knows better.

 > instead of having it randomly turned into a  confrontation of egos,

This was not a random event.  It was triggered by, *and responded only
to*, the misuse of the word "Pythonic".


I guess it's never occurred to me, and I wouldn't have thought it would be immediately clear to everyone, that Pythonic simply means "Whatever BDFL thinks". I've always thought it meant "elegant and in keeping with the design philosophy of Python", and up for discussion and interpretation by everyone. I never thought that it would be used as a means of *preventing* discussion about what was or was not 'Pythonic'. *Obviously*, BDFL's opinions on the language are authoritative, but that doesn't make them beyond discussion.

This is the Python Ideas list, not the dev-list, and I was discussing my own interpretation, not trying to force anyone on anything. To recall a quote I heard once, "You are entitled to your own opinion, but not your own facts". I would have thought that expressing ones' opinion about what is or is not Pythonic is a wonderful thing to encourage. It's like encouraging people to discuss what elegant code looks like, or the merits of a piece of writing.

I thought was very clear that I was talking about my interpretation of what was Pythonic, and clear that I was in no way talking about trying to claim authority. I feel a bit like I've been targetted by the thought police, truth be told, although that overstates matters. I didn't think I was in any way saying "My way is absolutely more Pythonic, you should all think like me", but much more along the lines of, "Hey, I think my solution captures something elegant and Pythonic, surely that's worth talking about even if there are some practical considerations involved".  I just thought I'd be clear in saying "seems to me to be more Pythonic" rather than "is more Pythonic".

Where are people going to talk freely about their interpretation of what is and isn't Pythonic, if not the ideas list? I'm also subscribed to the python-dev list, and I've never attempted to force an opinion there. Isn't *this* list the right place to have conversations about these concepts? I don't think people should be pulled short for talking about Pythonicity, just for trying to impose their world-view. That's what rubs wrongly -- being told you're not even supposed to *talk* about something, or not be entitled to an opinion on something. I would have thought that getting involved in discussing the Zen of Python is something that should be a part of everyone's learning and growth, rather than something which is delivered like a dogma. That's not to say there isn't a right answer on many issues, but it has to be acceptable to discuss the issues and to hold personal opinions.

Cheers,
-T

Aahz

unread,
May 12, 2009, 12:42:56 AM5/12/09
to python...@python.org
On Tue, May 12, 2009, Tennessee Leeuwenburg wrote:
>
> I thought was very clear that I was talking about my interpretation
> of what was Pythonic, and clear that I was in no way talking about
> trying to claim authority. I feel a bit like I've been targetted by
> the thought police, truth be told, although that overstates matters. I
> didn't think I was in any way saying "My way is absolutely more
> Pythonic, you should all think like me", but much more along the lines
> of, "Hey, I think my solution captures something elegant and Pythonic,
> surely that's worth talking about even if there are some practical
> considerations involved". I just thought I'd be clear in saying
> "seems to me to be more Pythonic" rather than "is more Pythonic".

That may have been your intent, but it sure isn't what I read in your
original post. I suggest you re-read it looking for what might get
interpreted as obstreperous banging on the table:

http://mail.python.org/pipermail/python-ideas/2009-May/004601.html

If you still don't see it, I'll discuss it with you (briefly!) off-list;
that kind of tone discussion is really off-topic for this list.

"It is easier to optimize correct code than to correct optimized code."
--Bill Harlan

Tennessee Leeuwenburg

unread,
May 12, 2009, 1:06:52 AM5/12/09
to Aahz, python...@python.org


On Tue, May 12, 2009 at 2:42 PM, Aahz <aa...@pythoncraft.com> wrote:
On Tue, May 12, 2009, Tennessee Leeuwenburg wrote:
>
> I thought was very clear that I was talking about my interpretation
> of what was Pythonic, and clear that I was in no way talking about
> trying to claim authority. I feel a bit like I've been targetted by
> the thought police, truth be told, although that overstates matters. I
> didn't think I was in any way saying "My way is absolutely more
> Pythonic, you should all think like me", but much more along the lines
> of, "Hey, I think my solution captures something elegant and Pythonic,
> surely that's worth talking about even if there are some practical
> considerations involved".  I just thought I'd be clear in saying
> "seems to me to be more Pythonic" rather than "is more Pythonic".

That may have been your intent, but it sure isn't what I read in your
original post.  I suggest you re-read it looking for what might get
interpreted as obstreperous banging on the table:

http://mail.python.org/pipermail/python-ideas/2009-May/004601.html

If you still don't see it, I'll discuss it with you (briefly!) off-list;
that kind of tone discussion is really off-topic for this list.

Agreed. Anyone else who wants to chime in, feel free to email me off-list. Regardless of the rights and wrongs, I'll of course be extra-careful in future to be crystal clear about my meaning.

As far as I can tell, my original email is littered with terms like 'seems to me', 'in my opinion', 'personally' etc which I would think would convey to anyone that I'm talking about a personal opinion and not trying to discredit any one or any thing. Not that it's about counting, but I count no fewer than seven occasions where I point out that I am advancing a personal opinion rather than making a universal statement. I'm not sure what else I should have done.

Cheers,
-T

Stephen J. Turnbull

unread,
May 12, 2009, 5:42:15 AM5/12/09
to Tennessee Leeuwenburg, python...@python.org
Tennessee Leeuwenburg writes:

> Where are people going to talk freely about their interpretation of
> what is and isn't Pythonic, if not the ideas list?

comp.lang.python seems more appropriate for free discussion.

Pascal Chambon

unread,
May 12, 2009, 3:56:28 PM5/12/09
to Stephen J. Turnbull, python...@python.org
Well, since adding new keywords or operators is very sensitive, and the existing ones are rather exhausted, it won't be handy to propose a new syntax...

One last idea I might have : what about something like

    def myfunc(a, b, c = yield []):
              pass


I'm not expert in english, but I'd say the following "equivalents" of yield (dixit WordWeb) are in a rather good semantic area :
*Be the cause or source of
*Give or supply
*Cause to happen or be responsible for
*Bring in

Of course the behaviour of this yield is not so close from the one we know, but there is no interpretation conflict for the parser, and we might quickly get used to it :
* yield in default argument => reevaluate the expression each time
* yield in function body => return value and prepare to receive one

How do you people feel about this ?
Regards,
Pascal

PS : I've heard some month ago someone who compared the new high level languages as new religions - with the appearance of notions like "py-evangelistes" and stuffs - whereas it had (in his opinion) never been so for older languages :p
That's imo somehow true (and that's rather a good sign for those languages) ; I feel phrases like "pythonic" or perl's "TIMTOWTDI" have gained some kind of sacred aura ^^

Stephen J. Turnbull

unread,
May 13, 2009, 3:34:02 AM5/13/09
to Pascal Chambon, python...@python.org
Pascal Chambon writes:

> One last idea I might have : what about something like
>

> * def myfunc(a, b, c = yield []):
> pass*

As syntax, I'd be -0.5. But my real problem is that the whole concept
seems to violate several of the "Zen" tenets. Besides those that
Steven D'Aprano mentioned, I would add two more. First is "explicit
is better than implicit" at the function call. True, for the leading
cases of "[]" and "{}", "def foo(bar=[])" is an easy mistake for
novice Python programmers to make with current semantics. And I agree
that it is very natural to initialize an unspecified list or
dictionary with the empty object. But those also have a convenient
literal syntax that makes it clear that the object is constructed at
function call time: "myfunc([])" and "myfunc({})". Since that syntax
is always available even with the proposed new semantics, I feel this
proposal also violates "there's one -- and preferably only one --
obvious way to do it".

I understand the convenience argument, but that is frequently rejected
on Python-Dev with "your editor can do that for you; if it doesn't,
that's not a problem with Python, it's a problem with your editor." I
also see the elegance and coherence of Tennessee's proposal to
*always* dynamically evaluate, but I don't like it. Given that always
evaluating dynamically is likely to have performance impact that is as
surprising as the behavior of "def foo(bar=[])", I find it easy to
reject that proposal on the grounds of "although practicality beats
purity".

I may be missing something, but it seems to me that the proponents of
this change have yet to propose any concrete argument that it is more
Pythonic than the current behavior of evaluating default expressions
at compile time, and expressing dynamic evaluation by explicitly
invoking the expression conditionally in the function's suite, or as
an argument.

Regarding the syntax, the recommended format for argument defaults is

def myfunc(a, b, c=None):
pass

Using a keyword (instead of an operator) to denote dynamically
evaluated defaults gives:

def myfunc(a, b, c=yield []):
pass

which looks like a typo to me. I feel there should be a comma after
"yield", or an index in the brackets. (Obviously, there can't be a
comma because "[]" can't be a formal parameter, and yield is a keyword
so it can't be the name of a sequence or mapping. So the syntax
probably 'works' in terms of the parser. I'm describing my esthetic
feeling, not a technical objection.)

As for "yield" itself, I would likely be confused as to when the
evaluation takes place. "Yield" strikes me as an imperative, "give me
the value *now*", ie, at compile time.

Jacob Holm

unread,
May 13, 2009, 5:00:41 AM5/13/09
to Pascal Chambon, python...@python.org
Pascal Chambon wrote:
> One last idea I might have : what about something like
>
> * def myfunc(a, b, c = yield []):
> pass*
>

> [...], but there is no interpretation conflict for the parser, and we

> might quickly get used to it


I am surprised that there is no conflict, but it looks like you are
technically right. The parentheses around the yield expression are
required in the following (valid) code:


>>> def gen():
... def func(arg=(yield 'starting')):
... return arg
... yield func
...
>>> g = gen()
>>> g.next()
'starting'
>>> f = g.send(42)
>>> f()
42


I would hate to see the meaning of the above change depending on whether
the parentheses around the yield expression were there or not, so -1 on
using "yield" for this.

I'm +0 on the general idea of adding a keyword for delayed evaluation of
default argument expressions.

- Jacob

spir

unread,
May 13, 2009, 6:25:23 AM5/13/09
to python...@python.org
Le Wed, 13 May 2009 16:34:02 +0900,
"Stephen J. Turnbull" <ste...@xemacs.org> s'exprima ainsi:

> But those also have a convenient
> literal syntax that makes it clear that the object is constructed at
> function call time: "myfunc([])" and "myfunc({})".

I totaly agree with you, here. And as you say "explicit, etc..."

Then, the argument must not be 'defaulted' at all in the function def. This is a semantic change and a pity when the argument has an "obvious & natural" default value.
While we're at removing default args and requiring them to be explicit instead, why have default args at all in python?
Especially, why should immutable defaults be OK, and mutable ones be wrong and replaced with explicit value at call time? I mean that this distinction makes no sense conceptually: the user just wants a default; this is rather python internal soup detail raising an issue.

I would rather require the contrary, namely that static vars should not be messed up with default args. This is not only confusion in code, but also design flaw. here's how I see it various possibilities (with some amount of exageration ;-):

def f(arg, lst=[]):
# !!! lst is no default arg, !!!
# !!! it's a static var instead !!!
# !!! that will be updated in code !!!
<do things>

def f(arg):
<do things>
f.lst = [] # init static var

or

def f(arg):
# init static var
try:
assert f.lst
except AttributeError:
f.lst = []
<do things>

Denis
------
la vita e estrany

spir

unread,
May 13, 2009, 6:38:09 AM5/13/09
to python...@python.org
Le Wed, 13 May 2009 16:34:02 +0900,
"Stephen J. Turnbull" <ste...@xemacs.org> s'exprima ainsi:

> I


> also see the elegance and coherence of Tennessee's proposal to
> *always* dynamically evaluate, but I don't like it. Given that always
> evaluating dynamically is likely to have performance impact that is as
> surprising as the behavior of "def foo(bar=[])", I find it easy to
> reject that proposal on the grounds of "although practicality beats
> purity".

I do not understand why defaults should be evaluated dynamically (at runtime, I mean) in the proposal.

This can happen, I guess, only for default that depend on the non-local scope:

def f(arg, x=a):
<body>

If we want this this a changing at runtime, then we'd better be very clear and explicit:

def f(arg, x=UNDEF):
# case x is not provided, use current value of a
if x is UNDEF: x = a
<body>

(Also, if x is not intended as a user defined argument, then there is no need for a default at all. We simply have a non-referentially transparent func.)

Denis
------
la vita e estrany

Stephen J. Turnbull

unread,
May 13, 2009, 8:01:43 AM5/13/09
to spir, python...@python.org
spir writes:

> Especially, why should immutable defaults be OK, and mutable ones
> be wrong and replaced with explicit value at call time?

"Although practicality beats purity." Consider a wrapper for
something like Berkeley DB which has a plethora of options for
creating external objects, but which most users are not going to care
about. It makes sense for those options to be optional. Otherwise
people will be writing

def create_db(path):
"""Create a DB with all default options at `path`."""
create_db_with_a_plethora_of_options(path,a,b,c,d,e,f,g,h,i,j,k,l,m)

def create_db_with_nondefault_indexing(path,indexer):
"""Create a DB with `indexer`, otherwise default options, at `path`."""
create_db_with_a_plethora_of_options(path,indexer,b,c,d,e,f,g,h,i,j,k,l,m)

etc, etc, ad nauseum. No, thank you!

> I would rather require the contrary, namely that static vars should
> not be messed up with default args. This is not only confusion in
> code, but also design flaw. here's how I see it various
> possibilities (with some amount of exageration ;-):
>
> def f(arg, lst=[]):
> # !!! lst is no default arg, !!!
> # !!! it's a static var instead !!!
> # !!! that will be updated in code !!!
> <do things>
>
> def f(arg):
> <do things>
> f.lst = [] # init static var

Don't generators do the right thing here?

def f(arg):
lst = []
while True:
<do things>
yield None

MRAB

unread,
May 13, 2009, 10:28:48 AM5/13/09
to python...@python.org
There's the suggestion that Carl Johnson gave:

def myfunc(a, b, c else []):
pass

or there's:

def myfunc(a, b, c def []):
pass

where 'def' stands for 'default' (or "defaults to").

Jeremy Banks

unread,
May 13, 2009, 10:52:57 AM5/13/09
to MRAB, python...@python.org
To someone who's a novice to this, could someone explain to me why it
has to be an existing keyword at all? Since not identifiers are valid
in that context anyway, why couldn't it be a new keyword that can
still be used as an identifier in valid contexts? For example (not
that I advocate this choice of keyword at all):

def foo(bar reinitialize_default []): # <-- it's a keyword here
reinitialize_default = "It's an identifier here!"

That would be a syntax error now and if it were defined as a keyword
only in that context it wouldn't introduce backwards compatibility
problems and wouldn't force us to reuse an existing keyword in a
context that may be a bit of a stretch.

Is there a reason that this wouldn't be a viable approach?

George Sakkis

unread,
May 13, 2009, 11:29:44 AM5/13/09
to Jeremy Banks, python-ideas
On Wed, May 13, 2009 at 10:52 AM, Jeremy Banks <jer...@jeremybanks.ca> wrote:

> To someone who's a novice to this, could someone explain to me why it
> has to be an existing keyword at all? Since not identifiers are valid
> in that context anyway, why couldn't it be a new keyword that can
> still be used as an identifier in valid contexts? For example (not
> that I advocate this choice of keyword at all):
>
>    def foo(bar reinitialize_default []): # <-- it's a keyword here
>        reinitialize_default = "It's an identifier here!"
>
> That would be a syntax error now and if it were defined as a keyword
> only in that context it wouldn't introduce backwards compatibility
> problems and wouldn't force us to reuse an existing keyword in a
> context that may be a bit of a stretch.
>
> Is there a reason that this wouldn't be a viable approach?

Traditionally, keywords are recognized at the lexer level, which then
passes tokens to the parser. Lexers are pretty simple (typically
constants and regular expressions) and don't take the context into
account. In principle what you're saying could work, but given the
significant reworking of the lexer/parser it would require, it's quite
unlikely to happen, for better or for worse.

George