On 08/13/2014 12:44 PM, Guido van Rossum wrote:+0 on the proposal as a whole. It is not something I'm likely to use, but I'm not opposed to it, so long as it stays optional.
[There is no TL;DR other than the subject line. Please read the whole thing before replying. I do have an appendix with
some motivations for adding type annotations at the end.]
-1 on deprecating alternative uses of annotations.
Nevertheless, it would be good to deprecate such alternative uses of annotations.
I'm strongly opposed this, for a few reasons.
First, I think that standardizing on a syntax, without a semantics is
incredibly confusing, and I can't imagine how having *multiple* competing
implementations would be a boon for anyone.
This proposal seems to be built around the idea that we should have a syntax,
and then people can write third party tools, but Python itself won't really do
anything with them.
Fundamentally, this seems like a very confusing approach. How we write a type,
and what we do with that information are fundamentally connected. Can I cast a
``List[str]`` to a ``List[object]`` in any way? If yes, what happens when I go
to put an ``int`` in it? There's no runtime checking, so the type system is
unsound, on the other hand, disallowing this prevents many types of successes.
Both solutions have merit, but the idea of some implementations of the type
checker having covariance and some contravariance is fairly disturbing.
Another concern I have is that analysis based on these types is making some
pretty strong assumptions about static-ness of Python programs that aren't
valid. While existing checkers like ``flake8`` also do this, their assumptions
are basically constrained to the symbol table, while this is far deeper. For
example, can I annotate something as ``six.text_type``? What about
``django.db.models.sql.Query`` (keep in mind that this class is redefined based
on what database you're using (not actually true, but it used to be))?
Python's type system isn't very good. It lacks many features of more powerful
systems such as algebraic data types, interfaces, and parametric polymorphism.
Despite this, it works pretty well because of Python's dynamic typing. I
strongly believe that attempting to enforce the existing type system would be a
real shame.
On 08/13/2014 01:19 PM, Guido van Rossum wrote:My script argument parser [1] uses annotations to figure out how to parse the cli parameters and cast them to appropriate values (copied the idea from one of Michele Simionato's projects... plac [2], I believe).
On Wed, Aug 13, 2014 at 12:59 PM, Ethan Furman wrote:
-1 on deprecating alternative uses of annotations.
Do you have a favorite alternative annotation use that you actually use (or are likely to)?
I could store the info in some other structure besides 'annotations', but it's there and it fits the bill conceptually. Amusingly, it's a form of type info, but instead of saying what it has to already be, says what it will become.
[1] https://pypi.python.org/pypi/scription (due for an overhaul now I've used it for awhile ;)
[2] https://pypi.python.org/pypi/plac/0.9.1
I agree with Alex that I think leaving the actual semantics of what these thingsmean up to a third party, which can possibly be swapped out by individual endusers, is terribly confusing. I don’t think I agree though that this is a badidea in general, I think that we should just add it for real and skip theindirection.
IOW I'm not sure I see the benefit of defining the syntax but not the semanticswhen it seems this is already completely possible given the fact that mypyexists.The only real benefits I can see from doing it are that the stdlib can use it,and the ``import typing`` aspect. I don't believe that the stdlib benefits aregreat enough to get the possible confusion of multiple different implementationsand I think that the typing import could easily be provided as a project on PyPIthat people can depend on if they want to use this in their code.So my vote would be to add mypy semantics to the language itself.
2014-08-14, 0:19, Guido van Rossum <gu...@python.org> wrote:+1. I'm a developer of the code analysis engine of PyCharm. I have discussed this idea with Jukka Lehtosalo and recently with Dave Halter, the author of Jedi code completion library. Standardized type annotations would be very useful for code analysis tools and IDEs such as PyCharm, Jedi and pylint. Type annotations would be especially great for third-party libraries. The idea is that most Python programmers don't have to write annotations in order to benefit from them. Annotated libraries are often enough for good code analysis.
> Yesterday afternoon I had an inspiring conversation with Bob Ippolito (man of many trades, author of simplejson) and Jukka Lehtosalo (author of mypy: http://mypy-lang.org/). Bob gave a talk at EuroPython about what Python can learn from Haskell (and other languages); yesterday he gave the same talk at Dropbox. The talk is online (https://ep2014.europython.eu/en/schedule/sessions/121/) and in broad strokes comes down to three suggestions:
>
> (a) Python should adopt mypy's syntax for function annotations
We (PyCharm) and Jukka have made some initial steps in this direction, including thoughts on semantics of annotations (https://github.com/pytypes/pytypes). Feedback is welcome.
Here are slides from my talk about optional typing in Python, that show how Mypy types can be used in both static and dynamic type checking (http://blog.pirx.ru/media/files/2013/python-optional-typing/), Mypy-related part starts from slide 14.
We are interested in getting type annotations standardized and we would like to help developing and testing type annotations proposals.
--
Andrey Vlasovskikh
Web: http://pirx.ru/
I am proposing that we adopt whatever mypy uses here, keeping discussion of the details (mostly) out of the PEP. The goal is to make it possible to add type checking annotations to 3rd party modules (and even to the stdlib) while allowing unaltered execution of the program by the (unmodified) Python 3.5 interpreter.
On Aug 13, 2014 9:45 PM, "Guido van Rossum" <gu...@python.org> wrote:
> (1) A change of direction for function annotations
>
> PEP 3107, which introduced function annotations, is intentional non-committal about how function annotations should be used. It lists a number of use cases, including but not limited to type checking. It also mentions some rejected proposals that would have standardized either a syntax for indicating types and/or a way for multiple frameworks to attach different annotations to the same function. AFAIK in practice there is little use of function annotations in mainstream code, and I propose a conscious change of course here by stating that annotations should be used to indicate types and to propose a standard notation for them.
>
> (We may have to have some backwards compatibility provision to avoid breaking code that currently uses annotations for some other purpose. Fortunately the only issue, at least initially, will be that when running mypy to type check such code it will produce complaints about the annotations; it will not affect how such code is executed by the Python interpreter. Nevertheless, it would be good to deprecate such alternative uses of annotations.)
I watched the original talk and read your proposal. I think type annotations could very very useful in certain contexts.
However, I still don't get this bit. Why would allowing type annotations automatically imply that no other annotations would be possible? Couldn't we formalize what would be considered a type annotation while still allowing annotations that don't fit this criteria to be used for other things?
[There is no TL;DR other than the subject line. Please read the whole thing before replying. I do have an appendix with some motivations for adding type annotations at the end.]
The curious thing here is that while standardizing a syntax for type annotations, we technically still won't be adopting standard rules for type checking. This is intentional. First of all, fully specifying all the type checking rules would make for a really long and boring PEP (a much better specification would probably be the mypy source code). Second, I think it's fine if the type checking algorithm evolves over time, or if variations emerge. The worst that can happen is that you consider your code correct but mypy disagrees; your code will still run.
That said, I don't want to completely leave out any specification. I want the contents of the typing.py module to be specified in the PEP, so that it can be used with confidence. But whether mypy will complain about your particular form of duck typing doesn't have to be specified by the PEP. Perhaps as mypy evolves it will take options to tell it how to handle certain edge cases. Forks of mypy (or entirely different implementations of type checking based on the same annotation syntax) are also a possibility. Maybe in the distant future a version of Python will take a different stance, once we have more experience with how this works out in practice, but for Python 3.5 I want to restrict the scope of the upheaval.
On 13.08.2014 21:44, Guido van Rossum wrote:I was at Bob's talk during EP14 and really liked the idea. A couple of
> Yesterday afternoon I had an inspiring conversation with Bob Ippolito
> (man of many trades, author of simplejson) and Jukka Lehtosalo (author
> of mypy: http://mypy-lang.org/). Bob gave a talk at EuroPython about
> what Python can learn from Haskell (and other languages); yesterday he
> gave the same talk at Dropbox. The talk is online
> (https://ep2014.europython.eu/en/schedule/sessions/121/) and in broad
> strokes comes down to three suggestions:
>
> (a) Python should adopt mypy's syntax for function annotations
> (b) Python's use of mutabe containers by default is wrong
> (c) Python should adopt some kind of Abstract Data Types
colleagues and other attendees also said it's a good and useful
proposal. I also like your proposal to standardize the type annotations
first without a full integration of mypy.
In general I'm +1 but I like to discuss two aspects:
1) I'm not keen with the naming of mypy's typing classes. The visual
distinction between e.g. dict() and Dict() is too small and IMHO
confusing for newcomers. How about an additional 'T' prefix to make
clear that the objects are referring to typing objects?
from typing import TList, TDict
def word_count(input: TList[str]) -> TDict[str, int]:
...
2) PEP 3107 only specifies arguments and return values but not
exceptions that can be raised by a function. Java has the "throws"
syntax to list possible exceptions:
public void readFile() throws IOException {}
May I suggest that we also standardize a way to annotate the exceptions
that can be raised by a function? It's a very useful piece of
information and commonly requested information on the Python user
mailing list. It doesn't have to be a new syntax element, a decorator in
the typing module would suffice, too. For example:
from typing import TList, TDict, raises
@raises(RuntimeError, (ValueError, "is raised when input is empty"))
def word_count(input: TList[str]) -> TDict[str, int]:
...
Regards,
Christian
_______________________________________________
Python-ideas mailing list
Python...@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/
I'm fine to have a discussion on things like covariance vs. contravariance, or what form of duck typing are acceptable, etc.
The type checking algorithm might evolve over the time, but by including typing.py in the stdlib, the syntax for annotations would be almost frozen and that will be a limitation. In other projects such as TypeScript (http://www.typescriptlang.org/), that the syntax usually evolves alongside the algorithms.
Is the syntax specifyed in typing.py mature enough to put it in the stdlib and expect users to start annotating their projects without worrying too much about future changes?
Is there enough feedback from users using mypy in their projects?I think that rushing typing.py into 3.5 is not a good idea. However, It'd be nice to add some notes in PEP8, encourage it's use as an external library, let some projects and tools (e.g. PyCharm) use it. It's not that bad if mypy lives 100% outside the Python distribution for a while. Just like TypeScript to JavaScript.
After getting some user base, part of it (typing.py) could be moved to the stdlib.
> 1) I'm not keen with the naming of mypy's typing classes. The visual
> distinction between e.g. dict() and Dict() is too small and IMHO
> confusing for newcomers. How about an additional 'T' prefix to make
> clear that the objects are referring to typing objects?
To this reader, ‘dict’ and ‘list’ *are* “typing objects” — they are
objects that are types. Seeing code that referred to something else as
“typing objects” would be an infitation to confusion, IMO.
You could argue “that's because you don't know the special meaning of
“typing object” being discussed here”. To which my response would be,
for a proposal to add something else as meaningful Python syntax, the
jargon is poorly chosen and needlessly confusing with established terms
in Python.
If there's going to be a distinction between the types (‘dict’, ‘list’,
etc.) and something else, I'd prefer it to be based on a clearer
terminology distinction.
--
\ “Simplicity and elegance are unpopular because they require |
`\ hard work and discipline to achieve and education to be |
_o__) appreciated.” —Edsger W. Dijkstra |
Ben Finney
However, I still don't get this bit. Why would allowing type annotations automatically imply that no other annotations would be possible? Couldn't we formalize what would be considered a type annotation while still allowing annotations that don't fit this criteria to be used for other things?
On Aug 13, 2014, at 6:05 PM, Guido van Rossum <gu...@python.org> wrote:On Wed, Aug 13, 2014 at 1:53 PM, Donald Stufft <don...@stufft.io> wrote:So my vote would be to add mypy semantics to the language itself.What exactly would that mean? I don't think the Python interpreter should reject programs that fail the type check -- in fact, separating the type check from run time is the most crucial point of my proposal.
I don’t know exactly :)Some ideas:1) Raise a warning when the type check fails, but allow it happen. This wouldhave the benefit of possibly catching bugs, but it's still opt in in thesense that you have to write the annotations for anything to happen. Thiswould also enable people to turn on enforced type checking by raising thewarning level to an exception.
Even if this was off by default it would make it easy to enable it duringtest runs and also enable easier/better quickcheck like functionality.
2) Simply add a flag to the interpreter that turns on type checking.3) Add a stdlib module that would run the program under type checking, like``python -m typing myprog`` instead of ``python -m myprog``.Really I think a lot of the benefit is likely to come in the form of linting
and during test runs. However if I have to run a separate Python interpreter
to actually do the run then I risk getting bad results through varying thingslike interpreter differences, language level differences, etc.
Although I wouldn't complain if it meant that Python had actual type checkingat the run time if a function had type annotations :)
I'm fine to have a discussion on things like covariance vs. contravariance, or what form of duck typing are acceptable, etc.I’m not particularly knowledgable about the actual workings of a type system andcovariance vs contravariance and the like. My main concern there is having asingle reality. The meaning of something shouldn't change because I used adifferent interpreter/linter/whatever. Beyond that I don't know enough to havean opinion on the actual semantics.
Still, different linters exist and I don't hear people complain about that. I would also be okay if PyCharm's interpretation of the finer points of the type checking syntax was subtly different from mypy's. In fact I would be surprised if they weren't sometimes in disagreement. Heck, PyPy doesn't give *every* Python program the same meaning as CPython, and that's a feature. :-)
[There is no TL;DR other than the subject line. Please read the whole thing before replying. I do have an appendix with some motivations for adding type annotations at the end.]
Yesterday afternoon I had an inspiring conversation with Bob Ippolito (man of many trades, author of simplejson) and Jukka Lehtosalo (author of mypy: http://mypy-lang.org/). Bob gave a talk at EuroPython about what Python can learn from Haskell (and other languages); yesterday he gave the same talk at Dropbox. The talk is online (https://ep2014.europython.eu/en/schedule/sessions/121/) and in broad strokes comes down to three suggestions:
(a) Python should adopt mypy's syntax for function annotations
(b) Python's use of mutabe containers by default is wrong
(c) Python should adopt some kind of Abstract Data Types
Proposals (b) and (c) don't feel particularly actionable (if you disagree please start a new thread, I'd be happy to discuss these further if there's interest) but proposal (a) feels right to me.
So what is mypy? It is a static type checker for Python written by Jukka for his Ph.D. thesis. The basic idea is that you add type annotations to your program using some custom syntax, and when running your program using the mypy interpreter, type errors will be found during compilation (i.e., before the program starts running).
The clever thing here is that the custom syntax is actually valid Python 3, using (mostly) function annotations: your annotated program will still run with the regular Python 3 interpreter. In the latter case there will be no type checking, and no runtime overhead, except to evaluate the function annotations (which are evaluated at function definition time but don't have any effect when the function is called).
In fact, it is probably more useful to think of mypy as a heavy-duty linter than as a compiler or interpreter; leave the type checking to mypy, and the execution to Python. It is easy to integrate mypy into a continuous integration setup, for example.
To read up on mypy's annotation syntax, please see the mypy-lang.org website. Here's just one complete example, to give a flavor:
from typing import List, Dict
def word_count(input: List[str]) -> Dict[str, int]:
result = {} #type: Dict[str, int]
for line in input:
for word in line.split():
result[word] = result.get(word, 0) + 1
return result
Note that the #type: comment is part of the mypy syntax; mypy uses comments to declare types in situations where no syntax is available -- although this particular line could also be written as follows:
result = Dict[str, int]()
Either way the entire function is syntactically valid Python 3, and a suitable implementation of typing.py (containing class definitions for List and Dict, for example) can be written to make the program run correctly. One is provided as part of the mypy project.
I should add that many of mypy's syntactic choices aren't actually new. The basis of many of its ideas go back at least a decade: I blogged about this topic in 2004 (http://www.artima.com/weblogs/viewpost.jsp?thread=85551 -- see also the two followup posts linked from the top there).
I'll emphasize once more that mypy's type checking happens in a separate pass: no type checking happens at run time (other than what the interpreter already does, like raising TypeError on expressions like 1+"1").
There's a lot to this proposal, but I think it's possible to get a PEP written, accepted and implemented in time for Python 3.5, if people are supportive. I'll go briefly over some of the action items.
(1) A change of direction for function annotations
PEP 3107, which introduced function annotations, is intentional non-committal about how function annotations should be used. It lists a number of use cases, including but not limited to type checking. It also mentions some rejected proposals that would have standardized either a syntax for indicating types and/or a way for multiple frameworks to attach different annotations to the same function. AFAIK in practice there is little use of function annotations in mainstream code, and I propose a conscious change of course here by stating that annotations should be used to indicate types and to propose a standard notation for them.
(We may have to have some backwards compatibility provision to avoid breaking code that currently uses annotations for some other purpose. Fortunately the only issue, at least initially, will be that when running mypy to type check such code it will produce complaints about the annotations; it will not affect how such code is executed by the Python interpreter. Nevertheless, it would be good to deprecate such alternative uses of annotations.)
(2) A specification for what to add to Python 3.5
There needs to be at least a rough consensus on the syntax for annotations, and the syntax must cover a large enough set of use cases to be useful. Mypy is still under development, and some of its features are still evolving (e.g. unions were only added a few weeks ago). It would be possible to argue endlessly about details of the notation, e.g. whether to use 'list' or 'List', what either of those means (is a duck-typed list-like type acceptable?) or how to declare and use type variables, and what to do with functions that have no annotations at all (mypy currently skips those completely).
I am proposing that we adopt whatever mypy uses here, keeping discussion of the details (mostly) out of the PEP. The goal is to make it possible to add type checking annotations to 3rd party modules (and even to the stdlib) while allowing unaltered execution of the program by the (unmodified) Python 3.5 interpreter. The actual type checker will not be integrated with the Python interpreter, and it will not be checked into the CPython repository. The only thing that needs to be added to the stdlib is a copy of mypy's typing.py module. This module defines several dozen new classes (and a few decorators and other helpers) that can be used in expressing argument types. If you want to type-check your code you have to download and install mypy and run it separately.
The curious thing here is that while standardizing a syntax for type annotations, we technically still won't be adopting standard rules for type checking. This is intentional. First of all, fully specifying all the type checking rules would make for a really long and boring PEP (a much better specification would probably be the mypy source code). Second, I think it's fine if the type checking algorithm evolves over time, or if variations emerge. The worst that can happen is that you consider your code correct but mypy disagrees; your code will still run.
That said, I don't want to completely leave out any specification. I want the contents of the typing.py module to be specified in the PEP, so that it can be used with confidence. But whether mypy will complain about your particular form of duck typing doesn't have to be specified by the PEP. Perhaps as mypy evolves it will take options to tell it how to handle certain edge cases. Forks of mypy (or entirely different implementations of type checking based on the same annotation syntax) are also a possibility. Maybe in the distant future a version of Python will take a different stance, once we have more experience with how this works out in practice, but for Python 3.5 I want to restrict the scope of the upheaval.
Appendix -- Why Add Type Annotations?The argument between proponents of static typing and dynamic typing has been going on for many decades. Neither side is all wrong or all right. Python has traditionally fallen in the camp of extremely dynamic typing, and this has worked well for most users, but there are definitely some areas where adding type annotations would help.
- Editors (IDEs) can benefit from type annotations; they can call out obvious mistakes (like misspelled method names or inapplicable operations) and suggest possible method names. Anyone who has used IntelliJ or Xcode will recognize how powerful these features are, and type annotations will make such features more useful when editing Python source code.
- Linters are an important tool for teams developing software. A linter doesn't replace a unittest, but can find certain types of errors better or quicker. The kind of type checking offered by mypy works much like a linter, and has similar benefits; but it can find problems that are beyond the capabilities of most linters.
- Type annotations are useful for the human reader as well! Take the above word_count() example. How long would it have taken you to figure out the types of the argument and return value without annotations? Currently most people put the types in their docstrings; developing a standard notation for type annotations will reduce the amount of documentation that needs to be written, and running the type checker might find bugs in the documentation, too. Once a standard type annotation syntax is introduced, it should be simple to add support for this notation to documentation generators like Sphinx.
- Refactoring. Bob's talk has a convincing example of how type annotations help in (manually) refactoring code. I also expect that certain automatic refactorings will benefit from type annotations -- imagine a tool like 2to3 (but used for some other transformation) augmented by type annotations, so it will know whether e.g. x.keys() is referring to the keys of a dictionary or not.
- Optimizers. I believe this is actually the least important application, certainly initially. Optimizers like PyPy or Pyston wouldn't be able to fully trust the type annotations, and they are better off using their current strategy of optimizing code based on the types actually observed at run time. But it's certainly feasible to imagine a future optimizer also taking type annotations into account.
--
--Guido "I need a new hobby" van Rossum (python.org/~guido)
>I'm strongly opposed this, for a few reasons.
[...]
>Python's type system isn't very good. It lacks many features of more powerful
>systems such as algebraic data types, interfaces, and parametric polymorphism.
>Despite this, it works pretty well because of Python's dynamic typing. I
>strongly believe that attempting to enforce the existing type system would be a
>real shame.
This is my main concern, but I'd phrase it very differently.
First, Python's type system _is_ powerful, but only because it's dynamic. Duck typing simulates parametric polymorphism perfectly, disjunction types as long as they don't include themselves recursively, algebraic data types in some but not all cases, etc. Simple (Java-style) generics, of the kind that Guido seems to be proposing, are not nearly as flexible. That's the problem.
On the other hand, even though these types only cover a small portion of the space of Python's implicit type system, a lot of useful functions fall within that small portion. As long as you can just leave the rest of the program untyped, and there are no boundary problems, there's no real risk.
On the third hand, what worries me is this:
> Mypy has a cast() operator that you can use to shut it up when you (think you) know the conversion is safe.
Why do we need casts? You shouldn't be trying to enforce static typing in a part of the program whose static type isn't sound. Languages like Java and C++ have no choice; Python does, so why not take advantage of it?
The standard JSON example seems appropriate here. What's the return type of json.loads? In Haskell, you write a pretty trivial JSONThing ADT, and you return a JSONThing that's an Object (which means its value maps String to JSONThing). In Python today, you return a dict, and use it exactly the same as in Haskell, except that you can't verify its soundness at compile time. In Java or C++, it's… what? The sound option is a special JSONThing that has separate getObjectMemberString and getArrayMemberString and getObjectMemberInt, which is incredibly painful to use. A plain old Dict[String, Object] looks simple, but it means you have to downcast all over the place to do anything, making it completely unsound, and still unpleasant. The official Java json.org library gives you a hybrid between the two that manages to be neither sound nor user-friendly. And of course there are libraries for many poor static languages (especially C++) that try to fake duck
typing as far as possible for their JSON objects, which is of course nowhere near as far as Python gets for free.
> def word_count(input: List[str]) -> Dict[str, int]:
> result = {} #type: Dict[str, int]
> for line in input:
> for word in line.split():
> result[word] = result.get(word, 0) + 1
> return result
I just realized why this bothers me.
This function really, really ought to be taking an Iterable[String] (except that we don't have a String ABC). If you hadn't statically typed it, it would work just fine with, say, a text file—or, for that matter, a binary file. By restricting it to List[str], you've made it a lot less usable, for no visible benefit.
And, while this is less serious, I don't think it should be guaranteeing that the result is a Dict rather than just some kind of Mapping. If you want to change the implementation tomorrow to return some kind of proxy or a tree-based sorted mapping, you can't do so without breaking all the code that uses your function.
And if even Guido, in the motivating example for this feature, is needlessly restricting the usability and future flexibility of a function, I suspect it may be a much bigger problem in practice.
This example also shows exactly what's wrong with simple generics: if this function takes an Iterable[String], it doesn't just return a Mapping[String, int], it returns a Mapping of _the same String type_. If your annotations can't express that, any value that passes through this function loses type information.
And not being able to tell whether the keys in word_count(f) are str or bytes *even if you know that f was a text file* seems like a pretty major loss.
> Both solutions have merit, but the idea of some implementations of the type checker having covariance and some contravariance is fairly disturbing.Why can't we have both? That's the only way to properly type things, since immutable-get-style APIs are always going to be convariant, set-only style APIs (e.g. a function that takes 1 arg and returns None) are going to be contravariant and mutable get-set APIs (like most python collections) should really be invariant.
Actually, mypy already has a solution. There's a codec (https://github.com/JukkaL/mypy/tree/master/mypy/codec) that you can use which transforms Python-2-with-annotations into vanilla Python 2. It's not an ideal solution, but it can work in cases where you absolutely have to have state of the art Python 3.5 type checking *and* backwards compatibility with Python 2.
> In Java or C++, it's… what? The sound option is a special JSONThing that
> has separate getObjectMemberString and getArrayMemberString and
> getObjectMemberInt, which is incredibly painful to use.
That's mainly because Java doesn't let you define your own
types that use convenient syntax such as [] for indexing.
Python doesn't have that problem, so a decent static type
system for Python should let you define a JSONThing class
that's fully type-safe while having a standard mapping
interface.
--
Greg
On Wednesday, August 13, 2014 12:45 PM, Guido van Rossum <gu...@python.org> wrote:def word_count(input: List[str]) -> Dict[str, int]:
result = {} #type: Dict[str, int]
for line in input:
for word in line.split():
result[word] = result.get(word, 0) + 1
return result
I just realized why this bothers me.
This function really, really ought to be taking an Iterable[String]
On 08/14/2014 01:26 PM, Andrew Barnert wrote:In Java or C++, it's… what? The sound option is a special JSONThing thathas separate getObjectMemberString and getArrayMemberString andgetObjectMemberInt, which is incredibly painful to use.
That's mainly because Java doesn't let you define your own
types that use convenient syntax such as [] for indexing.
Python doesn't have that problem, so a decent static type
system for Python should let you define a JSONThing class
that's fully type-safe while having a standard mapping
interface.
On Aug 13, 2014, at 6:39 PM, Andrew Barnert <abar...@yahoo.com.dmarc.invalid> wrote:On Wednesday, August 13, 2014 12:45 PM, Guido van Rossum <gu...@python.org> wrote:def word_count(input: List[str]) -> Dict[str, int]:
result = {} #type: Dict[str, int]
for line in input:
for word in line.split():
result[word] = result.get(word, 0) + 1
return result
I just realized why this bothers me.
This function really, really ought to be taking an Iterable[String]You do realize String also happens to be an Iterable[String], right?
One of my big dreams about Python is that one day we'll drop support for strings being iterable. Nothing of value would be lost and that would enable us to use isinstance(x, Iterable) and more importantly isinstance(x, Sequence). Funny that this surfaces now, too.
_______________________________________________
Python-ideas mailing list
Python...@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/
On Wednesday, August 13, 2014 12:45 PM, Guido van Rossum <gu...@python.org> wrote:I just realized why this bothers me.
> def word_count(input: List[str]) -> Dict[str, int]:
> result = {} #type: Dict[str, int]
> for line in input:
> for word in line.split():
> result[word] = result.get(word, 0) + 1
> return result
This function really, really ought to be taking an Iterable[String] (except that we don't have a String ABC). If you hadn't statically typed it, it would work just fine with, say, a text file—or, for that matter, a binary file. By restricting it to List[str], you've made it a lot less usable, for no visible benefit.
And, while this is less serious, I don't think it should be guaranteeing that the result is a Dict rather than just some kind of Mapping. If you want to change the implementation tomorrow to return some kind of proxy or a tree-based sorted mapping, you can't do so without breaking all the code that uses your function.
And if even Guido, in the motivating example for this feature, is needlessly restricting the usability and future flexibility of a function, I suspect it may be a much bigger problem in practice.
This example also shows exactly what's wrong with simple generics: if this function takes an Iterable[String], it doesn't just return a Mapping[String, int], it returns a Mapping of _the same String type_. If your annotations can't express that, any value that passes through this function loses type information.
And not being able to tell whether the keys in word_count(f) are str or bytes *even if you know that f was a text file* seems like a pretty major loss.
On Wednesday, August 13, 2014 12:45 PM, Guido van Rossum <gu...@python.org> wrote:I just realized why this bothers me.
> def word_count(input: List[str]) -> Dict[str, int]:
> result = {} #type: Dict[str, int]
> for line in input:
> for word in line.split():
> result[word] = result.get(word, 0) + 1
> return result
This function really, really ought to be taking an Iterable[String] (except that we don't have a String ABC). If you hadn't statically typed it, it would work just fine with, say, a text file—or, for that matter, a binary file. By restricting it to List[str], you've made it a lot less usable, for no visible benefit.
And, while this is less serious, I don't think it should be guaranteeing that the result is a Dict rather than just some kind of Mapping. If you want to change the implementation tomorrow to return some kind of proxy or a tree-based sorted mapping, you can't do so without breaking all the code that uses your function.
And if even Guido, in the motivating example for this feature, is needlessly restricting the usability and future flexibility of a function, I suspect it may be a much bigger problem in practice.
This example also shows exactly what's wrong with simple generics: if this function takes an Iterable[String], it doesn't just return a Mapping[String, int], it returns a Mapping of _the same String type_. If your annotations can't express that, any value that passes through this function loses type information.
And not being able to tell whether the keys in word_count(f) are str or bytes *even if you know that f was a text file* seems like a pretty major loss.
On Wed, Aug 13, 2014 at 6:39 PM, Andrew Barnert <abar...@yahoo.com.dmarc.invalid> wrote:
This example also shows exactly what's wrong with simple generics: if this function takes an Iterable[String], it doesn't just return a Mapping[String, int], it returns a Mapping of _the same String type_. If your annotations can't express that, any value that passes through this function loses type information.
In most cases it really doesn't matter though -- some types are better left concrete, especially strings and numbers. If you read the mypy docs you'll find that there are generic types, so that it's possible to define a function as taking an Iterable[T] and returning a Mapping[T, int]. What's not currently possible is expressing additional constraints on T such as that it must be a String. When I last talked to Jukka he explained that he was going to add something for that too (@Jukka: structured types?).
And not being able to tell whether the keys in word_count(f) are str or bytes *even if you know that f was a text file* seems like a pretty major loss.
On this point one of us must be confused. Let's assume it's me. :-) Mypy has a few different IO types that can express the difference between text and binary files. I think there's some work that needs to be done (and of course the built-in open() function has a terribly ambiguous return type :-( ), but it should be possible to say that a text file is an Interable[str] and a binary file is an Iterable[bytes]. So together with the structured (?) types it should be possible to specify the signature of word_count() just as you want it. However, in most cases it's overkill, and you wouldn't want to do that for most code.
Also, it probably wouldn't work for more realistic examples -- as soon as you replace the split() method call with something that takes punctuation into account, you're probably going to write it in a way that works only for text strings anyway, and very few people will want or need to write the polymorphic version. (But if they do, mypy has a handy @overload decorator that they can use. :-)
Anyway, I agree it would be good to make sure that some of these more advanced things can actually be spelled before we freeze our commitment to a specific syntax, but let's not assume that just because you can't spell every possible generic use case it's no good.
It’s great to see this finally happening!
I did some research on existing optional-typing approaches [1]. What I learned in the process was that linting is the most important use case for optional typing; runtime checks is too little, too late.That being said, having optional runtime checks available *is* also important. Used in staging environments and during unit testing, this case is able to cover cases obscured by meta-programming. Implementations like “obiwan” and “pytypedecl” show that providing a runtime type checker is absolutely feasible.
The function annotation syntax currently supported in Python 3.4 is not well-suited for typing. This is because users expect to be able to operate on the types they know. This is currently not feasible because:1. forward references are impossible
2. generics are impossible without custom syntax (which is the reason Mypy’s Dict exists)3. optional types are clumsy to express (Optional[int] is very verbose for a use case this common)
4. union types are clumsy to express
All those problems are elegantly solved by Google’s pytypedecl via moving type information to a separate file.
Because for our use case that would not be an acceptable approach, my intuition would be to:1. Provide support for generics (understood as an answer to the question: “what does this collection contain?”) in Abstract Base Classes. That would be a PEP in itself.
2. Change the function annotation syntax so that it’s not executed at import time but rather treated as strings. This solves forward references and enables us to…
3. Extend the function annotation syntax with first-class generics support (most languages like "list<str>”)
4. Extend the function annotation syntax with first-class union type support. pytypedecl simply uses “int or None”, which I find very elegant.
5. Speaking of None, possibly further extend the function annotation syntax with first-class optionality support. In the Facebook codebase in Hack we have tens of thousands of optional ints (nevermind other optional types!), this is a case that’s going to be used all the time. Hack uses ?int, that’s the most succinct style you can get. Yes, it’s special but None is a special type, too.
All in all, I believe Mypy has the highest chance of becoming our typing linter, which is great! I just hope we can improve on the syntax, which is currently lacking. Also, reusing our existing ABCs where applicable would be nice. With Mypy’s typing module I feel like we’re going to get a new, orthogonal set of ABCs, which will confuse users to no end. Finally, the runtime type checker would make the ecosystem complete.
This is just the beginning of the open issues I was juggling with and the reason my own try at the PEP was coming up slower than I’d like.
First, I am really happy that you are interested in this and that your point (2) of what you want to see done is very limited and acknowledges that it isn't going to specify everything! Because that isn't possible. :)
Unfortunately I feel that adding syntax like this to the language itself is not useful without enforcement because it that leads to code being written with unintentionally incorrect annotations that winds up deployed in libraries that later become a problem as soon as an actual analysis tool attempts to run over something that uses that unknowingly incorrectly specified code in a place where it cannot be easily updated (like the standard library).
At the summit in Montreal earlier this year Łukasz Langa (cc'd) volunteered to lead writing the PEP on Python type hinting based on the many existing implementations of such things (including mypy, cython, numba and pytypedecl). I believe he has an initial draft he intends to send out soon. I'll let him speak to that.
Looks like Łukasz already responded, I'll stop writing now and go read that. :)Personal opinion from experience trying: You can't express the depth of types for an interface within the Python language syntax itself (assuming hacks such as specially formatted comments, strings or docstrings do not count). Forward references to things that haven't even been defined yet are common. You often want an ability to specify a duck type interface rather than a specific type. I think he has those points covered better than I do.
You could use AnyStr to make the example work with bytes as well:
def word_count(input: Iterable[AnyStr]) -> Dict[AnyStr, int]:
result = {} #type: Dict[AnyStr, int]
for line in input:
for word in line.split():
result[word] = result.get(word, 0) + 1
return result
Again, if this is just a simple utility function that you use once or twice, I see no reason to spend a lot of effort in coming up with the most general signature. Types are an abstraction and they can't express everything precisely -- there will always be a lot of cases where you can't express the most general type. However, I think that relatively simple types work well enough most of the time, and give the most bang for the buck.
def word_count[AnyStr <: String](input: Iterable[AnyStr]): Dict[AnyStr, Int]
FWIW, I was strongly encouraging folks at SciPy that were interested
in static typing to look at mypy as an example of a potentially
acceptable syntax for standardised optional static typing.
Aside from being +1 on the general idea of picking *something* as
"good enough" and iterating from there, I don't have a strong personal
opinion, though.
(And yes, my main interest is in improving the ability to do effective
linting for larger projects. "pylint -E" is a lot better than nothing,
but it has its limits in the absence of additional static hints)
Cheers,
Nick.
--
Nick Coghlan | ncog...@gmail.com | Brisbane, Australia
On Wed, Aug 13, 2014 at 3:26 PM, Manuel Cerón <cero...@gmail.com> wrote:
The type checking algorithm might evolve over the time, but by including typing.py in the stdlib, the syntax for annotations would be almost frozen and that will be a limitation. In other projects such as TypeScript (http://www.typescriptlang.org/), that the syntax usually evolves alongside the algorithms.What kind of evolution did TypeScript experience?
> One interesting feature of TypeScript is that it allows you to annotate
> existing code without modifying it, by using external definition files. In
> the JavaScript world, many people have contributed TypeScript annotation
> files for popular JS libraries (http://definitelytyped.org/).
>
> I think this is possible in Python as well doing something like this:
>
> @annotate('math.ciel')
> def ciel(x: float) -> int:
> pass
>
> I think this should be the way to go for annotating the stdlib. It has the
> advantage that if the type syntax changes, it's possible to provide new type
> annotations without changing the libraries at all, and even supporting older
> versions. In this way the code and type annotations can evolve separately.
>
> Does mypy support something like this?
We use something quite similar to TypeScript's repository of
annotations in PyCharm. Here is our external annotations proposal and
stubs for some stdlib modules
(https://github.com/JetBrains/python-skeletons) that uses a custom
type syntax in docstrings due to lack of a better standard, see
README. We state in our proposal that we would like the standard to
emerge. The idea being discussed here about using Mypy's type system
is not new to us. As I've mentioned in the original thread, we have
discussed it with Jukka Lehtosalo, the author of Mypy. Some initial
ideas are listed here (https://github.com/pytypes/pytypes).
--
Andrey Vlasovskikh
Web: http://pirx.ru/
Guido, as requesting, I read your whole post before replying. Please to the same. This response is both critical and supportive.My main concern with static typing is that it tends to be anti-duck-typing, while I consider duck-typing to be a major *feature* of Python. The example in the page above is "def fib(n: int):". Fib should get an count (non-negative integer) value, but it need not be an int, and 'half' the ints do not qualify. Reading the tutorial, I could not tell if it supports numbers.Number (which should approximate the domain from above.)
On 8/13/2014 3:44 PM, Guido van Rossum wrote:
Yesterday afternoon I had an inspiring conversation with Bob Ippolito
(man of many trades, author of simplejson) and Jukka Lehtosalo (author
of mypy: http://mypy-lang.org/).
Now consider an extended version (after Lucas).
def fib(n, a, b):
i = 0
while i <= n:
print(i,a)
i += 1
a, b = b, a+b
The only requirement of a, b is that they be addable. Any numbers should be allowed, as in fib(10, 1, 1+1j), but so should fib(5, '0', '1'). Addable would be approximated from below by Union(Number, str).
I am proposing that we adopt whatever mypy uses here, keeping discussion of the details (mostly) out of the PEP. The goal is to make it possible to add type checking annotations to 3rd party modules (and even to the stdlib) while allowing unaltered execution of the program by the (unmodified) Python 3.5 interpreter.
What do I fear? I think it is that Python be transformed into a programming language different from the one that now makes me so productive.
I studied Ruby, and I don't like it. I've been studying Go, and I don't like it. One must like the concepts and the power, sure, but the syntax required for some day-to-day stuff stinks like trouble; simple stuff is so complicated to express and so easy to get wrong...
I hate "List[str]" and "Dict[str:int]". Where did those come from? Shouldn't they (as others have proposed) be "[str]" and "{str:int}"? What about tuples?
Why not write a similar, but different programming language that targets the Cython runtime and includes all the desired features?The PSF could support (even fund) MyPy and similar projects, promoting their maturity and their convergence. The changes in 3.5 would be limited but enough to enable those efforts, and those of the several IDE tool-smiths (changes in annotations, and maybe in ABCs). Basically, treat MyPy as PyPy or NumPy (which got '::'). It's in Python's history to enable third-party developments and then adopt what's mature or become the de-facto standard.
On 8/13/2014 3:44 PM, Guido van Rossum wrote:
Now consider an extended version (after Lucas).
def fib(n, a, b):
i = 0
while i <= n:
print(i,a)
i += 1
a, b = b, a+b
The only requirement of a, b is that they be addable. Any numbers should be allowed, as in fib(10, 1, 1+1j), but so should fib(5, '0', '1'). Addable would be approximated from below by Union(Number, str).Unless MyPy added some sort of type classes...
It seems to me that rather than adding a module which is only used by
one project so far to the standard library is a bit premature.
I support optional typing, but why push this to stdlib now? Wouldn't
it be better to wait until most IDEs/linters all agree on this syntax,
until freezing it in stdlib? So far typing seems to be a part of mypy,
shouldn't it spend some time on PyPI first?
I'm also sure about there not being other uses of annotations -- clize
aside, there are not many widely used Python3-only 3rd party
libraries, so it's no surprise that nothing big is built around Python
3 features.
Maybe the way from PEP 3107's "here's a feature, use it for whatever
you like" to "annotations are for typing declarations, using
mypy/typing syntax" should include a step of "if you use annotations
for typing, use mypy/typing syntax for it". (And perhaps it should end
there.)
_______________________________________________
Python-ideas mailing list
Python...@python.org
https://mail.python.org/mailman/listinfo/python-ideas
Code of Conduct: http://python.org/psf/codeofconduct/
My main concern with static typing is that it tends to be anti-duck-typing, while I consider duck-typing to be a major *feature* of Python. The example in the page above is "def fib(n: int):". Fib should get an count (non-negative integer) value, but it need not be an int, and 'half' the ints do not qualify. Reading the tutorial, I could not tell if it supports numbers.Number (which should approximate the domain from above.)
On Wed, Aug 13, 2014 at 12:44:21PM -0700, Guido van Rossum wrote:
> (a) Python should adopt mypy's syntax for function annotations
[...]
I'm very excited to see functional annotations being treated seriously,
I think the introduction of static typing, even optional, has the
potential to radically change the nature of Python language and I'm not
sure if that will be good or bad :-) but it is reassuring to hear that
the intention is that it will be treated more like an optional linter
than as a core part of the language.
On the other hand, are you aware of Cobra, which explicitly was modelled
on Python but with optional static typing?
[...]
> *(1) A change of direction for function annotations*
> [...] I propose a conscious change of course here by stating
> that annotations should be used to indicate types and to propose a standard
> notation for them.
And in a later email, Guido also stated:
> I want to eventually phase out other uses of function annotations
That disappoints me and I hope you will reconsider.
I've spent some time thinking about using annotations for purposes other
than type checking, but because most of my code has to run on Python 2,
there's nothing concrete. One example is that I started exploring ways
to use annotations as documentation for the statistics module in 3.4,
except that annotations are banned from the standard library. (Naturally
I haven't spent a lot of time on something that I knew was going to be
rejected.) I came up with ideas like this:
def mean(data) -> 'μ = ∑(x)/n':
def pvariance(data) -> 'σ² = ∑(x - μ)² ÷ n':
which might have been a solution to this request:
http://bugs.python.org/issue21046
had annotations been allowed in the stdlib. Regardless of whether this
specific idea is a good one or not, I will be disappointed if
annotations are limited to one and only one use. I don't mind if there
is a standard, default, set of semantics so long as there is a way to
opt-out and use something else:
@use_spam_annotations
def frobnicate(x: spam, y: eggs)->breakfast:
...
for example. Whatever the mechanism, I think Python should not prohibit
or deprecate other annotation semantics.
> *(2) A specification for what to add to Python 3.5*
>
> There needs to be at least a rough consensus on the syntax for annotations,
> and the syntax must cover a large enough set of use cases to be useful.
> Mypy is still under development, and some of its features are still
> evolving (e.g. unions were only added a few weeks ago). It would be
> possible to argue endlessly about details of the notation, e.g. whether to
> use 'list' or 'List', what either of those means (is a duck-typed list-like
> type acceptable?) or how to declare and use type variables, and what to do
> with functions that have no annotations at all (mypy currently skips those
> completely).
It doesn't sound to me like the mypy syntax is mature enough to bless,
let alone to start using it in the standard library.
> I am proposing that we adopt whatever mypy uses here, keeping discussion of
> the details (mostly) out of the PEP. The goal is to make it possible to add
> type checking annotations to 3rd party modules (and even to the stdlib)
> while allowing unaltered execution of the program by the (unmodified)
> Python 3.5 interpreter. The actual type checker will not be integrated with
> the Python interpreter, and it will not be checked into the CPython
> repository. The only thing that needs to be added to the stdlib is a copy
> of mypy's typing.py module.
What happens when the typing.py module in the standard library gets out
of sync with the typing.py module in mypy?
[...]
> *Appendix -- Why Add Type Annotations?*
> The argument between proponents of static typing and dynamic typing has
> been going on for many decades. Neither side is all wrong or all right.
> Python has traditionally fallen in the camp of extremely dynamic typing,
> and this has worked well for most users, but there are definitely some
> areas where adding type annotations would help.
Some people have probably already seen this, but I have found this
article to be very useful for understanding why static and dynamic type
checking can be complementary rather than opposed:
http://cdsmith.wordpress.com/2011/01/09/an-old-article-i-wrote/
--
Steven
On Thu Aug 14 2014 at 10:53:37 AM Petr Viktorin <enc...@gmail.com> wrote:
It seems to me that rather than adding a module which is only used by
one project so far to the standard library is a bit premature.
I support optional typing, but why push this to stdlib now? Wouldn't
it be better to wait until most IDEs/linters all agree on this syntax,
until freezing it in stdlib? So far typing seems to be a part of mypy,
shouldn't it spend some time on PyPI first?Because as you have noticed in this thread there are already a ton of competing solutions and no consensus has been reached. Sometimes Guido and/or python-dev have to step in and simply say "there is obvious need and the community is not reaching consensus, so we will make the decision ourselves".
I'm also sure about there not being other uses of annotations -- clize
aside, there are not many widely used Python3-only 3rd party
libraries, so it's no surprise that nothing big is built around Python
3 features.
Maybe the way from PEP 3107's "here's a feature, use it for whatever
you like" to "annotations are for typing declarations, using
mypy/typing syntax" should include a step of "if you use annotations
for typing, use mypy/typing syntax for it". (And perhaps it should end
there.)That's a possibility. Another thing to support this approach is that if something like List[str] is used over `[str]` then the returned object can subclass some common superclass which can be typechecked for to know that the annotation is from typing.py and not clize/scription and continue to function. That way you can avoid any decorators adding some attribute on functions to know that types have been specified while allowing function annotations to be used for anything. Otherwise a @typing.ignore decorator could also exist for alternative formats to use (since typing has been the most common use case and decorating your single main() function with @typing.ignore is not exactly heavy-handed).
Frankly, I'd just really like to get all of this noise out of the function declaration. Any reasonable, readable and consistent documentation format is fine with me. I chose the sphinx format because it is already well supported in pycharm and that was mentioned in the first few responses.
I actually don't like the colon syntax very much (it's awkward and unnatural to type), so if anyone has a different suggestion I'd be very open to that.
Mainly I want to ensure that Python doesn't sacrifice readability and line length (which is part of readability) just because annotations are already built in.
I suggest we decide on a standard format that can be used in documentation strings and also used with type checking.
Let's enhance our documentation with types, not obfuscate function declarations.
Sunjay
On 14 Aug 2014 17:02, "Sunjay Varma" <varma....@gmail.com> wrote:
> Here's a taste of what that looks like:
> class SimpleEquation(object):
>
> def demo(self, a, b, c):
> """
> This function returns the product of a, b and c
> @type self: SimpleEquation
> :param a: int - The first number
> :param b: int
> :param c: int - The third number should not be zero and should also
> only be -1 if you enjoy carrots (this comment spans 2 lines)
> :return: int
> """
> return a * b * c
There are at least three existing, popular, standardized syntaxes for these kinds of docstring annotations in use: plain ReST, Google's docstring standard, and numpy's docstring standard. All are supported by Sphinx out of the box. (The latter two require enabling the "napolean" extension, but this is literally a one line config file switch.)
Would you suggest that python-dev should pick one of these and declare it to be the official standard, or...?
-n
On 14 Aug 2014 17:02, "Sunjay Varma" <varma....@gmail.com> wrote:
> Here's a taste of what that looks like:
> class SimpleEquation(object):
>
> def demo(self, a, b, c):
> """
> This function returns the product of a, b and c
> @type self: SimpleEquation
> :param a: int - The first number
> :param b: int
> :param c: int - The third number should not be zero and should also
> only be -1 if you enjoy carrots (this comment spans 2 lines)
> :return: int
> """
> return a * b * c
There are at least three existing, popular, standardized syntaxes for these kinds of docstring annotations in use: plain ReST, Google's docstring standard, and numpy's docstring standard. All are supported by Sphinx out of the box. (The latter two require enabling the "napolean" extension, but this is literally a one line config file switch.)
Would it be possible, and desirable, to modify the built-in types so
that we could re-use them in the type annotations?
def word_count(input: list[str]) -> dict[str, int]:
Since types are otherwise unlikely to be indexable like that, I think
that might work.
See responses inline
On Aug 14, 2014 3:28 PM, "Dennis Brakhane" <brak...@googlemail.com> wrote:
> 1. Annotations can be used to communicate additional restrictions on
> values that must be checked on run time
>
> Let's assume a simple Web service that is called via HTTP to register a
> user, and the underlying framework decodes
> the request and finally calls a simple controller function, it could
> look like this
>
> (Java code, @ signifies an annotation)
>
> public Response register(@Range(18,100) int age, @ValidEmail String
> email) { ... }
>
> The framework would check the range of the age parameter and the
> validity of the email and if there are validation errors,
> refusing the request with a suitable error message without ever calling
> our function.
>
This is exactly the kind of syntax I want to avoid. Python should not attempt to just blindly become like Java or any other language. Though Dennis was just illustrating his point (this was not a suggestion of an alternate syntax), I feel like Python is moving further and further towards code like this. Python programmers should not be forcing as much as possible into each line. Explicit is better than implicit.
> Even if we assume that mypy's type system will be incredibly complex and
> allowing to specify all kinds of restrictions on a type,
> it won't help because those checks have to be done at run time, and are
> not optional.
This is a great point. Regardless of the nature of the annotation, we can't let this become a way out for lazy programmers. Having annotations is no excuse for poor error checking.
> 2. They can give meta information to a framework
>
> An admittedly contrieved example, let's expand our register function:
>
> public Response register( int age, @Inject @Scope("session") UserService
> userService, @Authenticated User user) ....
This is too much to put on one line in Python. We should be aiming to make code cleaner and promote proper error checking and documentation.
> The flexibility annotations give in Java makes programming much less
> frustrating. It also took quite some time before
> annotations were widly used (they were only available starting in Java
> 5) and people started finding more uses for them.
> I think this will also be true for Python, it will take time before
> people find useful ways for them. Redefining them now to
> be "not much more" than static type information feels wrong to me.
I agree that annotations can be useful, but I don't think they should be used for type checking at this scale.
Sunjay _______________________________________________
> On Wed, Aug 13, 2014 at 10:29:48PM +0200, Christian Heimes wrote:
>
> > 1) I'm not keen with the naming of mypy's typing classes. The visual
> > distinction between e.g. dict() and Dict() is too small and IMHO
> > confusing for newcomers. […]
>
> Would it be possible, and desirable, to modify the built-in types so
> that we could re-use them in the type annotations?
That would address my concern. With that change, when the programmer who
reads the code encounters mention of a type, it means precisely the same
type object as it appears to be and not some special beast.
--
\ “Science embraces facts and debates opinion; religion embraces |
`\ opinion and debates the facts.” —Tom Heehler, _The Well-Spoken |
_o__) Thesaurus_ |
Ben Finney
+1 This is definitely something to consider.
One of the many benefits of Python is that you can use objects with equivalent interfaces in functions that may not have expected that type while they were being written.
One thing to note: if we try to make this syntax too verbose, it may lose all of its purpose all together.
For example: one (bad) solution to support what I described above is to define some sort of convoluted syntax for specifying the exact interface a function supports. At this point, any addition to the language would do nothing but hinder it. Anything that verbose loses is too complex to be valuable.
We have to be careful with this. If we do accept it (or any of the many alternatives suggested so far), then we should choose a few use cases and focus on solving them as best as possible.
On Thu, Aug 14, 2014 at 12:28:26PM +0200, Manuel Cerón wrote:I'm afraid I don't understand what the annotate decorator is doing here.
> One interesting feature of TypeScript is that it allows you to annotate
> existing code without modifying it, by using external definition files. In
> the JavaScript world, many people have contributed TypeScript annotation
> files for popular JS libraries (http://definitelytyped.org/).
>
> I think this is possible in Python as well doing something like this:
>
> @annotate('math.ciel')
> def ciel(x: float) -> int:
> pass
Can you explain please?