An application can simply count the number of parses, and take whatever
action is desired. Why, in that case, have a max_parses named argument
at all? max_parses is mainly intended as a fallback for testing and
debugging, to stop "run-away" parses. In those contexts, where the
application's logic is in the formative stage, the Draconian solution is
the best way to deal with the "stop loss". Production applications will
probably avoid max_parses.
By the way, this question causes me to think out the usage of max_parses
more carefully. I've clarified the documentation in commit ae5d941.
high_rule_only?