Add verbosity option to manage.py checks that outputs which checks are ran

247 views
Skip to first unread message

Gordon

unread,
Jun 10, 2020, 10:00:36 AM6/10/20
to Django developers (Contributions to Django itself)
This is a discussion for https://code.djangoproject.com/ticket/31681 - the feature would be useful to me and seems like something generally useful as well since it is implemented in the testing framework.  The closing comment mentions that copying a documentation page into command output isn't valuable.  That isn't what I meant to suggest so I will attempt to explain more clearly below:

It would be helpful if `checks` would output what checks it is running.  Currently you don't actually know if a check runs or not.  Perhaps it isn't registered correctly?  Or it was excluded by mistake when you expected it to run?  I like to keep this information around in build history for quick inspection purposes as needed.


For example, when running a check it would be nice to (optionally) see something of the form:
  • path.to.my.checkA ... ok
  • path.to.my.checkB ... ERROR
  • path.to.my.checkC ... ok
  • path.to.my.checkD ... WARN
System check ran 100 checks and identified 2 issues (0 silenced).
 

It would be helpful if the `checks` verbosity flag worked in similar fashion to the testing framework.  I am thinking that `-v 1` or unspecified would maintain the current behavior.  `-v 2` outputs all checks run not under the django namespace (due to the comment in the ticket about not repeating a documentation page).  And `-v 3` outputs all checks.  It might also be helpful to add a verbosity level that only prints checks under the current working directory (i.e. project specific checks) but this might be difficult to determine correctly based on project layout and installation method.

If the general consensus likes the feature and the issue can be re-opened, I will investigate.  The ticket says it would not be feasible to output this information.  I am assuming (since they are run generically via a registration approach) that it should be straightforward to log the dotted path of the check.

Please let me know what you think!

Adam Johnson

unread,
Jun 10, 2020, 11:59:39 AM6/10/20
to django-d...@googlegroups.com
I am with Mariusz. Displaying the names of the check functions is a bit against the intention of the checks framework. The check ID's are intended to be enough information to reference the problem. This is different to unit tests, where the function names are intended to carry information.

Django doesn't really have a useful structure to its check function names. For example a single check function, check_all_models() , is responsible for all model level checks ( https://github.com/django/django/blob/678c8dfee458cda77fce0d1c127f1939dc134584/django/core/checks/model_checks.py#L12 ). This has its own call tree through to Model.check() then each field's Field.check(). Outputting a list entry like "check_all_models WARN" would not give any actionable information.

If you want to debug your own checks:
  1. Write tests for them. For example, use override_settings() in a test case to set a setting to the value that would cause a warning, and check that it returns the correct list of CheckMessage objects.
  2. You can inspect which check functions are registered with the undocumented get_checks() function of the registry:
In [1]: from django.core.checks.registry import registry

In [2]: registry.get_checks()
Out[2]:
[<function django.core.checks.urls.check_url_config(app_configs, **kwargs)>,
 <function django.core.checks.urls.check_url_namespaces_unique(app_configs, **kwargs)>,
 ...
 <function django.contrib.auth.checks.check_user_model(app_configs=None, **kwargs)>]

If you're particularly unconfident that your checks.register() lines don't always work, perhaps because they're conditional, you can also write unit tests that run get_checks() and see your check functions are registered.

--
You received this message because you are subscribed to the Google Groups "Django developers (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to django-develop...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/django-developers/85259884-52fa-42b2-822f-f0537f5b2f0bo%40googlegroups.com.


--
Adam

Gordon

unread,
Jun 10, 2020, 4:16:56 PM6/10/20
to Django developers (Contributions to Django itself)
Particular case that spurred this - reusable app had checks to make sure urls were reversible.  Checks weren't imported.  Checks command passes but doesn't run the apps checks.  There is no indication that the checks were not run so you have to assume they did and were successful.

The app checks were tested for function.  Since this imports the checks they would show as registered using your `get_checks()` test, no?  Seems kind of tricky...

Since the checks framework is meant to be "extensible so you can easily add your own checks", it just seems like some sort of indication about which checks are run makes sense.  They are a great way for checking runtime conditions against production settings but not being able to confirm they actually ran somewhat negates the peace of mind that all is well.

Thanks,
Gordon

To unsubscribe from this group and stop receiving emails from it, send an email to django-d...@googlegroups.com.


--
Adam

Adam Johnson

unread,
Jun 10, 2020, 6:40:16 PM6/10/20
to django-d...@googlegroups.com
Perhaps there could be better on guidance in the documentation about registering custom checks. I see nothing in the "writing your own checks" guide really describes a recommended code structure. We'd accept a docs PR there I'm sure.

I normally have a checks submodule in my app and register the checks in AppConfig.ready(). For example: https://github.com/adamchainz/django-mysql/blob/2cf93c771f16011bc754a61e5aa95be4871f0363/src/django_mysql/apps.py#L17 . I don't like the decorator form since that's an import side effect and as you've discovered it's not always easy to ensure your module gets imported.

To unsubscribe from this group and stop receiving emails from it, send an email to django-develop...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/django-developers/6417334d-39c9-4b82-9d26-ae617cd66bdeo%40googlegroups.com.


--
Adam

Gordon

unread,
Jun 12, 2020, 10:33:22 AM6/12/20
to Django developers (Contributions to Django itself)
The underlying problem that I want to solve is a way to fail a CI job if checks for an app don't run.  The only approach I currently see that would accomplish that is by using a custom tag for every project that could fail with the invalid tag error - but this feels wrong.

Would you be in favor of a flag for the check command that errors if checks aren't run from a particular namespace?

Like `python manage.py check --require-checks foo` which would exit with a non-zero exit code if zero checks for `foo` are run?

Adam Johnson

unread,
Jun 12, 2020, 11:50:31 AM6/12/20
to django-d...@googlegroups.com
This really feels like a "who watches the watchmen" problem.

Where does this problem stem from? Are you conditionally registering a check function? It would be better in that case to move the conditionality into the check body.


To unsubscribe from this group and stop receiving emails from it, send an email to django-develop...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/django-developers/594bd349-383f-4ffb-9b33-50b98b101f69o%40googlegroups.com.


--
Adam

1337 Shadow Hacker

unread,
Jul 24, 2020, 8:43:58 AM7/24/20
to django-d...@googlegroups.com
Absolutely agree that a verbosity or debug option should print ... debug info.
Reply all
Reply to author
Forward
0 new messages