[Django] #37066: Async generator `aclose()` deferred to loop shutdown at `async for` call sites that don't use `contextlib.aclosing`

5 views
Skip to first unread message

Django

unread,
Apr 25, 2026, 9:27:56 AM (4 days ago) Apr 25
to django-...@googlegroups.com
#37066: Async generator `aclose()` deferred to loop shutdown at `async for` call
sites that don't use `contextlib.aclosing`
-------------------------------------+-------------------------------------
Reporter: Thomas Grainger | Type:
| Cleanup/optimization
Status: new | Component:
| Uncategorized
Version: 6.0 | Severity: Normal
Keywords: | Triage Stage:
| Unreviewed
Has patch: 0 | Needs documentation: 0
Needs tests: 0 | Patch needs improvement: 0
Easy pickings: 0 | UI/UX: 0
-------------------------------------+-------------------------------------
= Component =

Core (Other) — affects `django.http`, `django.db.models`,
`django.core.paginator`, `django.contrib.auth`, `django.shortcuts`,
`django.utils.text`, `django.test.client`.

= Description =

Several places in Django consume a (possibly user-supplied) async iterable
with a bare `async for` (or `[... async for ... in ...]` comprehension)
without wrapping it in `contextlib.aclosing()`. When the consumer exits
the loop early — via `break`, `return`, an exception raised in the
consumer body, or cancellation that lands while the generator is suspended
at a `yield` (i.e. between `__anext__` calls, on a user-body `await`) —
the underlying async generator is not closed at that point. Its `aclose()`
is deferred until asyncio's asyncgen finalizer hook fires it: that happens
at the next loop iteration after the generator becomes unreachable (the
hook calls `call_soon` to schedule `aclose()` as a Task) or, if the
generator is still reachable when the loop is winding down, at
`loop.shutdown_asyncgens()` during `asyncio.run()` teardown. Any
`try`/`finally` (or `async with`) inside the generator — database
transactions, connection releases, file handles, locks — therefore runs at
a non-deterministic later time, and in the worst case (loop already torn
down, or generator held by a module-level reference) does not run at all.
The leak window varies by site: long-lived ASGI loops are unbounded;
short-lived `async_to_sync` wrappers (e.g. `to_list` in
`StreamingHttpResponse.__iter__`) tear down their own loop promptly so the
gap is small, but the close is still non-deterministic and `aclose()`
errors are still swallowed by the loop's exception handler.

`django/core/handlers/asgi.py:325` already uses `aclosing(aiter(...))` for
exactly this reason and links to
[https://github.com/python/cpython/commit/6e8dcdaaa49d4313bf9fab9f9923ca5828fbb10e
CPython 6e8dcdaa]. The same care is needed at every call site that
consumes an async iterable Django doesn't own.

= Steps to reproduce =

{{{
#!python
import asyncio

async def gen():
try:
yield "first"
yield "second"
finally:
print(" -> gen finally ran")

async def consume_and_break():
async for x in gen():
print(f"got {x!r}, breaking")
break # gen is suspended at the second yield; finally is pending

async def main():
await consume_and_break()
print("after consume_and_break() returned")
await asyncio.sleep(0)
print("after one loop tick")
await asyncio.sleep(0.1)
print("after another loop tick")

asyncio.run(main())
# Observed output on CPython 3.12:
# got 'first', breaking
# after consume_and_break() returned
# after one loop tick
# -> gen finally ran
# after another loop tick
#
# The user generator's `finally` does not run when the consuming
# `async for` exits. asyncio's asyncgen finalizer hook schedules
# `aclose()` on the loop (via `call_soon`) when the generator is
# collected; the cleanup therefore runs at some later loop tick,
# interleaved with whatever else the loop is doing.
}}}

In a long-lived loop (an ASGI server) the gap between "consumer exits" and
"generator's finally runs" is bounded only by when the next loop iteration
drains pending callbacks, so the `finally` runs concurrently with
unrelated work rather than synchronously at the consumer's exit point. Two
pathological cases push the gap further: if the generator is kept
reachable past consumer return (a module-level reference, or held by an
outer object such as a `StreamingHttpResponse`), the finalizer hook only
fires when that reference goes away; if the loop is closed before the
generator is collected, the hook's `if not self.is_closed()` guard skips
the `aclose` schedule entirely and the `finally` never runs.

In a Django application this surfaces as e.g. a streaming response where
the user-supplied async generator's `finally` block (releasing a DB
connection, closing a file) doesn't run promptly after a client
disconnect, and any exception raised by that `finally` is delivered to the
loop's exception handler rather than to the response code path that would
have handled it.

= Expected behaviour =

`aclose()` runs deterministically when the consuming `async for` exits
(normally, via exception, or via cancellation), the same way `close()`
runs on a sync generator at the end of a sync `for`.

= Actual behaviour =

`aclose()` is deferred to asyncio's asyncgen finalizer hook, which
schedules it via `call_soon` once the generator becomes unreachable — so
the `finally` runs at some later loop tick (interleaved with unrelated
work) if the generator is collected promptly, or at
`loop.shutdown_asyncgens()` if it is still reachable when the loop tears
down. Resources held in the generator's `try`/`finally` leak for that
window, and may not be released at all if the generator outlives the event
loop.

= Affected call sites =

Found via `grep -rn ' async for ' django/` (the leading/trailing spaces
filter out the genexp at `django/views/generic/base.py:74` which iterates
over a variable named `is_async`). The only call site that attempts the
pattern is `asgi.py:325`, and even there only the outermost layer is
closed (see the note below the table); every other site iterates without
`aclosing` at all:

The "Source" column says whether the iterable is guaranteed to come from a
Django-internal `async def … yield …` (use plain `aclosing`) or may be a
caller-supplied object that only implements `__aiter__`/`__anext__` (use
`maybe_aclosing`). The inner `aiter(...)` is shown only when the iterable
expression is not already an async iterator; the two `_iterator` rows in
`response.py` skip it because `_iterator = aiter(value)` was already done
in `_set_streaming_content` (`response.py:505`).

Legend: each "Replacement" cell `<expr>` is applied as `async with <expr>
as it: async for item in it: ...` (or the comprehension equivalent shown
in "Proposed fix" below).

||='''File''' =||='''Line''' =||='''Context'''
=||='''Source''' =||='''Replacement'''
=||
||`django/contrib/auth/backends.py` || 143
||`_aget_permissions()` set-comp over `perms` ||internal
||`aclosing(aiter(perms))` ||
||`django/shortcuts.py` || 156
||`aget_list_or_404()` list-comp over filtered queryset ||internal
||`aclosing(aiter(queryset.filter(*args, **kwargs)))` ||
||`django/http/response.py` || 488
||`StreamingHttpResponse.streaming_content` getter (`awrapper`)||caller-
supplied ||`maybe_aclosing(_iterator)`
||
||`django/http/response.py` || 524
||`StreamingHttpResponse.__iter__` / `to_list` ||caller-
supplied ||`maybe_aclosing(_iterator)`
||
||`django/http/response.py` || 532
||`StreamingHttpResponse.__aiter__` ||internal
||`aclosing(aiter(self.streaming_content))` ||
||`django/db/models/query.py` || 595, 612
||`QuerySet.aiterator()` (single hoisted wrap, see note) ||internal
||`aclosing(aiter(iterable))` ||
||`django/core/paginator.py` || 373
||`AsyncPage.__aiter__` ||caller-
supplied ||`maybe_aclosing(aiter(self.object_list))`
||
||`django/core/paginator.py` || 417
||`AsyncPage.aget_object_list()` list-comp ||caller-
supplied ||`maybe_aclosing(aiter(self.object_list))`
||
||`django/test/client.py` || 131
||`aclosing_iterator_wrapper()` body (see note below) ||caller-
supplied ||`maybe_aclosing(aiter(iterable))`
||
||`django/utils/text.py` || 400
||`acompress_sequence()` ||caller-
supplied ||`maybe_aclosing(aiter(sequence))`
||

Note the `StreamingHttpResponse` chain: even though `asgi.py` wraps
`aiter(response)` in `aclosing(...)`, the inner `async for part in
_iterator` in `response.py:488` still doesn't deterministically close
`_iterator` (the user-supplied async iterator), because closing the
wrapper doesn't propagate `aclose()` through a bare `async for`. With the
fix, the chain has three nested `aclosing` contexts (`asgi.py:325` →
`__aiter__` at `response.py:532` → `awrapper` at `response.py:488`); close
propagation runs outer-to-inner and `aclose()` is idempotent on a closed
asyncgen, so the nesting is correct, not redundant.

For `StreamingHttpResponse.__aiter__` (`response.py:532`),
`self.streaming_content` returns `map(...)` for the sync-iterable case.
`aiter(map(...))` raises `TypeError`, which is caught by the existing
`except TypeError` block immediately after the `async for` — so wrapping
in `aclosing(aiter(...))` does not break the sync fallback.

A subtle behaviour change in `StreamingHttpResponse`: each call to the
`streaming_content` getter creates a fresh `awrapper` async generator
closing over the same `self._iterator`, so two consumers race for one
underlying iterator. Today, if the first consumer exits early without
exhausting the stream, a second `streaming_content` call can resume
`_iterator` from wherever the first consumer happened to stop and observe
the remaining chunks. With the fix in place, the first consumer's exit
closes `_iterator`, and a second `streaming_content` call returns a new
`awrapper()` that iterates an already-closed asyncgen — which yields zero
items and exits normally (a closed asyncgen raises `StopAsyncIteration` on
`__anext__`, terminating the `async for` cleanly). Multi-consumer of
`streaming_content` is already broken today (the second consumer sees a
non-deterministic suffix of the stream); the fix makes the second
consumer's stream deterministically empty. Worth a release-note mention.

`django/test/client.py:131`'s helper is already named
`aclosing_iterator_wrapper`, but it does not actually use
`contextlib.aclosing` — its `try`/`finally` covers only `request_finished`
signalling, and it relies on the same deferred-aclose behaviour this
ticket is about. With the fix below the name becomes accurate (the wrapper
itself wraps the inner iteration in `maybe_aclosing`), so no rename is
needed.

In practice today both call sites of `aclosing_iterator_wrapper`
(`client.py:198,250`) pass `response.streaming_content`, which is always
Django's `awrapper()` asyncgen, so plain `aclosing` would suffice.
`maybe_aclosing` is used to keep the helper's contract permissive for
future callers.

`AsyncPage.aget_object_list()` (`paginator.py:408`) remains idempotent
under the fix: after the first call, `self.object_list` is rebound to a
`list` (which has no `__aiter__`), so the second call falls through the
`hasattr(self.object_list, "__aiter__")` check and never enters the `async
for` path. Closing the iterator on the first call therefore cannot affect
later calls.

Plain `aclosing` (rather than `maybe_aclosing`) is safe at the two
`query.py` rows because `iterable = self._iterable_class(self, ...)` is
always one of `ModelIterable`, `RawModelIterable`, `ValuesIterable`,
`ValuesListIterable`, `NamedValuesListIterable`, or
`FlatValuesListIterable` — all of which subclass `BaseIterable` and none
of which override `__aiter__`. The only `__aiter__` in the hierarchy is
`BaseIterable.__aiter__` (`query.py:84`), which returns
`self._async_generator()` — a real `async def … yield …` generator that
has `aclose`. So `aiter(iterable)` is guaranteed to return an async
generator, and `aclosing(aiter(iterable))` cannot fail at `__aexit__`.

The recommended fix at `query.py:595/612` is a single hoisted `async with`
around the whole `if self._prefetch_related_lookups:` block in
`QuerySet.aiterator()` (`query.py:579`), rather than two separate per-
branch wraps:

{{{
#!python
iterable = self._iterable_class(
self, chunked_fetch=use_chunked_fetch, chunk_size=chunk_size
)
async with aclosing(aiter(iterable)) as iterator:
if self._prefetch_related_lookups:
results = []
async for item in iterator:
results.append(item)
if len(results) >= chunk_size:
await aprefetch_related_objects(
results, *self._prefetch_related_lookups
)
for result in results:
yield result
results.clear()

if results:
await aprefetch_related_objects(
results, *self._prefetch_related_lookups
)
for result in results:
yield result
else:
async for item in iterator:
yield item
}}}

Equivalent to a per-branch form (one `async with` inside the `if`, another
inside the `else`) because the two `async for` loops are in mutually-
exclusive branches (so the shared `iterator` binding is iterated exactly
once), and `aclose()` fires at the same moment in either form (when the
outer `aiterator` async generator itself closes). The hoist is preferred
because it makes the lifetime of `iterable` syntactically obvious, avoids
duplicating the wrap, and keeps the resource manager next to where
`iterable` is constructed at line 589. The per-branch form is acceptable
as an alternative if the surrounding code later evolves so the two
branches iterate different sources.

`acompress_sequence` (`utils/text.py:392`) is itself an async generator
whose body is wrapped in `with GzipFile(...) as zfile:`. Adding `async
with maybe_aclosing(aiter(sequence)) as it:` inside that `with` nests
cleanly. There are two possible suspension points where `GeneratorExit`
can land: the bare `yield buf.read()` at `text.py:399` (header, before the
inner `async with` is entered) and the `yield data` at `text.py:405` (mid-
loop, inside both the `async with` and the `async for`). In the header
case, the inner `async with` has not been entered yet, so unwinding skips
it and the outer `with GzipFile(...)` runs `zfile.close()` directly. In
the mid-loop case, the `async for` unwinds, the inner `async with` runs
`aclose()` on `sequence`, and only then does the outer `with
GzipFile(...)` exit and call `zfile.close()` (which writes the gzip
trailer to `buf`). In either case, the trailing `yield buf.read()` at
`text.py:406` does not run during the unwind, so the trailer is generated
but not delivered — that is the correct behaviour for an early-exit
consumer, who can no longer receive bytes. Order in the mid-loop case is:
user generator's `finally` → gzip trailer written to `buf` → outer unwind.
No exception is raised and no double-close occurs.

= Proposed fix =

Add a public helper `maybe_aclosing` to `django.utils.asyncio` (next to
the existing `async_unsafe`). It handles both real async generators (which
have `aclose`) and plain `__aiter__`/`__anext__` adapters (which don't):

{{{
#!python
import contextlib

def maybe_aclosing(iterator):
return (
contextlib.aclosing(iterator)
if hasattr(iterator, "aclose")
else contextlib.nullcontext(iterator)
)
}}}

The capability check is required, not stylistic:
`contextlib.aclosing(obj)` constructs successfully whether or not `obj`
has `aclose`, and only fails at `__aexit__` with `AttributeError`.
Wrapping unconditionally would therefore turn every clean exit through a
non-`aclose`-having iterator into an exception. `hasattr` is sufficient
for the in-tree call sites here because every iterator passed in is either
a real async generator (which exposes `aclose` as an instance attribute on
the asyncgen object) or an object that genuinely has no `aclose` attribute
at all — none rely on `__getattr__` to synthesise it.

`maybe_aclosing` is exported as public API because third-party Django apps
that consume caller-supplied async iterables (custom views, middleware,
streaming adapters) face exactly the same hazard as the in-tree call sites
this ticket fixes. Hiding the helper would either push every third-party
author to re-derive the same `hasattr`/`aclosing`/`nullcontext` snippet,
or — more likely — leave the leak unfixed in their code. Public placement
carries the usual `docs/internals/release-process.txt` commitments
(deprecation shim across at least two feature releases for any
rename/remove, plus a `docs/ref/utils.txt` entry under
"django.utils.asyncio"); both are accepted as the cost of giving
downstream code a supported way to write the same fix. For
non-`aclose`-having iterators, `maybe_aclosing` is intrinsically a no-op —
the leak in those cases is a property of the caller's iterator type, not
something Django can fix from the outside.

Note that the public-vs-private decision is independent of every other
change in this ticket. If reviewers prefer to keep the helper internal,
renaming it to `_maybe_aclosing` (still in `django.utils.asyncio`) and
dropping the `docs/ref/utils.txt` entry is the only edit required — the
call-site rewrites, the `aiter(...)` wrap, the `query.py` hoist, and the
test sketches are all unchanged. The choice can therefore be made (or
revisited) in isolation during review without unwinding the rest of the
patch.

Each call site is then rewritten as:

{{{
#!python
async with maybe_aclosing(aiter(iterable)) as it:
async for item in it:
...
}}}

…or, for comprehensions:

{{{
#!python
async with maybe_aclosing(aiter(iterable)) as it:
result = [item async for item in it]
# `result` is used here, after the `async with` exits and `aclose()` has
run.
}}}

The comprehension fully materialises the list before `__aexit__` runs, so
`aclose()` fires on a drained iterator; `result` is consumed after the
block (`with` does not introduce a scope).

A pure consuming comprehension like `[item async for item in it]` has no
user body that could `break`, `return`, or `raise`, and its only `await`
points are the `__anext__` calls themselves (where cancellation propagates
into the generator promptly). Such comprehensions don't leak today; the
wrap at the three list-comp sites (`auth/backends.py:143`,
`shortcuts.py:156`, `paginator.py:417`) is defensive — kept for
consistency with the `async for` sites, and to survive future edits that
convert a comprehension into a loop with a `break`-able body.

The `aiter(...)` is required: if `iterable` is a `QuerySet` (or any object
whose `__aiter__` returns a fresh async generator each call),
`aclosing(iterable)` would fail at exit (`QuerySet` has no `aclose`), and
`as it` would re-bind to the original `iterable` so the `async for` would
create a new, unmanaged generator. Extracting the iterator with
`aiter(...)` guarantees the `async with` and the `async for` operate on
the same async-generator object.

`aiter()` and `contextlib.aclosing` are Python 3.10 builtins; Django main
targets Python 3.12+, so no compatibility shim is needed.

= Tests =

A regression test for the `StreamingHttpResponse` chain looks roughly
like:

{{{
#!python
async def test_streaming_response_closes_user_generator_promptly(self):
finally_ran = False

async def body():
nonlocal finally_ran
try:
yield b"chunk"
finally:
finally_ran = True

response = StreamingHttpResponse(body())
async with aclosing(aiter(response)) as content:
async for _ in content:
break
# `response` is held by this frame, which keeps `response._iterator`
# alive, which keeps the `body()` async generator alive — so GC cannot
# opportunistically run body's `finally` between the `break` and the
# assertion. The outer `aclosing(aiter(response))` mirrors what
# asgi.py:325 does. It propagates `aclose()` into `__aiter__`, which
# under the fix propagates through `awrapper` and into `body()`,
# running its `finally` synchronously. Without the fix, finally_ran is
# False here and only becomes True after loop.shutdown_asyncgens().
self.assertTrue(finally_ran)
}}}

Equivalent shapes are needed for `acompress_sequence`,
`AsyncPage.__aiter__`, `aclosing_iterator_wrapper`, and the `_aiterator`
prefetch/no-prefetch branches (a queryset whose `_iterable_class` yields
from a `try`/`finally` is the natural test fixture). In each case the test
consumer must wrap the iteration in `aclosing(aiter(...))`; bare `async
for` + `break` does not propagate `aclose()` and would leave the test
failing for the wrong reason.

= Backwards incompatibility / release note =

This is a coordinated edit across several modules but each call site is
mechanical. There is no observable behaviour change for callers whose
async generators have correct, prompt-completing `aclose()` semantics.
Generators that previously relied on cleanup being deferred (or whose
`aclose` blocks, raises, or hangs) will see that surfaced sooner — this is
the intended fix, but is worth a release-note mention.

Two specific surfacing changes worth calling out in the release notes:

* Exceptions raised by a user generator's `aclose()` (or by `__aexit__`
of an `async with` inside it) now propagate out of the consuming `async
for` site at the moment the consumer exits, rather than being absorbed by
asyncio's asyncgen finalizer hook at loop shutdown. Previously such errors
were typically logged via the loop's exception handler and otherwise
swallowed; now they reach the caller.
* Multi-consumer of `StreamingHttpResponse.streaming_content` (already
broken today — second consumer sees a non-deterministic suffix of
`_iterator`) becomes deterministically empty: the first consumer's exit
closes `_iterator`, and any later `streaming_content` access returns a new
`awrapper()` that iterates a closed asyncgen and yields zero items. No
exception is raised in either form — the failure is silent both before and
after — but the post-fix behaviour is at least predictable. The supported
pattern is one consumer per `streaming_content` access; the release note
should state this explicitly so that callers who were unknowingly relying
on consuming the leftover suffix can update their code.

= Additional context =

* Background: PEP 533 (Deferred, but the rationale is the canonical
description of this hazard).
* Related Python docs:
[https://docs.python.org/3/library/contextlib.html#contextlib.aclosing
contextlib.aclosing].
* Existing in-tree precedent: `django/core/handlers/asgi.py:325`.

= AI assistance disclosure =

Per Django's [https://docs.djangoproject.com/en/dev/internals/contributing
/writing-code/submitting-patches/#note-for-ai-tools AI-Assisted
Contributions policy]:

* '''Tool''': Anthropic Claude (model `claude-opus-4-7`), via Claude Code
CLI.
* '''Usage''': drafted the ticket text (description, call-site table,
proposed fix, test sketch). All file paths, line numbers, method names,
and behavioural claims were re-verified against the current `main`
checkout (commit `526b548cfb`) by reading the cited source and running
`grep -rn 'async for\|aiter('` over `django/`.
* '''Verified independently''': call-site list is exhaustive against
`grep` output; `BaseIterable.__aiter__` returns a real async generator (so
`aclosing` cannot fail at exit); `_set_streaming_content` already calls
`aiter(value)` at `response.py:505` (so the two `_iterator` rows correctly
omit a second `aiter`); `aiter(map(...))` `TypeError` is caught by the
existing `except TypeError` block in `__aiter__`.
* '''Items reviewers should still sanity-check''':
* `maybe_aclosing` is proposed as public API in `django.utils.asyncio`
(alongside `async_unsafe`), with a `docs/ref/utils.txt` entry. Reviewers
who prefer keeping it private can downgrade to a leading-underscore symbol
in the same module without any other change to this ticket; the call-site
fixes are unaffected.
* The hoisted-vs-per-branch choice in `QuerySet.aiterator()` is a style
call; both forms are behaviourally equivalent (justified inline).
* The `StreamingHttpResponse` multi-consumer behaviour change (second
consumer goes from non-deterministic suffix of `_iterator` to
deterministically empty stream — silent in both cases) is flagged for a
release-note line — confirm whether that warrants more than a note.
* No fictitious APIs are referenced. `contextlib.aclosing`,
`contextlib.nullcontext`, `aiter`, and `aclose` are all standard-library;
the only new symbol is the public `maybe_aclosing` helper proposed in this
ticket.
--
Ticket URL: <https://code.djangoproject.com/ticket/37066>
Django <https://code.djangoproject.com/>
The Web framework for perfectionists with deadlines.

Django

unread,
Apr 25, 2026, 10:27:58 AM (4 days ago) Apr 25
to django-...@googlegroups.com
#37066: Async generator `aclose()` deferred to loop shutdown at `async for` call
sites that don't use `contextlib.aclosing`
-------------------------------------+-------------------------------------
Reporter: Thomas Grainger | Owner: (none)
Type: | Status: new
Cleanup/optimization |
Component: Uncategorized | Version: 6.0
Severity: Normal | Resolution:
Keywords: | Triage Stage:
| Unreviewed
Has patch: 1 | Needs documentation: 0
Needs tests: 1 | Patch needs improvement: 0
Easy pickings: 0 | UI/UX: 0
-------------------------------------+-------------------------------------
Changes (by Thomas Grainger):

* has_patch: 0 => 1
* needs_tests: 0 => 1

--
Ticket URL: <https://code.djangoproject.com/ticket/37066#comment:1>

Django

unread,
Apr 28, 2026, 10:56:19 AM (14 hours ago) Apr 28
to django-...@googlegroups.com
#37066: Async generator `aclose()` deferred to loop shutdown at `async for` call
sites that don't use `contextlib.aclosing`
--------------------------------------+------------------------------------
Reporter: Thomas Grainger | Owner: (none)
Type: Cleanup/optimization | Status: new
Component: Uncategorized | Version: 6.0
Severity: Normal | Resolution:
Keywords: | Triage Stage: Accepted
Has patch: 1 | Needs documentation: 0
Needs tests: 1 | Patch needs improvement: 1
Easy pickings: 0 | UI/UX: 0
--------------------------------------+------------------------------------
Changes (by Simon Charette):

* needs_better_patch: 0 => 1
* stage: Unreviewed => Accepted

Comment:

Note that there are also known problems with some sync iterators holding
resources until garbage collection such as `QuerySet.iterator`
[https://code.djangoproject.com/ticket/37000#comment:5 when using server-
side cursors] which might also warrant some documentation adjustments.

I think it's worth treating these as separate issues but I figured I'd
point out that it's not only an async/IO problem.
--
Ticket URL: <https://code.djangoproject.com/ticket/37066#comment:2>
Reply all
Reply to author
Forward
0 new messages