Quarkus 3.15.3 LTS release preparation + branch freeze

10 views
Skip to first unread message

Guillaume Smet

unread,
Jan 7, 2025, 8:22:44 AMJan 7
to Quarkus Platform Coordination
Hi,

We are working on a new process to release LTS maintenance micros on a regular cadence (every 2 months).

It involves some changes as to how we release these as we need to make sure we don't break the Platform members with a micro.

We recently put in place Ecosystem CI for Quarkus Platform 3.15 (https://github.com/quarkusio/quarkus/issues/45119) - we encourage you to subscribe to the issue.

This new process has some buffer in place between the core release and the Platform release (in a similar way we do for .0) so that we can potentially solve compatibility problems. They should be extremely rare but given we want a regular cadence, we have to make room for them.

The new process is explained here: https://github.com/quarkusio/quarkus/wiki/3.15-LTS-Release-Planning together with the dates for 3.15.3 (which are set in stone) and dates for next releases - which we will tune depending on how this new release process flies.

Today is the branch freeze for 3.15.3 meaning we consider the payload fully merged to 3.15 branches both in the Core branch and the Platform branch.

Now is the time to make sure we don't have any compatibility issues - we should be fine given Ecosystem CI runs successfully for 3.15, but given it's a new process, let's double check.

Schedule for 3.15.3 LTS is the following:

Date Action
Dec 9, 2024 Backport meetings begin
Jan 7, 2025 Upstream branches freeze (Platform can begin compatibility testing)
Jan 14, 2025 Upstream core release
Jan 21, 2025 Upstream platform release

The buffers are extremely large for now, we are aiming at reducing them once we are more used to it.

Another change is that Jan Martiska (you might know him from his work on SmallRye GraphQL, Quarkus LangChain4j...) will handle this release as we are evolving to scale even more our releases.

Let us know if you have any questions or comments.

Thanks.

--
Guillaume








Reply all
Reply to author
Forward
0 new messages