Hi,
> We have only a few issues marked for 1.3
> <
https://github.com/crawler-commons/crawler-commons/milestone/9> and quite a
> few dependency updates
I've added all open dependency upgrades to the 1.3 milestone:
now there are six open issues or PRs. Or four if we delay #351 and #375
until a later release.
> Shall we release 1.3 soon?
+1
What about delaying the following issues / PRs?
#375 - it's still not August 2022 and (following the discussion in the issue)
would require to implement first a validating mode of the sitemap parser.
#351 - yes, this PR is approved by me. However, since it's a significant change
how robots.txt rules are handled, we may move this to a 1.4 release
and have "IETF RFC draft compliance" as a major feature. This should
include at least: #192, #351, #360, #362
As of today, the draft reached version 12 [1], so it may change in minor
details. Nevertheless, #192 and #351 would make the crawler-commons
robots.txt parser behave more similar to how major search engines
handle the robots.txt rules.
Any thoughts? I plan to spend some time during summer to bring this
feature forward.
Best,
Sebastian
[1]
https://datatracker.ietf.org/doc/draft-koster-rep/history/
On 7/7/22 10:07, Richard Zowalla wrote:
> Hi,
>
> We have only a few issues marked for 1.3
> <
https://github.com/crawler-commons/crawler-commons/milestone/9> and quite a