As mentioned, we'll be sending out a monthly newsletter describing the state of the repo, top relevant issues, and the road-map going forward. A big reason for this is to boost community input so please comment on the below information if you have any thoughts!
Summary:
Haven't sent a newsletter in a couple of months so I'll do my best to summarize what's been going on. We've released a 0.9 version of Addons and will soon release 0.10 pinned to TF2.2 with support for python3.8. We've completed our migration to pytest and are actively working on distributed testing. Another high priority is removing duplication between what's available in tensorflow/models and tensorflow/addons. We've begun meetings with the model garden team and hope to improve user experience across the ecosystem. Lastly we're in on-going discussion about how to best support TPU users by ensuring as many addons as possible are capable of XLA compilation.
User adoption has dramatically increased and with that we're seeing many new contributors. Wanted to extend a thank you to all the continued and new contributors! Also wanted to remind everyone in our SIG that PR reviews are greatly appreciated and is vital to keeping our repository sustainable. Anyone is welcome to review and we encourage multiple reviews per PR.
Daily Downloads for package TFA:
Below are some of the landmark issues that we're aiming to address in the coming month(s):
- Remove custom-op activations. We've determined that the maintenance burden and incompatibilities have made our custom-op activations obsolete. This issue describes the possible steps we'll take to deprecate them and are looking forward to fully compatible activations in the near future!
- Migrate GELU to TensorFlow core. This issue has lagged for too long, but the good news is we've discussed a plan to formalize up-streaming to tf-core and will be submitting the RFC for GELU this month. Keep an eye out for an RFC template on tensorflow/community which will describe the process.
- Enable testing with distribution strategies. Since our pytest refactor we now have the capability to cleanly test distributed strategies across several virtual workers. This is especally important for Addons which are likely to have issues in distribution strategies if they are not thoroughly tested like some of our optimizers. Thank you Gabriel de Marmiesse for all your work on this!
Maintainership:
As was originally mentioned in our sustainability RFC, we want to send out the status of our maintainership so submodules and subpackages can more readily be adopted. To become a maintainer with write-access on tensorflow/
addons we ask that you sign up to support some submodules first then we can grant as
defined on our landing page. I
f you would like to be a submodule maintainer please feel free to submit a PR adding yourself to the CODEOWNERS file.
Below we list submodules without ownership... though we also encourage becoming a submodule maintainer for an already covered submodule!
- /tensorflow_addons/image/interpolate_spline*.py
- /tensorflow_addons/image/sparse_image_warp*.py
- /tensorflow_addons/losses/metric_learning*.py
Best Regards,