Wouldn't it be great if we could take solutions others have developed, mix and match components, and use their scope as our search space?
With the huge amount of new research coming in every month, each with superiority claims, it's impossible to keep up-to-date and retain the "best of the market" title.
Given a standard way of sharing architectures—and a standard way of searching for architectures [e.g., custom PyPi classifier(s), and/or
zoo.tensorflow.org]—then an expanding search space (optimisers, loss, transfer-learning models, &etc. &etc.) with ensemble glue and linkage with other solutions [like
TFCO]), that can then be intelligently re-searched and automatically fine-tuned with genetic, bayesian, ray-tuning, NN and other solutions, then we really could hold that title.
Here's the issue, but it's gotten no activity:
Would be great to get your thoughts here and there,
PS: Was great on the Addons videoconference the other day. Again, feel free to @ tag me for any of the topics brought up =)