will you be possibly interested in donating to Keras core hypercomplex NN codes (Keras + TensorFlow and Keras + PyTorch) allowing to create Dense and Convolutional (1D, 2D, 3D) layers based on hypercomplex algebras as well as general algebras?
There are some specific codes for quaternions and other selected hyperalgebras scattered on the web, however there is no library for general (hypercomplex) algebras. We can extend Keras by this functionality.
All the best,
Radek
--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/keras-users/7800c991-471f-4fca-8c69-bd57f4c22de9n%40googlegroups.com.
Dear Francois,
thank you for interesting questions. Here are answers:
1. What are the use cases for this?
It was shown in some recent papers that neural networks based on hypercomplex algebras have the same performance and usually much less trainable parameters as standard neural networks based on real numbers (the ones that are in Keras). Thanks to it they are easier to learn and works faster. For instance, convolutional 2D hypercomplex layer outperforms in terms of trainable parameters (real numbers/typical) convolutional 2D layer – in the case of images it was shown in https://ui.adsabs.harvard.edu/abs/2021arXiv211206685A/abstract It was shown for 4D hypercomplex algebras – RGBA data were encoded as a simple 4D algebra element. We also showed it in time series analysis in our recent manuscript: https://arxiv.org/abs/2401.04632
However, there is much more recent results of this kind. At a first sight it contradicts the intuition since hypercomplex algebra introduces some additional constraints for trainable parameters, so it should be less optimal than classical layers, however, experiments shows different.
There exist some implementations for different hypercomplex algebras appears as independent packages, e.g., for quaternions, etc. Our approach is quite simple and allows to define dense and convolutional layers for all possible hypercomplex and even more general algebras. The implementation can be a starting point to rewrite all standard layers in a hypercomplex way, as we explained in our recent theoretical manuscript (see below).
Our motivation is just to add the library to some popular NN library (as Keras) as a summary of our research and open this area for many practical applications that can be done by everyone who can use Keras. It starts to be mature discipline so standardization is needed.
2. Do you have any papers about it?
In fact, recently we have submitted two manuscripts:
a) Theoretical one – explaining mathematical part of implementation:
https://arxiv.org/abs/2407.00449
b) Practical one – with description of library:
https://arxiv.org/abs/2407.00452
c) Here is library we want to include to Keras (possibly with adjustments:
https://github.com/rkycia/KHNN
We wanted to named this package ‘HyperKeras’, however, we wanted to publish it fast so the 'working name' name KHNN. The second thought was to ask the community to include it to Keras – it is the motivation for this thread.
If you
have more questions please do not hesitate to ask.
All the best,
Radek