How to add an "Attention Mechanism" in Keras

763 views
Skip to first unread message

Ajwa aslam

unread,
Mar 22, 2021, 4:10:32 AM3/22/21
to Keras-users
Hello Everyone,

I want to add an attention layer in my "Conv-Lstm" keras model for Text Classification.
How can I do this? Can anybody please help me out in this regard

Thank you!

Sayak Paul

unread,
Mar 22, 2021, 4:17:51 AM3/22/21
to Ajwa aslam, Keras-users
You can refer to this example: https://www.tensorflow.org/tutorials/text/image_captioning

Additionally, there are two types of core attention layers present in TensorFlow:
Both come with minimal code examples to get you started. There's also a MultiHeadAttention layer. 

Sayak Paul | sayak.dev



--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/8606dfbc-a64d-44bc-966f-797243f1c525n%40googlegroups.com.

Ajwa aslam

unread,
Mar 22, 2021, 4:50:03 AM3/22/21
to Keras-users
Thank you
 I'll check them out.

Sai Durga Kamesh Kota

unread,
May 16, 2021, 10:31:38 AM5/16/21
to Keras-users
Hi,

If you want to use self-attention layer then you can use this package called Keras-self-attention ( https://pypi.org/project/keras-self-attention/ ). It is also very easy to get started.
If you have any further doubts, feel free to discuss them.

With Regards,
Sai Durga Kamesh Kota
Data Science Intern
Sony Research India
Reply all
Reply to author
Forward
0 new messages