combining Annif and LLM's

132 views
Skip to first unread message

Enrico Laloli

unread,
Jul 16, 2024, 7:06:16 AM7/16/24
to Annif Users
Hello,

some of you have obviously been experimenting with large Language Models (LLM). But they basically lack the support for RDF / SKOS which Annif has out of the box.
Do the developers of Annif see any possible opportunity for linkage between a LLM and Annif, that is combining both for better associative matching support?

Enrico

anna.k...@googlemail.com

unread,
Jul 16, 2024, 7:28:53 AM7/16/24
to Annif Users
We have been working on a way to integrate X-Transformer into Annif -- would that help you?
Best
Anna

Jim Hahn

unread,
Jul 17, 2024, 3:16:02 AM7/17/24
to anna.k...@googlemail.com, Annif Users
This is a really interesting thread, I'm looking into that transformers repo.

I have found some bert models for text machine learning (named entity recognition) are better in some cases that GPT-like services, the GLiNER tool in particular has some impressive NER capabilities -- https://huggingface.co/gliner-community/gliner_large-v2.5

I use GLiNER in part of an Apache Airflow project to steer outputs from Annif into the facets from a vocabulary we use at my library, the FAST subject vocabulary. You can read about that approach here, which combines Annif, LLMs and Named Entity Recognition, along with web search: https://repository.upenn.edu/handle/20.500.14332/60308 -- page 14 talks about aligning the NER outputs to FAST schema times for our Annif services.

-Jim

--
You received this message because you are subscribed to the Google Groups "Annif Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to annif-users...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/annif-users/1de38077-5360-4355-b56c-d575205712c3n%40googlegroups.com.

Enrico Laloli

unread,
Jul 17, 2024, 6:44:21 AM7/17/24
to Annif Users
Thanks Anna and Jim. I am not much of an expert in LLM's, but these approaches look interesting. Jim, good paper / presentation and references there.
I have been using mainly lexical algorithms combined with the associative ones in Annif. I am interested in subject indexing, with a vocabulary of course, pertaining to articles with constantly new subjects. Training algorithms for these articles would always be one step behind and often leads to false subject assignments. And lexical approaches can never come up with semantically related subjects. Testing for relevance of subjects assigned by Annif is one first thing I have done by comparing them to the rest of the assigned terms and the vocabulary context of the subject term. This would greatly benefit from a context set of subjects checked by a LLM. The same goes for suggestions of semantically related subjects that are not in the text.

Enrico

Op woensdag 17 juli 2024 om 09:16:02 UTC+2 schreef jimf...@gmail.com:

juho.i...@helsinki.fi

unread,
Jul 17, 2024, 7:11:02 AM7/17/24
to Annif Users
We don't have any concrete plans for using language models in Annif, but it certainly is an interesting topic.

As Enrico pointed out, to use LLMs for subject indexing, they need to be aware of the vocabulary, and when the number of vocabulary terms is large, this is a big challenge.

Similarly to Jim's filtering approach (thanks for sharing your project and info about GLiNER!), instead of subject indexing from scratch, I have tried to just refine the scores of Annif suggestions with OpenAI's GPT models; basically by getting 100 suggestions for a text with NN ensemble, and putting them into a prompt with the text and an instruction to "score the keywords with a value between 0.0 and 1.0". However, this has not given the quality gain I hoped for, but this is partly because the LLM quite often does not score all the given subjects or hallucinates new subjects (I guess Llmafile with the grammar restriction could help in this).

Another issue with this has been the processing time, which increases by a factor of five when using OpenAI's GPT3.5 (and in the long run also the costs could be quite large).

One idea we have had is to indirectly utilize LMMs to produce training data that the existing Annif algorithms could in turn be trained with.

-Juho

anna.k...@googlemail.com

unread,
Oct 7, 2024, 5:07:42 AM10/7/24
to Annif Users
Regarding this conversation, my colleague Lakshmi rajendram bashyam has created a PR to integrate the transformer model X-Transformer into Annif: https://github.com/NatLibFi/Annif/pull/798
Feel free to join the discussion!

Best wishes
Argie (fka Anna)

MJ Suhonos

unread,
Dec 11, 2024, 1:36:53 AM12/11/24
to Annif Users
Hi all,

I am working on a research project implementing Annif using the X-Transformer (PECOS) PR linked above, and am extremely interested in helping to develop support for this classifier.

I have been able to get the code working with excellent results on my (rather small) dataset; roughly 1.5-2x higher scores than the LFO/nn ensemble (including weights) as used in this recent Annif paper. eg.

LFO/nn:

F1@5:                          0.1393
NDCG:                          0.1928
NDCG@5:                        0.2003
NDCG@10:                       0.1938
Precision@1:                   0.3233
Precision@3:                   0.2016
Precision@5:                   0.1502
True positives:                391
False positives:               3859
False negatives:               2070

X-Transformer:

F1@5:                          0.2472
NDCG:                          0.3204
NDCG@5:                        0.2867
NDCG@10:                       0.3216
Precision@1:                   0.2860
Precision@3:                   0.2698
Precision@5:                   0.2633
True positives:                845
False positives:               3455
False negatives:               1616

However, there were a couple of modifications I had to make to the current code in the PR to make it work:
  1. This commit is required for updated Keras support (/annif/backend/nn_ensemble.py) -- more on this below.
  2. In /annif/backend/xtransformer.py:L90, "distilbert-base-multilingual-uncased" is no longer available on Hugging Face, so I used "distilbert-base-uncased" instead.
  3. In /annif/backend/fasttext.py:L70, I had to add "fasttext.FastText.eprint = print" to suppress 'missing method' errors.
There is an excellent discussion within the Pull Request thread, in particular, the conflict (?) between PyTorch and and TensorFlow.  I don't know enough to speak to this, although Keras seems to be an attempt to abstract/unify these frameworks.  What I can say is that I am unable to feed the results of the xtransformer backend into the nn_ensemble backend due to these compatibility issues -- however, a simple ensemble will work, although it doesn't provide any scoring benefits with other backends that I've been able to find.

My current development environment is an Apple M2 / 24GB RAM, and xtransformer uses about 10-12GB of memory on my dataset, but it definitely pushes the hardware to its limit.  The next environment I'll be working with is an Intel i5-13600K / 64GB RAM, with an RTX 4070 (about 5000 CUDA cores) in an x86_64 VM.  CUDA support for this backend would be very helpful!

However, using xtransformer on its own is a big enough step forward that I'm likely to use it exclusively, and try to improve scoring on my dataset with hyperparameter tuning, and possibly varying transformer models.  PECOS is a very exciting architecture, and I'd like to learn as much more about it as possible.  Again, there is a great discussion about how to best incorporate it with Annif in the PR thread, but I believe a higher-level discussion of transformer-based backends makes more sense here.

Best,
MJ
Reply all
Reply to author
Forward
0 new messages