Fwd: How to tell (guess?) if a LLM hallucinates?

90 views
Skip to first unread message

Gianluca Miscione

unread,
Jun 21, 2024, 3:12:18 PM6/21/24
to ml-...@googlegroups.com, mlds-africa

How to tell (guess?) if a LLM hallucinates? Semantic Entropy may be an answer (beside cross-checking with external sources)

“Our method works by sampling several possible answers to each question and clustering them algorithmically into answers that have similar meanings, which we determine on the basis of whether answers in the same cluster entail each other bidirectionally. That is, if sentence A entails that sentence B is true and vice versa, then we consider them to be in the same semantic cluster.”

https://www.nature.com/articles/s41586-024-07421-0

Good weekend,

 

 

 

image009.png
image010.png
image011.jpg
image012.png
Reply all
Reply to author
Forward
0 new messages