AI Hallucinations in Arabic: Which term is most accurate?

150 views
Skip to first unread message

Amina EL GANADI

unread,
Jun 9, 2025, 3:33:52 AMJun 9
to SIGARAB: Special Interest Group on Arabic Natural Language Processing

Assalamo 'Alaikom,

Dear Community,

I’m currently writing my PhD dissertation on the nature and impact of AI hallucinations, with a particular focus on how this phenomenon is defined, interpreted, and addressed across different languages and disciplinary contexts.

In English-language research, the term hallucination is widely used, though increasingly debated. Scholars have proposed alternative terms such as confabulationdelusion, or even bullshit to better capture the epistemological  dimensions of the issue.

As part of this work, I’m also exploring how the concept is rendered in Arabic.

Which term is most widely used (or most appropriate) in Arabic to refer to what is known in English as “AI hallucination”?
I’ve come across translations such as هلوسة الذكاء الاصطناعي and هلاوس الذكاء الاصطناعي, but I’m also considering whether terms like اختلاق (fabrication) or تلفيق (falsification) might offer more semantic precision depending on the context.

I’d greatly appreciate any insights, especially from colleagues working in Arabic NLP, translation studies, or digital humanities.

With thanks in advance,

Amina

Nizar Habash

unread,
Jun 9, 2025, 5:02:05 AMJun 9
to Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
Hi Amina - my immediate reaction is that اختلاق أو فبركة (fabrication) or تلفيق (falsification) both imply intent (specifically intent to  deceive)... which risks anthropomorphizing the machine... 
Hallucination feels equally out of control for humans and machines. Another term in English is confabulations  سرد تخيلي أو استرسال وهمي >> توهمات؟.... 
Perhaps اختلاق غير متعمد can work... but it is unnecessary....  The word هلوس/هلوسة/مهلوس is already in the Arabic dictionary: https://www.almaany.com/ar/dict/ar-ar/%D9%87%D9%84%D9%88%D8%B3/ 
Best
Nizar


Nizar

--
You received this message because you are subscribed to the Google Groups "SIGARAB: Special Interest Group on Arabic Natural Language Processing" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sigarab+u...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/sigarab/72bf3dde-909c-49c6-be83-c69d9e232128n%40googlegroups.com.


--
Nizar Habash
Professor of Computer Science
New York University Abu Dhabi
https://www.nizarhabash.com/ 

Eric Atwell

unread,
Jun 9, 2025, 5:21:40 AMJun 9
to Nizar Habash, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
An alternative is that "hallucination" is an AI/NLP technical term, distinct in meaning from the general English language word, and so, like many other Computer Science terms, can be used in other languages without translation.


Eric Atwell, Professor of Artificial Intelligence for Language 

 School of Computer Science, Uni of LEEDS, LS2 9JT, UK                  


From: sig...@googlegroups.com <sig...@googlegroups.com> on behalf of Nizar Habash <nizar....@nyu.edu>
Sent: 09 June 2025 10:01 AM
To: Amina EL GANADI <amina.e...@gmail.com>
Cc: SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>
Subject: Re: [SIGARAB] AI Hallucinations in Arabic: Which term is most accurate?
 

CAUTION: External Message. Use caution opening links and attachments.

Emad Nawfal (عمـ نوفل ـاد)

unread,
Jun 9, 2025, 6:31:37 AMJun 9
to Eric Atwell, Nizar Habash, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
Further to what Nizar and Eric said, the word هلوسة  may have some other justification. While I don't think they're etymologically related, the Arabic root  ه ل س means to become weak, physically or mentally.   
في لسان العرب: ورجل مَهْلُوسُ العقل أي مسلوبه. ورجل مهتلس العقل ذاهبه
وفي مقاييس اللغة: الْمَهْلُوسُ: الضَّعِيفُ الْعَقْلِ



--
-------------------------------------
Emad Soliman Mohamed
University of Bradford

Kareem Darwish

unread,
Jun 9, 2025, 10:50:09 AMJun 9
to Emad Nawfal (عمـ نوفل ـاد), Eric Atwell, Nizar Habash, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
Though هلوسة may have a specific dictionary meaning, I have heard it used repeatedly for hallucination in the LLM context. So much so, that I think it is becoming like jargon term for the phenomenon. 


From: sig...@googlegroups.com <sig...@googlegroups.com> on behalf of Emad Nawfal (عمـ نوفل ـاد) <emadn...@gmail.com>
Sent: Monday, June 9, 2025 1:31:20 PM
To: Eric Atwell <E.S.A...@leeds.ac.uk>
Cc: Nizar Habash <nizar....@nyu.edu>; Amina EL GANADI <amina.e...@gmail.com>; SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>

Mustafa Jarrar

unread,
Jun 9, 2025, 11:58:54 AMJun 9
to Kareem Darwish, Emad Nawfal, Eric Atwell, Nizar Habash, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
Eid Mubarak to all!
Check out the translation of “hallucination” as هلوسة in Qabas:

I think it’s a solid translation. also both terms have similar sound.



For those outside the AI world: when people say LLMs “hallucinate,” they mean the model generates outputs that sound confident but are factually wrong—like how a person might hallucinate in a psychological sense.  For those inside the AI field: the term/translation is not always comfortable because LLMs aren’t actually hallucinating —they’re just making incorrect predictions due to limitations in their training data or probabilistic architecture, etc.


Best, Mustafa

On 9 Jun 2025, at 5:50 PM, Kareem Darwish <kareem...@live.com> wrote:

Though هلوسة may have a specific dictionary meaning, I have heard it used repeatedly for hallucination in the LLM context. So much so, that I think it is becoming like jargon term for the phenomenon. 


From: sig...@googlegroups.com <sig...@googlegroups.com> on behalf of Emad Nawfal (عمـ نوفل ـاد) <emadn...@gmail.com>
Sent: Monday, June 9, 2025 1:31:20 PM
To: Eric Atwell <E.S.A...@leeds.ac.uk>
Cc: Nizar Habash <nizar....@nyu.edu>; Amina EL GANADI <amina.e...@gmail.com>; SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>
Subject: Re: [SIGARAB] AI Hallucinations in Arabic: Which term is most accurate?
 
Further to what Nizar and Eric said, the word هلوسة  may have some other justification. While I don't think they're etymologically related, the Arabic root  ه ل س means to become weak, physically or mentally.   
في لسان العرب: ورجل مَهْلُوسُ العقل أي مسلوبه. ورجل مهتلس العقل ذاهبه
وفي مقاييس اللغة: الْمَهْلُوسُ: الضَّعِيفُ الْعَقْلِ

On Mon, Jun 9, 2025 at 2:21 PM 'Eric Atwell' via SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com> wrote:
An alternative is that "hallucination" is an AI/NLP technical term, distinct in meaning from the general English language word, and so, like many other Computer Science terms, can be used in other languages without translation.


Eric Atwell, Professor of Artificial Intelligence for Language 
 School of Computer Science, Uni of LEEDS, LS2 9JT, UK                  

From: sig...@googlegroups.com <sig...@googlegroups.com> on behalf of Nizar Habash <nizar....@nyu.edu>
Sent: 09 June 2025 10:01 AM
To: Amina EL GANADI <amina.e...@gmail.com>
Cc: SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>
Subject: Re: [SIGARAB] AI Hallucinations in Arabic: Which term is most accurate?
 
Hi Amina - my immediate reaction is that اختلاق أو فبركة (fabrication) or تلفيق (falsification) both imply intent (specifically intent to  deceive)... which risks anthropomorphizing the machine... 
Hallucination feels equally out of control for humans and machines. Another term in English is confabulations  سرد تخيلي أو استرسال وهمي >> توهمات؟.... 
Perhaps اختلاق غير متعمد can work... but it is unnecessary....  The word هلوس/هلوسة/مهلوس is already in the Arabic dictionary: https://www.almaany.com/ar/dict/ar-ar/%D9%87%D9%84%D9%88%D8%B3/ 
Best
Nizar


Nizar

On Mon, Jun 9, 2025 at 11:33 AM Amina EL GANADI <amina.e...@gmail.com> wrote:

Assalamo 'Alaikom,

Dear Community,

I’m currently writing my PhD dissertation on the nature and impact of AI hallucinations, with a particular focus on how this phenomenon is defined, interpreted, and addressed across different languages and disciplinary contexts.

In English-language research, the term hallucination is widely used, though increasingly debated. Scholars have proposed alternative terms such as confabulationdelusion, or even bullshit to better capture the epistemological  dimensions of the issue.

As part of this work, I’m also exploring how the concept is rendered in Arabic.

Which term is most widely used (or most appropriate) in Arabic to refer to what is known in English as “AI hallucination”?
I’ve come across translations such as هلوسة الذكاء الاصطناعي and هلاوس الذكاء الاصطناعي, but I’m also considering whether terms like اختلاق (fabrication) or تلفيق (falsification) might offer more semantic precision depending on the context.

I’d greatly appreciate any insights, especially from colleagues working in Arabic NLP, translation studies, or digital humanities.

With thanks in advance,

Amina


--
Nizar Habash
Professor of Computer Science
New York University Abu Dhabi
https://www.nizarhabash.com/ 
--

Hamdy S. Hussein

unread,
Jun 9, 2025, 12:22:27 PMJun 9
to Mustafa Jarrar, Kareem Darwish, emadn...@gmail.com, Eric Atwell, Nizar Habash, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
Eid Mubarak, dears!

We used the term Halwasa هلوسة in our paper and we shared 10K sentences labeled for factuality and correctness.
These sentences were generated by ChatGPT and GPT-4 in June 2023.

By that time, the models hallucinated in almost 25% of the cases. We observe consistent improvements in LLMs outputs over time.
I hope this is useful to some of our colleagues working on LLM hallucinations.

Best,
Hamdy

Hamdy S. Hussein
Principal Software Engineer
Qatar Computing Research Institute
+974 445 41679

HBKU-RGB.png




From: sig...@googlegroups.com <sig...@googlegroups.com> on behalf of Mustafa Jarrar <mustaf...@gmail.com>
Sent: Monday, June 9, 2025 6:58 PM
To: Kareem Darwish <kareem...@live.com>; emadn...@gmail.com <emadn...@gmail.com>; Eric Atwell <E.S.A...@leeds.ac.uk>; Nizar Habash <nizar....@nyu.edu>; Amina EL GANADI <amina.e...@gmail.com>; SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>
--
You received this message because you are subscribed to the Google Groups "SIGARAB: Special Interest Group on Arabic Natural Language Processing" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sigarab+u...@googlegroups.com.

Rick Rejeleene

unread,
Jun 10, 2025, 7:57:26 AMJun 10
to Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
Hi Amina, I work in NLP. 
I would suggest using gibberish as the word. 
You may consult other arabic speakers, use كلام غير مفهوم
I hope this helps
Rick 


--
You received this message because you are subscribed to the Google Groups "SIGARAB: Special Interest Group on Arabic Natural Language Processing" group.
To unsubscribe from this group and stop receiving emails from it, send an email to sigarab+u...@googlegroups.com.

Omer Said

unread,
Jun 10, 2025, 8:07:46 AMJun 10
to Rick Rejeleene, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
مرحبا جميعا
أرجو أن يتم التوضيح لماذا تم العدول عن استعمال المصطلح الحقيقي (الخطأ) لهذه الحالة التي سُميت (هلوسة)؟
إن استعمال مصطلح (هلوسة) إن كان دالا على مفهوم (الخطأ) بحسب فهمي؛ فإنه يجسد حالة من (الخطأ) تم تسميتها بمصطلح (هلوسة) ليشكل مفهوما مقبولا يمكن التغاضي عنه أو على الأقل قبوله كما هو، بينما لو سُمي (خطأ) فإنه يجب حينها الاعتراف به كحالة غير مقبولة ويجب البحث لها عن حل.

بحكم عدم تخصصي في المجال التقني فقد تبادر إلى فهمي أن مفهوم (هلوسة) هو حالة مقبولة من الخطأ أشبه ما يكون بهامش الخطأ 0.05 بالمائة المقبولة في البحث العلمي.

تحياتي 
عمر الشحري

Nizar Habash

unread,
Jun 10, 2025, 8:41:51 AMJun 10
to Omer Said, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
مرحبا عمر - أظن أن كل الهلوسات أخطاء ولكن ليست كل الأخطاء هلوسات.  الذكاء الاصطناعي قد يخطئ بسبب أن المعلومات التي تدرب عليها خاطئة أو غير متوافقة مع المجتمع أو الحقيقة (والتي هي نفسها محل تساؤل) وفي هذه الحالة لا أعتبر النتيجة هلوسة - مثل التحيز الجنسي أو العنصري أو الحضاري الذي يمكن تفسيره كتحيز نمذجي وقد يكون خطأ بالنسبة للبعض وليس لآخرين. ولكن الهلوسة عادة هي نتيجة اعتباطية ولكن قُدمت بثقة كحقيقة لملء فجوة لغوية أو منطقية.  وطبعا أحيانا الهلوسة بتعريفي هذا قد تكون صحيحة (كذب المهلوسون ولو صدفوا 😄).

نزار      

عبد السلام الفيتوري أحمد النويصري

unread,
Jun 10, 2025, 11:28:12 AMJun 10
to Nizar Habash, Omer Said, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
السلام عليكم 

وان كانت الهلوسات هي الكلمة الخاطئة وفقا لقواميس اللغة العربية، لا يمكن تغييرها لانها وببساطة اصبحت المصطلح الصحيح المستخدم.  
هناك جملة تقال لدينا هنا " اخترعه تم سمه" المصطلح نقل من اللغة الانجليزية كما هو بغض النظر عن معناه الحقيقي في اللغة واصبحنا الان نعلم معناه  في اللغة العربية وان خالف ما هو موجود في قواميس اللغة.

اننا نعمل جاهدا للتعامل مع ما هو موجود على ارض الواقع سواء كان صحيحا ام خاطئا ولا يمكن تغيير المصطلحات الخاطئة الموجودة بكثرة والموروثة من اللغات الاخرى.

تحياتي

Dr Abdusalam Nwesri,

1507522253403_DSCSmaller.jpg

Associate Professor,

Faculty of Information Technology,

University of Tripoli,

P.O.Box: 5760 Hai Alandalus,

Tripoli - Libya.

Tel: +218922307021

Email: a.nw...@uot.edu.ly



Sent: Tuesday, 10 June 2025 2:41 PM
To: Omer Said <omaer...@gmail.com>
Cc: Amina EL GANADI <amina.e...@gmail.com>; SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>

Subject: Re: [SIGARAB] AI Hallucinations in Arabic: Which term is most accurate?

sondos krouna

unread,
Jun 10, 2025, 11:37:10 AMJun 10
to Nizar Habash, Omer Said, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
مساء الخير
عيدكم مبارك جميعا. اعتقد ان مفهوم   hallucination في الذكاء الاصطناعي يرتبط حسب ما فهمت بتوليد معطيات تبدو متماسكة وحقيقية ولكنها مستندة الى معلومات خاطئة يختلقها الذكاء الاصطناعي لسد الفراغات التي لديه. وهو في ذلك يحاكي الدماغ البشري الذي يقدم احيانا معلومات او تحليلات تبدو منطقية ومتماسكة ولكنها في الحقيقة قائمة على تخمينات. ولذلك اعتقد ان مصطلح الاختلاق او الايهام او التوهم قد تكون مناسبة لترجمة هذا المفهوم او المفاهيم المتفرعة عنه التي ذكرتها الباحثة اميرة.
هذا رأي متواضع من خارج الاختصاص
مع تحياتي
سندس كرونة

عبد السلام الفيتوري أحمد النويصري

unread,
Jun 10, 2025, 11:42:57 AMJun 10
to sondos krouna, Nizar Habash, Omer Said, Amina EL GANADI, SIGARAB: Special Interest Group on Arabic Natural Language Processing
نتيجة ترجمة كلمة 
hallucination

في مترجم قووقل هي

هلوسة
هذيان
اهتلاس




Dr Abdusalam Nwesri,

1507522253403_DSCSmaller.jpg

Associate Professor,

Faculty of Information Technology,

University of Tripoli,

P.O.Box: 5760 Hai Alandalus,

Tripoli - Libya.

Tel: +218922307021

Email: a.nw...@uot.edu.ly



From: sig...@googlegroups.com <sig...@googlegroups.com> on behalf of sondos krouna <sondo...@gmail.com>
Sent: Tuesday, 10 June 2025 5:36 PM
To: Nizar Habash <nizar....@nyu.edu>
Cc: Omer Said <omaer...@gmail.com>; Amina EL GANADI <amina.e...@gmail.com>; SIGARAB: Special Interest Group on Arabic Natural Language Processing <sig...@googlegroups.com>

Subject: Re: [SIGARAB] AI Hallucinations in Arabic: Which term is most accurate?

Muhammad Helmy

unread,
Jun 10, 2025, 4:35:37 PMJun 10
to SIGARAB: Special Interest Group on Arabic Natural Language Processing
على حد علمي أن الهلوسة في الاصطلاح الطبي هي نتيجة لوجود خلل في الدماغ والأعصاب. فبناءًا عليه لو نظرنا إلى أسباب الخطأ في النماذج اللغوية كما أشار د. نزار سنقدر على اختيار اللفظ الأنسب.

يعني إذا كان الخطأ بسبب خلل في التصميم والشبكات العصبية في التعلم العميق، وسلمنا جدلًا أن هذه الشبكات تشبه الشبكات العصبية عند الإنسان، فلفظ الهلوسة هو لفظ مناسب (عوامل داخلية).

أما إذا كان الخلل من البيانات وبإصلاح هذه البيانات وزيادتها يختفي الخطأ، فيمكن أن نقول أن النموذج كان غير ناضج أو تم تضليله (عوامل خارجية).

النموذج المهلوس سيعطي نتائج خطأ حتى مع تصحيح البيانات وزيادتها على عكس النموذج الغير ناضج.

ومع ذلك فالعلاج غالبًا ما يكون بإصلاح العوامل الداخلية والخارجية معًا، فيمكن أن يكون في الأمر سعة.

Moutaz Alkhatib

unread,
Jun 10, 2025, 5:21:09 PMJun 10
to SIGARAB: Special Interest Group on Arabic Natural Language Processing
Salam Amina

According to Wikipedia, AI hallucinations are defined as follows:
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting,[1][2] confabulation,[3] or delusion)[4] is a response generated by AI that contains false or misleading information presented as fact.[5][6] This term draws a loose analogy with human psychology, where hallucination typically involves false percepts. However, there is a key difference: AI hallucination is associated with erroneously constructed responses (confabulation), rather than perceptual experiences.[6]
> See: https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)

When you look at this definition, you'd find that Arabic words such as تلفيق implies malicious intent as well as a level of consciousness, which LLMs lack. The same applies for اختلاق as well.
Although using تخريف and its plural تخريفات might also be acceptable and anecdotally sufficient to describe such a phenomenon.
But seriously I think that هلوسة as an Arabic counterpart is proliferated enough in both academic research and popular use, and I don't believe a different term is needed.

Kind regards;
Moutaz

Guessoum, Ahmed

unread,
Jun 11, 2025, 5:19:38 AMJun 11
to SIGARAB: Special Interest Group on Arabic Natural Language Processing, Amina EL GANADI
assalaamu 3alaykum,

Let me start by saying that I commend Amina El Ganadi's efforts to correct a term if it is indeed not the most appropriate.  

My understanding is that hallucination refers to the situation where an LLM generates, based on whatever data it was trained on and the well-known mechanisms of transformers, some text in response to a prompt, and within a certain context (the latter being more or less long, depending on the model). The result, as usual, is given with confidence, because LLMs are not like expert systems which produce results along with Certainty Factors. The latter can help the user get a feel of how confident the system is about the reasoning done. Hallucination, in my opinion, is not the most appropriate term, although it is the most widely used term to refer to the phenomenon.
I underline in passing that I never adhere to using the same English words (or words from other languages) in Arabic; I believe that Arabic is rich enough and we can either find appropriate words for the concepts we have in mind, or even use the language generative mechanisms to come up with reasonably appropriate words.
In the Qur'an, the word اختلاق was used in the verse 7 of Surah Sad (Chapter 38):

مَا سَمِعْنَا بِهَذَا فِي الْمِلَّةِ الْآخِرَةِ إِنْ هَذَا إِلَّا اخْتِلَاقٌ {7}

[Shakir 38:7] We never heard of this in the former faith; this is nothing but a forgery:
[
Yusuf Ali 38:7] "We never heard (the like) of this among the people of these latter days: this is nothing but a made-up tale!"
[
Pickthal 38:7] We have not heard of this in later religion. This is naught but an invention.


I feel this word اختلاق is more appropriate than hallucination since it is what happens when an LLM generates an answer (and it IS programmed to do so systematically, without being allowed to say I do not know or my confidence level is such).


As to the "risks of anthropomorphizing the machine" by using this word اختلاق, I feel it is less so than hallucination which I see as more strongly giving the machine some human characteristics.


Thank you for an interesting discussion.


Ahmed



Prof. Ahmed Guessoum

Full Professor

ISE Department

ENSIA

ahmed.g...@ensia.edu.dz

Pôle technologique de Sidi Abdellah

ensia.edu.dz

http://lria.usthb.dz/TALAATeam/index.html

The content of this email is confidential and intended for the recipient specified in message only. It is strictly forbidden to share any part of this message with any third party, without a written consent of the sender. If you received this message by mistake, please reply to this message and follow with its deletion, so that we can ensure such a mistake does not occur in the future.



--

Abdulrahman Alosaimy

unread,
Jun 11, 2025, 4:33:02 PMJun 11
to Guessoum, Ahmed, SIGARAB: Special Interest Group on Arabic Natural Language Processing, Amina EL GANADI
السلام عليكم

في معجم مصطلحات الذكاء الاصطناعي، الذي كان بالشراكة بين سدايا ومجمع الملك سلمان العالمي للغة العربية، تم اختيار كلمة هلوسة مقابلا لكلمة hallucination.




وفي رأيي، أن المهم في الترجمة العربية أن تكون متبناة من قطاع كبير من المختصين، وألا تكون موهمة مع مصطلح آخر. وهو ما في رأيي ينطبق على "هلوسة".

مع الشكر الجزيل




Amina EL GANADI

unread,
Jun 22, 2025, 6:58:16 AMJun 22
to SIGARAB: Special Interest Group on Arabic Natural Language Processing
Dear all,

Many thanks to all who took the time to engage with my question and share their thoughtful insights. Your contributions have provided essential perspectives that I’m integrating into my broader research on AI-generated errors.

Building on this exchange, I’ve been reflecting more closely on the term at the centre of our discussion. While hallucination has become a widely used and technically convenient label for AI-generated errors, particularly in NLP, its usage is far from neutral. The term is both semantically and ethically charged, and its application to LLMs raises several concerns that deserve thoughtful consideration.

First, hallucination originates in psychological and clinical contexts, where it denotes perceptual errors experienced by sentient beings, typically the sensing of stimuli that do not exist. When applied to machines, even metaphorically, the term risks anthropomorphizing systems that neither perceive nor misperceive. It subtly suggests the presence of cognitive malfunction or sensory distortion, where in fact LLMs are generating outputs based solely on token prediction within statistical patterns derived from training data. They possess no perception, no experience, and no internal state.

This point is underscored in both the academic literature (e.g., Edwards, 2023; Ji et al., 2022) and in contributions to this discussion, which emphasize the importance of avoiding the attribution of agency or intent to non-sentient computational models.

Second, the term can obscure questions of accountability. AI hallucinations arise not from psychological malfunction but from well-known technical limitations, training data gaps, architecture constraints, decoding errors, or prompt ambiguities. To say a model “hallucinates” may inadvertently deflect attention from system design flaws, shifting blame away from developers, evaluators, or deployment contexts. Framing such outputs as “hallucinations” can normalize or excuse them, whereas describing them as fabrications, ungrounded responses, or generative errors would signal a need for correction and transparency.

There are also linguistic and cultural implications, particularly in Arabic. The term هلوسة is closely tied to clinical or pathological connotations, often evoking mental illness or cognitive disorder. Introducing it into AI discourse can inadvertently stigmatize or sensationalize the phenomenon, especially for non-specialist audiences who may not share the technical framing. As has been noted in this discussion, metaphors imported from English into Arabic do not come empty-handed, they reshape local semantic fields and bring with them new assumptions, tensions, and interpretive layers.

Compounding this is the semantic drift of the term hallucination across AI subfields. In computer vision, it once referred to constructive inference, adding plausible detail to degraded images (e.g., face hallucination). In NLP, the term has acquired a sharply negative meaning: confident but factually incorrect output. This shift introduces ambiguity for interdisciplinary researchers and challenges efforts to maintain consistent terminology across linguistic and technical domains.

From the perspective of digital humanists, this ambiguity is especially problematic. We often work at the intersection of language, knowledge, and cultural authority, engaging with sources and traditions that demand careful framing. Calling an AI-generated false citation a hallucination risks reducing complex epistemological failures to a technical shorthand that flattens interpretive nuance and obscures institutional accountability.

None of this is to suggest that the term hallucination/ هلوسة  should be discarded. As many of you have pointed out, it is now firmly embedded in both English and Arabic AI discourse. Its use is also expanding into other languages (this year, Treccani included “allucinazione” in its Neologismi list, marking its growing presence in Italian contexts as well). While the term remains a convenient shorthand within technical communities, I believe it is especially important in interdisciplinary, educational, or general-audience discussions, to clarify both what is meant and what is not meant when describing a model as “hallucinating.”

It may therefore be helpful to retain هلوسة as the established term of art in Arabic NLP, while also introducing more mechanism-focused alternatives such as فبركة (fabrication), خرابيط (informal for nonsense/gibberish), or اختلاق(fabrication/invention). These may more accurately emphasize the generative and synthetic nature of such outputs while avoiding clinical or anthropomorphic connotations. I have been exploring possible substitutes, and these seem among the more acceptable options currently available.

At the same time, I recognize that terms like fabrication can themselves be problematic, as they may suggest intentionality or deception, concepts inapplicable to computational systems. This only underscores the broader difficulty of identifying language that captures the nature of these outputs without resorting to misleading human analogies.

This remains an open inquiry, and I truly appreciate all the insights shared so far. I welcome any further thoughts as I continue my research.

Best,

Amina El Ganadi
Visiting PhD Student, University of St Andrews
Doctoral Researcher, Universities of Modena–Reggio Emilia & Palermo

Reply all
Reply to author
Forward
0 new messages