LLMs, by themselves, are not intelligent

14 views
Skip to first unread message

John F Sowa

unread,
Sep 29, 2025, 3:19:36 PM (3 days ago) Sep 29
to ontolog-forum, CG
LLMs, by themselves, are stupid.  Claims that they can achieve Artificial General Intelligence have not been supported by any evidence.

Many people who had jobs speaking to customers on the phone have been replaced by AI systems.  One service that I had used for years replaced their friendly human operator with an equally friendly computer voice.  I made the usual arrangements, and the friendly computer assured me that all the details had been arranged.

Today was the day when the services were to be provided.  But nothing happened.  I called to complain, and got an actual human on the phone.  He said that there was no record of any request for that service at that location on that date.  It wasn't a total disaster, but it caused an inconvenient delay in the schedule.  

Apple tried to replace Siri with a new LLM-based system.  The new version supported a wider range of services and options.  Unfortunately, it often  made mistakes, sometimes serious enough to damage the TV or other equipment attached to it.  Apple canceled two years of development -- a huge waste of time and money.

Basic point:  LLMs are extremely valuable for two kinds of services:  (1) translating languages, natural or artificial; (2) Finding useful information that may answer a question or solve a problem.  Both are valuable.

But point #2 is only a good guess, technically known as an abduction.   To verify the guess,  a logic-based process of deduction followed by testing that involves some kind of action or observation.  Then the verified results are added to the knowledge base by induction.  And the cycle continues with more abductions, deductions, testing, and inductions.

Without those repeated cycles of abduction, deduction, testing, and induction, there is nothing that remotely resembles true intelligence -- human or artificial.

John

Alex Shkotin

unread,
Sep 30, 2025, 5:17:06 AM (3 days ago) Sep 30
to ontolo...@googlegroups.com, CG

JFS:"One service that I had used for years replaced their friendly human operator with an equally friendly computer voice.  I made the usual arrangements, and the friendly computer assured me that all the details had been arranged.


Today was the day when the services were to be provided.  But nothing happened.  I called to complain, and got an actual human on the phone.  He said that there was no record of any request for that service at that location on that date.  It wasn't a total disaster, but it caused an inconvenient delay in the schedule."


John, what you described is more of a detective story. If an IT company keeps logs of its service activity, and during an investigation, the person convincingly claims there was no call, then who did you call?

If they lied to you, that means (abduction) they don't keep logs. You can start an investigation by looking at your phone logs and billing records.


Intriguing story, 


Alex



пн, 29 сент. 2025 г. в 22:19, John F Sowa <so...@bestweb.net>:
--
All contributions to this forum are covered by an open-source license.
For information about the wiki, the license, and how to subscribe or
unsubscribe to the forum, see http://ontologforum.org/info
---
You received this message because you are subscribed to the Google Groups "ontolog-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ontolog-foru...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/ontolog-forum/19c208f5ec1348998821e6e3f7ee1dcc%4093b4684c87564e4aa630ad8a3d9405a3.

John F Sowa

unread,
Sep 30, 2025, 3:27:33 PM (2 days ago) Sep 30
to ontolo...@googlegroups.com, CG
Alex,

What I did was to call the company that used that service.  Then I explained that the service company that they were using had a BOT that made a serious mistake.  They said that they would look into the matter.

They also said that the next time I used that service I should ask for a human agent.   In any case, examples like these show that these AI Bots have a long way to go before they become as intelligent as a typical clerk with only a high-school education.

Anybody who claims that superintelligence is coming soon is spouting what is technically called Phony Baloney or even Bovine Fecal Matter.

John
 


From: "Alex Shkotin" <alex.s...@gmail.com>

Avril Styrman

unread,
Oct 1, 2025, 2:54:55 AM (yesterday) Oct 1
to ontolo...@googlegroups.com, CG
Dear John, Alex, and all,

for my experience, a shop bot has never been helpful.

It has always gone like this: I go to the chat of an online shop; the bot gives useless answers; finally the chat is transferred to a human agent.

I believe that one day they'll work just fine, but until then they'll waste a lot of time.

Cheers,

Avril


--
All contributions to this forum are covered by an open-source license.
For information about the wiki, the license, and how to subscribe or
unsubscribe to the forum, see http://ontologforum.org/info
---
You received this message because you are subscribed to the Google Groups "ontolog-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ontolog-foru...@googlegroups.com.


--

Kind regards,

Avril Styrman, PhD
+358 40 7000 589

Alex Shkotin

unread,
Oct 1, 2025, 7:04:34 AM (yesterday) Oct 1
to ontolo...@googlegroups.com, CG
JFS:"They said that they would look into the matter."
It's a shame they didn't say "...and report back to you."
In any case, we're talking about a bad IT decision.
Let's assume superintelligence is with us. Some guys will still make bad IT decisions.

Alex

вт, 30 сент. 2025 г. в 22:27, John F Sowa <so...@bestweb.net>:
--
All contributions to this forum are covered by an open-source license.
For information about the wiki, the license, and how to subscribe or
unsubscribe to the forum, see http://ontologforum.org/info
---
You received this message because you are subscribed to the Google Groups "ontolog-forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ontolog-foru...@googlegroups.com.

Alex Shkotin

unread,
Oct 1, 2025, 8:39:18 AM (yesterday) Oct 1
to ontolo...@googlegroups.com, CG
Dear Avril, 

Of course we expect more from a robot advisor. But JFS describes a much more advanced situation, one that Tim Berners-Lee began dreaming about 25 years ago: a chatbot committed to performing some actions on JFS's behalf beyond search.
And suddenly, it turns out that not only was the action never performed, but there's no trace of any interaction with the chatbot.

Alex

ср, 1 окт. 2025 г. в 09:54, Avril Styrman <avril....@gmail.com>:
Reply all
Reply to author
Forward
0 new messages