readin text and looking at images

9 views
Skip to first unread message

mrsi...@sasktel.net

unread,
Nov 29, 2025, 6:02:14 PMNov 29
to echovisio...@agiga.ai

Hi List,

 

This message is for Michael Hingston?

Apologies if I have your last name wrong.

 

Michael,  you have a beta  version of the echovision glasses.

 

When you use the glasses to look at images, or to read text, do you experience  some “guess work”  as many of us have encountered with the meta glasses.

 

Do the echovision glasses use terms like;

“it appears to be”

 

Do you get false  responses like;

 

“this is a kirklandsignature brand”

 

When infact it    is really a Ziploc or generic house brand.

 

I’m just trying to sort some of the wheat from the chaf before the grand reveal.

 

Thanks,

 

 

Monte Single

 

Sieghard Weitzel

unread,
Nov 29, 2025, 6:10:02 PMNov 29
to echovisio...@agiga.ai

I’m obviously not Michael, but I’d be surprised if there was absolutely 0 AI hallucination happening since that does seem to be something all AI/LLM models do at times.

--
You're receiving this message because you're subscribed to the EchoVision Community Forum.
 
To view the archive or catch up on discussions:
https://groups.google.com/a/agiga.ai/g/echovision-discuss
 
To unsubscribe, send an email to:
echovision-disc...@agiga.ai
To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai.

info.mich...@gmail.com

unread,
Nov 30, 2025, 12:57:32 PMNov 30
to echovisio...@agiga.ai

There are some AI hallucinations. AI is by no means yet perfect. However, what I see is that EchoVision is improving. This is a good thing.

 

 

Best Regards,

 

 

Michael Hingson

amandainp...@gmail.com

unread,
Nov 30, 2025, 10:04:31 PMNov 30
to echovisio...@agiga.ai

Also, AI is dependend on the AI latform and of course, the interface between the device and said platform.

 

I have seen some strange results from a popular Screen Reader V2026 and Picture Smart yielding some results, that aren’t in the interest of the general public.

Gene Warner

unread,
Dec 1, 2025, 9:02:07 AMDec 1
to echovisio...@agiga.ai
lol! Despite trying to hide the screen reader you were referring to you gave it away by using the actual name of a feature in JAWS!

Gene...
I am friends with the monster under my bed
I get along with the voices inside of my head
> echovision-disc...@agiga.ai <mailto:echovision-disc...@agiga.ai>
> To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai <mailto:echovision-disc...@agiga.ai> .
>
> --
> You're receiving this message because you're subscribed to the EchoVision Community Forum.
>
> To view the archive or catch up on discussions:
> https://groups.google.com/a/agiga.ai/g/echovision-discuss
>
> To unsubscribe, send an email to:
> echovision-disc...@agiga.ai <mailto:echovision-disc...@agiga.ai>
> To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai <mailto:echovision-disc...@agiga.ai> .
>
> --
> You're receiving this message because you're subscribed to the EchoVision Community Forum.
>
> To view the archive or catch up on discussions:
> https://groups.google.com/a/agiga.ai/g/echovision-discuss
>
> To unsubscribe, send an email to:
> echovision-disc...@agiga.ai <mailto:echovision-disc...@agiga.ai>
> To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai <mailto:echovision-disc...@agiga.ai> .

Buddy Brannan

unread,
Dec 1, 2025, 10:04:43 AMDec 1
to echovisio...@agiga.ai
Hi Monty,

I recently got a “You’re holding a Kirkland signature…” …some bottled water. … When in fact, I was holding a 12-pack of Coke Zero in cans. Oops. It happens, though I’d say that, apart from the wrong directions, up vs. down, left vs. right, in live AI, hallucinations, while they do happen, aren’t nearly so eggregious as that. Mostly. I don’t think we’ll ever get to 100% no hallucinations; after all, sighted humans can’t always agree on what they’re seeing. Even color can be somewhat subjective. So yeah, I think 100% no hallucinations is an unrealist expectation, though it’s absolutely reasonable that we aim for as few and as minor of hallucinations as possible. I also think that as LLM’s and visual language models and all the rest improve, and they will, fewer hallucinations will be a natural outcome. Remember, we’re witnessing a huge paradigm shift (OMG, I can’t believe I just said that) in its infancy. Crawl before you walk, walk before you run. We’re still crawling, people. 

--
Buddy Brannan
Customer Support And Training Specialist
Agiga



Gene Warner

unread,
Dec 1, 2025, 10:14:01 AMDec 1
to echovisio...@agiga.ai
Definitely! AI has been around for a long time, just look at the natural language program called Eliza that was developed back in 1964 through 1967 at MIT. AI is not new and yet AI is still very much in its infancy. Time and experience is what is needed here and there are no short cuts for either.

Gene...
I am friends with the monster under my bed
I get along with the voices inside of my head

----- Original Message -----

Buddy Brannan

unread,
Dec 1, 2025, 1:12:14 PMDec 1
to echovisio...@agiga.ai
Hi,

So there are two books that I think are interesting. These are by no means an endorsement by Agiga, and they’re very different from each other in tone and content. Both are on BARD and on Audible. First, *Scary Smart*. It’s about AI, how we got here, what we can expect, the paths we could go down with AI development, and so forth. … And what our responsibilities are in teaching and interacting with AI. The other book is *The Singularity Is Nearer* by Ray Kurzweil. Very academic and such. You might think his theories are out there, but so far, his timelines have tracked pretty closely to reality. 


--
Buddy Brannan
Customer Support And Training Specialist
Agiga


Reply all
Reply to author
Forward
0 new messages