Best Regards,
Kevin Chao
📧 kevin.ev...@agiga.ai
📱 415.226.5026
🕶️ EchoVision AI Smart Glasses – built by, with, and for the blind community: Providing real-time audio descriptions of the visual world.
🔗 Facebook Group
https://www.facebook.com/groups/echovision/
🌐 Website
https://echovision.agiga.ai
📺 Watch EchoVision in Action / Media Coverage
https://echovision.agiga.ai/#news-insights
Hi Kevin,
I have high hopes for the EchoVision glasses. However, I have 2 major concerns which have been expressed by others elsewhere.
The video you note at https://youtu.be/ZkXCEx8MSRs is an example of one of my concerns. At least in the first few minutes, the lady in the video expresses her amazement at how well the glasses read a book. However, we do not hear the glasses reading. And we do not know if the book was provided by EchoVision staff or by the lady.
A major problem with all OCR-based reading devices for the blind is that their sales teams select books to demo that have high quality paper, high quality, unsmudged ink, print with lines the same width, and a font with few if any serifs. Letters in such a book can be readily recognized by just about any OCR software and will be read aloud flawlessly. This process is truly criminal because it implies that the purchaser can expect clear, clean reading on any material used. And I, as a 46-year user of OCR (beginning with the first huge Kurzweil reading Machine) and as a 52-year-user of an Optacon, know that these perfect print conditions seldom hold true. First, the video you note needs to play the reading output from the glasses. Second, we need examples of books, flyers, and pamphlets being read by the glasses that users provide. I am heartily sick of hearing the glasses read The Magic Tree House, which I suspect of including the “perfect” print conditions that I just described. I need to hear it read a wide variety of books, flyers, documents, adds, and addresses on envelopes.
I have a similar strong criticism of the scene description and live description examples shown in the EchoVision webinars. I can absolutely confirm that it describes Zharon’s (sorry about misspelling her name) apartment beautifully. I could probably get a perfect description of my house using best lighting conditions. However, I don’t plan to use these glasses to describe my home under perfect lighting conditions. I plan to walk down a street and hope to identify and read the signs on buildings that I pass so that I can easily find the location I want. I plan to use the glasses indoors to read office names and numbers that I pass so that I can locate the office I need to find. I have nowhere seen any demo that demonstrates these activities.
I don’t expect these or any glasses to be perfect. I don’t expect them to perform 100% in glare or in poor lighting conditions. I don’t expect them to read poor quality print on poor quality paper flawlessly. But I, as an end user of these glasses, and a purchaser who has made a $99 deposit, do expect to see the range of quality, from good to bad, that these glasses provide under a variety of document quality and lighting conditions.
Kevin, I beg you to do everything that you can to have a demo demonstrate the :good, the bad, and the ugly” with these glasses. My hope is that the bad and the ugly will perform substantially better than do the Envision Glasses and the RayBan ones. No device that our present technology can create will read and describe scenes flawlessly under all conditions. I want to know how the glasses respond under bad conditions as well as under good ones.
Thank you for reading and passing my email along to the rest of the development team. I would gladly help in using a prototype of the glasses under a variety of print and lighting conditions and recording how the glasses perform under a variety of circumstances. However, I imagine my offer is unnecessary because this is one of the roles you play in glasses’ development. So I again beg you and the other developers to show the range of performance we can expect from the glasses and not just the extremely positive outcomes.
With best regards,
Mary Teresa (Terrie) Terlau, PH.D.
2347 Payne Street
Louisville, KY 40206
Phone: (502) 551-6382
Email: terr...@gmail.com
On 9 Aug 2025, at 5:43 AM, Kevin Chao <kevin.ev...@agiga.ai> wrote:
--
You're receiving this message because you're subscribed to the EchoVision Community Forum.
To view the archive or catch up on discussions:
https://groups.google.com/a/agiga.ai/g/echovision-discuss
To unsubscribe, send an email to:
echovision-disc...@agiga.ai
To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai.
Best Regards,
Kevin Chao
📧 kevin.ev...@agiga.ai
📱 415.226.5026
🕶️ EchoVision AI Smart Glasses – built by, with, and for the blind community: Providing real-time audio descriptions of the visual world.
🔗 Facebook Group
https://www.facebook.com/groups/echovision/
🌐 Website
https://echovision.agiga.ai
📺 Watch EchoVision in Action / Media Coverage
https://echovision.agiga.ai/#news-insights
Best Regards,
Kevin Chao
📧 kevin.ev...@agiga.ai
📱 415.226.5026
🕶️ EchoVision AI Smart Glasses – built by, with, and for the blind community: Providing real-time audio descriptions of the visual world.
🔗 Facebook Group
https://www.facebook.com/groups/echovision/
🌐 Website
https://echovision.agiga.ai
📺 Watch EchoVision in Action / Media Coverage
https://echovision.agiga.ai/#news-insights