BOok Reading Demo

16 views
Skip to first unread message

Kevin Chao

unread,
Aug 8, 2025, 3:43:43 PMAug 8
to EchoVision Community Forum
A lady trying book reading demo using EchoVision during ACB: https://youtu.be/ZkXCEx8MSRs

Best Regards,
Kevin Chao

📧 kevin.ev...@agiga.ai
📱 415.226.5026

🕶️ EchoVision AI Smart Glasses – built by, with, and for the blind community: Providing real-time audio descriptions of the visual world.

🔗 Facebook Group
https://www.facebook.com/groups/echovision/

🌐 Website
https://echovision.agiga.ai

📺 Watch EchoVision in Action / Media Coverage
https://echovision.agiga.ai/#news-insights

❓ FAQ
https://echovision.agiga.ai/faq/

terr...@gmail.com

unread,
Aug 9, 2025, 11:15:14 AMAug 9
to echovisio...@agiga.ai

Hi Kevin,

I have high hopes for the EchoVision glasses. However, I have 2 major concerns which have been expressed by others elsewhere.

 

The video you note at https://youtu.be/ZkXCEx8MSRs is an example of one of my concerns. At least in the first few minutes, the lady in the video expresses her amazement at how well  the glasses read a book. However, we do not hear the glasses reading. And we do not know if the book was provided by EchoVision staff or by the lady.

 

A major problem with all OCR-based reading devices for the blind is that their sales teams select books to demo that have high quality paper, high quality, unsmudged ink,  print with lines the same width, and a font with few if any serifs. Letters in such a book can be readily recognized by just about any OCR software and will be read aloud flawlessly. This process is truly criminal because it implies that the purchaser can expect clear, clean reading on any material used. And I, as a 46-year user of OCR  (beginning with the first huge Kurzweil reading Machine) and as a 52-year-user of an Optacon, know that these perfect print conditions seldom hold true. First, the video you note needs to play the reading output from the glasses. Second, we need examples of books, flyers, and pamphlets being read by the glasses that  users provide. I am heartily sick of hearing the glasses read The Magic Tree House, which I suspect of including the “perfect” print conditions that I just described. I need to hear it read a wide variety of books, flyers, documents, adds, and addresses on envelopes.                                   

 

I have a similar strong criticism of the scene description and live description examples shown in the EchoVision webinars. I can absolutely confirm that it describes Zharon’s (sorry about misspelling her name) apartment beautifully. I could probably get a perfect description of my house using best lighting conditions. However, I don’t plan to use these glasses to describe my home under perfect lighting conditions. I plan to walk down a street and hope to identify and read the signs  on buildings that I pass so that I can easily find the location I want. I plan to use the glasses indoors to read office names and numbers that I pass so that I can locate the office I need to find. I have nowhere seen any demo that demonstrates these activities.

 

I don’t expect these or any glasses to be perfect. I don’t expect them to perform 100% in glare or in poor lighting conditions. I don’t expect them to read poor quality print on poor quality paper flawlessly. But I, as an end user of these glasses, and a purchaser who has made a $99 deposit, do expect to see the range of quality, from good to bad, that these glasses provide under a variety of document quality  and lighting conditions.

 

Kevin, I beg you to do everything that you can to have a demo demonstrate the :good, the bad, and the ugly” with these glasses. My hope is that the bad and the ugly will perform substantially better than do the Envision Glasses and the RayBan ones.  No device that our present technology can create will read and describe scenes flawlessly under all conditions. I want to know how the glasses respond under bad conditions as well as under good ones.

Thank you for reading and passing my email along to the rest of the development team. I would gladly help in using a prototype of the glasses under a variety of print and lighting conditions and recording how the glasses perform under a variety of circumstances. However, I imagine my offer is unnecessary because this is one of the roles you play in glasses’ development. So I again beg you and the other developers to show the range of performance we can expect from the glasses and not just the extremely positive outcomes.

With best regards,

Mary Teresa (Terrie) Terlau, PH.D.

2347 Payne Street

Louisville, KY 40206

Phone: (502) 551-6382

Email: terr...@gmail.com

 

Amanda Heal

unread,
Aug 9, 2025, 6:56:38 PMAug 9
to echovisio...@agiga.ai
Quick question. In all the book demos are the glasses reading both the left and the right hand pages, or one page at a time? just to clarify, if a double page is presented to the glasses, will it read both, or do we need to scan each page?
Regards
Amanda Heal
Sent from my iPhone

On 9 Aug 2025, at 5:43 AM, Kevin Chao <kevin.ev...@agiga.ai> wrote:


--
You're receiving this message because you're subscribed to the EchoVision Community Forum.
 
To view the archive or catch up on discussions:
https://groups.google.com/a/agiga.ai/g/echovision-discuss
 
To unsubscribe, send an email to:
echovision-disc...@agiga.ai
To unsubscribe from this group and stop receiving emails from it, send an email to echovision-disc...@agiga.ai.

Kevin Chao

unread,
Sep 4, 2025, 8:53:00 AM (12 days ago) Sep 4
to echovisio...@agiga.ai
Q: Quick question. In all the book demos are the glasses reading both the left and the right hand pages, or one page at a time? just to clarify, if a double page is presented to the glasses, will it read both, or do we need to scan each page?
A: Assuming both pages are visible to camera, It'll read both left and right page. 

Best Regards,
Kevin Chao

📧 kevin.ev...@agiga.ai
📱 415.226.5026

🕶️ EchoVision AI Smart Glasses – built by, with, and for the blind community: Providing real-time audio descriptions of the visual world.

🔗 Facebook Group
https://www.facebook.com/groups/echovision/

🌐 Website
https://echovision.agiga.ai

📺 Watch EchoVision in Action / Media Coverage
https://echovision.agiga.ai/#news-insights

❓ FAQ
https://echovision.agiga.ai/faq/


Suzanne Erb

unread,
Sep 4, 2025, 9:13:02 AM (12 days ago) Sep 4
to echovisio...@agiga.ai
Further question.
Does it ‘understand” where each page begins and ends? How is it with column detection?
Thanks.
Best,
Suzanne Erb

Kevin Chao

unread,
Sep 4, 2025, 9:16:03 AM (12 days ago) Sep 4
to echovisio...@agiga.ai
Column detection works in scene description.

Best Regards,
Kevin Chao

📧 kevin.ev...@agiga.ai
📱 415.226.5026

🕶️ EchoVision AI Smart Glasses – built by, with, and for the blind community: Providing real-time audio descriptions of the visual world.

🔗 Facebook Group
https://www.facebook.com/groups/echovision/

🌐 Website
https://echovision.agiga.ai

📺 Watch EchoVision in Action / Media Coverage
https://echovision.agiga.ai/#news-insights

❓ FAQ
https://echovision.agiga.ai/faq/


Reply all
Reply to author
Forward
0 new messages