NVDA and accuracy on the web

14 views
Skip to first unread message

James AUSTIN

unread,
Oct 23, 2025, 5:33:33 PMOct 23
to nvda-...@nvaccess.org
Hi folks

I hope everyone is well?

I work in the Accessibility field and have long believed that NVDA provides a more accurate picture of webpages than JAWS does. I think this is due to JAWS' machine  learning. From a user perspective, the ability to guess at form labels and other  content is extremely helpful, but not so from a testing   perspective.

I don't wish to  cause offence by my remarks and I am happy to be corrected,  but I sincerely hope that NVDA does not follow suit and continues to provide the accuracy that I and others depend on.

Thank you 

Warmest wishes
James 

Quentin Christensen

unread,
Oct 23, 2025, 7:54:24 PMOct 23
to nvda-...@nvaccess.org
Hi James,

Indeed, we know that a lot of our users - end-users and testers alike, value having NVDA echo exactly what is on a web page or application, rather than trying to guess and make assumptions which may be wrong.

The only AI thing we're working on is an on-device, completely offline image description feature.  This would work similarly to the OCR function in that when you get to an image, NVDA will tell you whatever it normally would - any alt text it has etc.  If you WANT to run the feature, then you would press the keystroke and it would give you a description.  The feature isn't enabled by default and even when enabled, doesn't activate except on request.

Thanks for your kind words!

Kind regards

Quentin

--
***
Please note: the NVDA project has a Citizen and Contributor Code of Conduct.
NV Access expects that all community members will read and abide by the rules set out in this document while participating in this group.
https://github.com/nvaccess/nvda/blob/master/CODE_OF_CONDUCT.md
 
You can contact the group owners and moderators via nvda-user...@nvaccess.org.
---
You received this message because you are subscribed to the Google Groups "NVDA Screen Reader Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nvda-users+...@nvaccess.org.
To view this discussion visit https://groups.google.com/a/nvaccess.org/d/msgid/nvda-users/CAHTne5K-mRkCTuS8-iSodcoZzQ%2Bu-iARRVTc2ewuYxg%3DQ-Uqmw%40mail.gmail.com.


--

Quentin Christensen
Training and Support Manager

NV Access

Subscribe to email updates (blog, new versions, etc): https://eepurl.com/iuVyjo

Andrew Downie

unread,
Oct 24, 2025, 5:41:59 AMOct 24
to nvda-...@nvaccess.org
"From a user perspective, the ability to guess at form labels and other  content is extremely helpful, ..." but only if the guess is correct.  If the guess is incorrect, which is by no means unheard of, consequences can be severe.


Andrew


--

James AUSTIN

unread,
Oct 24, 2025, 6:17:03 AMOct 24
to nvda-...@nvaccess.org, nvda-...@nvaccess.org
Hi Andrew,

Yes, as an end user as well as a tester, I agree with you. If I didn’t say that in my original post, I apologise, I should have.

Warmest wishes,

James.
Sent from my iPhone

On 24 Oct 2025, at 10:41 am, Andrew Downie <doveta...@gmail.com> wrote:



joseph....@gmail.com

unread,
Oct 24, 2025, 7:52:03 AM (14 days ago) Oct 24
to nvda-...@nvaccess.org

Hi,

NVDA does not rely on AI to guess web control labels. Rather, it uses the information coming from the authors themselves together with some heuristics and expectations from web accessibility standards to announce web content.

Cheers,

Joseph

Mujtaba Merchant

unread,
Oct 24, 2025, 8:10:43 AM (14 days ago) Oct 24
to nvda-...@nvaccess.org

So sorry, this is an interesting topic and I encounter this kind of situation almost daily while doing user testing with NVDA on many websites. I do not know how valuable my contribution is to the topic being discussed but if you visit Get image descriptions on Chrome it might help you.

 

I am not sure if this is applicable with other browsers too but the chance is high with Edge and Chrome.

 

Sincerely,

 

Mujtaba Merchant

Bangalore | INDIA

Mail: mujt...@gmail.com

Website: The Somebody, Nobody, Anybody & Everybody Blog!

Sent from Outlook ® for Windows 10

James AUSTIN

unread,
Oct 24, 2025, 9:30:24 AM (14 days ago) Oct 24
to nvda-...@nvaccess.org, nvda-...@nvaccess.org
Hi Mujtaba

There’s no need to apologise. Of course your contribution is valuable. Thanks for the tip and I will check this out. Where I work, 99% of the time testing takes place under a chrome so I am sure this will be extremely useful.

Thank you.

Warmest wishes,

JAMES 





from my iPhone

On 24 Oct 2025, at 1:10 pm, Mujtaba Merchant <mujt...@gmail.com> wrote:



James AUSTIN

unread,
Oct 24, 2025, 9:48:41 AM (14 days ago) Oct 24
to nvda-...@nvaccess.org, nvda-...@nvaccess.org
Hi Joseph,

Thank you for the clarification on how NVDA gets its information. Really interesting to know.

Warmest wishes,

JAMES.
Sent from my iPhone

On 24 Oct 2025, at 12:52 pm, joseph....@gmail.com wrote:


Reply all
Reply to author
Forward
0 new messages