Displaying the text from NVDA with a larger font size

23 views
Skip to first unread message

Kostadin Kolev

unread,
Aug 28, 2025, 5:47:54 AM (8 days ago) Aug 28
to NVDA Screen Reader Discussion

Hello all,

I'll be doing a presentation with NVDA a few weeks from now.

What I need for NVDA to do is display the spoken text.

The Speech Viewer is a solution, but not perfect, because the text in it is not very large and with my monitor resolution of 2560x1440, it's even smaller. And as far as I know, there is no way to increase the text size in the NVDA Speech Viewer.

So, is there another way to display the text that NVDA is speaking? TalkBack on Android can display it as captions. The same can be done with VoiceOver on iOS and Mac OS. And the captions size from VoiceOver on Mac OS are large and easy to see. Isn't there an add-on for NVDA that can do the same?

Thanks in advance.

____________
Best wishes,
Kostadin Kolev


Quentin Christensen

unread,
Aug 28, 2025, 7:01:19 AM (8 days ago) Aug 28
to nvda-...@nvaccess.org
I'm not sure of an add-on - I'll be keen to see if there is though.  But from the NV Access side, we're aware that there are a couple of visual limitations with things like speech viewer - lack of ability to adjust font size and lack of light / dark mode support being the main two which spring to mind.

In the meantime, I'm not sure if it's the easiest solution for a presentation, but in general what you can do, is adjust Windows "Display" settings (search for display then open the settings from there) - there is a "Scale" option.  It defaults to 100%, which broadly means that a letter at size 12 font is 12 pixels high.  You can bump that number up - how high depends on your graphics card and monitor - on mine I can go to 225%.  Which seems about as large as Windows is happy to do without starting to cause problems (and even then it sometimes causes problems with things not displaying on screen properly).  But for most things that does make the font size larger and the icons and everything else.  The other option you've got is "Text size" or "Make text bigger" (search for either).  a slider which lets you change just the size of text without changing graphics.  Similarly, I find taking that up to the high end limit often causes problems with some programs, so somewhere in between works well.

Kind regards

Quentin

--
***
Please note: the NVDA project has a Citizen and Contributor Code of Conduct.
NV Access expects that all community members will read and abide by the rules set out in this document while participating in this group.
https://github.com/nvaccess/nvda/blob/master/CODE_OF_CONDUCT.md
 
You can contact the group owners and moderators via nvda-user...@nvaccess.org.
---
You received this message because you are subscribed to the Google Groups "NVDA Screen Reader Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to nvda-users+...@nvaccess.org.
To view this discussion visit https://groups.google.com/a/nvaccess.org/d/msgid/nvda-users/bf72ba11-c4fd-4337-ac6f-29588e12d11b%40gmail.com.


--

Quentin Christensen
Training and Support Manager

NV Access

Subscribe to email updates (blog, new versions, etc): https://eepurl.com/iuVyjo

Mike B

unread,
Aug 28, 2025, 8:45:03 AM (8 days ago) Aug 28
to nvda-...@nvaccess.org
You can try lowering your screen resolution, article below.
 
Simple explanation:
It is usually quoted as width × height, with the units in pixels: for example, "1024 × 768" means the width is 1024 pixels and the height is 768 pixels.
This example would normally be spoken as "ten twenty-four by seven sixty-eight" or "ten twenty-four by seven six eight".
 
The most important thing is not the monitor size, but its ability to display the higher resolution images. The higher you set the resolution, the smaller
the images on the screen will be, and there comes a point at which the text on the screen becomes so small it’s not readable. On a larger monitor it is
possible to push the resolution very high indeed, but if that monitor’s pixel density is not up to par, you won’t get the maximum possible resolution before
the image becomes unreadable. In many cases the monitor will not display anything at all, if you tell Windows to use a resolution the monitor cannot handle.
In other words, don’t expect miracles out of a cheap monitor. When it comes to high-definition displays, you definitely get what you pay for.
 
 
Screen resolution? Aspect ratio? What do 720p, 1080p, 1440p, 4K and 8K mean?
article Tutorial by Codrut Neagu
 published on 05/20/2016      
Screen resolution
 
 In days gone by, screen resolution (also called display resolution) wasn’t much of an issue. Windows came with a few preset options and to get higher
resolution or more colors (or both) you would install a driver for your video card. As time went on, you could choose better video cards and better monitors
as well. Today we have lots of options when it comes to displays, their quality and the supported resolutions. In this article I would like to take you
through a bit of history and explain all the important concepts, including common acronyms like 1080p or 4K.
 
 It all started with IBM & CGA
 
 The color graphics technology was first developed by IBM. CGA was first, followed by EGA and VGA - color graphics adapter, enhanced graphics adapter,
video graphics array. Regardless of the capability of your monitor, you’d still have to choose from one of the few options available through your graphics
card’s drivers. For the sake of nostalgia, here’s a look at a once well-known CGA display. 
 
screen, resolution, display, aspect, ratio, size, 1080p, 720p, 1080i, 1440p, 4K, 8K
 
 With the advent of high definition video and the increased popularity of the 16:9 aspect ratio (we’ll explain more about aspect ratios in a bit) selecting
a screen resolution is not the simple affair it once was. However, this also means that there are a lot more options to choose from, with something to
suit almost everyone’s preferences. Let’s look at what today’s terminology is, and what it means:
 
 The screen is what by what?
 
 I am sure some of you already know that the term "resolution" isn’t correct when it’s used to refer to the number of pixels on a screen. That says nothing
about how densely the pixels are clustered. "Resolution" is technically the number of pixels per unit of area, rather than the total number of pixels.
Here, we’ll be using the term as it’s commonly understood, rather than the absolutely technologically correct usage.
 
 Since the beginning, resolution has been described (accurately or not) by the number of pixels arranged horizontally and vertically on a monitor, for
example 640 x 480 = 307200 pixels. The choices available were determined by the capability of the video card, and they differed from manufacturer to manufacturer.
 

 The resolutions built into Windows were very limited, so if you didn’t have the driver for your video card you’d be stuck with the lower-resolution screen
that Windows provided. If you’ve watched Windows Setup or installed a newer version of a video driver, you may have seen the 640 x 480 low resolution screen
for a moment or two. It was ugly even on CGA screens, but that was the Windows default.
 
 As monitor quality improved, Windows began offering a few more built-in options, but the burden was still mostly on the graphics card manufacturers, especially
if you wanted a really high resolution display. The more recent versions of Windows can detect the default screen resolution for your monitor and graphics
card and adjust accordingly. This doesn’t mean that what Windows chooses is always the best option, but it will work, and you can change it if you wish,
after you see what it looks like. If you need guidance on doing that, check this tutorial:
Change your display's screen resolution and make text and icons bigger.
 
screen, resolution, display, aspect, ratio, size, 1080p, 720p, 1080i, 1440p, 4K, 8K
 
 Mind your P’s and I’s
 
 You may have seen the screen resolution described as something like 720p or 1080i. What does that mean?
 
 To begin with, the letters tell you how the picture is "painted" on the monitor. A "p" stands for  progressive , and an "i" stands for  interlaced .
 
 The  interlaced scan is a holdover from television and from early CRT monitors. The monitor or TV screen has lines of pixels arranged horizontally across
it. The lines were fairly easy to see if you got up close to an older monitor or TV, but nowadays the pixels on the screen are so small that they are very
hard to see even with magnification. The monitor’s electronics "paint" each screen line by line, too quickly for the eye to see. An interlaced display
paints all the odd lines first, then all the even lines.
empty complementary information
 Since the screen is being painted in alternate lines, flicker has always been a problem with interlaced scans. Manufacturers have tried to overcome this
problem in various ways. The most common way is to increase the number of times a complete screen is painted in a second, which is called the  refresh
rate. The most common refresh rate was 60 times per second, which was acceptable for most people, but it could be pushed a bit higher to get rid of the
flicker that some people perceived.
 
 As people moved away from the older CRT displays, the terminology changed from  refresh rate to  frame rate , because of the difference in the way the
LED monitor works. The  frame rate is the speed with which the monitor displays each separate frame of data. The most recent versions of Windows set the
framerate at 60 Hertz, or 60 cycles per second, and LED screens do not flicker. And the system changed from  interlaced scan to  progressive scan because
the new digital displays were so much faster. In a progressive scan, the lines are painted on the screen in sequence rather than first the odd lines and
then the even lines. If you want to translate 1080p for example, is used for displays that are characterized by 1080 horizontal lines of vertical resolution
and a progressive scan.
 
 There’s a rather eye-boggling illustration of the differences between progressive and interlaced scans on Wikipedia here:
Progressive scan.
For another interesting history lessons, read also
Interlaced video.
 
 What about the numbers: 720p, 1080p, 1440p, 4K and 8K?
 
 When high-definition TVs became the norm, manufacturers developed a shorthand to explain their display resolution. The most common numbers you will see
are 720p, 1080p and 2160p or 4K. As we’ve seen, the "p" and "i" tell you whether it’s a progressive-scan or interlaced-scan display. And these shorthand
numbers are sometimes used to describe computer monitors as well, even though in general a monitor is capable of a higher-definition display than a TV.
The number always refers to the number of horizontal lines on the display.
 
 Here’s how the shorthand translates:
 
list of 5 items
•  720p = 1280 x 720 - is usually known as HD or “HD Ready” resolution
•  1080p = 1920 x 1080 - is usually known as FHD or “Full HD” resolution
•  1440p = 2560 x 1440 - commonly known as QHD or Quad HD resolution, and typically seen on gaming monitors and on high-end smartphones. 1440p is  four
times the resolution of 720p HD or “HD ready”.
•  2160p = 3840 x 2160 - commonly known as 4K, UHD or Ultra HD resolution. It’s a very large display resolution and it’s found on high-end TVs and monitors.
2160p is called 4K because it offers  four times the resolution of 1080p FHD or “Full HD”.
•  4320p = 7680 x 4320 - is known as 8K and it offers 16 times more pixels than the regular 1080p FHD or “Full HD” resolution. Although you’re not going
to see TVs or computer monitors with this resolution too soon, you can test whether your computer can render such a large amount of data. Here’s an 8K
video sample:
list end
 
 What is the Aspect Ratio?
 
 At the beginning we mentioned the term aspect ratio. This was originally used in motion pictures, indicating how wide the picture was in relation to its
height. Movies were originally in 4:3 aspect ratio, and this carried over into television and early computer displays. Motion picture aspect ratio changed
much more quickly to a wider screen, which meant that when movies were shown on TV they had to be cropped or the image manipulated in other ways to fit
the TV screen.
 
 As display technology improved, TV and monitor manufacturers began to move toward widescreen displays as well. Originally "widescreen" referred to anything
wider than the common 4:3 display, but it quickly came to mean a 16:10 ratio and later 16:9. Nowadays, nearly all computer monitors and TVs are only available
in widescreen, and TV broadcasts and web pages have adapted to match.
 
 Until 2010, 16:10 was the most popular aspect ratio for widescreen computer displays. But with the rise in popularity of high definition televisions,
which were using high definition resolutions such as 720p and 1080p and made this terms synonyms with high-definition, 16:9 has become the high-definition
standard aspect ratio.Today, finding 16:10 displays is almost impossible.
 
 Depending on the aspect ratio of your display, you are able to use only resolutions that are specific to its width and height. Some of the most common
resolutions that can be used for each aspect ratio are the following:
 
list of 3 items
•  4:3 aspect ratio resolutions: 640×480, 800×600, 960×720, 1024×768, 1280×960, 1400×1050, 1440×1080 , 1600×1200, 1856×1392, 1920×1440, and 2048×1536.
 
•  16:10 aspect ratio resolutions: - 1280×800, 1440×900, 1680×1050, 1920×1200 and 2560×1600.
•  16:9 aspect ratio resolutions: 1024×576, 1152×648, 1280×720, 1366×768, 1600×900, 1920×1080, 2560×1440 and 3840×2160.
list end
 
 How does the size of the screen affect resolution?
 
 Although a 4:3 TV’s display can be adjusted to show black bars at the top and bottom of the screen while a widescreen movie or show is being displayed,
this doesn’t make sense with a monitor, so you’ll find that Windows will not even offer you the widescreen display as a choice. You can watch movies with
black bars as if you were watching a TV screen, but this is done by your media player.
 
 The most important thing is not the monitor size, but its ability to display the higher resolution images. The higher you set the resolution, the smaller
the images on the screen will be, and there comes a point at which the text on the screen becomes so small it’s not readable. On a larger monitor it is
possible to push the resolution very high indeed, but if that monitor’s pixel density is not up to par, you won’t get the maximum possible resolution before
the image becomes unreadable. In many cases the monitor will not display anything at all, if you tell Windows to use a resolution the monitor cannot handle.
In other words, don’t expect miracles out of a cheap monitor. When it comes to high-definition displays, you definitely get what you pay for.
 
 Conclusion
 
 If you are not very technical, it is very likely that you are confused by so many technicalities. Hopefully this article has managed to help in your understanding
of the most important characteristics of a display: aspect ratio, resolutions or type.
article end

Take care.  Mike.  Sent from my iBarstool.  Go Dodgers!
To view this discussion visit https://groups.google.com/a/nvaccess.org/d/msgid/nvda-users/CAKsDpFiOPEdj%2B9yg-DWKz6ozGPF86WP-hi6cAu%3D5Q5%2BiXGOMSQ%40mail.gmail.com.

Quentin Christensen

unread,
Aug 28, 2025, 9:15:58 PM (7 days ago) Aug 28
to nvda-...@nvaccess.org
While that article gives a good explanation of the terms, I would strongly recommend changing your scale (at one point Windows called it 'Make everything bigger" - highlighting its main use) rather than lowering your resolution.

At the default 100% scale, a 12 point letter is 12 pixels high.  If you halve the resolution, then that letter becomes twice as high, but still only using 12 squares.  That works fine for I or T - but starts to become problematic for rounder letters like O or S.

Instead of lowering the resolution, if you increase the scale, then you your letter still becomes twice as high BUT with awareness of the actual pixel size - so your 12 point letter actually now uses 24 points to make itself twice as high.  Which means your O and your S have twice as many pixels to make those rounded edges (think of making a letter using square lego bricks.  Lowering the resolution is like using larger Duplo blocks.  Increasing the scale is like using the normal small bricks, but twice as many).

Beyond that, many programs simply won't work if you lower your resolution too far, but they can compensate for a higher scale (to a point, after a few hundred percent it still becomes problematic).

Reply all
Reply to author
Forward
0 new messages