Even when I draw a GDI line on a DirectX surface, I find the quality is not
as good as drawing a GDI line to a device context not sitting on a DirectX
surface. Why is this? Is it a trade off between DirectX giving me better
performance, but not as good line quality, or is there some setting I can use
to make it comparable with GDI?
Alternatively, is it something to do with my graphics card perhaps?
Anyone got suggestions/answers?
Tony
=?Utf-8?B?bGlzdGVjaHRvbnk=?= <liste...@discussions.microsoft.com> spake the secret code
<A77F2481-53B6-4BBC...@microsoft.com> thusly:
>Anyone got suggestions/answers?
To get the best quality lines in D3D, you need to make sure your gamma
is configured properly. Otherwise the antialiasing won't look quite
right. For more details on this, see Steve Smith's excellent
presentation from GameFest 2007:
<http://legalizeadulthood.wordpress.com/2007/08/28/gamefest-2007-picture-perfect-gamma-through-the-rendering-pipeline/>
Also, with D3D you have to make sure your back buffer matches the size
of your window, or when you call Present you'll get a StretchBlt which
will just introduce more aliasing.
Finally, you're more likely to get the best quality lines by drawing a
textured quad (which is what D3DX does in the most recent SDK) instead
of trying to draw line primitives with antialiasing enabled. When you
draw line primitives with antialiasing enabled, you're relying on the
IHV to provide you with good line quality. When you draw a textured
quad, you're relying on your texture and the gamma ramp to provide you
with good line quality. The latter you can control, while the former
you cannot.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://legalizeadulthood.wordpress.com/the-direct3d-graphics-pipeline/>
Legalize Adulthood! <http://legalizeadulthood.wordpress.com>