Directx Renderer

1 view
Skip to first unread message

Maria

unread,
Aug 5, 2024, 10:21:49 AM8/5/24
to blasorteli
Iam creating a game engine for fun and learning. I have made an engine before with a DirectX 11 renderer. This time I want to implement a DirectX 12 renderer, I am using 3D game programming book with DirectX 12 by Frank Luna, to learn Dx12, I am trying to improve how rendering works. Currently my render works like this:

Is this approach okay? it doesn't feel right, in my DirectX 11 renderer, the model class has the render code directly inside it but I want to make my Dx12 more abstract. (probably better to learn Dx12 before doing abstraction but where's the fun in that). Are there any good examples that can help me, most of the example I see only allow for the meshes to be created and passed to the renderer before the game loop


In my DX11 forward renderer (not a PBR renderer), meshes are responsible only for binding themselves to a shader and calling the drawIndexed function. Each mesh also stores its own transformation matrix. This transformation matrix never changes, as it is defined when the model file is created (i.e: it's an model transform created by blender, not actually a world transform)


Other shaders have other optional cbuffers. The logic of how to bind data to these is handled in DrawableMixins". Every Object in my game has a pointer to an IDrawableMixin, which is an object that defines a single draw method:


By no means would I say that what I'm doing is typical, or that you should emulate me. I'm just showing an example of how I did it and you can compare with yours. If you think my system is stupid and you hate it, that's fine. If you like some aspects of it, feel free to use those as well. My intention was to design a system where the objects for rendering more closely mirror the hierarchy of objects in glTF (the model format I'm using)


I can see a major design flaw: because you have models/meshes draw themselves, there is no way to do any sorting or filtering of the draw calls. You might need to sort back-to-front for transparency, front-to-back for opaque objects to reduce overdraw, and might also want to sort by material/shader/texture to reduce state changes. You might want to only draw certain objects in certain render passes (e.g. shadow or reflection pass). You also don't seem to be doing any culling, which will hurt performance.


The solution to this is to introduce a RenderQueue class. Each model/mesh adds the necessary information to the queue for drawing itself, but doesn't actually submit anything to the graphics API. Then, the renderer can sort the queue once all draw items are collected, and finally submit the draw items in sorted order. RenderQueue can also store things like lights of different types, etc.


You shouldn't be loading assets inside your drawing functions. That should be done upfront before starting to render a frame. Function-local statics are a bad code smell and have potential issues with thread safety and multiple instances. What happens if you need to have 2 renderers with different graphics contexts, such as in a multi-window game editor? There might be a conflict if you have a single static shader. Why can't you use a class member variable instead? That will be more flexible since the shader is not fixed at compile time.


Culling is handled at the model level by comparing the AABB of each model against the frustum in an octree; it doesn't occur at the mesh level (meshes are not usually big enough to warrant their own culling mechanics)


Browsing the directx wiki, 8.0a in 2001 was the last version to include the software renderer, so I figured virtualbox + fast current cpu + software renderer = directx games up to 2001/dx1-8 at possibly decent speed(what I wanted to check out) because of present day cpu. I installed 98se on the latest virtualbox, generic vesa driver: , and directx 8.0a, ready to go! Except, programs refuse to install/complain about no 3d accelerator instead of falling back to the supposedly there software renderer. Anyone used the software renderer before? How to enable/force/use it? I've been googling around without success, such old information seems to have fallen off the internet.


The article references -graphics-accelerators/ , which still seems to be accessible via the Google Cache, but doesn't seem to make any references to 8.0a having "software rendering support". Unless there's another reference, I would think that Wikipedia is inaccurate here.


Are there any good examples that show how to render the IMFSample output from the H.264 decoder? My scenario uses a 4K resolution H.264 stream and the PC that I am targeting will only accept 1080p using the DXGI buffers. But the H.264 decoder will handle 4K so I need to find a way to feed that NV12 IMFSample directly to the DirectX 11 renderer. I have already tried using the DX11VideoRenderer sample but it fails due to this particular IMFSample not having an IMFDXGIBuffer interface.


It looks like in the DX11VideoRenderer the input IMFDXGIBuffer is NV12 type and that can be rendered successfully in hardware. So it seems logical that a non-DXGI buffer of NV12 type should be acceptable too?


Perhaps I need to create a ID3D11Texture2D texture or resource with an NV12 type? I found examples for how to create a texture from a file but none for how to create a texture from a sample, which would seem to be even more useful. And if I can create a NV12 texture, how to figure out the SysMemPitch and SysMemSlicePitch values in the D3D11_SUBRESOURCE_DATA structure for NV12?


I was able to find a complete example that renders an NV12 sample to the screen. Although there are some simple stride calculation errors in how it renders it's own example image, the actual rendering code does work correctly. It appears to be an old Microsoft sample that I cannot find any other information about.


Hello everyone,

The latest versions of DirectX 11 renderer were removed from MODDB.

The reason is reportedly programmed paypal link which didn't caused any troubles for the last 3 years.But I think that maybe Epic Games wants to remove Unreal Tournament 99 and Unreal from the memory of the players completely and suggested removal of the renderer from this website(It was the most downloaded UT mod on MODDB). As they cancelled sales of their games three months ago. I suggest everyone to grab the latest version from gamepressure before it may disappear completely from the internet.On the contrary if some player didn't like free content and contacted moderators and reported that paypal link might be against moddb rules then I'm sorry for that.Making this mod costed me a lot of efforts like learning DirectX11 API, solving non-standard programming problems, bypassing Engine limitations, programming difficult shaders etc.This wasn't about modding the games with tools released by developers or upscaling textures with oneclick tools as the most mods on moddb are made.It was writing in C++ a new part of the game which developers 20 years ago only dreamt about.I made it free so everyone could enjoy playing these old games with modern graphics.I often read comments complaining on different things with bitter taste and sorrow.The positive comments were a miniority. If I worked on this again I would uploaded my work directly to Patreon..


Possible fix for initialization failed: Make sure you installed Microsoft Visual Studio 2010 Redistributable x86.(vcredist_x86.exe).

Possible fix for micro stuttering: increase number of prerendered frames or adjust the setting 'Max Frame Rate' to your desired FPS for a given game.

Reshade in DeusEx,Undying : Disable Antialiasing.Enable copy depth buffers before clear, check 2nd or 3rd clear.


version 1.6.1 hotfix

1.Fixed screen freezing on some machines when SSR=False and Tessellation=True.

2.Fixed some skybox decorations being visible through world geometry (mostly in custom mods).

3.Fixed transparent object stretching in DeusEx during cutscenes.

4.Fixed some water not having ssrs (like in DOM-WolfsBay).

5.Added new option SSRNonTransparentWater when set to true all water with SSR=True becomes non-transparent and reflecive.

6.ASSAOContrast can be lowered to -0.5 to further lower ASSAO.

7.Updated tessellation.cfg for Unreal Tournament to cover all decals on NW3UltraGore mod.

8.Normal map panning in small water pools has been accelerated.


version 1.4:

1.Added support for Star Trek: Klingon Honor Guard

2.Added support for Harry Potter and the Philosopher's Stone

3.Fixed a bug in DeusEx with CCTV camera interface not displaying properly

4.Fixed a bug in DeusEx with ASSAO where characters had black shadows on their faces.

5.Added new option SSRModelIntensity to customize reflections affecting models only.

6.Fixed a bug in Unreal Gold where maps with fullscreen fog maps displayed no textures.

7.Potentionally fixed a memory leak causing lens flare corruptions and crashes in drawTile().

8.Added option FastAltTab for players who frequently switch between the game and the desktop.

Unfortunately turning this option on via preferences menu causes black screen bug and requires you to restart the game.

You will also get this bug if you press ALT+TAB and then change any settings in the preferences.

9.Fixed tessellation causing flickering screen in WOTS mod for UG and UT.


version 1.3:

1.New vastly improved ray-tracing algorithm for SSR. Now reflections also affect far distances from reflective surfaces.

Turned on reflections on models like weapons and non-transparent water bodies.

2.Added new option SSRIntensity to customize reflections intensity.

3.Fixed a bug in DeusEx where SSR and ASSAO worked only during cutscences.

4.Fixed a bug when disabling Tessellation decals like blood showed pixelated.

5.Unbound process affinity from 1st core to arbitrary one.

6.Fixed a bug with screen brightness when wallmark is being drawn.


version 1.2:

1.New ambient occlusion algorithm ASSAO (on highest quality preset) instead of the old HBAO. It is simply faster and much better.

More information here: Software.intel.com

2.Added full support for patch 227i for Unreal Gold.

Almost everything works here except from high-res shadow support.You can also use UnrealHD 2.1 skins mod from Lightning Hunter with tessellation enabled.

3.Added Screen Space Reflection shader which is disabled by default because too many surfaces reflect light and the reflections sometimes gets distorted.

It is still better than using Reshade or some postprocessing program. You can enable it by changing SSR=True in preferences menu.It will get improved in the future.

4.Added HDRFilmicTonemapping which produces better contrast/color warmth in HDR using complex interpolation polynomial instead of simple Reinhard approximation.

5.New option to give brighter looking environments and skyboxes.

I recommend to default HDRLuminance back to 0.5,,HDRBloom 0.2,HDRFilmicT to get the best image quality and experiment from there.

Or you can go back to version 1.1 HDR look by setting HDRFilmicT and />6.New option FullMeshLOD. It is equivalent to typing "mlmode 0" in the console. Disables mesh level of detail so meshes are drawn with full detail even at high distances.

It may have negative impact on performance and can get rid of some tessellation artifacts.

7.TessellateOnlyInCFG tessellation only affects meshes in .cfg file while other meshes won't get tessellated. I recommend to disable it unless you play on heavily customized map with many bad-looking models.

8.tessellation.cfg file got updated to support mods like : OperationNaPali,ChaosUT,NaliWeapons.Weapon models also have less "mesh morphing" artifacts.

9.Fixed displaying decals like bullet holes and blood spats,bleeding skyboxes,ambient occlusion bugged coronas and explosions and many more..

3a8082e126
Reply all
Reply to author
Forward
0 new messages