1440p Video Converter

0 views
Skip to first unread message

Imke

unread,
Aug 5, 2024, 4:45:45 AM8/5/24
to breaksectiper
Likemany of you on this forum, I have a collection of vintage PC hardware that I like to use and tinker with incessantly. Last year I bought an AGON AOC 1440p 144Hz monitor for my main rig, driven by my two 1080Ti's in SLI. I made sure that it supported not only the lastest standards, but also VGA, so that I could connect my vintage DOS/Win98/WinXp era hardware natively, without the use of adapters or dongles. The monitor has 2 DP, 2 HDMI and a VGA port. So far so good on the VGA side, as long as I stay at or below the maximum 1920x1080 resolution supported by the monitor in that mode.

Before I begin to describe my issue, I would like to say that my target is 1440p 60Hz on any GPU capable of that. The problem is that I have a bunch of cool cards from the 2005 to 2012 era that only have DVI ports. Some of the newer ones have one HDMI or mini-HDMI port but it will get disabled in Quad SLI or 4 Way Crossfire modes. So to solve the problem I bought one of those DVI to HDMI passive cables, and at first glance it seemed to work, but it doesn't. It works at a maximum of 1080p 60Hz, but it will only do 1440p 30Hz after forcing a custom resolution. By default it will show 1080p 60Hz.


Now to the thing that completely baffles me: there is one instance where the cable works with a 1440p 60hz display. This is only possible with one of my GTX 295's. Installing the drivers will autodetect the native resolution of the monitor and automatically show it. No custom resolutions, no tinkering, it just works. I have 3 GTX 295's and the other two will display nothing when presented with this resolution. Mind you, it is the default resolution, so simply installing the drivers on any of those two cards will net you a black screen and a "no signal" message shortly afterwards. To be able to use any of those cards as a display with the adapter I have to install the drivers in the one that works, downgrade to 1080p, shutdown, and finally connect one of the other cards.


There is one other instance where I got this to work though, through forcing a custom resolution with the CVT - Reduced Blank option selected: my two GTX 590's. Nothing works with any of my other cards. Here's a list of the behaviour with different cards, when attempting 1440 60Hz:


That's all I've tested for now, which seems enough. This situation leads me to believe that the cable might work, but I sincerely have no clue as to why it only does so in a single model of GTX 295. I've tried numerous driver versions, and it's always the same behaviour.


I think I've been able to pinpoint the problem: the Single Link DVI 165 Mhz pixel clock limit. I have tested this forcing custom resolutions using Custom Resolution Utility 1.4.2. Anything beyond 165 Mhz Pixel Clock will make the image blurry and text unreadable. The connector is Dual Link, but I've since learned that half of the pins are dummies as HDMI is always Single Link and those extra pins are not connected to anything.


This situation leads me to believe that the cable might work, but I sincerely have no clue as to why it only does so in a single model of GTX 295. I've tried numerous driver versions, and it's always the same behaviour.


It's possible that that particular card has circuitry that knows how to output a high-clock HDMI signal on the single-link DVI connector pins, whereas the others do not. The card detects the monitor's capabilities during the handshake and switches on the fly to the required signal type. It's a similar idea to outputting an HDMI signal via the DisplayPort++ connector or an analog RGB (VGA) on the DVI-A connector. At least that's my guess.


Thank you for your prompt and informative response. Your take on the Gigabyte GTX 295's "misterious" capabilities makes a lot of sense, my best guess would have been a VBIOS version (or witchcraft), but yours is definitely better. It makes me wish some of the other cards would be able to do the same thing though, as it would make matters much easier. If you allow me to request a little more of your time, would you happen to know any vendors or manufacturers that sell those kinds of active adapters? The only things I can find are the reverse converters (DP or HDMI to DVI) and terribly misguiding (or outright lying) ads for passive cables or dongles like the one I have. Amazon is especially ridden with the latter, and they always come up first when googling.


I still find it odd that the other two GTX 295 cards don't default to 1080p when first installing the drivers, but instead try to display 1440p and fail miserably. I mean, if I wasn't lucky enough to have the Gigabyte one, I wouldn't even be able to use those cards at all. None of the other cards I've tested behave that way. They even try to do 1440p, it's blurry and unreadable, but they don't just give up.


edit:

I see what you mean by the amazon ratings... that is weird though, there should be no difference between the digital output on a DVI-I and a DVI-D port

maybe they tried using it on cards that don't support 1440p?


It's most likely a typo - they either meant to write DVI-A or don't know the differences between DVI-I and DVI-A. In any case, I think their meaning is clear - the adapter can convert a digital signal, not an analog one.


Ironically, in my case it did work just well enough on the *digital* signal from a DVI-I port to make me think there was a software issue: it did allow a 2560x1440 monitor to run at 1080p (from a video card capable of 2560x1600). After an email to Visiontek's tech support went unanswered for several days, resulting in many hours of troubleshooting trying to determine if different video card driver software was needed to get the monitor's native 2560x1440 res. (without any success), I finally got the answer from Visiontek: this cable does not work correctly with DVI-I *digital* signal either.


Which sounds exactly like the problem I'm having with the passive cable. Is there even such a thing as an "active cable"? Don't active converters have a separate cable that draws power from a USB port or directly from the wall? It just looks like the cable I bought from Amazon Basics, but 5 times more expensive. Now that I think of it, I believe I haven't tried my cable on a GPU with a DVI-D connector. I'm going to test it with a 780Ti that I got lying around and report back.


I know that is what it says... but this should not be possible, unless they really messed up something in the design process and somehow made it so that anything on pin 8 of the connector breaks functionality of the device, since that is the only pin that is on the DVI-D part of the connector that has an analogue signal... but even then, there should be nothing on that if setting DVI to a digital output.


but now that I think of it... I remember having a weird issue with a LCD before, that would not work in digital mode with a DVI-I cable... that probably had some detection method to recognize a DVI-I connection and did not switch to digital, but why they would build a DVI to DP adapter like that is beyond me, in any case I think this is a mess up on their part and they don't want to admit it.


Ok, so I've tested the 780Ti and....it works out of the box. It works using both the DVI-D port and the DVI-I port, so now I'm really confused. If it were a port incompatibility it should not work when connected to the DVI-I port. If it were a bandwidth issue it should never be able to output 1440p 60Hz, but it's happy to do it. Damn, this shouldn't be so difficult, but I'm completely puzzled. Another "special" card like the Gigabyte GTX 295 or are we missing something here?


I've been conducting further testing for most of the afternoon and I've made some progress. At first I was trying to make the conversion work with an HD 3870 X2. No success there. Then I've remembered that during my countless searches I stumbled upon two little patchers for AMD and Nvidia drivers that claimed to remove the pixel clock limit. It seemed like a new line of investigation. So I plugged in one of my HD 7990s (DVI-I) and got to work. Default behaviour was as expected, 1080p 60Hz max resolution and the inability to show 1440p 60Hz correctly. But after running the patcher and rebooting, voil, 1440p 60Hz by default with flawless, crisp image!


So I think this rules out the cable. With the 7990, it's now doing 1440p with 4 different configurations: one GTX 295, both my GTX 590s, my 780Ti and both my 7990s. That leads me to believe that the passive cable can actually do as advertised just fine, and that we might be dealing with a software issue. Now there's a few caveats here. Firstly, these patchers only work on HD 5000 series or newer cards from AMD or GTX 400 series or newer from Nvidia. Secondly, I only have a limited selection of cards fitting those requirements, so I cannot test that this works for every card or combination of cards. And lastly, there are some inconsistencies in the original behaviour of the cards, and we still have one card, the Gigabyte GTX 295, that works out of the box when it shouldn't, if this were a software only issue.


The ideal solution is still a DVI-I to HDMI or DP 1440p adapter/converter/scaler. Should that be possible, there would be a consistent, out of the box way of achieving 1440p across all GPUs pre-HDMI 2.0 and DP era. This workaround will at most allow someone with a setup like mine use a small selection of generations - GTX 400, 500 and 600 series from Nvidia and HD 5000, 6000, 7000 from AMD. Anything newer than that will have ample support for newer standards. Anything older is not supported by the patch. I've tried the patch on the 3870 X2s and it doesn't work. It could even be a more constrained of a selection, as both my 7990s sport 4 mini-DP connectors and I'm just using the DVI port for convenience and not having to use 700 different cables.

3a8082e126
Reply all
Reply to author
Forward
0 new messages