Video Test 1440p Vs 1080p

0 views
Skip to first unread message

Solveig Lichtenberg

unread,
Dec 22, 2023, 6:24:07 PM12/22/23
to ConhecimentoProfissionaldeTI

I think you can turn games on in 1440p even though it's a 1080p monitor, just to see what it's capable of. If you want to see what it looks like at a higher res move your head about 1.5x farther back from your monitor.

I'm looking into upgrading my current PC so naturally I need to figure out what resolution I want to run at. Best as I can tell, it comes down to a 3070 if I want 1080p for the next 5+ years and 3080 if I want 1440p. To make that decision however, I want to be able to judge the difference between the two displays to see if I consider 1440p worth the upgrade.

video test 1440p vs 1080p


Download https://t.co/UL4lnTapbb



I need help with 1440p on the ps5... last time I checked it wasn't working, but realised I wasn't using the proper hdmi cable. but this time I used the hdmi cable that came with the ps5 (2.1) and even though my monitor is 1440p 165 hz (lg 27gr75q)

The GeForce RTX 4070 delivers exceptional 1440p gaming performance in even the most strenuous games, with best-in-class ray tracing performance if you want to turn those cutting-edge lighting features on.

In addition to traditional gaming tests, we separately benchmark how graphics cards perform with ray tracing enabled at the highest possible settings (though we stick to Ultra, not Psycho, in Cyberpunk 2077).

A 1440p resolution is one of the most common resolutions with monitors. These displays are meant for a variety of uses, and you can learn more about their benefits here. They tend to include gaming features like high refresh rates and variable refresh rate (VRR) support, and they can also include productivity features like ergonomic stands and USB hubs. While many 1440p monitors are focused on gaming, you can still use them for simple office use, so there's no perfect solution for everyone.

The most common size for a 1440p display is 27 inches, as it results in high pixel density, but you can still find them with larger 32-inch, or even ultrawide, screens. You can also find them at different prices, with the highest-end models having the most features. Once you know your budget, consider the monitor's performance for what you need; for example, high peak brightness and wide viewing angles are useful for office use, while smooth motion and low input lag are important for gaming.

We've bought and tested more than 300 monitors, and below are our recommendations for the best 1440p monitors available to buy. Check out our picks for the best 1440p 144Hz monitors, the best 1440p gaming monitors, or, if you want a higher resolution, the best 4k monitors.

The best 1440p monitor we've tested is the ASUS ROG Swift OLED PG27AQDM. It's an excellent overall monitor that's focused on gaming, as it offers a high 240Hz refresh rate and fantastic motion handling, so there's minimal motion blur with fast-moving objects. On top of that, it delivers stunning picture quality thanks to its OLED panel. It allows it to display deep blacks without blooming, and HDR also looks excellent as it gets bright enough to make highlights pop, so you'll enjoy this monitor even when watching movies.

If you're looking for something on a budget, there are a few good 1440p monitors to choose from. The Gigabyte M27Q P is a great gaming monitor similar to the LG 27GP850-B/27GP83B-B because it has a max 170Hz refresh rate, but its motion handling isn't as good as the LG. Regardless, it's still great for gaming, so you'll be happy if you're on a tight budget. It's a newer model than the popular Gigabyte M27Q (rev. 1.0), but it has extra features like built-in speakers and is easier to find.

If you're looking for the best 1440p monitors at a low cost, there's a good amount of monitors you can choose from, but most are very basic and don't perform that well. If you want decent overall performance, check out the Dell S2722DGM. It's a big step down in overall performance from the Gigabyte M27Q P, but that's the downside of looking for a cheap monitor. It has a different panel type than the Gigabyte and has worse viewing angles, so it isn't ideal for sharing your screen with others. If you want a cheap monitor with wider viewing angles, you can also consider the HP OMEN 27q when it goes on sale, but the Dell is easier to find at a low cost.

We are using DDR5 memory on the Ryzen 9 7950X3D and the other Ryzen 7000 series we've tested. This also includes Intel's 13th and 12th Gen processors. We tested the aforementioned platforms with the following settings:

At 1440p, things become more GPU limited than CPU limited. In titles that can benefit from the large pool of L3 cache, the Ryzen 9 7950X3D does very well. The biggest win for the 7950X3D at 1440p is in Red Dead Redemption 2, where it is clear of the rest of the field convincingly in average frame rates, but it's not as good as others in the 5% lows.

Stepping up to 1440p might be asking a bit too much of the RX 6600, depending on the game and settings used. As before, we'll start with our legacy test suite running at medium quality (which is one of the inputs used for the GPU benchmarks hierarchy), and then we'll move on to the expanded test suite and 1440p ultra settings, and wrap up with some ray tracing charts.

1440p at medium quality often ends up performing about the same as 1080p at ultra quality, which means the games in our slightly older test suite still run quite well. Better than that, really, as the RX 6600 averaged 106 fps, 15% behind the RX 6600 XT and 9% slower than the RTX 3060. Also, none of the games tested here fell below 60 fps.

The smaller Infinity Cache size generally means less benefit at higher resolutions, but even at 1440p the Infinity Cache clearly helps a lot. Considering the RTX 3060 has 50% more memory and memory bandwidth, the fact that the RX 6600 is even in the same ballpark shows how beneficial that large L3 cache is.

Using the RX 6600 at 1440p ultra wasn't nearly as compelling. All of the games were still playable, provided you think 30 fps or more is "playable," but multiple games came in far short of 60 fps. You can probably get by with high quality settings, or some mix of medium and high settings, and 1440p will still run fine, but don't be surprised if more games arrive in the next few years where 1080p is a far better choice for the RX 6600.

The RX 6600 again loses to the RTX 3060 by about 9%, though flipping through the individual games shows some variation. Far Cry 5 and Strange Brigade favor the RTX 3060 at 1440p ultra, even though they're formerly AMD promotional games, while newer releases like Valhalla are strongly in AMD's camp, and most Nvidia-promoted games still favor Nvidia hardware years after release. Considering the added memory and generally faster performance, it's only the street price of the RTX 3060 that keeps it from being a clear victor.

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

The EIZO monitor test consists of various test scenarios that your monitor can handle to a greater or lesser extent, depending on the model. For example, gaming monitors are distinguished by particularly short response times, whereas graphic monitors impress with a particularly homogenous image display and smooth gradients. You should therefore always assess your monitor within the context of its respective device category. For this reason, please note the manufacturer specifications (especially for the defective pixel test). It is recommended that you carry out the monitor test in a dark room. This allows you to precisely assess even dark image areas.

In order to ensure meaningful test results, your monitor should already be warmed up prior to testing (ideally for 30 minutes). You should also clean the display prior to testing, since reflected light could cause dust particles to look like defective pixels.

Hogwarts Legacy is open-world action RPG set in the Harry Potter universe. Having launched to very enthusiastic user reviews, it's time we benchmark it and we have a ton of data for you. We have 53 GPUs that will be tested in this game built on the Unreal Engine 4 engine, it supports DirectX 12 and ray tracing effects as well as all the latest upscaling technologies.

We have benchmarked two sections of the game, one benchmark pass took part on the Hogwarts Grounds as you exit, and the second at Hogsmeade as you arrive. We used a test system powered by the Ryzen 7 7700X with 32GB of DDR5-6000 CL30 memory and the latest Intel, AMD, and Nvidia display drivers. Let's get into it...

For those seeking a 60 fps experience at 1440p that should be possible with the Intel A750, Radeon 5700, RTX 2060 Super, or anything faster than those. Just falling short we have the Radeon 6600, 5600 XT, Vega 64 and RTX 2060. Then anything from the GTX 1660 down is going to lead to a pretty rough experience, so I'd recommend lowering the resolution and or quality settings.

Under these more CPU limited test conditions at 1080p we see that the Radeon GPUs perform exceptionally well. The 7900 XTX beat the RTX 4090 by an 8% margin while the 7900 XT was 6% faster than the RTX 4080. Then we see the 6950 XT and 4070 Ti trading blows.

This is less obvious with the RX 6800 and RTX 3070 Ti comparison though, where performance is basically identical. There's loads more data here but I think for 1440p ultra gaming you're not going to want to go below the RTX 3060 Ti and 6750 XT, so we'll move on.

Something we noticed after our initial wave of GPU testing was some strange scaling behavior in the Hogsmeade town, for whatever reason the game appeared extremely CPU bound here, despite low CPU utilization on all cores. We're not entirely sure what's going on here, and it will take more time and a lot more benchmarking to work it out.

Starting with 1080p medium results we see that we run into a system limitation in the Hogsmeade area at around 150 fps on average with 1% lows for around 130 fps in AMD hardware. Not a drastic difference to the Hogwarts Ground test, but we are clearly hitting a limitation here. With GeForce GPUs, we're seeing a hard cap on performance at 130 fps in Hogsmeade, not radically different to the initial test but it's interesting to see the GeForce RTX 3060 and 3070 performing better here, despite everything else hitting an fps wall a little sooner.

0aad45d008
Reply all
Reply to author
Forward
0 new messages