GhostRecon Breakpoint is a bit of a perplexing game. The latest installment of the series feels less like Ghost Recon and more like yet another take on an open world Ubisoft game full of microtransactions. It's not just the world and the game mechanics that feel familiar, however. The AnvilNext 2.0 engine is back, with a few tweaks that, oddly enough, make the game run quite a bit faster at maximum quality compared to Ghost Recon Wildlands. 4K ultra at more than 60 fps is actually possible, though it still requires a powerful graphics card.
Ghost Recon Breakpoint's PC feature checklist is pretty typical of recent Ubisoft games. Resolution and aspect ratio support is good, you can sort of adjust field of view (the range is a bit limited) and tweak 18 advanced graphics settings (or choose from one of five presets), and the only missing feature is mod support. But you weren't really expecting Ubisoft to embrace mods, were you? Not when it has such amazing microtransaction potential! I'm just waiting for Ubisoft to start including microtransactions for improved performance and settings options. (That's a joke, Ubisoft. No more MTX, please.)
In terms of hardware, let me start with the official system requirements. Ubisoft goes above and beyond most games in listing five different sets of system requirements. From 1080p low all the way up to 4K ultra, Ubisoft has recommendations. It doesn't specify what sort of performance you'll get from the recommended hardware, but that's why I'm here with the benchmarks.
I can confirm that Breakpoint really doesn't like some 2GB graphics cards, which is why the 960 4GB is listed as the minimum spec. On a GTX 1050, performance in the built-in benchmark waffled between nearly 50 fps average on a good run, to 25 fps on a subsequent run, and 1080p medium was effectively unplayable. A 1060 3GB on the other hand did remarkably well, handling up to 1080p very high at more than 60 fps. Finally, that 4K ultra recommendation isn't coming anywhere near 60 fps, especially not on the Radeon VII. The 2080 almost gets there, and dropping a few settings should do the trick, but as we'll see in the benchmarks, the Radeon VII struggles quite a bit.
Ghost Recon Breakpoint has five different presets along with 18 individual advanced settings that can be tweaked, not including resolution or resolution scaling. Here's the thing: on most graphics cards, the difference between minimum and maximum quality isn't very large, and some of the settings behave in unexpected ways. You can nearly double performance going from minimum to ultra on GPUs like the 1060 3GB, but part of that is the lack of VRAM. On a GTX 1070, low vs. ultra is only 70 percent faster, and on the Vega cards it's only 60 percent faster. Everything beyond the high preset looks pretty much the same as well, as you can see in the above image gallery.
The one setting I want to call out in advance is anti-aliasing. There are two settings: anti-aliasing and temporal injection. The latter can't be enabled if AA is off, but on many GPUs it appears temporal injection gives a decent 15 percent boost to performance. If you turn off AA (and thus temporal injection), the low preset actually runs about as fast as the medium preset with AA and injection enabled.
What is this so-called temporal injection, though? The description says, "Temporal Injection represents the technology of using samples from many frames to provide crisp high resolution images while keeping smooth framerate." That's fine, except it's basically a description of temporal AA.
"Temporal Injection is a system that draws less pixels (causing the performance boost) and uses a temporal upscaling shader that temporally reconstructs a high resolution image. It can produce slightly blurry backgrounds in some cases, which may show up more when using a large screen or a low resolution. On high resolution (1440p or greater), however, the effect becomes barely noticeable and is a must have to achieve great performances."
My own experience more or less agrees with that. At 1080p, temporal injection is more noticeable than at 1440p or 4K (and it's possibly even worse at 720p). But it doesn't appear to be doing a strict lower resolution rendering and then upscaling. It seems to affect foliage and people the most, and distance may also play a role.
Regardless, I initially started testing with both AA and temporal injection enabled, and to keep the results consistent across all hardware, I left temporal injection enabled for all the testing that follows. It improves performance, and you can always turn it off if you prefer. It's the wrong choice as far as image quality, but the right choice if you're trying to boost performance, and it's almost required at 1440p and 4K. Anyway, I'm not going to go rerun several hundred benchmarks at this point. Sorry. (Subtract 10-15 percent from the results for an estimate of the non-TI numbers.)
That leaves Sun Shadows (detailed shadows cast by the sun) as the only remaining setting, and dropping this from ultra to low can boost performance by 9-10 percent. Needless to say, turning this down also tends to cause the most visible change in image quality, as dynamic shadows are pretty noticeable.
Sometimes tweaks to the settings may stack up to slightly larger improvements, so I checked that as well. Going from the ultra preset to the minimum setting on the nine settings that caused less than a 3 percent change in performance yielded a 2.4 percent improvement in framerates. Yeah, those settings really don't matter. In contrast, dropping the seven settings that give modest fps increases from ultra to high boosts performance by around 22 percent, nearly the same as going from the ultra to high preset.
Ghost Recon Breakpoint includes its own benchmark tool, which was used for all of the benchmark data. Each GPU is tested multiple times and I use the best result, though outside of cards with less than 3GB VRAM, the variability between runs was largely meaningless (1 percent margin of error). The built-in benchmark doesn't represent every aspect of the game. Some areas will perform better, some worse, but it does give a good baseline summary of what sort of performance you can expect.
One thing to note is that there are a few scenes in the benchmark that are very light (e.g., starting at the snowy train tracks for several seconds), which may artificially boost the overall performance. As noted earlier, performance is actually better than that of Ghost Recon Wildlands, but some of that may simply be due to the differences in the benchmark sequence, plus the new temporal injection setting. Pay attention to the 97 percentile minimums (calculated as the average fps of the highest 3 percent of frametimes), which give a fair representation of worst-case performance (e.g., in a firefight).
All of the GPU testing is done using an overclocked Intel Core i7-8700K with an MSI MEG Z390 Godlike motherboard, using MSI graphics cards, unless otherwise noted. AMD Ryzen CPUs are tested on MSI's MEG X570 Godlike, except the 2400G which uses an MSI B350I board since I need something with a DisplayPort connection (which most of the X370/X470/X570 boards omit). MSI is our partner for these videos and provides the hardware and sponsorship to make them happen, including three gaming laptops (the GL63, GS75, and GE75).
For these tests, I've checked all five presets at 1080p (yes, some of the results on different presets are nearly identical), plus tested 1440p and 4K using the ultra preset. Anti-aliasing and Temporal Injection were left on for all of the tests, as the combination improves performance (at the cost of image fidelity). Minimum fps also tended to fluctuate quite a bit, and sometimes I'd get about a several second stall during a benchmark run, so multiple tests were run at each setting for each component, and I'm reporting the best result.
I used the latest AMD and Nvidia drivers available at the time for testing: AMD 19.9.3 and Nvidia 436.48, both of which are game ready for Ghost Recon Breakpoint. I also tried to test Intel and AMD integrated graphics at 720p low, but the Intel GPU refused to run the game. (I'm told an updated driver is in the works.) That was using the most recent 26.20.100.7212 drivers, and it would crash to desktop while attempting to load the main menu.
Ghost Recon Breakpoint can look a bit washed out at low quality, which again makes the relatively middling performance concerning. The GTX 1050 and RX 560 fail to clear 60 fps, but more critically, the GTX 1050 performance was prone to serious fluctuations. At one point, performance dropped to around 25 fps in the benchmark, and I ended up having to restart the PC to "fix" things. A second run right after the first test (shown in the chart) got 35 fps. I would be extremely cautious about buying Ghost Recon Breakpoint if you don't have a dedicated GPU with at least 4GB of RAM (or the 1060 3GB).
Minimum fps tends to be more variable than in other games, which is pretty typical of the AnvilNext 2.0 engine. You'll need at least a GTX 1650 to get a steady 60 fps (i.e., minimums above 60). The GTX 970 also performs quite poorly, dropping below even the GTX 1050 Ti.
As for integrated graphics, Intel is out of the picture, but AMD's Vega 11 did pretty well at 720p low, and even managed a playable 33 fps at 1080p low. It's not going to be the best experience for a shooter, but it's better than crashing to desktop. Interestingly, Vega 11 actually did better than the GTX 1050 at medium quality, though that's not saying much as it stayed below 30 fps.
CPU limits are also present, at least on the fastest GPUs. The RTX 2070 and above all end up with a tie at around 160 fps, with a bit of fluctuation. Meanwhile, AMD's GPUs seem to hit a lower ceiling of around 130 fps. The result is that AMD's GPUs underperform compared to Nvidia, despite this being an AMD promoted game. The RX 5700 XT ends up being AMD's fastest GPU at 1080p low, beating even the Radeon VII. However, Nvidia's 1080, 1660 Ti, and RTX 2060 all outperform it.
3a8082e126