So, essentially, we need things to change. Same hardware, same settings, same resolution. We knew that Tomb Raider 2013's benchmark produced much higher frame-rates than the same content in-game, but the difference is much wider than we thought it would be. And even CPU-related performance can bottleneck in different ways as there are systems which are heavily parallelised - like graphics, dispatch, physics or animation, while AI and gameplay logic are often more single-threaded in nature. There's also the load incurred by streaming and decompressing new data, which can have profound implications - as we saw in Arkham Knight.
#Arma 3 sync exact match greyed out Pc#
This is problematic because not every PC owner is playing on a powerful i7 and the reality is that game performance is not entirely defined by its rendering - though to be fair, it is typically the first bottleneck you'll encounter.īut that intrusive stuttering that often represents gaming performance at its worst? CPU and storage are main culprits there, with the simulation work, animation, AI and generating draw calls for the GPU often causing big dips to in-game performance.
Stutter is present to some degree on every PC configuration we've ever tested this title with, yet there is none of it on display in the benchmark.Īs things stand, in-game benchmark modes are often not reflective of actual performance realities or variability, and they tend to overly focus on the GPU as the sole limiting factor in game performance - as Rise of the Tomb Raider and Tomb Raider demonstrate. The game itself is notorious for its very poor background streaming and generally bad open world performance, particularly evident during fast traversal - such as driving in the Batmobile. The content chosen for benchmarking showcases some environments and effects work, but all it's actually doing is organising some pretty cinematic shots for the user to look at with some fairly meaningless numbers coughed up at the end. Possibly the worst example of a genuinely useless benchmark mode is the notorious Batman Arkham Knight. However, the benchmark representation plays out with no issues at all - actual gameplay sees performance dip to anything up to 35 per cent compared to the benchmark mode for this area! It's a benchmark that should be immensely useful to users because the game has a very specific problem - performance is great through all the early levels until you hit the hub areas like the Geothermal Valley, at which point performance drops significantly. Three scenes are rendered, taken from the snow peak at the beginning of the game, a beautiful run through the opening of The Prophet's Tomb and finally a fly-through of the Geothermal Valley. The follow-up - Rise of the Tomb Raider - is an improvement of sorts, but still has issues. It's a scene that's easy to replicate in-game where we find that the same hardware running the same scene at the same settings produces anything up to a 21 per cent performance deficit. It tests just one scene - the initial shipwreck scene from the beginning of the game - and it shows the camera panning around the Lara Croft character model. Perhaps the most notorious example we can muster is Tomb Raider 2013. However, there are some particularly striking examples we have to highlight simply because the delta between benchmark mode and real world performance is absolutely massive.
And then there are others which actually drain system resources more than the actual game, which we'd argue is of more use than those that inflate their performance figures. In fact, there are a range of great examples that do set you up reasonably well for tweaking for optimal performance. Have you ever loaded up a new PC title, run the in-game benchmark, tweaked settings for optimal performance then discovered that actual gameplay throws up much lower frame-rates, intrusive stutter or worse? It's a particular frustration for us here at Digital Foundry, and it leads to a couple of very obvious questions: firstly, if benchmark modes are not indicative of real-life performance, what use are they? And secondly, if their use is limited, how representative of real-life gaming are the graphics card reviews that use them, including ours?īefore we go on, it's fair to point out that not every benchmark mode out there is useless beyond redemption.