Last month, I put Ghost of Tsushima through a benchmark meat grinder, testing all graphics presets across multiple CPUs and GPUs. Upscaling and frame generation were also examined to see how good Nixxes Software’s job was, transitioning the PS4 to the powerful gaming PC. For the most part, everything ran very well, making the game on PC a very enjoyable experience. Well, apart from two GPUs used for the analysis: AMD Radeon RX 7800 XT and Intel Arc A770.
ββββThe first one would only run for three or four seconds of gameplay before crashing to the desktop and while the second one worked, it created some odd visual glitches and the overall performance could be far better.
Intel kindly reached out and offered to help with this, lending me an Arc A770 Limited Edition, as I had previously used an Acer Predator version. For some reason, that particular graphics card was a bit temperamental at times, especially during the boot phase in the BIOS. I wasn’t too sure a new GPU would help but I gave it a try. Or try to, at least.
Intel drivers refused to install, despite cleaning all traces of previous versions. Even reinstalling Windows didn’t help. After much head-scratching, the solution turned out to be quite simple, if a bit old-school: unzip the driver package manually and then install it via Device Manager. Not a method one would expect to do today, but at least it worked.
Anyway, with everything finally installed, I could get on with testing. The good news is that the previous rendering glitch was gone, and the new Arc card ran through all of the upscaling and frame generation tests without any major issues, unlike before.
The good news is that the results for the new Arc weren’t really that different from the old one. Slightly faster at 1080p but worse at 1440p and 4K. At least it all ran properly and I could properly test the Intel upscaler in the game.
XeSS produced a handy performance bump, of course, but it wasn’t as successful when paired with AMD’s FSR 3 Frame Generation as with the RTX 3060 Ti using DLSS upscaling and FSR 3 FG. Running Ghost of Tsushima at 4K Ultra High, with XesS Ultra Performance and gen frame enabled only resulted in an average frame rate of 64 fps.
That might sound reasonable but the Ampere-powered RTX card pulled 107 fps on a Ryzen 7 5700X3D machine, using DLSS Ultra Performance and FSR 3 frame gen.
No other graphics card tested in the game showed such a large change in frame rate, going from the default setting High to Very High, as the Arc A770. So I went back and tested all the quality settings, to see if I could pin down exactly what the issue was with this game. The volumetric fog option was the biggest culprit.
At 1080p, with the graphics options set to the Very High default, the Arc A770 ran at 40 fps with very high quality fog and 61 fps with high quality fog – a 100% increase in performance! Although the other cards did better using high quality fog, compared to the high quality setting, the gains were not as great.
So why is the A770 performing so poorly compared to all the other cards used in the analysis? The first thing I did was run some GPU traces – using Intel’s GPA software – to compare the difference in rendering workloads between the fog modes, but there was nothing to suggest that the GPU itself bouncing off any particular boundary.
And it’s not like it’s dropped in hardware stats, as the table below shows:
On paper, the Arc A770 should be as fast, if not faster, than the Radeon RX 6750 XT and RTX 3060 Ti in Ghost of Tsushima. But as my tests have shown, it’s so far behind them that there must be something about the architecture and/or drivers that don’t like the rendering workload.
It could be a coding issue in the game itself, of course, but as we noted in our review of the A770, lackluster performance isn’t limited to just one game (though to be fair, those results were taken with elderly drivers). Sometimes Intel’s GPU Alchemist works exactly as expected, sometimes it’s a complete mystery as to what’s going on.
To try to investigate this further, I turned to the Vulkan performance tool created by Nemez on X, which evaluates GPU capabilities by running multi-threading and cache tests. While the results can’t be used to directly analyze why the A770 struggles so much in Ghost of Tsushima, they do show that the Alchemist’s performance is a bit of an enigma.
FP32 multiplicative instructions are extremely common in graphics routines and the Alchemist chip is not only well off the speed of the RNDA 2 and Ampere GPUs, but also well below its peak throughput. The overall rate can’t always be beat, even in tests like this, but it’s still much lower than it should be.
However, in the other throughput tests, the A770 is very good. It is not lacking in internal bandwidth and there is no sign of high cache latencies, but it suffers much more than the competition when dealing with high resolutions or heavy rendering in Ghost of Tsushima.
Intel is fully committed to issuing regular driver updates for its Arc graphics cards, but I guess drivers can only go so far – after all, support for Ghost of Tsushima was added in the 5518 driver set and there are two we have already released. that (5522 and 5534).
Ultimately, whatever the issues are, they are almost certainly found in the Alchemist architecture. The Battlemage GPU in Lunar Lake chips looks very promising and some of the changes look like they will help a lot. The only problem is that the competition is already well ahead. The $500 AMD Radeon RX 7800 XT is a perfect example of what Battlemage will be up against.
Ghost of Tsushima has been patched a few times since release and one of the updates was to improve stability for Radeon GPUs. Running full benchmarks showed that Nixxes had definitely fixed that issue and the RDNA 3 powered GPU had no issues running the game at 1080p and 1440p.
Your next upgrade
Best CPU for games: The top chips from Intel and AMD.
The best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel crusher is waiting.
Best SSD for gaming: Get into the game ahead of the rest.
It was only at 4K that it started to struggle but even then, it wasn’t too slow and the upscaling made it very successful. And it’s like that in all the games I’ve tested so far with that graphics card.
It can be rightly pointed out that the Navi 32 chip in the RX 7800 XT is much more capable than the ACM-G10 in the Arc A770, but only because it has dual ALU shaders (thus doubling the FP32 throughput) and more cache and VRAM. bandwidth. At 1440p resolution, the advantage they offer is nowhere near as great as it is at 4K.
Nvidia dominates the discrete GPU market, so Intel needs to look to steal some of AMD’s share, no matter how small. But with an amazing RX 7800 XT being 132% faster than Arc A770 at 1440p High, and RX 6750 XT being 68% faster, it makes me wonder if Battlemage’s jump in performance over Alchemist will be big enough.
One game certainly doesn’t represent overall GPU performance but it does suggest that Intel has a real mountain of GPU gains to climb.