Nvidia recently took the gaming world by storm at its October conference. It revealed, as highly anticipated by millions of PC gamers, its newest graphics cards. The new RTX (named after the supported raytracing feature) 3000 series includes the 3070, 3080, and 3090. In typical, but highly straight-forward, Nvidia fashion, each ascending number marks a performance increase. So, the lowest specs belong to the RTX 3070 and the most powerful GPU in the market is now the RTX 3090.
The RTX 3090, in particular, comes as a much more powerful successor to the previous RTX 2080ti, the former champion. The card comes with 10,496 CUDA cores and 328 Tensor cores, with 24 GB of VRAM. This makes it effectively 40% faster than the RTX 2080ti, and have double the memory. At $1500, this card renders the latter’s $1200 price tag obsolete.
However, the RTX 3090 seems to meet its match in Ubisoft’s new game, Watch Dogs Legion
The RTX 3090 was marketed at the conference as the ultimate 4K gaming experience. While previous high-end GPU’s sought to attain 4K gaming at 60 FPS, this one promised much more. The better core counts, higher base, and boost clockspeeds all evidence this truth. In advertisements, 4K or even 8K gaming was emphasized to be achievable even at high framerates. However, just a few weeks into the card’s release, Watch Dogs Legion reportedly has it stumbling.
As reported by YouTuber RajmanGamingHD, the in-game benchmark tool for Watch Dogs Legion showed that his RTX 3090 GPU could not sustain 4K resolution at 60 FPS. They showed an average between 44 and 54 FPS, with frequent dips to as low as 47 FPS during intense gunfights and explosions. And, even more disappointingly, this was all done with raytracing entirely off.
Even if the claim is somehow unreliable, it is also possible games like Watch Dogs Legion just prove too graphically taxing
Ubisoft in general optimizes its games pretty well, but Legion is probably just a very heavy game. The game makes use of a large open-world in a highly-detailed version of London. Notably, the game prides itself on the ability to recruit almost any NPC as an ally, which can’t be an easy task to smoothly render. However, a GPU that costs $1500 and calls itself the best card available should perhaps do better.
Granted, the internet is the internet, and we can’t fully authenticate how reliable that single user’s video was. It is also possible they had modified their settings, as factors like DLSS 2.0 can alter performance significantly. Or, to use the classic disclaimer, their hardware was simply faulty and not totally representative of all 3090’s.
However, there is usually at least some level of truth to such wildly controversial discoveries. Suppose the source is right, and Legion cannot run at the 60 FPS bare minimum at 4K. After all, the Xbox Series X and PS5 versions were confirmed to only run at 30 FPS at 4K, and they sport much weaker custom AMD GPU’s. What’s to stop more upcoming AAA titles from also being so graphically-intensive? If this finding becomes a regular thing, rather than an isolated incident, then the RTX 3090 should no longer be viewed as the optimal 4K card.
While the RTX 3090 may have several unmatchable specs, the upcoming AMD GPU’s could potentially compete with it
Those who watched AMD’s recent reveal of the Big Navi GPU’s know that raytracing is no longer an Nvidia-exclusive feature. The RX 6800 RDNA-2 card competes with the RTX 3070 but at a lower price. Historically, AMD cards come as more power-efficient, budget-friendly GPU alternatives. So while the RTX 3090 may still keep the title of most powerful, yet still fail to hit a consistent 4K 60 FPS, buyers may turn to the less powerful but still good-enough, cheaper option.
Stay tuned for the latest on AMD, Nvidia, and Watch Dogs Legion!