So with the actual reviews out of the way, now is a good time to jump into pure, unabashed speculation, rumormongering, and looking ahead to the future of PC gaming.
The window around a console release is generally the most exciting in modern PC gaming tech. CPU and GPU technology jumps up sharply, games start leveraging increased graphical fidelity and put that CPU power to better use, and if you can secure a gaming PC that meets or exceeds the console at that point in time, it is the one moment where future-proofing makes sense, because you’ll be set for years for gaming.
There are two problems, which I discussed in my recent post about upgrading or buying a new machine now, but I left implicit there, which commenters called out and made explicit:
- The consoles, this generation, are more powerful than most gaming PCs as of their launches.
- It is possible to meet or exceed the consoles today, but doing so costs 4-5x the money of the console.
What the Radeon RX 6000 cards make possible is a future state in which you can get something precisely matched to meet or exceed the current consoles, as the RDNA 2 architecture at the heart of these Radeon cards is also the GPU tech powering both the Playstation 5 and Xbox Series X. However, if you want to precisely match the consoles so you can save some scratch while still getting an awesome gaming PC, it means waiting.
So what does the future hold?
On the AMD side, we expect there to be lower-end entries to the Radeon RX 6000 lineup, once the RX 6900 XT launches. This is pure speculation on my part, but if we match CU counts from the RX 5000 series, I would expect to see this:
RX 6700 XT: 40 CUs
RX 6700: 36 CUs
RX 6600: 32 CUs
RX 6500 and below: 22 CUs
These numbers would, in theory, mean that if you wait, an RX 6700 would neatly match the PS5 while likely having a similar boost clock speed to the PS5, while an RX 6700 XT would exceed it slightly, putting CU count right between the PS5 and Xbox Series X, where I would expect the graphics card clock speed to be high enough to brute force through to being on par with the Series X GPU. If AMD launches an RX 6600 XT similar to the 5600 XT they currently have, that would likely also have 36 CUs and be cheaper than a 6700, which would be a good deal. In the RX 5000 series, that card launched at $279 US, which is a pretty reasonable price.
However, that is all speculation on the AMD side.
We do know that Nvidia is preparing lower-end Ampere GPUs for launch in their RTX 3000 series. Currently, the biggest rumor is the pending launch of an RTX 3060 Ti, which would have 8 GB of VRAM, a further cut-down GA104 die from the RTX 3070, and would offer performance exceeding a 2080 Super (the level of performance in Nvidia-speak that the consoles offer). Pricing has not yet leaked or even been rumored that I’ve seen, short of an absurd placeholder in Euros (higher than the 3070!), so it is difficult to tell where this would fall, but I would expect it around $400, with a non-Ti 3060 at $350 to match the Turing lineup equivalent pricing, and then a 3050 eventually in that sub-$300 bracket. Of these, I expect the RTX 3060 Ti to be the last one to match or exceed the current consoles from Nvidia’s Ampere lineup, as a cutdown 3060 would likely just barely match a 2080 Super at best, and everything down from there pushes lower.
To wrap up on the consoles, let’s talk about CPUs really fast. Today, any 8-core or higher Ryzen 3000 or 5000 part can match or beat the consoles, and on Intel, the i9-9900k, i7-10700k, i9-10850 and i9-10900k all meet or exceed the PS5/XSX levels of performance. However, keeping in mind that the CPU on a PC is more encumbered by Windows and also has more responsibilities for other aspects of system performance, I would retain my recommendations of any of these parts higher than 8 cores – so the i9 10th gen parts from Intel and the 12 or 16-core parts from either 3000 or 5000-era Ryzen. Next year, Intel’s Rocket Lake cuts back to 8-cores in the high-end, but should offer enough overall performance uplift to be fine, and once DirectStorage launches for Windows 10, that should help alleviate storage performance to allow offloading that to your GPU, making current 8-core CPUs mentioned above match the consoles or beat them. That being said, to be perfectly honest, right now is simply not a great time to be building if longevity is your goal – waiting until Ryzen 5000 is in stock at MSRP regularly or waiting for Intel’s 11th-gen Core CPUs is a better bet. The same goes for GPUs – with Ampere remaining short-stocked and AMD’s RX 6800 launch today selling out in seconds, it just is a lousy time to try and chase a new PC. Don’t let that stop you – I’m still working on mine all the same! – but if you’re on the fence, I’d say get off that fence for now and watch through it until things look brighter.
But what led me to this split out post is actually an evaluation and bit of speculation about what the future holds for the Radeon RX 6000 parts, including those launched today.
AMD is known for having very iffy drivers as of late, and RX 6000 did dodge a lot of this, thankfully. However, what AMD drivers are also known for is a meme term – “fine wine.” Why is that? Well, simply put, it’s this: AMD’s driver team is slower to optimize the core functions of their driver for new hardware, meaning that over time, as the driver ages (like a fine wine), performance actually improves. I know it sounds a smidge silly, but it has held true for a long time – Radeon cards consistently improve such that day 1 reviews are off by as much as 15-20% of the long term performance the cards maintain. This is without mentioning that AMD often maintains driver support for far longer than Nvidia, with their driver package allowing cards that are 10+ years old to remain working with new titles and new game optimizations!
I broke this out from the review roundup because it is pure theorizing on my part, but I think we have to evaluate the possible ramifications of AMD’s driver strategy coupled with a few factors to see what long-term value the RX 6000 series Radeon cards hold.
Fine Wine Means More Performance Long Term
If we assume that AMD continues to tweak and optimize drivers for RDNA 2 long past launch, we can, reasonably, expect that the cards will continue to improve in performance as new driver revisions come to life. How much so remains to be seen – the launch drivers appear to be in good shape as-is, so I don’t know if we can expect a similar outlook of 15-20%. However, I will say that I fully expect a 10% or so uplift over the useful life of these cards – low-level driver tweaks coupled with game-specific driver optimizations should squeeze that much performance out. However, I think there are a few other factors to look at.
Smart Access Memory, Nvidia, and Application Support
When AMD announced Smart Access Memory as their proprietary technology, I feared that it would mean little or no support from software developers outside of AMD’s partner ecosystem. However, I’ve shifted my view on that for a few reasons. First and foremost, AMD being competitive means there is reason to optimize for this technology in the first place. In the past, where only 1 out of every 10 graphics cards was an AMD one, it would make sense to just not bother. However, AMD should drive their support up given that they are competitive and also because finding an Ampere graphics card is a rarity – provided that AMD can also get their supply chain in check. A second factor also helps this, though – since SAM is actually just the PCI-SIG standard technology of Resizable BAR, Nvidia has confirmed that they have it working on their internal drivers, and will be launching it for Ampere cards as well. Alongside this, they’re working with Intel and motherboard vendors to ensure Resizable BAR support is built into the UEFI for Z490-based Intel boards as well, meaning that modern Intel and AMD systems both should support this technology. That (theoretically) means that an Intel system could also support SAM on Radeon cards in the near future, but also with Nvidia putting their weight behind it, should ensure that application developers work to include support for it across the board – all of which is good news!
Consoles Mean Optimization
This is a point that gets beaten into the ground by AMD fanboys, and sorry, I’m here to do it too. Today, games are slowly being moved to next-generation development with the PS5 and XSX as the lead platforms. As that work happens, developers are likely to be using more RDNA 2-optimized code and also more likely to lean on AMD’s FidelityFX features, which also work on Nvidia and Intel GPUs. For example, WoW has already implemented AMD’s FidelityFX Ambient Occlusion for lighting technology, which gives a modest performance improvement and runs better on AMD hardware. Multiple next gen titles are already boasting about using AMD features, like Godfall, Dirt 5, and Far Cry 6. So, in theory, you’ll have developers using the open source FidelityFX feature set, since it runs well on all hardware and works especially well on RDNA 2 including the consoles, and then you’ll have next-gen console titles already launching with tweaks that make them run better on the AMD hardware set.
Now of course, all of this is theoretical – I don’t want to claim surety that any of these things will happen, much less all of them. But they do make sense in the current marketplace – Nvidia supporting Resizable BAR means more software optimization and benefit for AMD, RDNA2 in consoles means long-term optimizations, and AMD’s own driver team’s track record and history suggests that we will see improvements simply from driver tweaks to how the hardware is engaged, improving efficiency and squeezing more performance out.
The next year and change is going to be exciting for gaming hardware, either way, and with solid reviews and minimal reports of driver issues, my choice of the Radeon RX 6900XT is looking more and more settled!