Sidenote: PS4, Xbox One, and AMD – The Dark Decade of Gaming Hardware

This is a post I’ve been thinking about for a while!

Through the Sidenote series, something that comes up as a recurring theme of sorts is how gaming hardware in the 2010 decade (well, much of it) was basically stagnant – small or no performance improvements per clock, low or no CPU core count increases, sandbagging due to lack of competitive response, and more. However, what I think is fascinating is to look at why.

As a pre-teen, I gravitated towards technology because understanding more about how games got made was something that appealed to me, and I found the analysis of hardware fascinating. In the mid-nineties, the console war of the era was defined by 3 very different systems – the Saturn, the original Playstation, and the Nintendo 64, all of which had unique selling points and made very different uses of their hardware budgets. Even more thrilling, they all had advantages over the PC hardware in their own ways, at least at launch.

This kind of thing continued into the next generation, where I knew the specs of each of the four systems competing offhand and even knew the polygon per second count of the graphics processor of the Playstation 2 (it’s 66 million, in case you were curious). Once I started building PCs as a young adult, the thing I most loved is that performance nearly doubled every year, so there was always some amount of boundary-pushing happening. My first full, from scratch build was a dual-core Intel Core 2 Duo E6300 system, and within a year of that, I had the Core 2 Quad Q6600.

The game consoles of this era were more powerful than ever, but the old approach of bespoke, fully custom hardware was overcomplicating things for developers and the manufacturers alike. Sony’s PS3 shipped with the Cell processor, which in theory was a vastly more powerful solution than the 3-core, 6-thread “Xenon” CPU in the Xbox 360 (and both were designed and prototyped in the same IBM building, which is a fun fact!). However, in practice, the weird asymmetrical nature of the Cell and its cores being of unequal processing capability made it difficult to leverage fully, and when considering that the Xbox 360 had a better GPU and easier to design for CPU, the choice of lead platform was a no-brainer. Very few, if any, titles made use of the full breadth of the Cell microarchitecture and the other bad choices of the PS3 design led to it being runner-up in that generation (off the top of my head – fixed-function GPU shaders versus a fully programmable shader model, the split RAM design meaning late-era games like Skyrim ran like ass on the PS3, Blu-ray drives hold tons of data but are slower, Cell deserves a second entry on this list, and the launch price).

When 2013 rolled around, consoles took a new approach, enabled by a change of business strategy by AMD. For the first time since the original Xbox, both the Xbox One and the PS4 were made with commodity PC hardware. However, unlike the original Xbox (which was pretty much PC components with slight modification), these systems were what AMD calls “semi-custom.” AMD supplied their intellectual property – CPU designs and GPU designs – and then did layouts unique to each console brand. Somewhat interestingly (and this is also the case in the upcoming generation), both systems used identical hardware portfolios – Jaguar CPU cores with the only difference between both systems being clock speeds, and Radeon GPU tech from the HD 7xxx era, with Sony having 150% the compute units that Microsoft had at launch. Microsoft customized further by adding an SRAM cache to the die (something they used a daughter die for in the Xbox 360 which worked really well), while Sony stuck to the design as it stood. Both then paired their sort-of bespoke system-on-chip designs with the memory and storage configurations of their choice – Microsoft using very slow but low latency DDR3 system memory, while Sony opted for much faster but higher latency GDDR5. Since the GPU is arguably more important for gaming, Sony’s decision to use a slower CPU clock speed and much larger GPU coupled with the bandwidth of GDDR5 won out over the Xbox One on hardware specs, while Microsoft’s clumsy announcement of platform features requiring online connectivity (a sore spot at the time for gamers) cost them their Xbox 360 era lead and made them the defacto number 2 system, which has continued even as the Xbox One X refresh gave them the performance crown once more.

But that is just me getting nostalgic and talking a lot about hardware. What is more interesting here is why technological innovation slowed down in this era and why AMD was even in the position where their business led them to making these chips in the first place.

In 2010 to start the decade, AMD and Intel were fairly competitive on the PC hardware front. Both had their strengths and opportunities, but overall, while Intel was in the lead overall, AMD was worth buying at multiple price points for the CPU in your gaming system. In 2011, however, both companies released the products that defined their decades, respectively – Intel’s Sandy Bridge architecture in the Core i7-2600k and the AMD FX series CPUs. Sandy Bridge nearly doubled the IPC of the previous-generation Nehalem CPU architecture and offered that performance at an accessible $300 price tag, coupled with the then-new part differentiation of K-SKU CPUs which had unlocked multipliers for overclocking. AMD’s FX, on the other hand, made a bet that applications would be deeply multi-threaded in integer workloads, splitting a single floating point unit between every two integer units, but calling the combination of two INT and one FP unit a “module” and branding it as two-cores. FX was disastrously slow compared to the Intel Sandy Bridge parts, and the only other designs in the hopper for AMD were the A-series APUs (which, to be fair, were better than FX in a lot of ways) and the Zen core, which would take 5 years to complete and we’ll be talking more about later.

So the stage is set early on – in 2011, AMD bombed in the CPU market with a part that was, in some cases, even slower than the Phenom CPUs that came before, while Intel had nearly doubled their IPC with Sandy Bridge, and AMD was stuck with a development pipeline that had over-committed resources to the Bulldozer (codename for FX) lineup. In the GPU space, AMD was doing much better, but they made another strategic blunder that cost them – rather than rolling out a more robust top-to-bottom GPU lineup, the company committed to making parts at demand peaks, which meant that in most cases, AMD was only designing 3 GPU core designs for a full stack, with the top-end part being a small, efficient mainstream GPU. Nvidia, on the other hand, was having trouble, but once they smoothed out their Fermi disaster from the GTX 480 with the better GTX 580, the writing was on the wall for AMD. Nvidia immediately began sandbagging knowing that AMD wasn’t going to compete with a halo product, so the GTX 680 was made using cutdown versions of their Kepler design, instead of being the top-end version (later, the top end Kepler would be the first Titan card).

This market condition on PC was a self-inflicted wound for the small AMD, and had they simply plugged away at the desktop market in this condition, the company might very well have gone belly-up. Under CEO at the time, Rory Reed, AMD introduced their semi-custom business and pitched to Sony and Microsoft for a lifeline. The income from consoles was a sure bet – both manufacturers were likely to sell near 100 million units in the system lifetime, and if each one of those was a silicon purchase and licensing purchase to AMD, that guaranteed cashflow would sustain the company while the Zen CPU core design was completed and they could compete again in the desktop PC market.

The problem is that at the time of the console launches of the PS4 and Xbox One in late 2013, their hardware was already outdated. The Jaguar CPU cores were mobile-optimized, low IPC, and ran at clockspeeds under or just around 2 GHz. Meanwhile, the GPU design was two years old at that point, and while AMD ran the GCN architecture into the ground for the next handful of years anyways, it was already nearly 5 years old at that point. It didn’t help that the GCN design, while okay for graphics work, ended up being better at raw compute (hence why AMD’s Radeon Vega cards were the most sold out during the cryptocurrency boom) and so the consoles, the lead platforms for most game development, were already holding back the PC market.

However, due to AMD failing to show up to the fight in the desktop space, that also didn’t matter much either. Since there was little to no AMD competition, both Intel and Nvidia floated on with designs that coasted, offering less and less generational improvement while pushing prices higher. Nvidia crept up from a top end xx80 GPU costing $500, to $600, to $700, until the launch of the 2080 saw them selling the non-Ti 2080 for around $800! Intel likewise pushed 4-core, 8-thread designs as the top of their consumer lineup from 2009 until 2017, and from 2015 on, there was zero IPC gain in Intel’s designs – slightly faster clock speeds, some new feature, but nothing really to write home about.

All of this led to a lost decade of sorts in gaming technology – the consoles were slow, commodity PC hardware re-engineered into a semi-custom SOC that could be bought cheaply and sold at a profit to the consumer. AMD was using that revenue to take their time with the Zen design, and without that Zen design in the market, Intel developed a fierce complacency that is costing them a bit these days.

However, it is worth noting that this era ended and what we have now is promising in a way that technology hasn’t been since I was much younger and less jaded. In 2017, AMD did launch that Zen core via the Ryzen and Epyc lineups, and while the first launch wasn’t 100% competing with Intel yet, refinements now have them competitive with Intel, with Zen 2 in both next-gen consoles thanks to that same semi-custom business built by Rory Reed in the early 2010s. AMD’s GPU lineup has improved thanks to the cashflow and success of Zen, with RDNA being a much better design that fundamentally didn’t even change that much of the GCN thought process, but rather tweaked and optimized it to work better for gaming graphics, and the RDNA2 architecture likewise is a part of both next gen consoles and looking great! Intel has been spurred to action and upped their core-counts 3 times in the last 4 years, cut their high-end desktop lineup costs in half, and generally looks like they are starting to realize they need to innovate again. Nvidia likewise, rumored to be scared by what RDNA2 is bringing to consoles, has worked to drastically improve their desktop GPU lineup and Ampere seems likely to bring much better real-time raytracing performance along with a bigger uplift in rasterization performance than what the RTX 2080 Ti brought relative to the prior-gen 1080 Ti.

Lastly, it is worth talking about briefly here, but I plan to get into it in more detail in its own post, but storage technology is finally getting a moment in the sun, as the next gen consoles moving to SSD storage means we can finally see games optimized for that hardware specifically rather than taking a hard-drive optimized game installation and putting it on a much faster drive which eventually hits diminishing returns. In this way, the consoles are actually better than PC for now – due to optimization with fixed-function hardware, the PS5 in particular but also the Xbox Series X can stream assets in realtime from their SSDs, using them as a backup to RAM in a sort of way rather than just a slow library to be accessed in downtime. This optimization also means that game installs should get smaller again, as the need to store multiple copies of an asset to reduce seek times will not be a concern in this era.

Overall, I’m actually excited to have consoles that clearly have advantages over the PC for now. It means that as a PC-first gamer, my gaming is going to get more detailed, more interesting and feature-rich, and to make better use of the investments I’ve made into my system than it currently does. Those advancements also mean that we can get newer games that push more boundaries, getting closer to fully-realized virtual worlds with fewer or even no compromises.

That is exciting, and a good light at the end of the dark decade of gaming hardware.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.