While I’m working on putting together some more game entries for my games of the decade, I felt like doing a single-post, condensed listing of the best gaming tech of the last decade.
The 2010s were an interesting dichotomy for me in that I was an avid annual upgrader until 2011, and then that dropped off and I stopped following the market almost completely until 2018, when the landscape was suddenly drastically different. It speaks to something of the modern nature of PC gaming as a hobby, though – consoles lead, and with modern consoles being PC hardware with small customizations being stuffed into a small box and sold, once a new console generation launches (as was the case in 2013), things sit in stasis for a while. This was the case this last decade, however, for another reason, which we will get into.
For the sake of brevity (and some recent commentary on listicles and ideal sizes!) I’m going to narrow this to my personal top 3.
3. Real Time Virtual Reality Hardware
This is included because I think it merits discussion – I don’t currently own any such hardware (I do have my eyes on a Valve Index though!) but I think that the fact that technology has grown to the point of being able to manage real time 3D rendering in VR (meaning high refresh rates and dual display rendering for left/right eye) merits a discussion on its own. VR headsets for home use require a lot of considerations – motion tracking, 3D space management, display tech, audio tech, and managing all of this in a tiny visor that rides tight to your face. VR has not reached a point of popularity where game support is assured or mainstream, but there are a handful of major releases (Skyrim VR, Doom VFR, Hellblade: Senua’s Sacrifice VR) and a ton of indy titles. On top of that, the PS4 made a play for VR support with its own VR headset and breakout box, and while title support is similarly limited here (doubly so because of the processing power limitations), the fact that it works at all is impressive.
I don’t know that I’d say the 2020s is going to see VR be a breakout new technology, but with a new console generation launching in 2020, the available horsepower in new systems, graphics cards, and CPUs should all enable better VR experiences and start to bring over more developers. Couple that with the lower prices of entry-level VR kits, and we are nearing a point where more people will at the very least try it. Having said that, I feel like unless there is a clear breakout hit with mass adoption rates, VR is going to be more like 3D TVs and less like high-refresh rate monitors (probably the actual beneficiary of the increase in raw processing power available to the average consumer, IMO). Which is a shame, because we haven’t yet had the amazing VR MMOs that anime talked about, and if that ends up never happening, than Sword Art Online just ends up being a bad piece of work instead of a prescient media experience!
2. SSDs
Yes, technically, the first consumer-level SSDs were available prior to the 2010s, however, it is worth discussing their proliferation into gaming as a part of the 2010s. When the technology first launched, it was a fantastic boost to Windows boot times and not much else – my first SSD was 60GB and was barely good enough for that, but I marveled at the speed with which Windows Vista launched on that drive. Ever since then, the SSD market has been a fierce one, with new flash memory technologies driving down performance and prices, achieving a better symmetry of the two. Super-fast SLC as the sole medium of storage gave way to TLC and MLC, and now fast modern drives for consumers will often use an SLC cache to speed up things for consumer operation while using TLC or MLC to store the actual data once written. Samsung has come out with QLC flash, which sacrifices the higher write speeds to reach still consumer-fast performance while being cheaper per GB than other flash tech.
SSDs have drastically improved my gaming experience as an MMO player – loading into instances in WoW, changing zones in FFXIV, all of these are faster than is possible with spinning drives. If you told me in 2009 a fast 7200 RPM hard drive wouldn’t be good for a game, I would have laughed, but just look:
The difference is plainly obvious for a game like WoW at this point in time, and while many other games work around storage, as the new consoles are both rumored to use SSDs as the primary storage method (even rumors of Samsung NVMe drives!), we’re likely to see an increase in performance that begins to mandate SSDs.
The good news of all of the above though is that first part – focus on cheaper, consumer-available flash memory technology means that getting an SSD with enough storage to hold your Windows installation and a few games is no longer a difficult endeavor, provided you are comfortable cracking the side panel of your machine to make the upgrade. With 250 GB drives starting at $30, they make sense to most users and if you’re building a new PC anytime soon, you may be able to make the switch fully to SSD storage. (Unless you are a digital packrat like me, who has a 3.1 TB Steam library fully downloaded and installed, along with 200 GB of Blizzard games and an FFXIV install!)
1. Sandy Bridge Microarchitecture from Intel
This one is a bit contentious, but for me, I think nothing quite shaped the decade in overall hardware like this CPU launch from Intel. Sandy Bridge, the microarchitecture behind the Core i5-2500k and Core i7-2600k CPUs, absolutely defined the competitive race between Intel and AMD and was the point at which Intel clearly dominated, a lead they maintained in performance until just this year.
Sandy Bridge was from a more aggressive Intel, two to three generations deep in their Core series CPUs. Intel, still remembering the big wins AMD made the prior decade, which Intel was just starting to claw back in the datacenter space, wanted a part that completely reclaimed their dominance. Sandy Bridge was it, and boy, it was a massive leap forward. Huge instructions per clock gains and easy overclocking allowing the parts to reach very high clockspeed at rock-solid stability. I consider myself an AMD CPU fan, but during the early Core era, I was all-in with Intel, and Sandy Bridge made damn near everyone Intel fans. Rightfully so, too.
It is easy to dump on Intel for being a company in modern times that rests on their laurels, using their brand name and recognition coupled with sweetheart deals with system manufacturers and datacenters to coast on ahead, but a lot of those deals today have origins back in this era. AMD’s response to Sandy Bridge was the immensely awful and disappointing Bulldozer FX series parts, which not only underperformed Sandy Bridge, but also AMD’s own prior Phenom lineup CPUs. AMD was forced to spend the next 5 years from 2012-2017 iterating on Bulldozer with newer designs while Intel continued to increase performance and clock speed. Although no subsequent increase from Intel matched the huge jump Sandy Bridge offered, it didn’t matter – they were already far ahead of AMD and the tweaks to Bulldozer just didn’t bring enough extra performance.
This also, however, was great timing for Intel for another reason. With the launch of the current console generation (Xbox One and PS4) and those systems using low-clock power efficient AMD CPU cores, even a game made to take full advantage of the consoles could run easily on even the cheapest Sandy Bridge parts, and it was only fairly recently that the use of all 8 cores in the consoles has led to AAA ports being better suited with more modern hardware.
Lastly, Sandy Bridge serves as an interesting tale for the current state of the CPU market. Intel had a few generations where they were beating or within a hair of AMD’s Athlon and Phenom parts, but Sandy Bridge became the blowout generation, where getting an AMD CPU only really made sense at low price points. With the 2019 launch of Ryzen 3xxx CPUs, AMD is starting to pull ahead of Intel, and if rumors about the 4000-series Ryzen parts hold true, AMD will be in a similar lead to the one they suffered at the hands of Sandy Bridge this time next year. There’s no denying that Sandy Bridge was, in all likelihood, the best CPU launch of the decade. Ryzen 3000 was exciting, but Sandy Bridge was an unprecedented event – a massive increase in performance available at a mainstream cost, and a generation of hardware that still remains in service for many even today.
With that, a few runner-up spots merit brief discussion!
Ryzen: Sandy Bridge was the defining launch of the 2010s, but while Ryzen’s 2017 launch was good, it didn’t quite set the world on fire in the same way, as mainstream performance was just enough below Intel’s parts of the era to push Ryzen to a solid runner-up. However, with 2018’s Zen+ and this year’s Zen 2 architectures, AMD has a solid foundation, and with another rumored 15%+ IPC increase for next year’s Zen 3 based parts (along with a small potential clockspeed jump!), the CPU of the 2020s could very well turn out to be Ryzen. I certainly hope it is – Intel is basically tapping out of the market and refocusing on datacenter and mobile until they can spin up 7nm (so the rumors go, at least).
RTX Technology: I’m of two minds on Nvidia’s RTX tech for real-time raytracing and the companion tech that makes it work well (DLSS for downscaling and tensor-core powered de-noising to make the real-time aspect of RTX feasible). On the one hand, few titles support any of the RTX suite technologies (DLSS has been the most commonly adopted, surprisingly) and when the ray-tracing component is supported, it tends to kneecap framerates in games that often work better faster. However, the simple fact is that RTX is technologically quite impressive – Nvidia gave up a huge amount of silicon real estate in the Turing GPU design in order to make RTX possible, real estate that could have been used to simply deliver a huge generational increase in rasterization performance. Instead, we got between 15-30% from top-end Ti card to new top-end (the main reason I am still using my 1080 Ti!) but got a feature that, when it is used well, looks impressive. Sure, it isn’t really completely to the level most of us would want yet – games use too many baked-in lightmaps both for practical lighting but also as palette-setters, and until the art style of modern games evolves to take real-time, fully ray-traced lighting into account, it is always going to be a bolt-on feature – better looking in some ways but less true to the developer’s artistic vision.
Game Console Mid-Cycle Refreshes: I would loathe giving an actual consideration to them, but the PS4 Pro and Xbox One X are both interesting in the grander scheme of things. In the past, mid-cycle hardware refreshes were limited to silicon process shrinks allowing smaller heatsinks and power supplies, which also then allowed the physical dimensions of the consoles to be smaller and to enable price cuts due to everything in the BOM being cheaper. This generation, we sort of got that (PS4 slim and Xbox One S) but then both Sony and Microsoft took steps forward to iterate on their hardware and enable features that could have been done from the beginning, albeit not as well or as well supported. 4k output was the big one, although it remains fascinating that Sony took a less impressive way out – building the pro and upcharging for it when it adds little more than clockspeed for the CPU and GPU and a slight improvement to the GPU that still only enables 4k via scaling in most cases. Microsoft took a much better approach to the Xbox One X refresh, using the opportunity to ship a substantially beefier system with higher-clocked CPU, higher-clocked and more enabled GPU, and more system memory that also ran faster. Some of this is down to the fact that while both systems are, more or less, the same APU from AMD, Microsoft disabled substantially more of the GPU and ran the CPU at a lower clockspeed in the launch Xbox One when compared to the PS4, while also using slower DDR3 system memory instead of the GDDR5 in use on the PS4. Both systems, however, deliver a better experience than their launch counterparts, although I am not a big fan of the idea of a mid-cycle refresh bringing more power (meaning that what was once a safe-7-10 year lifecycle investment for a console, barring any malfunctions is now 3-5 years and then perhaps another 3-5 years within the same ecosystem). However, both developers shored up their weaknesses for the most part, and today, the PS4 Pro and Xbox One X represent a fantastic out-of-box experience that works with your stuff better – proper 4k support, a better PS VR experience in the case of the PS4, software and UI updates already present, and larger, faster built-in storage.
Wireless Technology in General: Last decade, if you told me a wireless controller was good enough for even most latency-sensitive games, I would have said no way. Now, the Xbox One and PS4 controllers are both pretty damn well tuned for the systems in question, and the Xbox One controller even supports Bluetooth for use with PCs! Sure, there are still a few applications where it gets bad (I can’t play Fire Pro Wrestling World well unless I wire up my Xbox One pad) but overall, they work far better than the wireless tech of last generation, and one of the most versatile addons I have for my PC is an Xbox One controller.
And with that, we’ve discussed most of the major gaming tech I would look back on this decade with reverence!
Excellent points. I hadn’t considered SSD tech but it really has made the biggest impact in gaming for me as well. My Xbox has one hooked up externally and it makes a huge difference, as well as the PC being pure ssd.
LikeLiked by 1 person