Thoughts on using AMD graphics in 2023

  It’s been some time since I had my last AMD/ATI graphics card. In fact, the last one I had was a Radeon HD7970 back in 2012. I had some longstanding issues with that card, and the previous HD4970 CrossfireX setup I ran even further back. I became so frustrated that I started roughly a decade of exclusively using Nvidia graphics cards starting with a GTX 470 and ending with my last card of an RTX 2080 Ti.

Honestly, during that time I had some stinkers, like the GTX 970, and some real winners like the GTX 1080 and RTX 2080 Ti. Something that didn’t change much was how I interacted with my graphics drivers. Nvidia’s drivers look almost identical, and behave almost identically, to when I first used Nvidia with the 7800 GTX. In fact, the last major change to Nvidia’s driver applications was the introduction of GeForce Experience back in 2013. Since then, the user experience has largely stayed the same. And while that can be good, it can also be a sign of stagnation. It makes it simple from a user’s perspective in that you never have to do much to learn where new features are located in the driver applications, but it also means that the GeForce drivers in particular feel absolutely ancient. Those seemingly endless lists of dropdown menus for tweaking settings feel old, awkward, and unpolished at this point.

Having said that, it also made me feel complacent. Despite years passing and a decade of innovation from AMD, I kept having flashbacks to those miserable experiences that I had last had from their graphics division back in the early 2010s. So, I stuck with it and managed to luck out buying a 2080 Ti back in 2019. It was stupidly expensive, but I really did love that card. It is particularly bittersweet being that it was the last card from EVGA that I owned too. I had come to really count on their cards whenever it came to Nvidia and the shakeup between EVGA and Nvidia that ended up with their separation left a particularly sour taste in my mouth beyond the already shrinking opinion I had of Nvidia.

So, when the GeForce 40 series and RX 7000 series began to roll out last year, I began to really pay close attention to both of those offerings. I had already decided that I wanted a change from Nvidia and wanted to see where the current generation of cards would stand with performance. And while the top tier 40 series cards were powerhouses, they were also plagued by early issues such as melted power adapters on some 4080 and 4090 cards. It was also a 4070 that looked honestly lacklustre compared to the price point and spec sheet. They just didn’t look good.

AMD on the other hand had hit back with the previous RX 6000 series and had competitive, if not high-end, offerings on the table. AMD had also been working over that decade to completely revamp their drivers, bringing them forward to the modern era both in features and in user experience. During this time, I also upgraded back to an AMD processor from my near decade of use with the Core i7 2600K when I swapped it out for the Ryzen 7 2700X also in early 2019. What I found was a rock-solid product with competitive performance and a whopping 8-cores/16-threads to play with on a consumer platform that had been languishing for years with quadcore chips. Or hexacore if you were particularly spendy. 

My computer as it stands now was upgraded dramatically versus what it was when it was first built. I went from the aging 2700X, 64GB of 2400MHz DDR4, and 2080 Ti, to quite the jump in performance. I opted to stick with the same board and go for the 5800X3D, 64GB of 3200MHz DDR4 in two sticks versus 4, and a Powercolor Radeon RX 7900 XT Red Devil. My aging Acer 4K 60Hz monitor was also slowly causing more issues as my secondary monitor and I changed from using my Pixio 1440p 165Hz IPS to my first VA monitor since my Dell 2407WFP-HC back from 2007. I chose another Pixio 1440 ultrawide, upping from 2560x1440 to 3440x1440. And while it does overclock to 165Hz I’ve kept it to 144Hz because I found the overdrive simply got far too aggressive and would get shadows and ghosting that was frankly unusable for me. It’s not as responsive or clear as the IPS monitor but it’s more than enjoyable in the games I play which are primarily single-player focused. Games like Fallout, Skyrim, Elden Ring, and such along with a heavy dose of Forza Horizon 5.

And what has it been like using this card since January? It’s been great. Drivers have been steadily upgraded at least once a month. Day one drivers have been solid options, such as the most recent for Starfield. In a somewhat ironic twist of fate, it seems Nvidia was caught on their backfoot when it came to drivers for that title. That said, they were certainly further along than poor Intel with their drivers for Arc which were abysmally terrible.

When it comes down to it, I’ve been very happy with the card. Amusingly for me, it also means I’ve had cards from every AMD/ATI 7000 series. One of my first cards was a Radeon 7000 which was, frankly, awful as their bottom barrel card at the time. I’m not quite sure how naming is really supposed to go with graphics, or any computer components really, as the industry is rather chaotic in their naming regardless of company. What I’ve found is that when it comes to my games I’ve gotten roughly a two-fold increase in performance while increasing my resolution by 34 percent. Forza for instance would typically hover in the 80-100 FPS range at 1440p before but now averages 120-160 FPS at 3440x1440. And that’s with the raytraced self-reflections enabled in game too.

That’s an area that, because I did start with first-generation raytracing with the RTX 2080 Ti, even raytracing being AMD’s Achilles heel I’m still impressed. Would the 40 series give me better framerates under raytraced loads? Oh, absolutely! Strongly so I imagine. But I’m also seeing greater performance than I ever saw with my 20 series so I’m still happy with my card. If there’s one area that would be nice to still have it’s Nvidia’s AI upscaling. It’s simply more mature than AMD’s FSR implementation. In fact, going back to Starfield, I had to disable FSR to get Starfield to run beyond the introduction mission. I’d have a crash to desktop otherwise that was always at the same point. As whether that’s down to AMD or Bethesda I think it’s likely a combination of factors there and even without the performance boost FSR offers I still have a great experience in Starfield.

And that’s really what I’ve learned through all of this. AMD has absolutely worked hard the last decade and it shows in the performance they offer now with their CPUs and GPUs and with the much more streamlined graphics drivers through AMD Adrenalin. At this point I have no qualms about suggesting AMD graphics or processors. They’ve brought some great competition to Nvidia and Intel and we’re finally seeing the results of those labours. Right now is a stark contrast to the awful malaise of the GeForce 30 series and pandemic shortages and it’s great to see. I’d love to see pricing continue to fall but I also know that none of the companies are incentivised for that and are more than happy to carry the increased margins to the bank. But, that gets into a whole slew of other issues going beyond the scope of this article. So, instead, go with what is best for your wallet. There are plenty of options for the first time in years.


Comments

Popular Posts