Thoughts on Graphics and the Domination of Nvidia

When it comes to personal computers there are two names that dominate: Intel and Nvidia. Intel still retains the lion’s share of processor sales and Nvidia maintains a massive lead in dedicated graphics chip sales. One of the largest hardware surveys, from Steam’s Hardware Survey, shows how dramatic that reality is even now. As of now, late January 2023, Nvidia is just under 76 percent of installed graphics cards and Intel is around 67 percent of the processors in systems surveyed. This has held steady for years now of both companies holding a massive lead over competing manufacturers. There has been a very slow, but steady, reduction in Intel dominance for CPUs in personal computers since the release of AMD’s Ryzen processor line starting with the 1000 series in March 2017 nearly 6 years ago.

However, when it comes to graphics Nvidia has held fairly steady at controlling 75% of the market for years now. AMD consistently falls into approximately 15% of the computers surveyed, and Intel at 9-10% of graphics chips. I feel that’s not quite fair as only until this past year has Intel started producing discrete graphics cards, which aren’t even listed as being quantifiable yet in the hardware survey. Regardless, Nvidia dominates the field and has for years now. There are countless reasons for this, from mismanagement and poor performance from AMD’s graphics division to ethically questionable decisions from Nvidia to wrest more control on the market, to simply long-standing misconceptions about AMD graphics cards. And there are so many more reasons beyond that but, ultimately, that’s not what I’m talking about with this article.

No, instead, I want to address some of those concerns as well as explain what is going on in the graphics market with the start of 2023 and why now is an excellent opportunity for AMD and Intel to both push their discrete graphics solutions hard to market as now is an opportunity that hasn’t existed for quite some time.

Firstly, 2022 was rough for Nvidia in that there was a lot of negative press that came out about them. The RTX 4000 series, which came with rave performance, also came with more than a month of speculation about fire hazards associated with the new 12-pin PCI-Express 12 volt connector that first appeared on their RTX 4090. Ultimately, that was a mess that could be traced back to Nvidia as well as board partners and even to the PCI-Express industry group itself for a particularly finnicky and fragile design. That particularly stung with the eyewatering prices commanded by the RTX 4090 and 4080 both coming in north of $1000 retail. In fact, they’re still upwards of $1200 for the 4080 and $1800 for the 4090. That they dropped the “12 GB 4080” and reintroduced it as a 4070 Ti for $850 (though you’re likely to find them at the $900-1100 range instead) was a slap given the debacle surrounding that.

But, for me, the biggest news was when EVGA announced they were dropping out of the graphics card market entirely and that it was largely due to how they had been treated by Nvidia leadership for years. I fully admit I had a strong bias towards EVGA graphics cards. My first Nvidia card that I chose was the venerable 7800 GTX and it was a special edition by EVGA, the excessively long-named EVGA e-GeForce 7800 GTX KO ACS³ Edition. Long story short it was EVGA’s crown jewel of their 7000 series at release and featured a custom all-aluminum cooler in a single-slot design with an overclocked graphics core and memory. Mine was in a sleek blue color and it was a sight to behold in 2005, not to mention a beast in the system it was installed into. I chose it not just because it put massive performance into a slim package but because EVGA offered a lifetime warranty at the time. Despite being in high school I saw that as a huge win just in case something did go wrong. And nothing did go wrong. Instead, the next year I, along with most of the PC gaming community, was blown away by the performance found with the GeForce 8 series. Again, I went with EVGA and got the 8800 GTX. This choice to stick with EVGA stuck with me through 2019 when I bought the RTX 2080 Ti Black Edition from EVGA. I had always been impressed with EVGA’s quality and their name meant a lot to me. In fact, I daresay it meant more to me than Nvidia’s name.

When EVGA made their announcement that they were ending their more than two decades of partnership with Nvidia that really set off alarm bells for me. They were Nvidia’s largest board partner and one which was known for being sent the highest binned (most stringently tested) GPUs which would put them frequently into the world records for performance. They consistently offered great service too, something that I rarely needed but always deeply appreciated to have. Having them suddenly drop from the market was shocking to me and I ended up reading, and watching, a great deal of content about the messy breakup between the two companies. As important as it was as a singular event, I feel it also was a bellwether for Nvidia for the foreseeable future. These aren’t small companies tussling over a few dollars. These are companies who measure their sales in billions of dollars and for EVGA to say they couldn’t continue to shave more and more of their margins for Nvidia on top of all their other misgivings and complaints wasn’t so much shocking, as Nvidia has been known to be a problematic partner in the best of times, but more so damning of their business culture as a whole.

For me, I do try to consider a lot of factors when it comes to companies I feel are worthy of doing business with. There are many considerations to make from the products themselves, to their business practices, to the way they treat their employees, to many other factors large and small. I had been having misgivings for years about Nvidia but this boiling over spilled so much more dirt on the company that it absolutely left a deeply bitter taste. So, I determined then to pay far closer attention to both Intel’s and AMD’s offerings. Intel’s first generation of cards are good, but nothing particularly interesting or appealing. They’re great for a mid-range card but not something reasonable for the higher end. Certainly not an appealing option for replacing a high end card and particularly shaky when it comes to backwards compatibility with older games, something that I am wont to do quite frequently.

So, Intel Arc was off the table. I had a feeling that would be the case given the market they were targeting and that they are a first generation product with plenty of teething problems to solve. They’re great for where they came in, but have a great deal of progress they need to make to really compete beyond the budget to mid-tier line. I’m going to keep watching them as they, hopefully, continue to make headway in the graphics market. They’re coming in at a time we desperately need more competition and I hope they are wise enough to realize that even with poor initial adoption they absolutely have the capability to dramatically alter the graphics market in a positive way.

And that leaves AMD. My last AMD graphics card was an XFX Radeon HD 7970 from 2011. I’ll not mince words; I didn’t have a good experience with that card. I had stability issues stemming from both drivers and the card itself and found the experience unpleasant. It drove me back away from AMD cards for the next 8 years’ worth of upgrades to the culmination of me choosing the RTX 2080 Ti. I pretty much muddled my way past the HD 7970 with a GTX 970, GTX 1070, and finally the RTX 2080 Ti in January 2019. It became easy to choose Nvidia out of habit almost. They had become a known entity and a safe choice. I could simply pop out the old card, put in the new one, and I’d be off gaming in a matter of minutes without even having to install a new driver.

To say I had trepidation regarding switching to AMD would be an understatement. Again, my last experience with one of their cards was not a positive experience. But, I also kept in mind the years of positive experiences I had had with ATI and AMD cards as well as the troubles. I also kept in mind the troubles I had had with Nvidia at several points with driver incompatibility issues, notably with several months of conflict with photo editing software in the late 2000s that lasted for the better part of a year in fact. So, looking at what was available I pulled the trigger and went with the PowerColor 7900 XT Red Devil, their top end 7900 XT card. Amusingly, for me at least, this means I’ve now owned cards from each of ATI/AMD’s 7000 series. I had a Radeon 7000, Radeon 7500 All-In-Wonder (a bizarre mishmash of graphics card and analog capture card that was incredibly useful back in the day), and then the two newer cards the HD 7970 and 7900 XT Red Devil.

Most amusing to me about the Red Devil versus my 2080 Ti is the sheer size difference. I thought the 2080 Ti was a big card when I got it but it is absolutely dwarfed by the sheer size of the Red Devil. In fact, it is so large I admit I broke into a bit of a cold sweat when I was getting ready to install it in my ThermalTake View 71, a behemoth of a full tower case that is stupidly heavy to move. But, much to my relief, it does in fact fit without having to remove the drive cage, even if by merely about 1.5cm. It’s a fabulously quiet card, understandable when it hardly needs to spin the three fans given the leviathan of a heatsink. I really appreciate the move from smaller, higher-RPM fans to larger, slower fans to reduce noise while improving airflow. It makes the cards themselves bigger but the result is cooler cards that can even turn off their fans during normal desktop usage and that is really awesome when it comes to brining down the sound levels.

But, aside from the construction quality of the card, my greatest concern going back to AMD was my worry over software and drivers quality. And, that old bias I had been carrying for years has also proven to be unfounded today. The drivers and software from AMD now are clearly far more advanced and featureful. In fact, they are even better than Nvidia’s latest drivers and GeForce Experience in some respects, such as how the AMD Adrenalin software keeps track of game time and framerates with suggestions to improve framerate that can be implemented on a game-by-game basis. I also appreciate that the AMD settings are all contained within a single application, Adrenalin versus Nvidia with their driver software and GeForce Experience being two separate pieces of software with some overlapping capabilities as well as some that are unique to each.

It's especially handy as I am running an AMD processor as well. Now I can see in depth information on temperatures, utilization, clock speeds, and even voltages for my system as a whole, graphics and main processor. It’s a very cohesive experience that can be used for so much more than just system monitoring. It’s a powerful tool to check for irregularities for troubleshooting both software and hardware issues and they’ve clearly put a great deal of time and effort into building these tools. That I can even log the information to save to files without resorting to additional software is another win for AMD. Having a built-in overlay tool that you can customize what information you want to see is also incredibly useful. And, again, having it pair seamlessly with AMD processors also makes it appealing for running both AMD graphics with an AMD CPU.

If anything, I think it shows how much Nvidia’s software has stagnated in the past several years. They don’t offer nearly as in-depth of diagnostics tools with their cards, even if just solely for their graphics cards. There are third-party tools available of course that go further in depth than either AMD or Nvidia’s software, but the depth AMD provides is night and day more capable than Nvidia’s software. Much of these same features that are stock to Adrenalin are left to board partners to develop for Nvidia hardware. Software such as MSI Afterburner, which is in a rocky state of support due to geopolitical reasons (the lead developer is Russian and has stopped receiving payment due to the ongoing war in Ukraine), does have some features but also requires yet another piece of software, Rivatuner, for OSD capability. It’s honestly a bit of a mess and there are other options for simply monitoring, like GPU-Z, but few offer simple options for tuning the card like Afterburner provides. Another option is EVGA’s Precision Tuner but given that EVGA no longer produces graphics cards that leaves that software in limbo as well. Nvidia should have produced their own given they have the tools already on their end for development purposes but they’re always so closed off with their tools that I am doubtful they’d ever release them with their current leadership under Jensen Huang.

At the very least, maintaining this status quo of a single chip maker holding three-quarters of the graphics market isn’t healthy for innovation. It also leads people to simply defaulting to the one they know for years on end. I should know, it did for me. We’re seeing what healthy competition brings in the processor market. AMD and Intel both are quickly developing and evolving their platforms and there is a real tit-for-tat in performance between the two along with strong discounts for buyers. There’s real incentive for both AMD’s platforms and for Intel’s platforms. It’s been wonderful to see happen, and I hope that both continue to fight out for performance in both the x86 space and in development of potential successor platforms such as ARM or RISC-V. I want to see that happen to the graphics world too. And I think that we are at our closest now than we have been in more than a decade. The combination of AMD’s modernization and strong development leading to their third round of Radeon 7000 GPUs (I still laugh that we’ve looped around with that from them and Nvidia now) is fantastic. Intel’s much-anticipated entry into the discrete graphics world finally came after years of rumors and delays. And despite the lack of high end offerings right now when it comes to the low to mid-tier graphics they provide a serious option, especially for more casual gamers or people who rarely go back to older games built on DirectX 10 or older.


Comments

Popular Posts