Having two or more GPUs used to be the pinnacle of PC gaming, but now it’s a thing of the past. What happened?
Up until the mid-2010s, the fastest gaming PCs used multiple graphics cards, usually two but sometimes up to four. Then, some of the best gaming graphics cards used two GPU chips rather than just one, which helped with consolidation. Nvidia’s SLI and AMD’s CrossFire multi-GPU technologies were seen as the pinnacle of any high-end gaming PC and could elevate your gaming experience to the next level.
XDA VIDEO OF THE DAY
SCROLL TO CONTINUE WITH CONTENT
Today, multi-GPU is a thing of the past — practically a relic in the computer world. The fact that most new GPUs today don’t even support SLI or CrossFire is certainly a problem, but the popularity of multi-GPUs was dropping off well before Nvidia and AMD effectively discontinued those technologies. Here’s the history of multi-GPU gaming and why it didn’t stand the test of time.
A brief history of multi-GPUs, from 3dfx to its decline
While modern graphics cards emerged in the early 2000s out of the rivalry between Nvidia and AMD, there were many more players during the ’90s. One of those companies was 3dfx Interactive, which produced the nostalgic Voodoo line of graphics cards. In order to gain a competitive edge, the company decided that two graphics cards could be better than one, and in 1998, it introduced its Scan-Line Interleave technology (SLI). It was a pretty genius move on 3dfx’s part since it encouraged more GPU sales and dissuaded Voodoo owners from switching to another card.
However, SLI was introduced right as 3dfx was heading toward bankruptcy, and the company was ultimately acquired by Nvidia, which obtained the intellectual property rights to everything 3dfx owned. Multi-GPU briefly stopped existing after the 3dfx acquisition, but Nvidia reintroduced SLI (changing the official name to Scalable Link Interface) in 2004 with its GeForce 6 series. It essentially worked the same way as it did before: Add more GPUs, get more performance. But there were some innovations with Nvidia’s take.
While 3dfx’s old SLI had each GPU render a line of pixels one at a time (the “scan line” in SLI), Nvidia’s new SLI introduced two new rendering methods: split-frame rendering (SFR) and alternate-frame rendering (AFR). With SFR, each GPU renders a portion of a single frame, not by splitting the frame down the middle but by giving each GPU an equally intensive chunk to render. AFR, on the other hand, has each GPU produce a frame in turn. While SFR is great for reducing latency, AFR tends to get the most performance, albeit with much poorer frame pacing and stuttering.
Similarly, in 2005, ATI (soon to be acquired by AMD) introduced its own multi-GPU technology, called CrossFire, but it was kind of a mess at first. With 3dfx and Nvidia cards, all you needed were two of the same GPU and a cable or bridge to connect them, but CrossFire required you to buy a special “master” card in addition to a regular graphics card. Then, instead of using a bridge, you used a weird DVI cable that plugged into both cards. Suffice it to say the first generation of CrossFire was poorly executed. It didn’t help that, at the time, its GPUs weren’t amazing.
But CrossFire really came into its own with the introduction of AMD’s (formerly ATI’s) Radeon 3000 series, which featured the Radeon HD 3870 X2, the world’s first graphics card with two GPU chips on it. AMD went really far with this whole dual-GPU concept; its Radeon 4000 and 5000 series chips were actually quite small, so dual-GPU graphics cards made lots of sense. The HD 5970 in 2009, one of AMD’s best GPUs of all time, was often described as too fast to be feasible. After this, Nvidia also began making its own dual-GPU cards.
After this point, however, the popularity of multi-GPU began to decline. Nvidia dropped the dual-GPU concept for its mainstream GPUs after the GTX 690 in 2012, and dropped it altogether after the GTX Titan Z in 2014. Nvidia made SLI exclusive just two years later to its GTX 1070, 1080, and 1080 Ti GPUs, and it also reduced support from four graphics cards down to two. SLI was on life support after this, but it was finally axed in 2020 with the launch of the RTX 30 series, of which only the 3090 supported SLI. But that didn’t matter since Nvidia ceased SLI driver support from 2021 onwards.
Meanwhile, AMD kept making dual-GPU cards for years, only stopping with the Pro Vega II in 2019, which was an Apple Mac exclusive card. AMD even said two RX 480s in CrossFire was a good alternative to Nvidia’s GTX 1080 in 2016. However, AMD eventually gave up on CrossFire after the launch of RX Vega in 2017, which was the last AMD card to support it. It seems AMD also stopped making drivers with per-game CrossFire support sometime in 2017 as well.
The many reasons why multi-GPU died out
Multi-GPU gaming came and went pretty quickly, all things considered. It was only a significant force after 2004 with SLI and CrossFire, but by the 2010s, it was already in a decline. Ultimately, it was the direction the graphics industry was going and how gamers found single GPU solutions so much more appealing that rang the death knell.
GPUs were getting bigger each generation and eventually outgrew multi-GPU
When 3dfx introduced SLI, graphics cards were tiny devices with really low power draw, nothing like the behemoths we see today. Graphics chips tended to be about 100mm2 large in the 90s and early 2000s, but this all changed when ATI launched its Radeon 9000 series, which featured a chip that was over 200mm2, double the size of anything the world had seen before. This started a GPU arms race that ATI/AMD and Nvidia kept escalating with each generation.
The thing is, larger chips require more power and better cooling, and while increased power draw didn’t really impact multi-GPU setups at first, it eventually proved to be a significant problem. Even as early as the GTX 480, graphics cards had gotten to the 250W mark, and two 480s in SLI consumed an unbelievable amount of power. While AMD put significant emphasis on multi-GPU with its HD 4000 and 5000 series, it was really only because it needed something high-end to go against Nvidia’s 480 and 580 since AMD’s graphics chips were too midrange.
From the late 2000s on, almost every flagship made by Nvidia and AMD consumed at least 200W, often 250W. It might not be a coincidence that Nvidia’s last mainstream dual-GPU card, the 690, used two GTX 680 chips, which had a TDP of only 195W. The simple fact that single GPUs were getting bigger and better made SLI and CrossFire more difficult and less appealing to users, who usually didn’t want their gaming PC to also be a space heater and a jet engine.
Multi-GPU was buggy and required devs, Nvidia, and AMD to invest resources into it
Hardware trends were a problem for multi-GPU’s feasibility, and so were software trends. Back when SLI was first introduced, games were much simpler, and even 2004’s best games, such as Half-Life 2, are pretty unremarkable compared to today’s games, even though we can appreciate how great they were when they came out. SLI and CrossFire required Nvidia and AMD to create special optimizations for multi-GPU in their drivers in order to achieve good performance, and back then, this wasn’t a big deal.
But over time, games (and, by extension, GPUs) got more complicated, and it became harder to optimize each year. Even in titles that had official support for multi-GPU, the experience was often subpar due to poorer-than-normal performance or bugs. For a brief time in 2016, I had two Radeon R9 380s, and when I played The Witcher 3, I often saw weird graphical glitches that sometimes even covered up important features like cave entrances, making the game not just quirky but buggy to the point of being unplayable.
Ultimately, it was the direction the graphics industry was going and how gamers found single GPU solutions so much more appealing that rang the death knell.
The only glimmer of hope for better software support for multi-GPU was DX12 and Vulkan, which boasted such powerful multi-GPU support that you could even use multiple GPUs from different vendors in a single game. However, this just offloaded the work Nvidia and AMD used to do into the hands of developers, who didn’t stand to gain anything by supporting multi-GPU technology, especially since Nvidia and AMD were phasing it out. So, the software side of things didn’t pan out either for multi-GPU gaming.
Gamers just didn’t need high-end multi-GPU setups
Even if things on the hardware and software sides of the equation worked out, multi-GPU gaming might have been doomed simply because it was overkill. Even the HD 5970 was described as overkill, and that was just with two midrange GPU chips. Still, multi-GPU was just popular enough to keep going for years, but I think its fate was decided by one single event: the launch of the GTX 1080 in 2016.
Nvidia’s GTX 10 series was really just the GTX 9 series on the brand-new 16nm from TSMC, but that alone was a big deal since Nvidia had spent three whole generations on 28nm due to the decline of Moore’s Law. Going from 28nm to 16nm resulted in the GTX 1080 being over 50% faster than the GTX 980 and 30% faster than the GTX 980 Ti. The 1080 also supported SLI and its TDP was relatively low at 180W, but the raw performance with a single 1080 was insane in 2016.
While PC gaming using multiple graphics is seemingly never coming back, the door for multi-GPU is actually open.
This was further improved with the GTX 1080 Ti the next year, boosting performance by nearly another 30%. A single 1080Ti was nearly twice as fast as a 980 Ti and would have certainly been a superior solution to two 980 Tis in SLI. Nobody in their right mind would really want two 1080 Tis in SLI, not only because it would have been hot and loud but also because twice the performance of a 1080 Ti would have been completely overkill (and also not feasible for most games with official SLI support). Imagine how crazy it would be to have two RTX 4090s in SLI.
Multi-GPU gaming could make a comeback
While PC gaming using multiple graphics is seemingly never coming back, the door for multi-GPU is actually open. If you’re familiar with AMD’s CPUs, you’ll know that its higher-end desktop chips and all its workstation and server CPUs use multiple CPU chips together instead of one big CPU. Using lots of smaller chips (also known as chiplets) is a technology AMD started using back in 2019, though only in 2022 did it start using chiplets for its GPUs with the introduction of the high-end RX 7000 series.
RX 7000 cards like the RX 7900 XTX, however, only have multiple cache and memory chiplets, and use a single GPU chip. Still, there’s reason to believe AMD might start using multiple graphics chiplets since it would cut down development and production costs while also making it simple to make new cards (just add or remove a chiplet and bam, new GPU). Intel could also go the same direction since it, too, is transitioning to chiplets.
While it seems Nvidia has absolutely no interest in chiplets, it would be surprising if AMD and Intel weren’t interested in bringing back multi-GPU with chiplets. Perhaps we’ll see the return of multi-GPU gaming with modern technology in the coming years if it can work well enough.
Categories: Reviews
Source: thptvinhthang.edu.vn