Both the article and the pushback are kind of silly here — the dGPU’s heyday was over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.
Back in 2008, gaming on a laptop started to become a possibility, and dGPUs were part of that story — but for the most part, good luck swapping out your GPU for a newer model; it generally wasn’t so easy to do on a laptop.
THAT was the beginning of the end for dGPUs.
By 2015, I had a laptop with both an iGPU and a dGPU. eGPUs were just appearing on the market as a way around the lack of upgradeability, but these were niche, and not required for most computing tasks, including gaming.
At the same time, console hardware began to converge with desktop hardware, so gaming houses, who had for over 20 years driven the dGPU market, fell into a slower demand cadence that matched the console hardware. GPUs stagnated.
And then came cryptomining, a totally new driver of GPUs. And it almost destroyed the market, gobbling up the hardware so that none was available for any other compute task.
Computer designers responded by doubling down on the iGPU, making them good enough for almost all tasks you’d use a personal computer for.
Then came AI. It too was a new driver for GPUs, and like crypto, sucked some of the oxygen out of the PC market… which switched to adding iNPUs to handle ML tasks.
So yeah; GPUs are now for the cloud services market and niche developers; everyone else can get their hands on a “good enough” SoC with enough CPU, GPU and NPU compute to do what they need, and the ability to offload to a remote server cluster for weightier jobs.
Show me non-niche software that needs more than a modern iGPU can provide. Your 3080ti can blast two screens of 4k video at 120fps HDR… and so can my iGPU.
Yeah but an xbox’s gpu can play games too. You don’t need a 3080ti to play any game on the market. Dedicated gpus are almost entirely a luxury upgrade with the power of today’s iGPU’s.
My 3080ti significantly out performs an Xbox. While you can game on a console you can game better on a PC with a dGPU. An iGPU will get the job done, but a dGPU today continues to outperform it and give you a better experience. I can play across 3 2k displays at 165Hz, or step up to native 4k, I can smooth framerates with raytracing on at a non upscaled resolution.
Cool? That level of performance is incredibly niche and not required to play any game. Maximal theoretical performance is one way to play, but not even a majority of PC gamers have that kind of hardware and no one needs it (for gaming at least). The only area where you need power to participate is VR, but stand alone sets run on phone hardware.
The performance differences between an Xbox, a laptop with a good igpu, and a $3k gaming rig doesn’t matter if your top priority is having fun playing a game and not tinkering with specs and hardware.
I never said there were more than that. I said most gamers can’t afford $3,500 computer setups which is well supported by that data. Just because you can doesn’t mean that performance is required or even needed. What’s the difference between smooth 60fps at 1080 on max settings and 165fps at 4k on ultra? Not much in terms of gameplay, beyond the perceived “better” experience.
Nope - eben Steam is only a small subset of all gamers. Most gamers are on smartphones. Others are on consoles and many others are playing outside of steam. And you have to count the steam survey correctly and count the “Laptop GPU” as integrated.
But IMHO the problem is another: There is currently no sweet spot for a cheap and capable GPU like the 1050Ti was back in the day. Some people will spend thousands of dollar for a GPU, but GPUs have been really expensive for a long time due to bitcoin, Covid and AI. You can see that in the steam survey - some of the most popular cards like the 1050TI or 1060 are ancient.
That puts game companies in a really bad position: They could release games with fancy graphics, but only a small subset of games can run them on their rig. And producing a multimillion dollar game for 2000€+ devices is not a good business proposal.
over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.
Hey man, I’m old, but I still have a custom built gaming PC on my desktop and a fairly recent GFX card (3070 ti). Although I’d say I only update my card maybe once every 3-5 years depending on necessity.
Both the article and the pushback are kind of silly here — the dGPU’s heyday was over a decade ago, back when “serious gamers” had a custom built PC on their desk and upgraded their GPU every two years at a minimum.
Back in 2008, gaming on a laptop started to become a possibility, and dGPUs were part of that story — but for the most part, good luck swapping out your GPU for a newer model; it generally wasn’t so easy to do on a laptop.
THAT was the beginning of the end for dGPUs.
By 2015, I had a laptop with both an iGPU and a dGPU. eGPUs were just appearing on the market as a way around the lack of upgradeability, but these were niche, and not required for most computing tasks, including gaming.
At the same time, console hardware began to converge with desktop hardware, so gaming houses, who had for over 20 years driven the dGPU market, fell into a slower demand cadence that matched the console hardware. GPUs stagnated.
And then came cryptomining, a totally new driver of GPUs. And it almost destroyed the market, gobbling up the hardware so that none was available for any other compute task.
Computer designers responded by doubling down on the iGPU, making them good enough for almost all tasks you’d use a personal computer for.
Then came AI. It too was a new driver for GPUs, and like crypto, sucked some of the oxygen out of the PC market… which switched to adding iNPUs to handle ML tasks.
So yeah; GPUs are now for the cloud services market and niche developers; everyone else can get their hands on a “good enough” SoC with enough CPU, GPU and NPU compute to do what they need, and the ability to offload to a remote server cluster for weightier jobs.
Show me an iGPU that will compete with my 3080ti.
Yeah, dgpus have been for niche applications for decades, I didn’t read the article, but the parent comment is vastly overestimating igpu capabilites
Show me non-niche software that needs more than a modern iGPU can provide. Your 3080ti can blast two screens of 4k video at 120fps HDR… and so can my iGPU.
I play video games. Those aren’t exactly niche.
Literally Minecraft and Fortnite will become much more enjoyable with a dGPU. That seems very not niche to me.
Yeah but an xbox’s gpu can play games too. You don’t need a 3080ti to play any game on the market. Dedicated gpus are almost entirely a luxury upgrade with the power of today’s iGPU’s.
My 3080ti significantly out performs an Xbox. While you can game on a console you can game better on a PC with a dGPU. An iGPU will get the job done, but a dGPU today continues to outperform it and give you a better experience. I can play across 3 2k displays at 165Hz, or step up to native 4k, I can smooth framerates with raytracing on at a non upscaled resolution.
Cool? That level of performance is incredibly niche and not required to play any game. Maximal theoretical performance is one way to play, but not even a majority of PC gamers have that kind of hardware and no one needs it (for gaming at least). The only area where you need power to participate is VR, but stand alone sets run on phone hardware.
The performance differences between an Xbox, a laptop with a good igpu, and a $3k gaming rig doesn’t matter if your top priority is having fun playing a game and not tinkering with specs and hardware.
Here’s a list of the GPUs recorded in the Steam hardware survey. These are what gamers are actually using. Less than 10% of them are using iGPUs.
I never said there were more than that. I said most gamers can’t afford $3,500 computer setups which is well supported by that data. Just because you can doesn’t mean that performance is required or even needed. What’s the difference between smooth 60fps at 1080 on max settings and 165fps at 4k on ultra? Not much in terms of gameplay, beyond the perceived “better” experience.
Nope - eben Steam is only a small subset of all gamers. Most gamers are on smartphones. Others are on consoles and many others are playing outside of steam. And you have to count the steam survey correctly and count the “Laptop GPU” as integrated.
But IMHO the problem is another: There is currently no sweet spot for a cheap and capable GPU like the 1050Ti was back in the day. Some people will spend thousands of dollar for a GPU, but GPUs have been really expensive for a long time due to bitcoin, Covid and AI. You can see that in the steam survey - some of the most popular cards like the 1050TI or 1060 are ancient.
That puts game companies in a really bad position: They could release games with fancy graphics, but only a small subset of games can run them on their rig. And producing a multimillion dollar game for 2000€+ devices is not a good business proposal.
Hey man, I’m old, but I still have a custom built gaming PC on my desktop and a fairly recent GFX card (3070 ti). Although I’d say I only update my card maybe once every 3-5 years depending on necessity.
This is the real write up right here. Article is pretty meh.