Published 9 minutes ago
After a 7-year corporate stint, Tanveer found his love for writing and tech too much to resist. An MBA in Marketing and the owner of a PC building business, he writes on PC hardware, technology, and Windows. When not scouring the web for ideas, he can be found building PCs, watching anime, or playing Smash Karts on his RTX 3080 (sigh).
A new graphics card used to be one of the simplest PC upgrades. It was affordable, transformative, and devoid of unsaid terms and conditions. The latest GPUs from Nvidia and AMD might seem like massive upgrades to someone using an obsolete graphics card, but things get complicated when you compare them to their predecessors. Companies now reserve true g…
Published 9 minutes ago
After a 7-year corporate stint, Tanveer found his love for writing and tech too much to resist. An MBA in Marketing and the owner of a PC building business, he writes on PC hardware, technology, and Windows. When not scouring the web for ideas, he can be found building PCs, watching anime, or playing Smash Karts on his RTX 3080 (sigh).
A new graphics card used to be one of the simplest PC upgrades. It was affordable, transformative, and devoid of unsaid terms and conditions. The latest GPUs from Nvidia and AMD might seem like massive upgrades to someone using an obsolete graphics card, but things get complicated when you compare them to their predecessors. Companies now reserve true generational gains for flagship SKUs, rely too much on software gimmicks, and increasingly price out the average consumer every generation. On top of that, they don’t have the decency to offer enough VRAM, even on their high-end offerings. Modern GPUs might be marvels of technology, but they stopped being straightforward upgrades long ago.
Related
5 worst PC hardware failures of 2025
Looking back at the bad, ugly, and worst of PC hardware in 2025
GPU prices have broken the upgrade psychology
GPU upgrades are a rich man’s game
If you’ve been building PCs for a while, you’ll agree that buying a new GPU didn’t involve nearly as much thinking as it does today. You didn’t have to worry about cost per frame, track pricing cycles, or hunt for a GPU to be in stock. All you needed to do was buy the one in your budget, and you were almost guaranteed a massive performance boost. Even mid-range and budget GPUs were enough to transform your gaming experience. I still remember switching from a GTX 1050 Ti to a GTX 1660 Ti in 2019, enjoying a significant performance uplift. I was finally able to play my favorite games at 1440p, utilizing my then-new monitor.
Expecting a similar performance upgrade today with an affordable GPU is borderline naive. GPU prices have risen to heights that would have been unthinkable just a few years ago. The RTX 2080 Ti was already overpriced at $999, but the RTX 3090 took things to a whole new level with its $1,499 MSRP. The RTX 4090 and RTX 5090, priced at $1,599 and $1,999, respectively, continued the trend, with real-world prices rarely staying near MSRP. This "luxurification" of consumer GPUs isn’t limited to high-end SKUs either. Prices across the product stack have been going up dramatically, affecting 80-class, 70-class, and 60-class cards, too.
With most worthwhile GPUs now costing more than most people’s definition of entire PCs, users are retaining their GPUs far longer than they used to. Naturally, blowing almost all your budget on a graphics card leaves you with little for all other upgrades for your build. This has left the average user with few good choices. They can either delay a GPU upgrade indefinitely or settle for a budget GPU that ends up being a sidegrade.
Related
The best generational gains are limited to high-end SKUs
Most consumers are left fighting for scraps
Unaffordable prices are just one side of the coin. The value of modern GPUs is getting harder and harder to justify, since raw hardware improvements are getting slimmer every generation. Nvidia’s RTX 50 series seemed only slightly better than a refresh of the RTX 40 series. Even AMD’s RX 90 series, which was otherwise praised for significant advancements, wasn’t a massive upgrade over the best RX 7000 GPUs. If you’re someone who waits for a new GPU lineup in hopes of a true generational uplift, you’re probably not going to be impressed by anything other than the flagship and high-end cards.
GPU manufacturers are clearly pushing consumers at every price tier toward the next best tier. It’s gotten to the point that they are actively nerfing GPUs down the product stack when compared against the flagship. For instance, the RTX 5080 features only 49% of the CUDA cores on the RTX 5090. This number was around 59% for the RTX 4080, compared to the 70% average for the GTX 900, GTX 10, RTX 20, and RTX 30 series (compared to the respective flagship of every generation). Nvidia’s 70-class cards have fared worse, with the RTX 5070 featuring only around 28% of the CUDA cores on the RTX 5090.
A new GPU doesn’t feel like a big deal anymore when the only card you can afford has a massively cut-down chip. Product segmentation is a reality, but it is far worse than what used to happen around 10 years ago.
Related
3 ways shrinkflation is making Nvidia GPUs worse every year
Nvidia is silently making its GPUs worse with every generation
Modern GPUs have gaping VRAM limitations
As if the performance and pricing concerns weren’t enough
Even when we’re lucky to get a modern GPU that packs decent performance for the money, there’s always a catch. In the case of Nvidia GPUs, it is often insufficient VRAM that handicaps an otherwise powerful graphics card. At higher resolutions and heavy graphical settings, GPUs with 8GB and even 12GB of VRAM can often exceed the available visual memory. When this happens, you’re exposed to texture pop-ins, severe glitches, and outright crashes. Many new titles will refuse to launch on a GPU with insufficient VRAM. Even at 1080p, a number of newer titles are known to exceed 8GB of VRAM. Sadly, VRAM is lacking not only on budget but on mid-range graphics cards today. Even with the RTX 5080, a GPU rarely available below $1,000, you’re only getting 16GB of VRAM, which is simply unacceptable.
How many customers are going to buy an RTX 5090 just to get more than 16GB of VRAM? AMD and Intel fare better than Nvidia in this department, but we haven’t seen any GPU with over 16GB of VRAM from either brand this generation. At least their GPUs don’t cost as much as Nvidia’s, but even a $650 RX 9070 XT has no business having a 16GB framebuffer. It’s high time GPU manufacturers stopped skimping on VRAM, and made 16GB of VRAM the bare minimum on modern cards.
Older-generation cards did not have massive framebuffers, but we didn’t really need more VRAM in those days. As memory requirements grew, GPU companies remained in the past, artificially segmenting their offerings using the VRAM lever. The ongoing DRAM crisis has posed another question mark on GPU VRAM going forward, and we most likely won’t see things change for another generation or two.
Related
Performance is more conditional on software than ever
What are you paying for?
Perhaps the worst story in all of this is that the performance numbers being advertised are increasingly dependent on AI-powered tricks instead of raw hardware grunt. You’re paying more than ever for products that aren’t delivering worthwhile gains for all those premiums, at least not without resorting to crutches like upscaling and frame generation. Instead of innovating on how GPUs can provide more raw performance every generation, manufacturers are hell-bent on making software the star of the show. Modern games, instead of being more optimized than ever, are getting harder to run on mainstream hardware.
Upscaling isn’t optional anymore for the vast majority of gamers using budget and mid-range cards. You might not see the difference between a native and an upscaled image, but I don’t like to resort to tricks even after paying top dollar for a GPU. Frame generation is even worse, since it doesn’t improve the inherent responsiveness of the game when padding the FPS counter. The latency penalties are huge, making frame generation viable only when it’s not needed.
GPUs used to be expensive even 10 years ago, but what we got for the money wasn’t software trickery but raw hardware grunt. Today, removing frame generation from the picture will make you realize how underwhelming the RTX 50 series GPUs really are when compared to the RTX 40 lineup. As things stand, we’re heading in the same boring direction for the next few generations of GPUs.
Related
Forget VRAM — these 5 GPU trends are way more disturbing
It’s not looking good for gaming GPUs, and it might get worse
Modern GPUs are powerful, but they’re not really attractive purchases
Upgrading to a new GPU used to be simple. Over the years, however, changing priorities and silicon limitations have made modern GPUs a complicated area of PC hardware. They’ve not only become luxury products, but consumers are also expected to be okay with underwhelming generational gains. Manufacturers hold us hostage with VRAM limitations, and AI-generated frames have suddenly become the headline. GPUs are hardly the sensible upgrades they used to be, and we’ll probably never return to that reality.