Submitted by redhatGizmo t3_115a172 in technology
mvw2 t1_j92dp8e wrote
It's the same model for cars and houses. It's a model that works great when the market volume is limited. You have limited sales, and you maximize profits to that limited number by focusing almost exclusively at the high end. It's a model that's good for business but forces customers to buy nothing, buy old, or buy into only the high end at exorbitant prices. ALL the in-between stuff is gone.
For video card makers, they have a unique hurdle. They are pushing hard into edge case tech, pushing as far as they can achieve with the physics of materials. And outside of ideological change of structure optimizations and good software optimization, this is really just a game of attempting to produce at the edge of science and do so without massive scrap losses. Worse yet, all the costs are tied into the process of it all, so cards from low end to high end barely cost different. It doesn't cost them hardly anything less to produce a bottom tier video card, but they'll make a lot less from it. Plus they have to be competitive with other brands and older generations of themselves. The push for performance and push for higher cost go hand-in-hand making sure the new stuff remains just barely the better value.
For raw cost, crypto miners and scalpers have played a big part of scarcity, unfortunately. And they are driving real costs well above MSRP. This in turn makes older cards more valuable and pushes up the entire market space as a whole. Everything is expensive just for the sake of being expensive. A few people simply exist to make a pile of money from it. And for these dumb reasons a 10 year old card can be sold for as much as it was bought new a decade ago.
Now our saving grace as PC gamers is that games have long become very flexible in hardware needs. The transition to Steam and the high analytics it brought to manufacturers showed quite starkly how ancient people's hardware had become. Game developers are forced to temper their designs to work on much, much older hardware. This has provided a customer base with a LOT of breathing room for holding onto very old tech or only performing very minor upgrades to retain the ability to play brand new games.
This used to not be the case at all. In the early days games forced hardware development, to the point where every new game was nearly unplayable on anything but the absolute newest hardware. You HAD to upgrade your entire computer every 2 to 4 years just to play anything new. You couldn't simply set graphics to low and magically make an older PC play it. The tech and software advanced way too fast. You could literally spend $3k on a brand new PC, and there was a chance that it could not play the game that was coming out 6 months later. The technology race was that aggressive.
This ground to a halt in the late 90s and early 2000s when this model was no longer viable. It shifted the burden of flexibility on game developers and forced them to slow down and accommodate. Today, you can play current year titles with a 15 year old PC. Not long ago that was unfathomable.
However, what this has also created was this weird vacuum in the hardware world. There actually isn't much driving next generations. You do have commercial users and professionals wanting faster task speeds for simulations, rendering, etc. But outside of that, PC gamers haven't really had drivers outside of high res monitors. There isn't much actually pushing graphics cards these days. Many developers are also bound by the heavy console market that are all stuck in one time period in history, and all these new titles have to work on them too. But for PC folks, the push is pretty much solely monitor resolution, stepping from 1080 to 4k and now into 8k. The need for that isn't even worthwhile. It's just there. And it's about the only thing pushing video cards forward. Sure, there's ray tracing too, but that's a singular use case. You still have developers catering to decade or older tech, so the game itself can never really ever again push the boundaries like it used to. And thinks like 4k, 8k, multi-monitor, etc. are just throughput equations. So the game in today's graphics card world is just that, throughput of pixel volume.
thenrepent t1_j92e36w wrote
> For raw cost, crypto miners and scalpers have played a big part of scarcity, unfortunately. And they are driving real costs well above MSRP. This in turn makes older cards more valuable and pushes up the entire market space as a whole. Everything is expensive just for the sake of being expensive. A few people simply exist to make a pile of money from it. And for these dumb reasons a 10 year old card can be sold for as much as it was bought new a decade ago.
The demand for GPUs for crypto mining should be fairly low nowadays - it was primarily Ethereum being mined with GPUs, and Ethereum switched to proof of stake (no mining). So that demand has fallen away entirely.
Viewing a single comment thread. View all comments