Viewing a single comment thread. View all comments

DRARCOX t1_j90ips1 wrote

I guess where I'm confused is... So what? My 2080 graphics card in my Lenovo desktop from 2019 runs everything I've ever thrown at it.

This sounds like a problem for people who buy a new smartphone every year, not for anyone with enough sense to not "need" a card or other hardware that came out less than 6 months ago to me.

Maybe I'm just missing something.

20

Gargenville t1_j90p38s wrote

Maybe this is an age thing but as an old timer I do think it's kind of wild we have multiple people in this thread talking about their crappy old 2080, like my brother in christ that's a $700 GPU, it's insane for that to be the 'barely hanging in there' card.

53

hurfery t1_j90ruwf wrote

Yes, lol. The window has been moved. By a lot. The mining craze, pandemic, supply crisis, scalping, Nvidia taking maximum advantage of it all... All this has normalized paying more than 500 for a gpu. People now go up to 2000+.

19

NuTeacher t1_j91y5o5 wrote

I've also noticed that the window for acceptable in-game specs has changed too. It used to be that the target was 1080p and 60 fps which even the lowest end card can do now. But now people want to be able to play at 1440p 120+fps or even at 4k 60fps. The 1060 can generally get 50-60 fps at 1080p. We've seen a huge leap in power these last three generations and card prices are reflecting that.

7

hurfery t1_j92e4q8 wrote

4k 120 fps is where it's at. 😎

Well yes but tech progress isn't really progress if you pay 3x more for 3x performance. New tech is supposed to give more performance per dollar compared to the old one.

7

Agreeable-Meat1 t1_j91rbhk wrote

Meanwhile I bought an MSI laptop in 2018 on Black Friday and other than the case falling apart, there have been 0 issues.

2

redbrick5 t1_j90t49q wrote

I think for decades we have been accustomed to buying the latest/maxed out components when buying/building a new computer. Primarily because there was a huge improvement over the tech from 2yrs prior. No longer the situation and our lust for the best vs cost can't be rationalized

4

stu54 t1_j910sts wrote

The tech is still hugely better over 2 generations, but the power consumption keeps rising. Somehow a 120 watt last gen GPU became entry level.

3

Overall-Business-624 t1_j90tnoo wrote

it is a problem when games like hogwarts legacy barely holds 1080p60 fps on an RTX4090 https://www.youtube.com/watch?v=zofJ5yFvajA

1

Spot-CSG t1_j913u09 wrote

The ray tracing is broken thats why. I'm playing at 120fps in 1440p Ultra settings with RT off. 5800x/3070ti.

11

eosh t1_j924ago wrote

Yeap, I have zero issues with ray tracing off. All other settings on high with my 1080.

2

Nevermorre t1_j91b7ut wrote

That's why I upgraded to my current GPU. However, to be fair, I do think my RX 580 was starting to show its age and I knew it was not going to favor a lot that would be coming out fairly soon. I could still play on mid settings mostly fine, but this game specifically, had some graphic bugs - mostly with WILD water textures getting streched all over the place making vision impossible. Also, the area around where I customize my wand, I'm guessing it was the "hitbox" even though it was not an active item. Anyway, after poking around online, I found others with the same issues and we all had the same card, I think I saw a 570 but close enough.

I'm not nearly as much of a gamer as I use to be, hell I dropped gaming for almost two years and only really got back in when Spiderman and Days Gone dropped on Steam. Not long after I finished Days Gone, Hogwarts Legacy was just a couple months on the horizon. Not sure what I'll pick up next, but I like to keep my system ready all the same. Also, I planned for upgrading my main components when I built my PC, one piece at a time every few years. I'm not entirely sure where my Ryzen 5 3600 sits currently, from what I understand it's still a humble, but competitive, piece I hope to get a few more years out of.

1

jaakers87 t1_j91snvz wrote

Thats not a hardware problem. It doesn't matter what GPU you have with that game, it still stutters. So why pay $1500 when you will have the same experience as a $500 card?

1

Iron_on_reddit t1_j90mqxx wrote

Exactly! This is more about the hype/marketing strangling the weak minded.

Also, PC gaming is much, much more, than AAA games. In the last 10-15 years, there were only a few triple A titles that caught my attention, while I fully enjoyed many indie games, and they require much more modest hardware.

PC gaming will be fine. The 34th installment of the same, regurgitated, boring concept covered in beautiful, but otherwise meaningless graphics? Maybe not so much, but they won't be missed anyway.

9

doneandtired2014 t1_j91frgl wrote

The gist of the article is that NVIDIA and AMD are focusing on the halo tier at the exclusion of all else and pricing their cards in an effort to maintain cryptoboom margins in the face of crypto collapsing like a neutron star (for the third time).

If you needed a sub-$900 card today, your options are 1) pay $100-$200 over MSRP for RTX 30 stock, 2) try to snag RDNA2 products before the remaining stock pool evaporates, 3) hope most of your games use DX12 or Vulkan on ARC.

Adding to that, performance gains have basically flatlined at the sub $400 price point. $380 in 2016 got you a GTX 1070. $300 in 2023 gets you...OCed GTX 1070 performance from AMD and NVIDIA. Intel shines here when a game gets along well with their drivers. When a game doesn't, you get performance on par with a GTX 1050 Ti.

$200 gets you cards that aren't even as perfomant as the $200 options from 7 years ago.

6

mrturret t1_j985acb wrote

I mean, I'm still rocking my 1080 that I got back in 2016, even after I upgraded my motherboard, CPU, and RAM. The 1080 still runs new games great.

2