LewAshby309 t1_itpko2t wrote
I don't even know what was wrong with the old pins.
Sure, you would need 4 of them but this leads to the questions why such a high power draw.
The 4090 at 70% of the power limit performance in games on avg 7% below the max power limit. They released a gpu that is stock already past the efficiency limit. Graphs clearly show that. That's usually the OC range. If you up the wattage from stock you gain like 2%. If you go past the efficiency you hit a wall where you need exponentially more power to increase the performance slightly.
It's a waste of energy, hands you just little performance with the risk of burning the power port and cable.
The only reason to do this is that the last few percent might give you the edge over the competition. Still we know if AMD beats them they come up with new gpus.
It's stupid. For casual users it should be right at the end of the efficiency range while past it should be a thing for people that are into OC. There are tons of people who don't touch a thing and run it stock. They won't and often don't even know about the way better efficiency if the simply lower a slider a little bit.
The positive for the consumer is that the coolers are made for the high tdp means if you lower it the gpu runs really cool comparable to undervolting 30 series gpus.
Tehnomaag t1_itpuarz wrote
The point was data pins. So that the new standard PSU's can communicate to the card how much power they can supply. Then on top of that someone figured that would it not be better if there was only a single cable.
Some optimistic engineer optimized it to the bone and it was decided that yep, it can handle 600W alright if the stars align just right and the user does not bend the cable in any way within the visual range of the card.
Viewing a single comment thread. View all comments