Comments
xSwyftx t1_j844xor wrote
For a lot of people it is as basic as which do I want to do more..eat or upgrade my pc
2160dreams t1_j84tlzd wrote
Situation I'm in for sure. Food + housing + pet supplies come before a PC build, no matter how old my current desktop is.
[deleted] t1_j85fsog wrote
[deleted]
kwamby t1_j85jivz wrote
Just had to pay $600 just to put my dog down. Couldn’t even give my baby peaceful rest without shelling out
Omegalazarus t1_j85t6w9 wrote
Damn that the has gone way way up since I last needed it.
FSMFan_2pt0 t1_j8621ft wrote
Eh, that price is high, IMO. We put our dog down in 2021 and it was $270, and that included cremation and a fancy wooden memorial box for the ashes, and his name engraved on it.
Omegalazarus t1_j862aks wrote
Oh I was thinking just for the shot. Okay all the other stuff too. We never did funerary type stuff.
Edit: Sorry you had to do that
kwamby t1_j86gi2q wrote
Cremation was only $150. But still $450 for an easy death is a ton
Raiden115X t1_j87kd7r wrote
The place I had to take my dog to wanted an extra $150 just for me to be in the room with her when they did it. I couldn't afford it. Assholes.
DizzyCommunication92 t1_j86g9n2 wrote
I don’t buy into that ashes stuff…our landfill uses fire anyways. The memories never die
Stock_Regular8696 t1_j87mlle wrote
Would you throw your own mother into the landfill? Just memories, right.
upper_crust07 t1_j886ak3 wrote
Basic as making omelettes. Lol
MarkyDeSade t1_j87sq4o wrote
After reading so much about crazy scalper GPU prices in the past few years I’ve just gotten used to the idea that I can’t afford a new one so I’m more psychologically equipped to just stick with what I’ve got until it dies. I can’t be the only one who feels like that.
CrayziusMaximus t1_j8aphij wrote
You are not alone.
braveNewWorldView t1_j87a2db wrote
Somewhat divergent opinion from my personal experience. The scarcity caused by first cryptominers than the pandemic took a lot of joy out of the PC building experience for me. It stopped being the fun experience of optimizing to performance but instead was a rush to get a complete set, even if it were subpar. Eventually found a next gen console at a good price to wait things out and, damn if I don't love the monthy game pass. Miss the keyboard and mouse but getting used to the controller. Thinking of getting a Steam deck to clear my steam backlog and take a step back into PC Gaming. Though really enjoying this monthly pass and haven't felt the urge.
cad216 t1_j87ur9w wrote
If you have an Xbox: There is a game pass for Windows, which you can access with your same Microsoft account that’s tied to the Xbox. Tons of PC exclusive games on there, my go-to when I’m in a gaming rut is AoE 2
MosesZD t1_j89kk9a wrote
It's more than that. Technology advances on a sigmoid curve until there's an entirely new breakthrough and we're at the low-growth end of that curve after going through the high-growth.
Now, it's much more static mature market with mere incremental changes. And while it may be a shock to the tech-manufactures, in these mature-technology markets sales will fluctuate with far more with the economy instead of the tech-driven replace rate that we went through during the 90s and 00s.
And I'm a great example of it. From 1988 through about 2008 I had to replace/upgrade my work and personal computers every two years to keep up with the changes in programs that got more powerful (or just bloated) to take advantage of all the power increases.
Now I get a new computer every 7 years on average. I got my first i7-based computer in 2009. I'm now on my third i-7 that I bought just 3 months ago. That's about 7-years a PC now. Not two.
Clemario t1_j85yqm3 wrote
No one said it’s surprising
Tshoe77 t1_j8aiwic wrote
Woah you mean that after record sales of something you don't need to buy every year that there might be a downturn?
That's fuckin crazy.
xSwyftx t1_j844z97 wrote
For a lot of people it is as basic as which do I want to do more..eat or upgrade my pc
AnBearna t1_j841lvt wrote
The fact that people are forced into smaller and smaller apartments these days is a factor as well. There isn’t the space to have 4 big box machines for your home lab, so you buy cloud storage and a couple NUC’s instead.
Strais t1_j847mmg wrote
Is that a problem for literally anyone cause it sounds made up on the spot.
AnBearna t1_j889173 wrote
I pay over a grand a month in Dublin for a box bedroom in a house with 3 others. No, I am not an outlier, no I cannot magic more space for my homelab. It’s NUC’s/Pi’s, or nothing.
This is a common problem in Ireland. Too many people in Dublin, and the houses/apartments are small.
itb206 t1_j84ienf wrote
Clearly you don't live in NYC or SF, where even if you wanted to be a tech enthusiast unless you're shelling out 4+ grand every month you won't have the space.
Strais t1_j855vgj wrote
No big cities suck and suburbs suck worse. Go find a better place to live (rent of a full size 2+ bed detached house for less than $500 a month) with some country or mountains around you. Quit trying to keep up with the Joneses it’s not built to be possible.
[deleted] t1_j85684i wrote
[removed]
argv_minus_one t1_j8657z8 wrote
Then how do you get non-horrible Internet access? They don't run fiber out there. They barely even run cell service out there.
[deleted] t1_j85g2ju wrote
[deleted]
argv_minus_one t1_j84wou1 wrote
Reddit: “Is [lack of space in your tiny apartment] a problem for literally anyone cause it sounds made up on the spot.”
Also Reddit: “Fuck cars! Fuck houses! Everyone except the rich should live in high-density housing, own nothing, walk to work, and be happy about it!”
HiCanIPetYourCat t1_j84x5nl wrote
It’s almost like there’s no such thing as Reddit and every single account here is an individual with different opinions
[deleted] t1_j84xi28 wrote
[removed]
lswins t1_j84y8zj wrote
I feel like greenman is just out to light straw men on fire
AnBearna t1_j889a9i wrote
But that’s not true entirely. There’s echo chambers on Reddit as much as their is on FB or Twitter, and there’s a reason why people occasionally use the term ‘Reddit Hive Mind’.
ThrowAway4564468 t1_j84l3uk wrote
Gamers propped the market up. GPU manufacturers betrayed gamers by inflating prices when crypto mining became popular. Console manufacturers are putting out high quality hardware, even selling at a loss, and are getting PC gamers to convert. Less PC gamers, less CPUs being sold. That’s my guess.
HYPERBOLE_TRAIN t1_j84r9kq wrote
I’m a multi platform gamer. For the past several years I have been enjoying more and more games that don’t require high-end hardware. AAA publishers have been fucking over gamers for years and are churning out garbage. Meanwhile there are small devs out there making games because they enjoy games. I can play those games on hardware that is 10 years old.
Acualux t1_j84ryue wrote
This is the way
Doggleganger t1_j8k16km wrote
I've put over 100 hours into a random indie game called Into the Breach that could probably run on a SNES.
KingArthas94 t1_j84ulvs wrote
Which AAA game hurt you?
sampat6256 t1_j84y2eo wrote
Anthem
[deleted] t1_j85gh91 wrote
[deleted]
KingArthas94 t1_j85u5im wrote
It just means you rely on other people's opinions instead of forming your own, you should try AAAs made since 2015, much better than whatever we had from 2010 to 2015, starting from The Witcher 3.
Dave4lexKing t1_j85yet0 wrote
> It just means you rely on other people's opinions instead of forming your own
proceeds to give an opinion
KingArthas94 t1_j864jww wrote
I did say “play the games and make your own”, there’s a difference.
Accomplished-Ad3250 t1_j84q36d wrote
If I remember correctly every console ever produced has sold at a loss until years later. Many never turn a profit on the console itself, just the games.
futureygoodness t1_j84v7e4 wrote
Nintendo uses less advanced hardware and sells its consoles with a positive margin.
[deleted] t1_j85gnte wrote
[deleted]
futureygoodness t1_j85nrhy wrote
That’s like saying Disney would be dead if not for their massive IP + coming up with new animated princess musicals. Yeah, that’s why the company is really good at sustaining its core franchises and coming up with new mechanics every generation, because it’s profitable and no one else does it as well.
KsnNwk t1_j84zv0r wrote
Except PS4 and nintendo.
At this point in time probably already PS5 and XsX also making in profit.
But PS4 and nintendo usually profit or break even from the start. This was the only reason why I did not get a PS4. Skipped PS first time ever, but also was frustrated with lack of SSD on PS4. It was actually a good turn around for me, that's how I got into PC gaming and MMOs.
Now I enjoyed the PS4 libary on PS5 and PC. Gave away PS5 to my cousin, cause PS5 games are coming out on PC now too. I rather spend more money on upgrading PC rather than spend money twice on hardware.
Albeit I may come back to PS5 gaming if PSVR2 will become big and have nice libary of games.
Arentanji t1_j87gyvy wrote
You do not remember correctly. That model was never the truth. Most consoles at least break even.
Rain1dog t1_j883y6n wrote
PS4 was not sold at a loss, and the ps5 disk version became profitable a few months after being sold.
sketchysuperman t1_j868f0u wrote
I’ve got a 12600k/RTX 3070, (had GPU for 2 years now) and except for games I prefer with KB/M, I have to admit I’m on my Series X and I prefer it.
On top of what you mentioned, I think this is also a function of where the current gen consoles are in their life cycle. Give it another GPU generation or another 1-2 years in the consoles lifespan, I’ll start moving towards the PC again.
n0oo7 t1_j8dbkhg wrote
Exactly this. Even the PS5 are becoming easy to buy. PC gamers are going back to console.
GachaSheep t1_j84xc2k wrote
I want to build a new PC for my husband to replace his old 4790k/GTX970 build, but just the graphics card we’re aiming for alone is nearly the cost of a PS5. Sure, the costs of high-end cards came down once crypto crashed, but it feels like that whole thing barely made a dent in the prices of mid-tier cards.
glaive1976 t1_j86ssf9 wrote
>I want to build a new PC for my husband to replace his old 4790k/GTX970 build, but just the graphics card we’re aiming for alone is nearly the cost of a PS5.
You might consider a mitigating step and just do the graphics card. Even stepping to an RTX 3060 might be enough to go another 2-3 years on that rig especially if he's rolling 32GB of ram.
PseudonymIncognito t1_j8j7mdy wrote
If you're looking in that price tier, I'd go AMD where the deals are much better. A 6700XT is basically the same price as a 3060 and is a substantially more powerful card.
glaive1976 t1_j8k2miu wrote
Good point, I was thinking from a mixed use stand point where Photoshop stability and performance is very important whilst OP and her hubby are likely focused on gaming. Quality suggestion friend.
Rain1dog t1_j884fh1 wrote
The ps5 is an outstanding console(same with SeriesS and X). I have great VR, play outstanding titles from ps1 to Ps5, fantastic controller.
It works all the time without any tinkering and it takes 15 seconds from sitting down to turning TV on, Ps5, to be in game playing.
Can play games from 120, 60, 45, 30 fps.
It is basically like a mid tier pC running something similar to a rtx 2070.
I play online all kinds of games, single player, etc.
Definitely something to consider. Outstanding generation of consoles.
pixel_of_moral_decay t1_j86m2uc wrote
This is where I am right now.
Would love to upgrade, but graphics card are way overpriced still.
Those of us old enough to remember pre crypto nobody paid the insane MSRP, there was always either sales or rebates 60-90 days after a launch. So going back to MSRP is still way over what we’d normally pay.
Ninety8Balloons t1_j87kipb wrote
>Sure, the costs of high-end cards came down once crypto crashed
Pretty the MSRP of cards has actually increased since crypto crashed.
Northern-Canadian t1_j87yuuz wrote
Sorry, just chiming in here…
Can someone explain to me how graphics cards have anything to do with crypto? Wouldn’t RAM and CPU make the most sense?
Logpile98 t1_j8a2y7j wrote
When crypto was brand new, mining started off using CPUs. However, that quickly gave way to the creation of ASICs (application-specific integrated circuits), which are purposely designed and built for just crypto mining, and often one specific hashing algorithm (like the one bitcoin uses).
There was a big belief in the community at the time that the intention of crypto should be "one PC, one vote". The ideal network was secured by a shitload of people using their PCs to mine and each person would have a vote in the next block chosen. ASICs eliminated that possibility because they were orders of magnitude better at mining than even a crazy high-end desktop PC. Regular people couldn't contribute to mining power significantly anymore, it was all about who had the best/most ASICs. That's why although bitcoin is the largest cryptocurrency by far, it doesn't directly affect consumer PC components when its mining becomes more/less profitable.
Enter Ethereum. This shook up crypto in many ways, one of which was to make the network less centralized, by having a shitload of miners rather than a few with mega expensive mining rigs. Now I don't know enough about the technicals here but it was intended to be ASIC-resistant, and for that to happen it doesn't use just one single hashing algorithm. So you can't build a circuit specifically optimized for one type of math problem, you need to be good at a bunch of them. Basically you'd need an all-around better CPU. But for whatever reason, GPUs tend to better at crunching large amounts of numbers like you'd need for hashing algorithms. The analogy I've heard is that a CPU is like a fighter jet, and the GPU is like a cargo ship. The jet will get there much faster but because it can't carry much, the ship will get 10,000 tons of cargo to its destination much sooner.
Of course, it didn't take long before people started buying a bunch of GPUs and hooking them together for mining. But at the very least, if you wanted to have 10x the hashing rate of the best GPU available, you had to spend ~10x as much on GPUs. Compared to bitcoin, where 10x the spend could get you thousands of times more power.
Northern-Canadian t1_j8aoevi wrote
Thank you for taking the time to elaborate on this.
makeasnek t1_j9irsar wrote
This is an excellent explanation. Also worth adding that Ethereum moved to "proof of stake" a while ago, so nobody is buying GPUs to mine Eth at this point, whatever effect they have (if any) on current GPU prices has a time limit on it. There are other cryptos which use GPU mining, but they are not particularly popular and couldn't absorb much of the miner exodus when Ethereum switched to PoS.
Keks3000 t1_j88700f wrote
The operations that are needed to calculate (or rather guess) the keys required to mine new blocks on the blockchain are best run on graphics cards, hence the demand created by crypto, and the price hikes that came with it.
I’m not sure why that is the case though, maybe someone can explain how a GPU is better suited for the job than a CPU. I think it somehow has to do with CPUs focusing on parallelization and energy efficiency in recent years, while GPUs are more like raw power work horses.
Northern-Canadian t1_j8aog83 wrote
Thanks for explaining!
NecroAssssin t1_j8blle5 wrote
It's because GPUs are optimisized for math, since doing anything fancy (literally more than the original command line environment) with the display is a lot of math. CPUs however, are much more generalized to be able to give sane output in a variety of different ways for downstream processes.
KsnNwk t1_j852uzs wrote
Consoles have a 2700x in them and an RX 5700 / RTX 2070 in them.
A 5600 and 3060Ti (2080 Super) would already run much better than what consoles can do. It would run 1440p, 60fps, high settings in newest games.
If you aim at 4K in the newest games, then I would recommend at least 4080 for Ray Tracing or 7900XT for non Ray Tracing gaming.
The 5600 cpu would still be enough to get high fps gaming. Unless you can find am5 and 7600 and ddr5 for simlar price or not that much more (100-200) I would go route of going with 5600, ddr4, and b550.
GachaSheep t1_j854e9a wrote
Your economic recommendation is… purchasing GPUs that are the price of an entire console, and then for 4k, recommending me cards that cost even more?
I understand you had helpful intentions, and perhaps I should have clarified better, but the cost of pc building (and manufacturers actively choosing to reduce production/withhold stock to keep prices high) in this economic climate is the entire problem/reason why people like myself are hesitating to build even when we want to, and why PC CPU shipments are in decline.
Suggesting I purchase even more expensive parts than I was even planning for the build in the first place is kind of missing the point (though to be fair I suppose I never posted the intended build).
Wind_14 t1_j86etqd wrote
Second-hand GPU is definitely go down in price. 3060 Ti second is around $300 or cheaper here. Even the series 20 can be obtained quite cheaply (there's someone who claim that they get 2070 Super for just $200, although this listing is a rarity). The AMD RX 6600/6650 XT is always a really nice budget GPU(they're the winner for fps/$ for 1080p and 1440p). There's also a rumor about intel arc 750 being discounted to $250 now, although you need to check for this yourself. The 16xx series is also quite cheap, though obviously the upgrade might not be as big as you hope.
Ethereum's move to PoS basically made GPU mining almost unprofitable, so there's tons of second-hand GPU flooding the market, and if you/your hubby knows how to benchmark the GPU they're usable (Mining rig usually undervolt the GPU so they're often just as good as normal second hand GPU, but not every miner takes good care of their rig).
There's actually 2 listing on eBay for 2070S under $200, but they look very sketchy.
KsnNwk t1_j855uf7 wrote
If you want a PC for cheap and one that perform rival consoles.
You ann just get a RTX 3060 and ryzen 5600. That's 100 for mobo, 100 for ram, 120 for cpu and 329 for gpu. It's still a litte more powerful and should handle 1440p 60fps high settings gaming with dlss.
That's 150 more than for consoles, but on PC you don't pay for Online and games are cheaper over time.
Also consoles albeit adv. for 4K run most games at 1400p-1800p dynamic resolutions and at medium settings at best to active 60fps.
PlutoniumChemist t1_j86caxt wrote
Budgeting 0 dollars for you power supply is a good way to burn your house down
Consoles ship with a case, your PC doesn't
Is Windows still free?
Assuming they are going to use the kb/m and monitor they already have
Stock_Regular8696 t1_j87nglo wrote
Windows will always be free 🤫
STILLADDICT t1_j86ff6n wrote
Good call out. HD/Power/accessories/case/software/monitor.. It's already over $1k ez. Hopefully it comes with free shipping at the least.
KsnNwk t1_j881jnr wrote
She have psu and case from old PC and accessories. Don't get it why she cannot use those and that is why my recommendstion did not have those.
[deleted] t1_j89ozgm wrote
[removed]
PlutoniumChemist t1_j88wz05 wrote
Oh so you don't get a brand new PS5 from the store? You have to grab a screw driver and manually take everything out of your PS4 except for the power supply, then put all of the new PS5 components into the old PS4 case??
I'm a PC gamer, I even have a full custom loop that takes a bit of work to maintain. But it's simply fact for the last several years that consoles are a better value for straight gaming. PC catches up in value if you're reusing old parts, using it to multitask with productivity/WFH, and take advantage of video game discounts/sales, opposed to ps+/Xbox live which has a monthly fee. But the immediate cost is simply higher than console
KsnNwk t1_j892vtk wrote
Your comments makes no sense.
They can re use their case and psu and that is normal. If they are changing mobo and cpu anyway, then it's one job.
If you upgrade your GPU, cpu+mobo+ram, it's not like you have to rebuy psu and case every time.
Arentanji t1_j87gtj4 wrote
Series X is generally considered equivalent to a 3060 Ti. So congratulations, you spent twice as much to get roughly the same performance.
KsnNwk t1_j8827di wrote
Consoles play on medium settings (at best) to get 1440p-1800p upscaled to 4K and utilize dynamic resolution to achieve that performance.
When you consider most new games have DLSS or FSR 2.1 then 3060 achives same performance or even better than consoles at same settings.
Plus she already had psu, case and accessories. So that is only 150$ more for 3060 PC and around 300$ more for 3060Ti.
Additionaly if you look at used market you can get 60ti for 300$ or 3080 for 600$ regularly in good condition with warranty left.
The difference is easily made back up across the years with not paying for online, cheaper games, more indie titles and you can upgrade your GPU over time.
While for consoles you have to buy brand new one every time new gen is out and prices of games and subscription for consoels are ever increasing.
Rain1dog t1_j885tud wrote
I bought a ps4 for 399.00 in 2014 and played flawlessly until 2020 when I got a ps5.
I’ve paid on average around 3.25 a month for online access plus 3 games per month with PSPlus. You can now get over 900 games with your subscription for a few dollars per month with their expanded service.
The only time you spend 70.00 is when a game launches. After a few weeks games drop usually around 15-25% and if you are a PS Plus member you usually get an additional 10% of sale prices.
I got Cyberpunk steel book edition for 5.99, Dying Light 2 for 25.00, Dead Cells with season pass for around 9.00, Tiny Tina WonderLands for 15.99, Witcher 3 Season Pass for 5.99. Games go on sale at insanely cheap prices every other week on the ps store. If you have disk version you can get launch titles days after launch for 1/2 from people selling after they beat the game. On average I’m spending 2.99 to 25.00 for games in Sony’s ecosystem.
Since the consoles are all alike dev’s can get some absolutely insane looking games running on such cheap hardware. The graphics they pulled off on a shitty Jaguar cpu unit from 2013 was mind blowing. Sony pulled off voodoo magic running VR as good as they had on the PS4 with that cpu. While a console will never match hardware that launches 3 years after its launch its dam close.
Then if I’m online I rarely come across people running hacks, aimbots, etc.
I’ve switched over to playing on my ps5 PSVR2 almost 100% online/single just how great it is and how convenient it can be.
I get not everyone likes consoles but this generation has been a massive leap. Great cpu, gpu, SSD all for 399/499, absolutely outstanding.
KsnNwk t1_j895xp6 wrote
The opposite.
I'm a multi platform gamer, got PC 4K, PS5 and NS.
Edit:
But ps4 on release we underpowered to those today's PCs.
A year before release I already had 4770K, GTX770 and 750GB of SSD storage.
Which was faster in everything by margin of twice and loading times were faster by margin of 5x.
I agree though PS5 been massive leap and positive outcome. It aged way way better than any consoles before it in terms of performance and features (PSVR2, VRR, HDR (HGIG), AI adaptive resolution).
ReviewImpossible3568 t1_j854vfi wrote
An RX 5700? Not at all. Those are RDNA2 graphics and Zen 2 (3700X, not 2700X), if I recall correctly (I looked it up once upon a time, but don’t remember the exact SKU it was similar to) it’s more like a 6700XT/6800. In raw performance, a 3060Ti would actually be pretty close.
PC is still better in my personal experience, but the consoles are unbeatable value right now and there’s no need to act like they have worse hardware than they do.
Edit: looked it up and the core count is in between 6700XT and 6800, but it’s also clocked lower so it probably performs about in the middle given the slightly lower overhead and higher optimization that you can get on a console.
KsnNwk t1_j85572g wrote
Nah DigitalFoundary said it was around 2070 performance IIRC. 3060ti is equal to 2080 super. IDK about RX radeon but 2070 was simlar to 5700 at the time.
ReviewImpossible3568 t1_j858roy wrote
I don’t know what DF said, but by looking at the spec sheet I can tell that the GPU in the Xbox Series X is in between the 6700XT and the 6800. Maybe real world performance is in between 3060 and 3060Ti, but it’s definitely not as low as a 2070. I’ve played on similar cards to the 5700/XT and the new consoles definitely outperform them. I’d guess it varies quite a bit based on the game, but since all I have to go on is the specs I can tell you it’s in between the 6700XT and 6800.
Broadband- t1_j8702o3 wrote
Current consoles are equal to 3800 amd and 2080 nvidia
Rain1dog t1_j884muc wrote
The CPU’s in consoles are closer to 3700x.
Stock_Regular8696 t1_j87naji wrote
5800x3d would be better for cpu.
KsnNwk t1_j881eax wrote
No way, sherlock. She said she wanted a budget PC. At the moment those are her best options.
Stock_Regular8696 t1_j885z7c wrote
It's an older gen so it can be bought used. It's super good.
Quigleythegreat t1_j84pxjw wrote
I built my parents a desktop ten years ago with a 4570 and 16gb or ram. For what they do it's still all they need and still runs windows 10. Gamers and business are the only people who regularly upgrade anymore.
ci_newman t1_j844wzj wrote
My 9 year old 4790k still plays modern games at 4k and 60fps. Why do I need anything else?
Hell, that CPU is older than my kid...
HiCanIPetYourCat t1_j84xm13 wrote
That CPU is an absolutely massive bottleneck on any recent GPU my dood
R1ddl3 t1_j84cmmw wrote
If you're pairing that with a GPU that can play demanding games at 4k/60, you've definitely got a bottleneck going on.
Skarth t1_j84r0qw wrote
High resolutions are typically more GPU limited. If he's not aiming for 120fps, an old CPU does a lot better in games than people think it will.
Plattfoot t1_j84s19c wrote
That CPU still needs to feed the GPU. In the end it depends on the game and needs of the user. Easy rule, if the CPU is around 100% he is an issue, if not GPU is.
HiCanIPetYourCat t1_j84yhqi wrote
Just as an example, I just got an RTX4090 and put it in my 2021 built 5600x based PC. The 5600x is several generations newer and a whole lot more powerful than a 4790. My Timespy benchmark was 19,000ish, Cinebench was 28,000ish.
I then upgraded to a new 13900 cpu, one gen ahead of the 5600x. The same GPU then scored 30,000 in Timespy, and 40k in Cinebench. Even that one gen old CPU was a gigantic bottleneck on the new GPU.
I don’t know what card he’s on but it must be a 3080 or better if he’s running 4k 60fps on that old CPU, which means he would see a huge gain from upgrading.
If it works it works and whatever, this is just how it be ☺️
KsnNwk t1_j851rwq wrote
I don't entirely disagree, but synthetic benchmarks are different than games.
Plus, the money not spent on CPU, Mobo, and RAM can be spend on GPU and Monitor.
If it still plays smooth for him that is what matters, not just numbers.
I gone with 4770k GTX770 to a 1060, then 1080 then 2080 and upgraded to 1440p. In single player games it was very good. Stupidly enough I was completely fine in AAA title. But actually it was old competitive games like CSGO or simracing games that had micro stutter ever so often.
Upgraded to 5800x3d 32gb 3600cl16 and b550 wifi. Spent 510€ on that and problems were gone.
But thar 510€ plus another 300€ could gotten me a 4070Ti for 4K gaming and for SP, I would still be fine with my 4770K and 16gb ddr3.
Vanman04 t1_j8blis4 wrote
The thing is rhe benchmark means nothing.
If the pc plays the game acceptably the bottleneck means nothing beyond they likely overspent on the card.
Chasing numbers is pretty silly. After a cetain point the framerate goes up but it makes virtually no difference in the player experience.
Chasing 120 fps is for people who have nothing better to do with their money. Most of the research suggest a cap of 60 fps for the human eye to register. Some research points to maybe as high as 90 but even then most cards on the market these days can acheive that easily.
it's kind of like buying a gallon of milk when you only drink a glass a week. sure you have more milk but you dont really need it. You would be much better off buying a quart.
Yup bottlenecking is a thing to be sure but again after a certain point that only means you spent too much on the card or you have room to grow in the future assuming the experience you are getting is acceptable.
HiCanIPetYourCat t1_j8bm564 wrote
It’s a way to discuss performance. It pretty much directly translates to fps.
If a person can’t tell the diff between 60 fps and 120 they are literally brain damaged.
R1ddl3 t1_j84vyff wrote
I mean I agree, but we're talking about an 8+ year old CPU.
futureygoodness t1_j84vh5e wrote
But who cares if it’s still hitting his performance expectations
YanVoro t1_j866rvz wrote
Just got a 4070 for my 4790k and I’m getting ~100 with 1440p with dips here and there. Still better than a console and I can also max out all the settings. Though I’m not an fps chaser myself, I’d rather have a prettier picture.
Llama-Lamp- t1_j85vpku wrote
Because won’t don’t constantly need new CPU’s.
Unless you’re in a line of work that benefits massively from every last drop of performance you can get, then pretty much any CPU released in the past few years will get the job done.
There’s no reason to keep upgrading when the performance gain is minimal, so people are saving their money and sticking with what they’ve got, especially considering the world is currently an economic shitshow.
tatanka01 t1_j88639g wrote
Just hit 4 years on the "killer" i9 box I built in 2019. Still stable, still fast. There just doesn't seem to be an upgrade need.
DriftMantis t1_j861jas wrote
If microsoft wasn't locking people out of a new OS then no one would even need to upgrade. You can still hit 60fps in most games with like a 10 year old four core cpu.
If your not looking to game a lot of things that you needed a desktop cpu for are now done on smartphones or apple hardware. That leaves gaming, and the industry is such a mess there are only a few triple a games worth a shit anyway.
chriswaco t1_j84ehc0 wrote
Other than gaming, there's little reason to upgrade. PCs plateaued - most people upgrade when something breaks or they have so much crud on the old one they want to replace it. Apple's switch to the M1 was a big step in both speed and battery life, but it isn't worth upgrading to an M2 from an M1, plus the prices for RAM and SSD upgrades are crazy.
Radiobandit t1_j84is0l wrote
Well basing anything off the price gouge offered by a PC company is a bit silly. I was browsing for an upgrade for a friend the other day and to upgrade the RAM and SSD from 8g-16g and 512MB-1TB respectively raised the price by around 500. Meanwhile the individual prices of just buying a 2TB M.2 SSD and 16g of ram from newegg was about 150 pounds. Imagine more than tripling the price for what amounts to 3 screws worth of work.
ReviewImpossible3568 t1_j855lcy wrote
Apple’s components are not the same. The M series chips use on-package ultrafast (I forget what type) memory that’s accessible by the CPU and GPU and cannot physically be swapped, it’s not like before when they soldered the DIMMs for no reason, packaging it this way actually gives a performance benefit. The SSD I will give you being overpriced, but Apple’s SSDs are still super fast and not that bad.
Vanman04 t1_j8bmxn5 wrote
Nah it's LPDDR5 and you can get it all day for around $100 for 16gb
Apple straight up robs people and has been for decades now.
ReviewImpossible3568 t1_j8bz501 wrote
No, it actually does not work that way. I can’t speak to you on whether it’s LPDDR5 or not but it’s on the package which means there’s no way you’re gonna be swapping it. It’s an SoC.
Vanman04 t1_j8cl5xq wrote
Well yes and no.
Yes it's an Soc but it is not magic memory.it's ddr5 with an apple proprietary connection.
The point remains the memory is not nearly as expensive as what apple.is charging for it. It is also a practice they have had since long before the new CPU.
Apple has one of the highest profit margins in the world and it is because they overcharge you for everything.
I would not touch one of these machines with a ten foot pole just because of the proprietary fort nox they have on replacement parts.I mean it is beyond ridiculous at this point. They now have the track pad serialized so you can't replace it without getting the part from them.
That's predatory in my opinion.
Nice machines but apple can go fuck right off.
ReviewImpossible3568 t1_j8cn10n wrote
Yeah, I mostly agree with that. I’m not gonna pretend that I’m a hugely knowledgeable memory nerd, but I do know a bit and I’m pretty sure that having the memory on the package gives some form of benefit, and obviously because it’s on-package you’re not gonna be able to replace it. That said, their soldered DDR4 chips in their older (2016-2019 afaik) laptops were pretty stupid and anti-consumer.
And yes, Apple does charge extra for their machines. I’m happy to pay it because I live in the walled garden and because I’m reasonably sure that they’re not monetizing my data in return for that extra money that I pay them, but I can see why people would be upset about it. I own a custom build as well and while I love tinkering with it (I just did a full transplant into a smaller ITX case, which was a huge pain but super rewarding), my Apple devices just fit my life better. I would never want to do any form of work on my Windows machine, those little creature comforts that you get from having a full ecosystem (seriously, I own and actively use every single class of their products except a desktop Mac) are just without parallel. Nobody has created the ecosystem that they have, and that + privacy is really what you’re paying for when you spend $400 extra for RAM that costs them like, $50 to make.
JeffFromSchool t1_j85me1g wrote
Do people game on macs?
chriswaco t1_j85nia2 wrote
Some do. More on iOS. Serious gamers don't from what I can tell.
JeffFromSchool t1_j85pc5k wrote
I was going to say, I'm not sure their graphics processors compete with Nvidia or AMD right now
Dawnfreak t1_j874we6 wrote
I moved to PS5 during the graphics card shit show.
Bladedrax t1_j86cdx6 wrote
Well the good news for CPU makers is Microsoft will force a lot of users to upgrade CPU's when windows 10 comes end of life in two years.
Sector__7 t1_j86g926 wrote
You don’t need to upgrade the CPU to install Windows 11 as there’s a bypass for the requirement. Microsoft even has a guide on how to bypass the requirement during the install.
ABDL-GIRLS-PM-ME t1_j8ivrri wrote
That is until you try to update Windows. I used various bypass methods on several unsupported machines and was unable to update to Windows 11 22H2 until I made ANOTHER set of modified installers.
Sector__7 t1_j8iy8c5 wrote
Yes, this makes sense and is just common sense. Why would you think that you wouldn’t have to do the bypass method on every major update?
[deleted] t1_j88o99h wrote
The PC hardware market is largely based around the needs of PC gamers. 20 years ago you could release a new CPU or GPU and there would immediately be new games pushing the limits and taking advantage of those new chips. But today the PC gaming market is largely subservient to the much larger console gaming market. Console hardware development moves much slower, the pace of graphical development has slowed, and thus the need for faster and faster chips has slowed as well.
I built my computer in 2020 and it still plays every new game at the highest settings. 20 years ago you could build a top of the line computer and then six months later Crysis comes out and makes it feel like a potato. That just doesn't happen anymore.
unoriginalname17 t1_j88nx8e wrote
Didn’t I just read an article about one of the manufacturers purposely under shipping to make sure prices stayed high?
LuckyCharms201 t1_j8ecxso wrote
AMD
NotTooDistantFuture t1_j8d90k8 wrote
The current generation of CPUs is barely better than the last. They just cranked up the power usage for most of the gains.
Even if you don’t care about power cost, the noise can be reason enough to skip it.
SorakaWithAids t1_j84m461 wrote
I bought a full custom watercooling setup for my 11th gen-3000 series build. Might wait another 2 or 3. Depends on if I get this huge pay raise, then I'll just buy
[deleted] t1_j86ozwk wrote
[deleted]
smashkraft t1_j8792tk wrote
You're going to have to pry the Pentium G4600@3.6GHz + 8 GB DDR4-2133 + 1050 Ti from my cold dead hands. It might get 8GB added, but that's about it.
I'm in a proper US midtown in a top-15 city size and the power keeps brownout'ing every 2-3 weeks, so I know that they are trying. I did not buy a crap PSU and have a true surge suppressor (that breaker has tripped a few times, she might get the budget from the RAM)
routerg0d t1_j87pdeb wrote
Because everything that goes with the CPU has become 2-4x the cost of the CPU and unless you can dump serious cash there's very little benefit to people who already game on rigs that do just fine at 1080p. Prices need to come back down to earth, but good luck explaining that to wall street.
Biscuits4u2 t1_j89gphd wrote
Nice! Time for some great deals!
Isamu29 t1_j8ankmi wrote
The scalpers for all things tech ruined gaming for me.
rmcooper541 t1_j8b7v5s wrote
A slump in sales AND no inventory to buy? Mmmkay.
srebew t1_j8q8t6l wrote
I'll upgrade to a 5800X3D if it's $200, otherwise it'll be at least another 3 years.
dcheesi t1_j85b2b5 wrote
My laptop mobo that decided to randomly short out last week:
EDIT: ...and researching replacements suggests why things might have stagnated. Sure, I can get a faster CPU for less money these days, but it's darn hard to find a mobile GPU that's better than the 6yo old one in my old laptop!
elementality883 t1_j86vmqq wrote
Couldn't this also be attributed to AMD artificially creating shortage by under-shipping chips?
LevelWriting t1_j862cub wrote
its just so much more convenient to buy a laptop these days, can carry anywhere and can play any game too. never going back to desktop.
chriswaco t1_j84eli7 wrote
Other than gaming, there's little reason to upgrade. PCs plateaued - most people upgrade when something breaks or they have so much crud on the old one they want to replace it. Apple's switch to the M1 was a big step in both speed and battery life, but it isn't worth upgrading to an M2 from an M1, plus their prices for RAM and SSD upgrades are crazy.
HalobenderFWT t1_j852jka wrote
I mean, I didn’t mean it literally.
chriswaco t1_j84epr7 wrote
Other than gaming, there's little reason to upgrade. PCs plateaued - most people upgrade when something breaks or they have so much crud on the old one they want to replace it. Apple's switch to the M1 was a big step in both speed and battery life, but it isn't worth upgrading to an M2 from an M1, plus their prices for RAM and SSD upgrades are crazy.
HalobenderFWT t1_j852kqm wrote
Ok, calm down there….
chriswaco t1_j84f87q wrote
Other than gaming and maybe 3D or video editing, there's little reason to upgrade. PCs plateaued - most people upgrade when something breaks or they have so much crud on the old one they want to replace it. Apple's switch to the M1 was a big step in both speed and battery life, but it isn't worth upgrading to an M2 from an M1, plus their prices for RAM and SSD upgrades are crazy.
HalobenderFWT t1_j852ihi wrote
You can say that again!
Rain1dog t1_j8866o2 wrote
Lol
Harbinger2001 t1_j84lb2m wrote
My macs generally last about 8 years before becoming too obsolete. While I’m excited to get an M2, I don’t have a need just yet. I’m guessing I’ll be getting an M4 or M5.
PARANOIAH t1_j83yy0t wrote
How is this surprising? Inflation, economic downturn and the fact that almost everyone who wanted a new PC would have already gotten something fairly recent during the last few years of the pandemic/WFH era.