Submitted by imgonnarelph t3_11wqmga in MachineLearning
Educational-Net303 t1_jd051kh wrote
Reply to comment by I_will_delete_myself in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Cyberpunk on max with psycho takes ~16gb max. It's gonna be a few years before we actually see games demanding more than 24.
I_will_delete_myself t1_jd05atn wrote
Now try that on 2-4 monitors. You would be surprised how premium gamers like their hardware. It’s like checking out sports cars but for nerds like me.
Educational-Net303 t1_jd05hmc wrote
Are we still taking consumer grade hardware or specialized GPU made for a niche crowd?
42gether t1_jd0juau wrote
Niche supercar gamers start up the industry which then will lead into realistic VR which will then lead into consumer high quality stuff?
Educational-Net303 t1_jd0k6p6 wrote
Which takes years
42gether t1_jd2rfb6 wrote
Okay, thank you for your input.
And?
Newsflash everything we did started because some cunt felt like growing lungs and wanting oxygen from the air.
It all takes time, what are you trying to argue?
Educational-Net303 t1_jd2rsax wrote
My whole point is that it will take years before we get to 48GB vram consumer GPUs. You just proved my point again without even reading it.
[deleted] t1_jd306w6 wrote
[removed]
Viewing a single comment thread. View all comments