Submitted by imgonnarelph t3_11wqmga in MachineLearning
currentscurrents t1_jd0f76v wrote
Reply to comment by Educational-Net303 in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
I mean of course not, nobody would make such a game right now because there are no >24GB cards to run it on.
frownyface t1_jd6q1qi wrote
There was an insane age of PC gaming where hardware was moving so fast that game developers were releasing games with max-settings that didn't run on any current hardware to try to future proof themselves from having a game suddenly feeling obsolete shortly after launch.
Viewing a single comment thread. View all comments