pyonsu2
pyonsu2 t1_jb3y5ps wrote
Reply to [D] Best way to run LLMs in the cloud? by QTQRQD
maybe, Colab Pro+?
pyonsu2 t1_jabzq81 wrote
Reply to [D] More stable alternative to wandb? by not_particulary
What are potential alternatives?
Mlflow, tensorboard?
pyonsu2 t1_ja6xz70 wrote
Hottest ever. RHFL, robotics
pyonsu2 t1_j9ds6j5 wrote
Reply to [D] Large Language Models feasible to run on 32GB RAM / 8 GB VRAM / 24GB VRAM by head_robotics
Depends on what you’re trying to do but just use OpenAI APIs. Your effort/time is also expensive.
pyonsu2 t1_jcen9vb wrote
Reply to [D] What do people think about OpenAI not releasing its research but benefiting from others’ research? Should google meta enforce its patents against them? by [deleted]
Proving this is possible is already valuable.
Soon-ish open source communities will figure out and build something even “better”