[D] Choosing Cloud vs local hardware for training LLMs. What's best for a small research group? Submitted by PK_thundr t3_11rnppe on March 15, 2023 at 6:01 AM in MachineLearning 11 comments 11
CKtalon t1_jc9hm91 wrote on March 15, 2023 at 6:41 AM Don't think a 40K budget can get you a machine with 256GB VRAM. It's barely enough to get 8xRTX6000 Ada, and that's ignoring how you would need a high-end workstation/server-grade CPU/motherboard to support 8 cards. Permalink 5
Viewing a single comment thread. View all comments