Submitted by ravik_reddit_007 t3_zzitu1 in technology
rslarson147 t1_j2d06gs wrote
Reply to comment by DaffyDogModa in There's now an open source alternative to ChatGPT, but good luck running it by ravik_reddit_007
It’s just one GPU compute cluster, how much power could it consume? 60W?
Coindiggs t1_j2ekopz wrote
One A100 needs about 200-250w each. This needs 584x250w = 146,000w so approximately 146kwH. Average price of power is like 0.3$/kwh right now so running this will cost ya 43.8$ per hour, 1051.2$ per day or 32,500ish USD per month.
DaffyDogModa t1_j2d09dx wrote
One GPU if it’s a bad boy can be hundreds of watts I think. Maybe a miner can chime in to confirm.
rslarson147 t1_j2d0cqe wrote
I actually work as a hardware engineer supporting GPU compute clusters and have access to quite a few servers but I’m sure someone in upper management wouldn’t approve of this use
XTJ7 t1_j2fhhjd wrote
This went right over the head of most people. Brilliant comment though.
Viewing a single comment thread. View all comments