Viewing a single comment thread. View all comments

go_comatose_for_me t1_j2c6afc wrote

The article made it seem that running the AI at home would be stupid due to hardware needs, but not completely out of reach. The new software does seems to be very, very reasonable for a University or Company doing research into AI to build and run.

118

EternalNY1 t1_j2cd4hm wrote

They still estimate $87,000 per year on the low end to operate it yearly on AWS for 175 billion parameters.

I am assuming that is just the cost to train it though so it would be a "one time" cost every time you decided to train it.

Not exactly cheap, but something can can be budgeted for larger companies.

I asked it specifically how many GPUs it uses, and it replied with:

>For example, the largest version of GPT-3, called "GPT-3 175B," is trained on hundreds of GPUs and requires several dozen GPUs for inference.

57

aquamarine271 t1_j2ckpo1 wrote

That’s it? Companies pay like at least 100k a year on shitty business intelligence server space that is hardly ever used.

75

wskyindjar t1_j2cly2m wrote

seriously. Chump change for any company that could benefit from it in any way

28

aquamarine271 t1_j2cmsny wrote

This guy should put a deck together on the source of this 87k/yr and make it public if he wants every mid sized+ company to be sold on the idea

10

Tiny_Arugula_5648 t1_j2d910s wrote

It costs much less and trains in a fraction of the time when you can use a TPU instead of a GPU on Google Cloud.. that’s how Google trained the BERT & T5 models..

7