Submitted by vernes1978 t3_zv2cpt in singularity
GuyWithLag t1_j1q43hg wrote
Reply to comment by imlaggingsobad in Money Will Kill ChatGPT’s Magic by vernes1978
There's good indications that one can trade-off training time and corpus size against model size, making the post-training per-execution cost smaller.
Note that ChatGPT is already useful to very many people; but training a new version takes time, and I'm guessing that OpenAI is currently still in the iterative development phase, and each iteration needs to be short as it's still very early in the AI game.
Viewing a single comment thread. View all comments