Viewing a single comment thread. View all comments

imlaggingsobad t1_j1oqq9c wrote

Also this is very early days still. Computers also started off expensive, so did phones, gaming consoles, and TVs. But now we have a huge market with many affordable options.

2

GuyWithLag t1_j1q43hg wrote

There's good indications that one can trade-off training time and corpus size against model size, making the post-training per-execution cost smaller.

Note that ChatGPT is already useful to very many people; but training a new version takes time, and I'm guessing that OpenAI is currently still in the iterative development phase, and each iteration needs to be short as it's still very early in the AI game.

1