Submitted by bikeskata t3_10r7k0h in MachineLearning
lunarNex t1_j6vgjgy wrote
So not "open" AI anymore? That greed sets in fast.
[deleted] t1_j6vhw51 wrote
You must be new here from a gaming subreddit or something where people talk like this, and not actually in a research field.
ChatGPT is the only free, self hosted product they have exposed people to. This is actually the norm for OpenAI and you would be dying on a stale hill.
Other than that their inference code is open. You can run a local version of GPT with your own code and a locally existing model right now (if you know what you are doing, minor caveat)
Same for their Whisper code. Doesn’t get more open than that. The compute required to train a multi billion parameter model isn’t something you could do anyways.
Lastly “open” doesn’t just mean free of cost. It means intellectually transparent about the code (this is always what it means). There’s no reason to confuse the two. It costs 100k per day to run these models so I’m not sure what leads you to think that risk should be part of an intellectually open philosophy when you can just deploy GPT yourself if you’re so inclined.
Welcome to the sub.
Viewing a single comment thread. View all comments