Viewing a single comment thread. View all comments

Kaarssteun OP t1_j29a9ad wrote

That's the thing - this going viral is costing them millions. There is no product for them to sell given people now expect this service to be free.

−22

gantork t1_j29czbf wrote

Nah, they could easily shut it down or start charging. DALLE-2 was also free at the start. I'm pretty sure they have this under control.

26

dietcheese t1_j2b8p7p wrote

They will start charging, and instead of spending millions they will make millions every month.

These people are not stupid - they have major backers funding and advising them.

10

hauntedhivezzz t1_j29cfjw wrote

They’ve already sold it, it will be integrated into Bing next year, and while this cost may be a lot for a small startup, it’s a drop in the bucket for the company paying for it, Microsoft

23

Kaarssteun OP t1_j29cl0a wrote

Isnt that just a rumor so far? I love that microsoft is working with openai so closely, but has that been confirmed?

2

Kaarssteun OP t1_j29curp wrote

As i said, i know and love they work together closely, but that's not confirming chatgpt will be integrated into bing

8

hauntedhivezzz t1_j29d4o2 wrote

sure, not confirmed, implied, but the premise that this is a disaster because it’s outsized success is costing them, is just short sighted

12

Equivalent-Ice-7274 t1_j29n998 wrote

The ChatGPT app costs money, and they can easily place ads within it, and around it.

0

slashd t1_j2dhgav wrote

Finally a good reason to use Bing instead of Google 😂

1

blueSGL t1_j29d1ee wrote

> There is no product for them to sell given people now expect this service to be free.

I don't get the argument.

If they want to yoink it and put it behind a paywall where you pay for tokens they could do that today.

If people still want to use it they pay or stop using it.

This has happened before. (look at Dalle2)

13

treedmt t1_j29ty59 wrote

That would be awesome for the free competitors though.

2

blueSGL t1_j29uv5o wrote

> for the free competitors

Who are the free competitors?

3

treedmt t1_j29v2zd wrote

LUCI for one. Not exactly chat format but generative single turn question answering. Http://askluci.tech/QA

1

blueSGL t1_j29xsiu wrote

but that's not what ChatGPT is offering.

Anywhere that is able to do the sorts of things that ChatGPT does will be in a 'loss leader' phase to begin with to attract customers. Or a [x] tokens per month are free, or other marketing trick.

Until inference cost is lower than the cash generated via advertising all services will be losing money, at that point it's either start charging or stop the service.


ChatGPT has succeeded in getting the name out. They are losing money by operating (if the training data they are getting from people is worth less than inference costs) so the solution is to start charging money.

Continually running a product that is in the red to prevent competitors products who are also in the red from succeeding seems like poor decision making in the long term.

5

treedmt t1_j29yope wrote

LUCI is also built on a fine tuned gpt3.5 model, so pretty close to chatgpt in terms of capabilities.

They have a very different monetisation model afaik. They are tokenising the promise of future revenue to monetise, instead of charging customers up front.

> if the training data is worth less than the inference cost.

The thesis is that training data could be worth much more than inference cost, if it is high quality, unique, and targeted to one format (eg. problem:solution or question:answer)

In fact, I believe they’re rolling out “ask-to-earn” very shortly, which will reward users for asking high quality questions and rating the answers, in Luci credits. The focus appears to be solely on accumulating a massive high quality QA database, which will have far more value in the future.

I’m not aware of any rate limits yet but naturally they may be applied to prevent spam etc., however keeping the base model free is core to their data collection strategy.

2

theRIAA t1_j2ejmws wrote

> so pretty close to chatgpt in terms of capabilities

I was impressed that it could give me generic working one-liners, but that is quite far off from writing a working program with 100+ lines of code in all major languages, like ChatGPT can (effortlessly) do. But thank you for the link, it's still very useful.

1

Think_Olive_1000 t1_j29bca8 wrote

They've cut the number of requests you can make an hour, addresses the cost issue somewhat. I think they can plug the hole made by unexpected influx with their marketing budget. It's gotta be one of the most succesfull tech product launches of all time with number of unique and new users reaching the million mark within a week of going live.

12

TheTomatoBoy9 t1_j29s315 wrote

The expectations for it to be free are with the current version. Subsequent versions will easily be marketed as premium and sold through subscriptions.

Then, this doesn't even address the whole business market where expensive licenses can be sold.

Finally, they are bankrolled by Microsoft, among others. Eye watering costs are only eye watering to small startups. It's not much of a problem when the company backing you is sitting on $110 billion in cash.

In the tech world, you can lose money for years if you can sell a good growth story. Especially with backers like Microsoft.

3

visarga t1_j2bi28f wrote

I expect in the next 12 months to have an open model that can rival chatGPT and runs on more accessible hardware, like 2-4 GPUs. There's a lot of space to optimise the inference cost. Flan-T5 is a step in that direction.

I think the community trend is to make small efficient models that rival the original, but run on local hardware in privacy. For now, the efficient versions are just 50% as good as GPT-3 and chatGPT.

2