Submitted by bikeskata t3_10r7k0h in MachineLearning
2blazen t1_j70vh2g wrote
Reply to comment by CowardlyVelociraptor in [N] OpenAI starts selling subscriptions to its ChatGPT bot by bikeskata
Might be just me, but I really hate how the reply is returned in the UI. Even if the subscription will solve the random interruptions during generation, the word-by-word printing kills me, I'd rather wait a bit but receive my answer in one piece
danielbln t1_j7c9mpc wrote
I much prefer to see the tokens as they are generated, it's much better UX as you can abort the generation if you feel it's not going in the right direction. All my GPT3 integrations use stream:true and display every word as it comes in.
Viewing a single comment thread. View all comments