Viewing a single comment thread. View all comments

phamtuanminhmeo t1_jb9owvy wrote

Did you put the prompt inside the text "The answer for the question "<prompt>" would be:" and make it the input? I think it would limit a lot of the generated text because it would give it a fixed context. Can we please try without it?

3

ortegaalfredo OP t1_jbaadnz wrote

Yes, you can send raw prompts using 'raw' like this:

&#x200B;

'@ BasedGPT raw The recipe of a chocolate cake is'

This will send whatever you write raw, without any wrapping or added text. But you have to write the prompt as a continuation like every other LLM before ChatGPT.

3