Submitted by ortegaalfredo t3_11kr20f in MachineLearning
phamtuanminhmeo t1_jb9owvy wrote
Did you put the prompt inside the text "The answer for the question "<prompt>" would be:" and make it the input? I think it would limit a lot of the generated text because it would give it a fixed context. Can we please try without it?
ortegaalfredo OP t1_jbaadnz wrote
Yes, you can send raw prompts using 'raw' like this:
​
'@ BasedGPT raw The recipe of a chocolate cake is'
This will send whatever you write raw, without any wrapping or added text. But you have to write the prompt as a continuation like every other LLM before ChatGPT.
phamtuanminhmeo t1_jbcjjnb wrote
Thank you so much 🥺
Viewing a single comment thread. View all comments