Submitted by tmblweeds t3_zn0juq in MachineLearning
rafgro t1_j0fpg80 wrote
Do you embed some special clauses or verification to limit hallucination? In my experiences with splicing primary sources into input, sometimes it can even induce more hallucinations (which can be more believable with sources but still false!). To test it out here, I consciously asked a few questions with no obvious answers - such as "What genes cause brain cancer?" - and got nice response in the form of "there's no answer yet".
Viewing a single comment thread. View all comments