Viewing a single comment thread. View all comments

DaffyDuck t1_jdtz90r wrote

Can you not essentially prevent hallucinations by instructing it to tell you something, like a fact, only if it is 100% confident? Anyway, interesting topic! I’m also wondering if it could essentially spit out all of its knowledge in a structured way to essentially rebuild human knowledge.

1