Submitted by [deleted] t3_yw3ear in singularity
Kaarssteun t1_iwhtr3s wrote
Reply to comment by randomrealname in models superior to GPT-3? by [deleted]
from here: "Galactica models are trained on a large corpus comprising more than 360 millions in-context citations and over 50 millions of unique references normalized across a diverse set of sources. This enables Galactica to suggest citations and help discover related papers."
Always remember that the outputs of a language model are however, very prone to hallucination. I would not trust its outputs.
Viewing a single comment thread. View all comments