Submitted by imgonnarelph t3_11wqmga in MachineLearning
royalemate357 t1_jd1stda wrote
Reply to comment by hosjiu in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Not op, but I imagine they're referring to the sampling hyperparameters that control the text generation process. For example there is a temperature setting, a lower temperature makes it sample more from the most likely choices. So it would potentially be more precise/accurate but also less diverse and creative in it's outputs
Viewing a single comment thread. View all comments