Submitted by Emergency_Apricot_77 t3_zmd6l8 in MachineLearning
prototypist t1_j0c5p2j wrote
There have been attempts this year at building a more human-like decoder for language models and seeing what outputs humans prefer. Transformers supports typical decoding and contrastive search, and there are papers and code out for RankGen, Time Control, and Contrastive Decoding (which is totally different from contrastive search).
Emergency_Apricot_77 OP t1_j0fe4lo wrote
Thanks for this ! Typical decoding paper contains really useful information that is similar to what I was looking for
Viewing a single comment thread. View all comments