Viewing a single comment thread. View all comments

wordholes t1_jed6wd9 wrote

Oh my god they're using approximate data from a probabilistic model to train another even more approximate probabilistic model.

What level of generational loss is this??

34

z57 t1_jedfhgf wrote

Wasn't Stanfords Alpaca trained using GPT?

Yes I think it was: Researchers train a language model from Meta with text generated by OpenAI's GPT-3.5 for less than $600

9

Orqee t1_jedubqo wrote

It’s called meta probabilistic recursion. Because I just name it.

4