Viewing a single comment thread. View all comments

PM_ME_GAY_STUF t1_j86zf4g wrote

The ability to learn without updating parameters is literally a known and intended feature of most modern models though?

−6

ElbowWavingOversight t1_j870smg wrote

No. Not until these LLMs came around, anyway. What other examples do you have of this? Even in the case of few-shot or zero-shot learning, which allow the model to generalize beyond the classes it sees in its test set, is limited to the associations between classes that it learns during training. It can't learn new associations given new data after-the-fact without rerunning the training loop and updating the parameters.

20

graham_fyffe t1_j87i6tw wrote

Look up “learned learning” or “learning to learn by gradient descent by gradient descent” (2016) for a few examples.

7