Viewing a single comment thread. View all comments

pengo t1_jdt6iv2 wrote

Reply to comment by cegras in [D] GPT4 and coding problems by enryu42

> Then what you have is something that can separate content into logically similar, but orthogonal realizations.

Like a word vector? The thing every language model is based on?

1

cegras t1_jdta9mj wrote

More like, the ability to know that 'reversing a linked list' and 'linked list cycle and traversal problems' are the same concepts but different problems, and to separate those into train/test. Clearly they haven't figured that out because ChatGPT is contaminated, and their (opaquely disclosed) ways of addressing that issue don't seem adequate at all.

3