yaosio t1_je56tet wrote
There's a limit, otherwise you would be able to ask it to self-reflect on anything and always get a correct answer eventually. Finding out why it can't get the correct answer the first time would be incredibly useful. Finding out where the limits are and why is also incredibly useful.
Cantareus t1_je8pk04 wrote
There's no self-reflecting happening when the request to self-reflect is in the prompt. The improvement happens because the expected output after asking it to self-reflect is a more thought out response. You can get a type of reflecting by pasting the output back into the prompt with your own feedback.
Viewing a single comment thread. View all comments