Viewing a single comment thread. View all comments

NotASuicidalRobot t1_j3jza6v wrote

I mean it's trying to get close...i guess? Maybe it can't double back to check the essay like humans to add and delete random phrasing to get accurate

5

johntwoods t1_j3k0ebm wrote

It is so definitive, though. I tell it, "Hey bro, that's only 387 words and it needs to be 500." And instead of saying, "Oh, ok, I will attempt it again, I am sorry." It states, with authority, "My apologies, here is a version that is 500 words." And then it is 402.

It is like it can't count. Which is fucking hilarious to me.

11

alwaysuseswrongyour t1_j3k3ace wrote

I tried to have it make a meal plan with nutrition facts, 200 grams of protein and 2000 calories and it would do the same, “okay here’s a meal plan with 200 grams of protein and 2000 calories” it had all the nutrition facts but the numbers did not add up to anything close to what they were supposed to. I can’t think of the other time but this was not the first time I used it and it simply could not add, something I would expect an AI to do with ease.

4

johntwoods t1_j3k3snt wrote

That's a good example.

Math/counting... Not AI's strong suit I guess.

2

Syntaxosaurus t1_j3k7xxu wrote

That's a sore spot for chatGPT specifically rather than AI in general. See this thread for more.

1

johntwoods t1_j3k8gyv wrote

Totally. I understand.

I'm not even asking for simple algebra.

I am asking for GPT to simply count the number of words it is outputting.

2

NotASuicidalRobot t1_j3k1b1e wrote

Yeah that's fair. I find it likes repeating the words in your question again more than anything else, it's like a student answering the "write your answers in full sentences" type questions. So it's probably just compulsively doing that

1