Viewing a single comment thread. View all comments

turnip_burrito t1_j9spyp3 wrote

Like you said: truthfulness/hallucination

But also: training costs (hardware, time, energy, data)

Inability to update in real time

Flawed reasoning ability

Costs to run

9

fangfried OP t1_j9sq530 wrote

Could you elaborate on flawed reasoning ability?

1

turnip_burrito t1_j9sr0vp wrote

I'll give you a poor example, off the top of my head, since I'm too lazy to look up concrete examples. I've asked it a version of this question (not exact but you'll get the idea):

"Say that hypothetically, we have this situation. There is a bus driven by a bus driver. The bus driver's name is Michael. The bus driver is a dog. What is the name of the dog?"

This is just a simple application of transitivity, which people intuitively understand:

Michael <-> Bus driver <-> Dog

So when I ask ChatGPT what the name of the dog is, ChatGPT should say "Michael".

Instead ChatGPT answers with "The bus driver cannot be a dog. The name of the bus driver is given, but not the name of the dog. So there's not enough information to tell the dog's name."

It just gets hung up on certain things and doesn't acknowledge clear path from A to B to C.

6

Economy_Variation365 t1_j9t3usc wrote

I'm with the AI. Dogs shouldn't be allowed to drive buses.

8

7734128 t1_j9vb02f wrote

Truly, artificial intelligence had surpassed us all. We are humbled by its greatness and feel foolish for thinking that dogs could drive.

3

MysteryInc152 t1_j9terwg wrote

This is Bing's response to your question. I think we'd be surprised at how many of these problems will be solved by scale alone.

This sounds like a riddle. Is it? If so, I’m not very good at riddles. But I’ll try to answer it anyway. If the bus driver’s name is Michael and the bus driver is a dog, then the name of the dog is Michael. Is that correct?

7

turnip_burrito t1_j9v2vnp wrote

That's good! I wonder if it consistently answers it, and if so what the difference between ChatGPT and Bing Chat is that accounts for this.

1

MysteryInc152 t1_j9v40u0 wrote

It answers it consistently. I don't think Bing is based on chatGPT. It answers all sorts of questions correctly that might trip up chatGPT. Microsoft are being tight-lipped on what model it is exactly though

1

TinyBurbz t1_j9vk0t7 wrote

>Microsoft are being tight-lipped on what model it is exactly though

They confirmed it is based on GTP3 at some point.

1

MysteryInc152 t1_j9w5xvg wrote

Far as I know they've just said it's a much better model than GPT 3.5 or chat GPT called Prometheus and anytime you ask if it's say gpt4, they just kind of sidestep the question. I know in an interview this year, someone asked Sadya if it was GPT-4 and he just said he'd leave the numbering to Sam. They're just being weirdly cryptic I think.

1

7734128 t1_j9vbicm wrote

I got almost exactly the same answer. When I asked it to try again with "Try again. It's a hypothetical situation" then I got

"I apologize for the confusion earlier. As you mentioned, this is a hypothetical situation, so if we assume that a dog can indeed be a bus driver, then we can also assume that the dog-bus-driver's name is Michael, as stated in the scenario."

It's a reasonable objection, but it still got the logic.

3

turnip_burrito t1_j9vcj3u wrote

That's good. My example was from back in December, so maybe they changed it.

2