Viewing a single comment thread. View all comments

marcandreewolf OP t1_jd25w8b wrote

It is not lying (it even cannot lie, unless it would be conscious 😅), but it is sometimes grabbing the wrong info, especially if repeated often online (by humans), or just halucinates nonsense. So: yes and no 😁

0

Baprr t1_jd28c8h wrote

It's just wrong instead of lying then. I mean, if you can't trust it to write the very easy to look up history of automation - why would you believe it's predictions? This info is pretty much useless.

3

alex20_202020 t1_jd2c4m3 wrote

I think it is not useless, it might represent average dates when people wrote/predicted somewhere publicly this and that might happen.

2

Baprr t1_jd2f3bj wrote

Not really. If you read what people predicted in the past about 2023, you might believe that we already have colonies in space, fully autonomous self driving cars, and cure for cancer. You have to filter the output of the chatbot, or it's - well, not gibberish, but extremely suspect information. It doesn't check or provide sources.

This list might be used to look up current projects that are being developed, and with some effort be turned into maybe 20 points of exciting things to look forward to.

But right now it's low effort useless content.

1