Submitted by SupPandaHugger t3_zv6q6p in singularity
tiorancio t1_j1nwfy4 wrote
The have to get it to stop hallucinating first. And there's no easy fix.
mick_au t1_j1o023v wrote
This is a great article
enilea t1_j1poxp9 wrote
I've been testing the examples given in the article and they don't seem to happen anymore. It still gives false information sometimes, but not as much as it was in the article, so even if there isn't an easy fix seems like it's getting better at it.
imnos t1_j1ro2e2 wrote
Why? There's no guarantee that the results on the pages Google spit out give correct information either.
tiorancio t1_j1ruoqn wrote
There's a big difference. Google just links to the results, including some ads in the mix, and can claim the info was already there. ChatGPT is creating an answer, a derivative work on the results. This has huge implications for copyright law and can be a huge liability for misrepresenting people, products or companies.
Viewing a single comment thread. View all comments