Submitted by fangfried t3_11alcys in singularity
With ChatGPT you could say it doesn’t give up to date correct information all the time, but that will be mostly solved by integrating it with Bing.
It also has a lot of bias and is very limited in its freedom to really tell us what it thinks. Maybe as computing gets cheaper, and there are open source LLMs that can be as powerful as GPT, that can solve that issue.
I’m more interested though in if there are any flaws with transformers and LLMs from a technical standpoint and what those are.
turnip_burrito t1_j9spyp3 wrote
Like you said: truthfulness/hallucination
But also: training costs (hardware, time, energy, data)
Inability to update in real time
Flawed reasoning ability
Costs to run