Submitted by demauroy t3_11pimea in Futurology
GerryofSanDiego t1_jbzerah wrote
ChatGPT doesn't have a moral code as far as Im aware. Could be very dangerous for teens especially. Even a fully formed AI isn't going to be able to relate emotionally to a human experience. Its really the one thing it shouldn't be used for.
MamaMiaPizzaFina t1_jc18o0g wrote
better than therapists i've seen that have "their" moral code.
GerryofSanDiego t1_jc2faum wrote
Hahaha good point. I just wouldn't want an AI to like advise suicide or something like that. But I have no expertise in the topic at all.
MamaMiaPizzaFina t1_jc2zhxh wrote
I tried you.com for that. that madlad recommended me the dosages for suicide according to the medications I have.
You cannot deny that it did give relevant advice.
GerryofSanDiego t1_jc31hl8 wrote
Yea I guess thats my basic point is that AI has good uses but it's never going to fully understand the human experience. Like you ask it for suicide dosages and it gives it to you, which is not what a mental health professional would ever do.
Viewing a single comment thread. View all comments