Viewing a single comment thread. View all comments

hxckrt t1_jc90ity wrote

Empathy and nonverbal building of rapport for one, but also the judgment to intervene and take proportional action when there is an immediate threat to someone's life.

Do you want a therapist to call someone when their patient is seriously considering harming someone? Don't be too quick to wish for a machine to do that.

−3

Dziadzios t1_jc9msh2 wrote

> judgment to intervene and take proportional action when there is an immediate threat to someone's life

Recently I've read a post of someone who was suicidal but refused to go get help to specifically avoid this. It might actually be a feature.

2

hxckrt t1_jcaiwx4 wrote

Then you're still missing the safeguards for them harming others, and those can be very, very hard to separate

−1

Pippin987 t1_jcc91hm wrote

This is the more extreme cases which preferably be done by real therapists yeah, but much off the world population has no acces to therapy and an AI semi-therapists that could help ppl with mundane therapy would seem helpful and could help avoid ppl needing actual therapy later.

Also a lot off ppl that do need therapy don't take that step to go into therapy because it's a daunting thing too many or they think they don't need it, but being able to talk to an app on their phone about their issues could help and if it's anything serious the AI could refer them to a real therapist.

1