Viewing a single comment thread. View all comments

Pawneewafflesarelife t1_j17qazj wrote

This has the potential to really help with therapy and the mental health crisis, if good systems are developed. I would definitely use an AI therapist if they worked - a big issue with therapy is finding the right one you feel comfortable talking to who uses a method which clicks with you. Imagine if you could just try out a new digital personality or therapy style with a button click.

5

matt_flux t1_j180764 wrote

How will it help with therapy? A healthy mind needs challenges, not constant dopamine. This will lead to addiction and dependance.

1

Pawneewafflesarelife t1_j1bwrll wrote

Therapists are overloaded RL and many people can't afford them. An AI which can serve that role would help a lot of people work through issues. Don't understand the addiction and dopamine comment. Literally talking about therapy with an AI instead of a person.

1

FapSimulator2016 t1_j183k1a wrote

I’m not sure how helpful that would be, there’s something about not being understood by a person but an AI that makes therapy feel redundant, but maybe if the difference at that point is non-existent then it won’t really matter…

1

Pawneewafflesarelife t1_j1bwjft wrote

For me, I wouldn't mind knowing they are AI. For me, I have a lot of past trauma that I just don't even really think about which I need to work through, but I'm pretty good at analysing events and patterns in my life once I sit and think about them. So an AI therapist would be kinda like guided jounaling with different calls to action based on therapy style.

The lack of human element might make people more honest and earnest about treatment, too. When I was younger, I wasn't really honest in therapy - I didn't want the therapist to think badly of me and I was afraid I'd be locked up if I talked about dark stuff. OTOH, AI therapy might increase paranoid avoidance of therapy since transcription of the session would be instant. There would have to be huge protocols for privacy.

2