Viewing a single comment thread. View all comments

BloomEPU t1_jdqmifr wrote

I think the kind of chatbots that are marketed as genuine companionship and relationships should be heavily scrutinised. I don't think they're inherently bad, if a desperate person uses one of these and just gets useful life advice I'm not going to complain, but there's always a possibility that they could be telling desperate impressionable people things that are very unhelpful.

Honestly, all chatbots should be scrutinised for that, people are probably turning to them for the same stuff.

4

bobartig t1_jdsbkpw wrote

I feel like there is potentially great therapeutic value in this sort of AI companion bot for people who have certain kinds of social anxiety, or neuro-atypicalities, or working through trauma. This needs to be implemented with all of the medical rigor of a therapy program administered by trained professionals, and the technology doesn't have that kind of maturity, yet.

But, what this app demonstrates is that there are a lot of people out there who are deeply lonely in some way and crave interaction, and this is an axis along which they are willing to engage. Meet them where they are, and then work towards more healthy interactions whatever that may mean.

2