Viewing a single comment thread. View all comments

WhuddaWhat t1_j6daban wrote

Well, why don't you talk to your ChatGPT therapist about how to manage these feelings in a healthy way?

33

weirdgroovynerd t1_j6dmbdw wrote

Right?

The ChatGPT therapist was very helpful after Scarlett Johansson's voice broke up with me.

24

Theemuts t1_j6h1alh wrote

Yeah, don't do that.

> ChatGPT (Chat Generative Pre-trained Transformer)[1] is a chatbot launched by OpenAI in November 2022. It is built on top of OpenAI's GPT-3 family of large language models, and is fine-tuned (an approach to transfer learning)[2] with both supervised and reinforcement learning techniques.

> Nabla, a French start-up specializing in healthcare technology, tested GPT-3 as a medical chatbot, though OpenAI itself warned against such use. As expected, GPT-3 showed several limitations. For example, while testing GPT-3 responses about mental health issues, the AI advised a simulated patient to commit suicide.[51]

1

WhuddaWhat t1_j6h2pa9 wrote

>the AI advised a simulated patient to commit suicide

holy shit. Can you imagine being absolutely despondantly suicidal and reaching out for help and basically being told by what FEELS like an all-knowing computer, but is really just the most statistically relevant response to the series of things you've said, tells you that on reflecting upon your situation, it really would be best to go ahead and end it.

That would probably be enough to expand the crisis for anybody that is truly battling to get back a feeling of control within their life.

2

DasKapitalist t1_j6hwp55 wrote

If you're taking life altering advice from a probabilistic language algorithm, you're pretty well doomed to begin with.

1