Viewing a single comment thread. View all comments

Ortus14 t1_j4t0coc wrote

The article is more about the relationship limitations of current Ai, rather than future Ai.

In the future you will be able to ask your Ai partner to teach you relationship skills and conflict resolution and not be a "yes man" or "yes woman", if that is your goal.

12

PhysicalChange100 t1_j4t88pq wrote

Agreed, if i ever were to have a relationship with an AI then i would want it to challenge my bullshit.

9

DadSnare t1_j4t84uf wrote

I feel like the augmentation of relationship skills in any moment, and connecting people to each other with intelligent assistance could be the greatest gift of AI.

3

SupPandaHugger OP t1_j4u4niy wrote

How? They still have to be given instructions to have certain objectives and won’t go through life experiencing things like humans would. If AI wouldn’t be yes people then the initial appeal would also be gone.

2

Ortus14 t1_j4ufhfq wrote

There's lots of conversational data for Ai to learn from such as all the big tech in our homes that records our conversations (Alexa, Google Assistant, Siri, etc.), as well as platforms and social media that records conversations, and other future devices such as smart taxis, and home security systems that will also record our conversations.

As far as the appeal, in this case the goal would be using the Ai to teach people how to have better real world friendships/relationships, specifically for people who don't have enough relationship and social skills to be able to get practice with a real person yet.

This is not a goal I made up. I got this from how the article was framed.

6

SupPandaHugger OP t1_j4ungkr wrote

Sure, but the data is not the problem. The problem is that it has no will on its own, it needs nothing and will only do what it's told.

Yes, that makes sense, to teach/coach people to improve socially. But this isn't a relationship with AI, more so a tool to improve relationships with real people.

1