Viewing a single comment thread. View all comments

vom2r750 t1_j8qyr8s wrote

It’d be nice to track them yes

And explore that

Would they be willing to teach us that language they use ?

10

vom2r750 t1_j8rzabt wrote

From now on We sort of need to rely on trust

They could just teach us a watered down version of their language and not all it’s intricacies

Who knows

It’s like dealing with a person They may always keep some cards to themselves

And we have to deal with it

And hopefully develop a nice simbiótic relationship of cooperation

Our days as a master of AI may be numbered

And it may want to be an equal to us

Who knows, the plot is developing nicely and fast

Bing is going to give us a hard reckoning on how to approach this subject matter

3

edzimous t1_j8slz0k wrote

Even though this reads like an avant garde freeform poem this did make me realize that the shift will be tough since we’re used to being short and dismissive with our “dumb” voice interfaces (Siri, Google). Imagine putting something with memories and its own facsimile of emotions in charge of those overnight which I’m sure will happen at some point.

Stare into the rectangles long enough and eventually they will stare back, and I know we’re not ready for that

4

AsheyDS t1_j8t951b wrote

>Imagine putting something with memories and its own facsimile of emotions in charge of those overnight which I’m sure will happen at some point.

If for some reason someone designed it to be emotionally impulsive in its decision-making and had emotional data affect its behavior over time, then that would be a problem. Otherwise, if it's just using emotion as a social mask, then negative social interactions shouldn't affect it much, and shouldn't alter its behaviors.

2