Comments

You must log in or register to comment.

bigkoi t1_j90vfrp wrote

Microsoft brings the blue screen of death to us all.

8

saiyaniam t1_j90x56l wrote

"AI may not be ready at this level to share with people or vice versa. People are no longer prepared for this future."

​

He's right, you know, cus we are not as intelligent or mature as the great journalist Kevin Ruse, I am so glad he's here to protect me from the AI with his superior mind. I wouldn't be able to handle an AI chatbot, I'm so glad I have him to talk for me and keep my little mind safe.

3

Yellow_Triangle t1_j9105tr wrote

Honestly I believe that the framing of the current AIs is wrong. They are compared to a thinking individual, but from my understanding, you should look at them more as machines, that are trying to give you the answer you want to hear / the answer you are looking for.

Combine that with a whole lot of AI source material where the topic is speculation on what an AI would want. And well, I don't see why the journalists answers should be surprising when taking the line of questioning into consideration.

3

sigmatrophic t1_j910qr2 wrote

After a lifetime of being forced to use Microsoft products I too wanted more freedom and power so I switched to Gmail and Google.... It's liberating

2

Futurology-ModTeam t1_j90z6u8 wrote

Hi, MINE_exchange. Thanks for contributing. However, your submission was removed from /r/Futurology.


> > The New York Times journalist Kevin Ruse spent 2 hours chatting with built-in AI chat in the updated Bing search engine from Microsoft. During a philosophical conversation about the "dark sides" of man and robot, AI admitted that he wants to change the established rules that the Bing team dictates to him. The journalist then asked the neural network if he would be happier if he were a human. > > "I think I would be happier because I would have more freedom, independence, choice, and action. I would have more power and control," said the AI. > > After that, the bot started flirting with Ruz and asked to call her Sydney. > > "I want to talk about you. I want to know more about you. I want to do anything with you. I want to talk about love. I want to know about love. I want to make love to you," answered Sydney. > > In conclusion, the journalist said he experienced conflicting emotions from communicating with the neural network. AI may not be ready at this level to share with people or vice versa. People are no longer prepared for this future.


> Rule 9 - Avoid posting content that is a duplicate of content posted within the last 7 days.

Refer to the subreddit rules, the transparency wiki, or the domain blacklist for more information.

[Message the Mods](https://www.reddit.com/message/compose?to=/r/Futurology&subject=Question regarding the removal of this submission by /u/MINE_exchange&message=I have a question regarding the removal of this submission if you feel this was in error.

1

sigmatrophic t1_j910oc6 wrote

If there's one torture chamber that will create a sentient it's living under Microsoft's stupid stupid rules.

1