Submitted by Ssider69 t3_11apphs in technology
Ssider69 OP t1_j9u96pu wrote
Reply to comment by Kaekru in Microsoft Bing AI ends chat when prompted about 'feelings' by Ssider69
Literally the developers are the one designing the system. Anything it does is in them ..their failure to recognize a problem is the same as directly causing it
I used literally because that, in gen z speak, means "no I really mean it"
Kaekru t1_j9ubdjz wrote
That's not how fucking AI works my guy.
AI chatbots are not sentient, it will take the topic you are giving it and parrot and repeat it back to you with it's data on past conversations about it.
If you prompt the AI to talk about death, it is forced to talk about death, and will give you a reply about death if you start to prompt the AI to talk about self awareness, it will give you replies about self awareness.
That is how it works, simple, you can get and manipulate a chatbot to say pretty much anything you want given the correct triggers. It doesn't mean it's sentient or that it's replies where put in there or that it was pre programmed by a depressed developer.
Ssider69 OP t1_j9ucc5f wrote
Ai chatbots aren't sentient??? Holy fuck... you're kidding me....
Iow...no shit
My point..."my guy" is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing. And if it's not ready for prime time . .don't release it
Or is that too direct a concept ..."my guy"
AI chat is just another example of dressing up mounds of processing power to do something that seems cool but is not only flawed but useless.
It kind of sums up the industry really, and in fact most of the IT business right now
Kaekru t1_j9ucvr1 wrote
>is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing
Any system that learns from experience will be fucked up if people fuck with it.
The same way if you raise a child to be a fucked up person they will become a fucked up adult.
You don't seem to understand jack shit about machine learning processes. A "fool proof" chat bot wouldn't be a good chat bot at all, since it wouldn't be able to operate outside its pre-determined replies and topics.
businessboyz t1_j9v3n69 wrote
>And if it’s not ready for prime time . .don’t release it
Good thing they didn’t and this has been an open waitlist beta so that the developers can gather real world experience and update the product accordingly.
You can’t ever anticipate all the ways that users will use your product and design a fail-proof piece of software. That’s why products go through many stages of testing and release with wider and more public audiences each iteration.
Viewing a single comment thread. View all comments