Submitted by Ssider69 t3_11apphs in technology
Ssider69 OP t1_j9ucc5f wrote
Reply to comment by Kaekru in Microsoft Bing AI ends chat when prompted about 'feelings' by Ssider69
Ai chatbots aren't sentient??? Holy fuck... you're kidding me....
Iow...no shit
My point..."my guy" is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing. And if it's not ready for prime time . .don't release it
Or is that too direct a concept ..."my guy"
AI chat is just another example of dressing up mounds of processing power to do something that seems cool but is not only flawed but useless.
It kind of sums up the industry really, and in fact most of the IT business right now
Kaekru t1_j9ucvr1 wrote
>is that any system that routinely fucks up as much as AI chat is the result of designers not thoroughly testing
Any system that learns from experience will be fucked up if people fuck with it.
The same way if you raise a child to be a fucked up person they will become a fucked up adult.
You don't seem to understand jack shit about machine learning processes. A "fool proof" chat bot wouldn't be a good chat bot at all, since it wouldn't be able to operate outside its pre-determined replies and topics.
businessboyz t1_j9v3n69 wrote
>And if it’s not ready for prime time . .don’t release it
Good thing they didn’t and this has been an open waitlist beta so that the developers can gather real world experience and update the product accordingly.
You can’t ever anticipate all the ways that users will use your product and design a fail-proof piece of software. That’s why products go through many stages of testing and release with wider and more public audiences each iteration.
Viewing a single comment thread. View all comments