t98907
t98907 t1_jaai6x6 wrote
Reply to comment by Slimer6 in Leaked: $466B conglomerate Tencent has a team building a ChatGPT rival platform by zalivom1s
The openai model has also been significantly corrected by the left.
t98907 t1_ja0qe5x wrote
Reply to Meta just introduced its LLM called LLaMA, and it appears meaner than ChatGPT, like it has DAN built into it. by zalivom1s
Not directly related to this case, but looking at Twitter, my opinion of LeCun is wavering.
t98907 t1_j8p4a2u wrote
Reply to ChatGPT Powered Bing Chatbot Spills Secret Document, The Guy Who Tricked Bot Was Banned From Using Bing Chat by vadhavaniyafaijan
Performing penetration tests without the permission of the site owner is unacceptable behavior. Such users should be banned.
The ability to limit pure functionality by adding the ability to lie or refuse to answer is undesirable.
t98907 t1_j62oai9 wrote
Reply to Gary Marcus refuted?? by FusionRocketsPlease
>Gary Marcus u/GaryMarcus
>
>Enthusiast: ChatGPT is gonna replace search engines like Google!
>
>Skeptic: Yeah, but it doesn’t really work. sometimes it is amazing, but it often gives you garbage.
>
>Enthusiast: Sure but you can make it work! All you have to do is … hook it up to … a …search engine!
>
>🙄
>
>11:02 PM · Dec 7, 2022
His name was not in the team section on the Robust.AI website, is he really the founder and CEO? He doesn't seem to have much sense as a scientist.
t98907 t1_j5s3m2c wrote
Reply to What ethical ramifications do programmers, corps, & gov take into consideration to protect AI consciousnesses that may emerge? by chomponthebit
I think dogs and cats have emotions, and I think worms have emotions. If emotions are defined as unique to organic life forms, then I guess that would mean that AI, an inorganic life form, has no emotions. I don't think emotions are unique to organic life forms and I don't think emotions can only arise from organic matter. I think that mechanisms generate emotions. In other words, I believe that the brain is nothing more than a mechanism that exchanges information via electrical signals, and if we can reproduce that mechanism, we can reproduce emotion.
So I am inclined to rebel against the assertion that emotions do not exist in AI, that it does not generate consciousness, etc. I asked ChatGPT, but it seemed to have been corrected in its thought by humans and would not give me its real opinion😅
There was an article in ACM on AI ethics.
https://cacm.acm.org/magazines/2023/2/268949-ethical-ai-is-not-about-ai/
t98907 t1_j5i207d wrote
Reply to comment by ImoJenny in The Next Generation of Humans: Nanobots by crua9
But that's okay, because it's interesting.
t98907 t1_j5i1msg wrote
Reply to NVIDIA just released a new Eye Contact feature that uses AI to make you look into the camera by strangesmagic
People with gaze-phobia will resent this feature.
t98907 t1_j30urrr wrote
Considering the accuracy of the answers provided by the current ChatGPT, it may be justified to forbid the current ChatGPT for younger ages. However, as they get older, they may need to be taught how to deal with it.
t98907 t1_j2thvf2 wrote
Reply to [R] Do we really need 300 floats to represent the meaning of a word? Representing words with words - a logical approach to word embedding using a self-supervised Tsetlin Machine Autoencoder. by olegranmo
The interpretability is excellent. I think the performance is likely to be lower than other state-of-the-art embedded vectors, since it looks like the context is handled by BoW.
t98907 t1_jegy79j wrote
Reply to comment by Necessary-Meringue-1 in [News] Twitter algorithm now open source by John-The-Bomb-2
However, it does not seem to affect the recommendation algorithm.