Submitted by AylaDoesntLikeYou t3_11c5n1g in singularity
Z1BattleBoy21 t1_ja2qcli wrote
Reply to comment by duffmanhb in Meta unveils a new large language model that can run on a single GPU by AylaDoesntLikeYou
I did some research and you're right. I made my claim based on some reddit threads that said that apple won't bother with LLMs as long as they couldn't be processed on local hardware due to privacy; I retract the "required" part of my post but I still believe they wouldn't go for it due to [1] [[2]] (https://www.theverge.com/2021/6/7/22522993/apple-siri-on-device-speech-recognition-no-internet-wwdc)
Viewing a single comment thread. View all comments