Viewing a single comment thread. View all comments

agsarria t1_ja3qx0d wrote

Running llms on a desktop is something very interesting, but running on a phone doesn't make any sense... Just send a request and get the response, it probably is gonna be faster and much less battery demanding .

3

NoidoDev t1_ja5uunb wrote

With a home server, maybe. Corpo AI, no thanks. Also, it's more interesting for robots with bigger batteries. Stay at home with your robowaifu, therefore care less about your phone.

1