Viewing a single comment thread. View all comments

Kujo17 t1_jdtxdjk wrote

I made a "survival bit" on character AI beta or whatever. It was pretty cool as a novelty, was fun to roleplay disaster scenarios where the internet had crashed society had collapsed and it was my only source of "knowledge" other than my own. While truly a novelty for entertainment value it got me thinking even then, if one has a way to run one locally on a small enough device- like a phone - it really would be a must have/perfect survival tool not just for info on Survival specifically but literally anything/everything available in it's training data.

This was only a few months ago, and I had the thought "wow in my lifetime there's a good chance that could be a reality". Saying that then even felt crazy, but seeing how fr they've been able to come in scaling the size requirement down and getting them to run on essentially something the size of a phone with only losing very little in terms of it's abilities.... It literally puts something like the survival bit "Hope", into tangible grasp likely in the next year or two if not before then.

Maybe I'm being over zealous but I really don't think I am 🤷 lol

You're def not the only one to think about this, and I absolutely have been looking out for this very reason- among others lol

10

Anjz OP t1_jdtycqs wrote

I think very soon, there will be ASIC(Application-specific integrated circuit) low powered devices that can run powerful language models locally.

It's within our grasp. Might be integrated into our smartphones sooner than later actually.

10

fluffy_assassins t1_jdvgg2n wrote

This is exactly what tensor cores/ppu are... VERY specifically suited to AI. They are bought by the board-full to run AI servers. Basically your graphics card is an AI ASIC.

2