Submitted by calbhollo t3_11a4zuh in singularity
ActuatorMaterial2846 t1_j9r171j wrote
I'm pretty stupid, but I just want to grasp something if it can be clarified.
A basic function can be described as an equation with solid answer 1+1=2.
But what these nueral networks seem to do is take a basic function and provide an approximation. That approximation seems to be based on context, perhaps by an equation proceeding or succeeding it.
I've heard it described as complex matrices with inscrutable floating-point numbers.
Have I grasped this or am I way off?
Girafferage t1_j9r3juu wrote
No you aren't way off. They run off models, which are a huge set of pre-trained data that tell the AI what any given thing is. Using that model and the rules written into the AI and neural net it gives a result from an input. The input can be images, sounds, whatever, and the model has to be trained to specifically handle that type of input or in some cases multiple types.
After that you usually run the AI a bunch and at the start you get pretty much garbage coming out so you change the weights around to see what works best and do some training with the AI where it gives you a result and you say yes that's right or no that's incorrect, and it takes that information into account to determine its future outputs. That is not the same as a person telling something like ChatGPT it is wrong or right, at that point the model is done and complete. You aren't rewriting anything. The developers might take those conversations into account and use the corrections to enhance the model, but that's separate and not at all like chatting with an AI.
I have mostly worked with image related neural networks for tracking and detection and tracking works a lot different than detection, but I also had a hobby project with one for text that was determined the mood of a set of sentences (sad, happy, lonely, confused, scared, ect.) But that text one is easy to do for any programmer and not too bad for a non-programming savvy person either.
Viewing a single comment thread. View all comments