Mental-Swordfish7129 t1_j2xrr7a wrote
Reply to comment by maizeq in [R] Do we really need 300 floats to represent the meaning of a word? Representing words with words - a logical approach to word embedding using a self-supervised Tsetlin Machine Autoencoder. by olegranmo
>How do you achieve something similar in your binary latent space?
All data coming in is encoded into these high-dimensional binary vectors where each index in a vector corresponds to a relevant feature in the real world. Then, computing error is as simple as XOR(actual incoming data, prediction). This preserves the semantic details of how the prediction was wrong.
There is no fancy activation function. A simple sum of all connected synapses which connect to an active element.
Synapses are binary. Connected or not. They decay over time and their permanence is increased if they're useful often enough.
Viewing a single comment thread. View all comments