Diffeologician t1_j43qshg wrote
Isn’t that the whole point of differentiable programming?
cdsmith t1_j45e09w wrote
Sort of. The promise of differentiable programming is to be able to implement discrete algorithms in ways that are transparent to gradient descent, but it's really only the numerical values of the inputs that are transparent to gradient descent, not the structure itself. The key idea here is the use of so-called TPRs (tensor product representations) to encode not just values but structure as well in a continuous way, so that one has an entire continuous deformation from the representation of one discrete structure to another. (Obviously, this deformation has to pass through intermediate states that are not directly interpretable as a single discrete structure, but the article argues that even these can represent valid states in some situations.)
Diffeologician t1_j45j7c4 wrote
So, there’s a trick where you write a differentiable program and swap out expensive bits with a neural network, which I think is probably related to this. Looking at the article, I think you would very quickly run into some hard problems in differential geometry if you tried to make this formal.
currentscurrents OP t1_j43s8ki wrote
In the paper they talk about "first generation compositional systems" and I believe they would include differentiable programming in that category. It has some compositional structure, but the structure is created by the programmer.
Ideally the system would be able to create it's own arbitrarily complex structures and systems to understand abstract ideas, like humans can.
Viewing a single comment thread. View all comments