master3243 t1_ir94pxk wrote
Reply to comment by Ulfgardleo in [R] Discovering Faster Matrix Multiplication Algorithms With Reinforcement Learning by EducationalCicada
I don't think you're right unless deepmind is lying in the abstract of a nature paper which I highly doubt.
> Particularly relevant is the case of 4 × 4 matrices in a finite field, where AlphaTensor’s algorithm improves on Strassen’s two-level algorithm for the first time, to our knowledge, since its discovery 50 years ago
Ulfgardleo t1_ir95y3t wrote
Yeah they are not right. Sota is laser method.
They even missed the huge improvement from 1981...
https://ieeexplore.ieee.org/document/4568320
It is btw all behind the wiki link above.
Ulfgardleo t1_ir997hv wrote
The worst thing is however that they do not even cite the practically relevant memory efficient implementation of strassen (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.39.6887 ). One can argue that all matmul algorithms with better complexity than Strassen are irrelevant due to their constants, but not even comparing to the best memory implementation is odd-especially as they don't show improvement in asymptotic complexity.
Viewing a single comment thread. View all comments