Viewing a single comment thread. View all comments

hostilereplicator t1_isrzugb wrote

I would echo the others here and say that, depending on the focus of your paper, the big conferences do take maths/theory papers (NeurIPS, ICML, ICLR, COLT, also AIStats and UAI depending on your topic) and JMLR for longer papers. But all of the conferences are both very competitive and have a large random component in what gets accepted… it may also be worth looking at workshops at these conferences to see if anything fits better. Less “prestigious” but also easier to get into and more likely to be reviewed by a suitable/friendly referee.

What’s the topic of your research?

7

vajraadhvan OP t1_iss3bb6 wrote

Approximation theory traditionally looks at the structure of function spaces under addition; but approximation spaces under composition are underexamined. Studying approximation spaces under composition may quantitatively explain the outperformance of neural networks, reveal links to dynamical systems, and suggest related architectures.

(Edit: Following the work of Weinan E, Chao Ma, Lei Wu, Ronald DeVore, Gitta Kutyniok, et al.)

4