Neural networks have a reputation for being better at solving statistical or
approximate problems than at performing calculations or working with symbolic
data. In this paper, we show that they can be surprisingly good at more
elaborated tasks in...

Guillaume Lample
252d ago

Our new paper, Deep Learning for Symbolic Mathematics, is now on arXiv arxiv.org/abs/1912.01412
We added *a lot* of new results compared to the original submission. With @f_charton (1/7) pic.twitter.com/GrhQRT5WRW

12

The code for our @iclr_conf paper, Deep Learning for Symbolic Mathematics, is now available in @PyTorch! We also provide our datasets and pretrained models
Code: github.com/facebookresear…
Paper: arxiv.org/abs/1912.01412

Amazing results applying transformers to symbolic function integration and differential equations solving by Guillaume Lample and François Charton from FAIR-Paris.
Succeeds in many cases where Mathematica fails.
Paper:...

Transformers work wonders on natural language. Given enough examples, they can translate without a dictionary. Why not consider mathematics as a language and problem solving as translation tasks?
with @GuillaumeLample