Facebook’s AI Mathematician Can Solve University Calculus Problems

Maths is a highly complex and challenging subject for most of the human beings, even for Artificial Neural Networks. Most neural networks are only capable of performing simple addition and multiplication problem.

But in Paris, at Facebook AI Research, François Charton and Guillaume Lample have developed an algorithm that has learned to solve university-level calculus problems in seconds.



Both have trained an AI on tens of millions of calculus problems randomly generated by a computer.

The Facebook team of researcher have trained a neural network to perform the necessary symbolic reasoning in order to differentiate and integrate mathematical expressions for the very first time.




Neural networks are highly capable of recognizing patterns, but when it comes to symbolic reasoning they struggle, at best neural networks were able to do addition and multiplication of whole numbers.

Facebook’s neural network is a significant step toward more powerful mathematical reasoning and a new way of applying neural networks beyond traditional pattern-recognition tasks.




In order to teach AI about the Mathematical problem and finding its solutions, the team trained the AI by using natural language processing (NLP), a computational tool commonly used to analyze language.

François Charton and Guillaume Lample teach a neural network to recognize the patterns of mathematical manipulation that are equivalent to integration and differentiation.

Finally, they let the neural network loose on expressions it has never seen and compares the results with the answers derived by conventional solvers like Mathematica and Matlab.

Lample and Charton break the mathematical expressions into their component parts by representing expressions as tree-like structures. The leaves on these trees are numbers, constants, and variables like x.

This works because the mathematics in each problem can be thought of as a sentence, with variables, normally denoted x, playing role of nouns & operations, such as finding the square root, playing role of verbs.

The AI then “translates” the problem into a solution.

François Charton and Guillaume Lample create this database by randomly assembling mathematical expressions from a library of binary operators such as addition, multiplication, and so on; unary operators such as cos, sin, and exp; and a set of variables, integers, and constants, such as π and e.




They also limit the number of internal nodes to keep the equations from becoming too big.

By this, the team generates a massive training data set consisting, for example, of 80 million examples of first- and second-order differential equations and 20 million examples of expressions integrated by parts.

And finally, François Charton and Guillaume Lample fed the neural network with 5,000 expressions it has never seen before and comparing the results it produces in 500 cases with those from commercially available solvers, such as Maple, Matlab, and Mathematica.

When the pair tested AI on 500 calculus problems, it found a solution with an accuracy of 98%. A comparable standard program for solving maths problems had only an accuracy of 85% on the same problems.

In many cases, the neural network was able to find solutions within the 30s.

One interesting observation was that the neural network was able to find more than one solution for a single problem and that because a math problem can be solved in many ways.

“On all tasks, we observe that our model significantly outperforms Mathematica,” say the researchers.

“On function integration, our model obtains close to 100% accuracy, while Mathematica barely reaches 85%.” And the Maple and Matlab packages perform less well than Mathematica on average.

Despite this, it could still correctly answer questions that confounded other maths programs.

“The ability of the model to recover equivalent expressions, without having been trained to do so, is very intriguing,” say Lample and Charton.

That’s a significant breakthrough. “To the best of our knowledge, no study has investigated the ability of neural networks to detect patterns in mathematical expressions,” says the pair.

More in AI

AWS launches SageMaker Studio, a web-based IDE for Machine Learning

No Degree In AI Or Data Science Needed To work In Tesla, Elon Musk Reveals

Yann LeCun and Yoshua Bengio Elected as AAAI-20 Fellows

The DeepMind CEO & Co-Founder Is Joining Google As The AI Lab Positions



Leave a Reply

Your email address will not be published.