Experiments using Neural Arithmetic Logic Units

Sayak Paul

Sayak Paul

Kolkata, West Bengal

0 0
  • 0 Collaborators

This project is a reproduced implementation of Neural Arithmetic Logic Units (NALU) as proposed by Trask et al. last year (https://arxiv.org/pdf/1808.00508.pdf). It also shows how to construct a sequential neural network model using the NALU for basic arithmetic functions like addition, subtractions and so on. ...learn more

Project status: Under Development

Artificial Intelligence

Intel Technologies
Intel Opt ML/DL Framework

Code Samples [1]

Overview / Usage

Neural networks can learn to represent and manipulate numerical information, but they seldom generalize well outside of the range of numerical values encountered during training. In fact, in simple operations like addition, subtraction, division, multiplication, and exponentiation, neural networks fail to extrapolate within the existing activation functions. To encourage more systematic numerical extrapolation, Trask et al. proposed NALU.

This project shows the implementation of NALU and also shows how to train shallow neural networks using NALU for the arithmetic functions mentioned earlier. As per the authors, with NALU even the networks fail to extrapolate well for operations like division and exponentiation.

Methodology / Approach

As the idea of NALU is not mine, I encourage the viewers to take a look the Section 2 of the original paper from here: https://arxiv.org/pdf/1808.00508.pdf

Technologies Used

  • Python (Intel)
  • numpy, PyTorch (main libraries)

Repository

https://github.com/sayakpaul/NALU

Comments (0)