Mish: A Self Regularized Non-Monotonous Activation Function

0 0
  • 0 Collaborators

In this work, a novel neural activation function called as Mish is proposed. The experiments show that Mish tends to work better than both ReLU and Swish along with other standard activation functions in many deep networks across challenging datasets. ...learn more

Project status: Published/In Market

Artificial Intelligence

Groups
Student Developers for AI, DeepLearning, Artificial Intelligence India

Code Samples [1]Links [1]

Overview / Usage

The concept of non-linearity in a Neural Network is introduced by an activation function which serves an integral role in the training and performance evaluation of the network. Over the years of theoretical research, many activation functions have been proposed, however, only a few are widely used in mostly all applications which include ReLU (Rectified Linear Unit), TanH (Tan Hyperbolic), Sigmoid, Leaky ReLU and Swish. In this work, a novel neural activation function called as Mish is proposed. The experiments show that Mish tends to work better than both ReLU and Swish along with other standard activation functions in many deep networks across challenging datasets. For instance, in Squeeze Excite Net- 18 for CIFAR 100 classification, the network with Mish had an increase in Top-1 test accuracy by 0.494% and 1.671% as compared to the same network with Swish and ReLU respectively. The similarity to Swish along with providing a boost in performance and its simplicity in implementation makes it easier for researchers and developers to use Mish in their Neural Network Models.

Methodology / Approach

Frameworks used:

  • PyTorch

  • Numpy

  • TensorFlow

  • Keras

Technologies Used

Models used

  • ResNet (all variants)

  • WRN (all variants)

  • DenseNet (all variants)

  • Mobile Net (both v1 and v2)

  • SENet (all variants)

  • Shuffle Net (v1 and v2)

  • Squeeze Net

  • SimpleNet

  • VGG

  • AlexNet

  • Efficient Net (b0, b1, b2)

  • Inception ResNet v2

  • Capsule Net

  • Xception Net

  • UNet

  • Inception v3

Repository

https://github.com/digantamisra98/Mish

Comments (0)