Novel Activation Functions Based on TanhExp Activation Function in Deep Learning

  • Marina Adriana Mercioni, Stefan Holban2

Abstract

Our proposal is to make a brief synthesis of the most common and recent activation functions to see how the activation function impacts the performance of a Deep Learning model. For this purpose, we will test popular functions on Deep Learning models and using the obtained information in order to see which activation function is the most suitable in order to increase the performance based on the type of chosen task. We introduce three novel activation functions, that are able to bring performance improvements on object classification tasks using datasets such as MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100, but we will see that we also used a dataset for detection of anomalies in time series. To test them, we used several types of architectures, including LeNet-5 and AlexNet for CIFAR-10 and CIFAR-100, for MNIST and Fashion-MNIST a custom architecture as at the implementation of TanhExp function and a custom architecture for the task of time series. To validate our proposal, we compare them with ReLU, tangent (tanh) and TanhExp activation functions.

Published
2020-11-01
Section
Articles