The hyperbolic tangent (tanh) function is a slight variation of the sigmoid function that is 0 centered. The function can be mathematically and graphically represented as follows:
The range of the tanh function is between -1 and 1 and it is zero-centered; -1 < Output < 1. In this case, the optimization is easy and this activation function is preferred over the sigmoid function. However, the tanh function also suffers from a vanishing gradient problem similar to the sigmoid function. In order to overcome this limitation, the Rectified Linear units activation function, ReLu, is used.