TanH

To overcome the problems of the sigmoid function, we will introduce an activation function named hyperbolic tangent function (TanH). The equation of TanH is given in Figure 9.34:

Figure 9.34: Tanh activation function equation (Image credit: https://cdn-images-1.medium.com/max/800/1*HJhu8BO7KxkjqRRMSaz0Gw.png)

This function squashes the input region in the range of [-1 to 1] so its output is zero-centric, which makes optimization easier for us. This function also suffers from the vanishing gradient problem, so we need to see other activation functions.

Get Python Natural Language Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.