ReLu and its variants

Rectified Linear Unit (ReLu) is the most popular function in the industry. See its equation in Figure 9.35:

Figure 9.35: ReLu activation function equation (Image credit: https://cdn-images-1.medium.com/max/800/1*JtJaS_wPTCshSvAFlCu_Wg.png)

If you will see the ReLu mathematical equation, then you will know that it is just max(0,x), which means that the value is zero when x is less than zero and linear with the slope of 1 when x is greater than or equal to zero. A researcher named Krizhevsky published a paper on image classification and said that they get six times faster convergence using ReLu as an activation function. ...

Get Python Natural Language Processing now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.