Rectified linear unit

Rectified Linear Unit(ReLU) is the most used activation function since 2015. It's a simple condition and has advantages over the other functions. The function is defined by the following formula:

The following screenshot shows a ReLU activation function:

The range of output is between 0 and infinity. ReLU finds applications in computer vision and speech recognition using deep neural nets. There are various other activation functions as well, but we've covered the most important ones here.

Get Keras 2.x Projects now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.