ReLU Activation Function [with python code]

vidyasheela
2 min readOct 23, 2021

--

The rectified linear activation function (RELU) is a piecewise linear function that, if the input is positive say x, the output will be x. otherwise, it outputs zero.

The mathematical representation of ReLU function is,

Also Read:

The coding logic for the ReLU function is simple,

if input_value > 0:
return input_value
else:
return 0

A simple python function to mimic a ReLU function is as follows,

def ReLU(x):
data = [max(0,value) for value in x]
return np.array(data, dtype=float)

The derivative of ReLU is,

A simple python function to mimic the derivative of ReLU function is as follows,

def der_ReLU(x):
data = [1 if value>0 else 0 for value in x]
return np.array(data, dtype=float)

ReLU is used widely nowadays, but it has some problems. let’s say if we have input less than 0, then it outputs zero, and the neural network can’t continue the backpropagation algorithm. This problem is commonly known as Dying ReLU. To get rid of this problem we use an improvised version of ReLU, called Leaky ReLU.

Python Code

import numpy as np
import matplotlib.pyplot as plt
# Rectified Linear Unit (ReLU)
def ReLU(x):
data = [max(0,value) for value in x]
return np.array(data, dtype=float)
# Derivative for ReLU
def der_ReLU(x):
data = [1 if value>0 else 0 for value in x]
return np.array(data, dtype=float)
# Generating data for Graph
x_data = np.linspace(-10,10,100)
y_data = ReLU(x_data)
dy_data = der_ReLU(x_data)
# Graph
plt.plot(x_data, y_data, x_data, dy_data)
plt.title('ReLU Activation Function & Derivative')
plt.legend(['ReLU','der_ReLU'])
plt.grid()
plt.show()

to read more about activation functions — link.

Originally published in vidyasheela.

--

--

vidyasheela
vidyasheela

Written by vidyasheela

Educational Website {http://vidyasheela.com} || Free website templates, web pages and widgets #HTML #CSS || Free tutorials on AI, ML, DL and Computer Programing

No responses yet