ReLU Activation Function [with python code]The rectified linear activation function (RELU) is a piecewise linear function that, if the input is positive say x, the output will be x…Oct 23, 2021Oct 23, 2021
How to make a responsive contact us page template using HTML and CSS?by keshavOct 22, 2021Oct 22, 2021
Basic concepts of (K-Nearest Neighbour)KNN AlgorithmIt is probably, one of the simplest but strong supervised learning algorithms used for classification as well regression purposes. It is…Oct 1, 2021Oct 1, 2021
Activation Function in Deep Learning [python code included]Deep learning needs lots of data to perform efficiently. Today internet provides tons of data but the issue with it is that there is no…Sep 25, 2021Sep 25, 2021