top of page

Types of Activation

Writer's picture: Shreyansh vermaShreyansh verma

A normal Model is always linear. But to understand a pattern sometimes we need non-linearity. Activation helps the model to add nonlinearity.

Non-linearity

In deep learning, we usually use 3 types of activation functions.


Relu

In Relu function if the number is smaller than 0, than the function takes 0

if the number is larger than 0, than the function takes the number

f(x)= max(0,x)

Sigmoid

In Sigmoid function the out put is limited to 0 to 1.


Softmax

in Softmax function the output is limited to -1 to +1







9 views2 comments

Recent Posts

See All

2 commentaires


talhathecode
13 juin 2023

Great

J'aime
Shreyansh verma
Shreyansh verma
13 juin 2023
En réponse à

Thanks

J'aime
bottom of page