top of page
Writer's pictureShreyansh verma

Types of Activation

A normal Model is always linear. But to understand a pattern sometimes we need non-linearity. Activation helps the model to add nonlinearity.

Non-linearity

In deep learning, we usually use 3 types of activation functions.


Relu

In Relu function if the number is smaller than 0, than the function takes 0

if the number is larger than 0, than the function takes the number

f(x)= max(0,x)

Sigmoid

In Sigmoid function the out put is limited to 0 to 1.


Softmax

in Softmax function the output is limited to -1 to +1







9 views2 comments

Recent Posts

See All

2 comentarios


talhathecode
13 jun 2023

Great

Me gusta
Shreyansh verma
Shreyansh verma
13 jun 2023
Contestando a

Thanks

Me gusta
bottom of page