A normal Model is always linear. But to understand a pattern sometimes we need non-linearity. Activation helps the model to add nonlinearity.
Non-linearity
In deep learning, we usually use 3 types of activation functions.
Relu
In Relu function if the number is smaller than 0, than the function takes 0
if the number is larger than 0, than the function takes the number
f(x)= max(0,x)
Sigmoid
In Sigmoid function the out put is limited to 0 to 1.
Softmax
in Softmax function the output is limited to -1 to +1
Great