Activation Functions
Activation Functions
σ(z)= g(z)=
Note: The functionality of tanh activation function is better because it provides the output
between +1 and -1 with zero mean. Fixing mean at 0 makes next layer’s decision easier.
Non-Linear Activation Functions