0% found this document useful (0 votes)
2 views

Activation Function

An activation function is a mathematical function that introduces non-linearity into neural networks, enabling them to learn complex patterns. Various types of activation functions include Identity, Threshold, ReLU, Sigmoid (with Binary and Bipolar variations), and Hyperbolic Tangent, each serving different purposes in neural network architectures. These functions determine whether a neuron should be activated based on the weighted sum of inputs and bias, facilitating complex decision-making in models.

Uploaded by

rasiksuhaif35
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Activation Function

An activation function is a mathematical function that introduces non-linearity into neural networks, enabling them to learn complex patterns. Various types of activation functions include Identity, Threshold, ReLU, Sigmoid (with Binary and Bipolar variations), and Hyperbolic Tangent, each serving different purposes in neural network architectures. These functions determine whether a neuron should be activated based on the weighted sum of inputs and bias, facilitating complex decision-making in models.

Uploaded by

rasiksuhaif35
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Activation Function

Definition :
An activation function is a mathematical function applied to
the output of a neuron. It introduces non-linearity into the
model, allowing the network to learn and represent
complex patterns in the data. Without this non-linearity
feature, a neural network would behave like a linear
regression model, no matter how many layers it has.

The activation function decides whether a neuron should


be activated by calculating the weighted sum of inputs and
adding a bias term. This helps the model make complex
decisions and predictions by introducing non-linearities to
the output of each neuron.

Types Of Activation Function in ANN :

A) Identity Function :

Identity function is used as an activation function for the


input layer. It is a linear function having the form.

Formula :

 yout = f(x) = x, ∀x
B) Thereshold/step Function :

It is a commonly used activation function. As depicted in


the diagram, it gives 1 as output of the input is either 0 or
positive. If the input is negative, it gives 0 as output.
Expressing it mathematically.
Formula :

 yout = f(ysum) = {1,x >= 0 (or) 0,x < 0

C) ReLU(Rectified Linear Unit) Function :


It is the most popularly used activation function in the areas
of convolutional neural networks and deep learning. It is of
the form.
Formula :
 f(x) = {x,x >= 0 (or) 0,x < 0

D) Sigmoid Function :
It is by far the most commonly used activation function in
neural networks. The need for sigmoid function stems from
the fact that many learning algorithms require the activation
function to be differentiable and hence continuous.
There are two types of sigmoid function:
 Binary Sigmoid Function
 Bipolar Sigmoid Function

Binary Sigmoid Function :

 Formula :
 yout = f(x) = 1/1+e-kx

 Where k = steepness or slope parameter, By


varying the value of k, sigmoid function with different
slopes can be obtained. It has a range of (0,1). The
slope of origin is k/4. As the value of k becomes
very large, the sigmoid function becomes a
threshold function.

Bipolar Sigmoid Function :

 Formula :
 Yout = f(x) = 1-e-kz/1+e-kz
 The range of values of sigmoid functions can be
varied depending on the application. However, the
range of (-1,+1) is most commonly adopted.

E) Hyperbolic Tangent Function :


It is bipolar in nature. It is a widely adopted activation
function for a special type of neural network known
as Backpropagation Network. The hyperbolic tangent
function is of the form.

Formula :
 yout = f(x) ex-e-x/ ex+e-x

This function is similar to the bipolar sigmoid function.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy