0% found this document useful (0 votes)
3 views

Soft Comp Lab File

This document outlines a practical laboratory file for Soft Computing, detailing various experiments to be conducted using MATLAB. It includes tasks such as creating vectors, plotting functions, implementing perceptrons, designing artificial neural networks, and applying gradient descent. The document serves as a guide for students to explore and implement soft computing techniques in a structured manner.

Uploaded by

vivesod153
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Soft Comp Lab File

This document outlines a practical laboratory file for Soft Computing, detailing various experiments to be conducted using MATLAB. It includes tasks such as creating vectors, plotting functions, implementing perceptrons, designing artificial neural networks, and applying gradient descent. The document serves as a guide for students to explore and implement soft computing techniques in a structured manner.

Uploaded by

vivesod153
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

A Practical Laboratory file for

Soft Computing (UEC823)

Submitted By:
Sharan Gakhar
Roll. No. :102115118
4NC4 - 4O14

Submitted to:
Dr. Gaganpreet Kaur
(Assistant Professor, ECED)

Department of Electronics and Communication Engineering


THAPAR INSTITUTE OF ENGINEERING & TECHNOLOGY, PATIALA, PUNJAB
July - December 2024
LIST OF PRACTICALS

1. Introduction to MATLAB
a) Familiarization with Basic Matrix Operations in MATLAB
b) Create a vector x of all numbers between 31 and 75. Write it using
 while loop
 FOR loop
c) Create a vector x=[2 5 1 6]. Perform 3 operations:
 add 3 to each element of x
 Add 16 only to elements with odd index
 Find square of each element
d) Plot sine wave and cosine wave of 100 points between [0,2] in the same plot
window with sine wave as Red colour and ‘o’ markers while cosine wave in blue
colour ‘*’ markers. Add grid, legends, axis titles to the plot.
f) Plot following activation functions:
(i) Sigmoidal function with 3 different values of slope ‘a’ as a=0.5, a=1 and a=1.5 in
same window.
(ii) Hyperbolic tan function
(iii) identity function
Consider X-[-10,10] for all functions.

2. Using perceptron as hard limiting classifier. Plot an input matrix [2,50] of random
values and corresponding to these 50 points create a target vector of binary values.
Plot the points and show the classification using perceptron learning.
3. Design an ANN using patternnet and Feedforwardnet for AND gate and compare their
performance for error function and training function used.
4. Using the inbuilt sample data file build and NAR model for time series prediction of
chaotic data.
5. Implement gradient descent function
6. Implementing Kohonen SOM for clustering a set of 100 points and 2-D output with
gridtop layout using(i) Euclidean distance (ii) Manhattan distance. Compare the
performance for two criteria of neighbourhood in 1000 epochs.
7. Design a fuzzy Inference system for sale of ice-cream shop. The sale depends on
price of ice cream and temperature. Price has membership functions low, medium or
high while for temp it is freezing, cool, warm or hot. The output would be good or
bad.
8. Design a Sugeno FIS using the GUI tool.
9. Using GA tool Box for optimizing problem/ GA Optim tool
Lab Experiment 4
Using the inbuilt sample data file build and NAR model for time series prediction of
chaotic data.

% Load sample data


data = sin(0:0.1:10); % Example time-series data

% Prepare data for NAR model


inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;

% Create NAR network


net = narnet(inputDelays, hiddenLayerSize, 'closed');
[Xs, Xi, Ai, Ts] = preparets(net, {}, {}, con2seq(data));
net = train(net, Xs, Ts, Xi, Ai);

% Predict
Y = net(Xs, Xi, Ai);
Lab Experiment 5
Implement gradient descent function
% Example function: f(x) = x^2
f = @(x) x.^2; % Function
df = @(x) 2 * x; % Derivative of f(x)
% Gradient descent parameters
x = 10; % Initial guess
alpha = 0.1; % Learning rate
tolerance = 1e-6;
% Track the history of x for visualization
x_history = x;
f_history = f(x);
% Gradient descent loop
while abs(df(x)) > tolerance
x = x - alpha * df(x); % Update x
x_history = [x_history, x]; % Save x for visualization
f_history = [f_history, f(x)]; % Save f(x) for visualization
end
disp(['Minimum found at x = ', num2str(x)]);
% Plot the function and gradient descent process
x_vals = -10:0.1:10; % Range of x values for plotting
y_vals = f(x_vals); % Corresponding y values of the function
figure;
plot(x_vals, y_vals, 'b-', 'LineWidth', 1.5); % Plot f(x)
hold on;
scatter(x_history, f_history, 'ro', 'filled'); % Plot gradient descent steps
plot(x_history, f_history, 'r--'); % Connect the gradient descent steps
hold off;
% Annotations
title('Gradient Descent on f(x) = x^2');
xlabel('x');
ylabel('f(x)');
legend('f(x) = x^2', 'Gradient Descent Steps', 'Location', 'Best');
grid on;
Minimum found at x = 4.3136e-07

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy