Soft Comp Lab File
Soft Comp Lab File
Submitted By:
Sharan Gakhar
Roll. No. :102115118
4NC4 - 4O14
Submitted to:
Dr. Gaganpreet Kaur
(Assistant Professor, ECED)
1. Introduction to MATLAB
a) Familiarization with Basic Matrix Operations in MATLAB
b) Create a vector x of all numbers between 31 and 75. Write it using
while loop
FOR loop
c) Create a vector x=[2 5 1 6]. Perform 3 operations:
add 3 to each element of x
Add 16 only to elements with odd index
Find square of each element
d) Plot sine wave and cosine wave of 100 points between [0,2] in the same plot
window with sine wave as Red colour and ‘o’ markers while cosine wave in blue
colour ‘*’ markers. Add grid, legends, axis titles to the plot.
f) Plot following activation functions:
(i) Sigmoidal function with 3 different values of slope ‘a’ as a=0.5, a=1 and a=1.5 in
same window.
(ii) Hyperbolic tan function
(iii) identity function
Consider X-[-10,10] for all functions.
2. Using perceptron as hard limiting classifier. Plot an input matrix [2,50] of random
values and corresponding to these 50 points create a target vector of binary values.
Plot the points and show the classification using perceptron learning.
3. Design an ANN using patternnet and Feedforwardnet for AND gate and compare their
performance for error function and training function used.
4. Using the inbuilt sample data file build and NAR model for time series prediction of
chaotic data.
5. Implement gradient descent function
6. Implementing Kohonen SOM for clustering a set of 100 points and 2-D output with
gridtop layout using(i) Euclidean distance (ii) Manhattan distance. Compare the
performance for two criteria of neighbourhood in 1000 epochs.
7. Design a fuzzy Inference system for sale of ice-cream shop. The sale depends on
price of ice cream and temperature. Price has membership functions low, medium or
high while for temp it is freezing, cool, warm or hot. The output would be good or
bad.
8. Design a Sugeno FIS using the GUI tool.
9. Using GA tool Box for optimizing problem/ GA Optim tool
Lab Experiment 4
Using the inbuilt sample data file build and NAR model for time series prediction of
chaotic data.
% Predict
Y = net(Xs, Xi, Ai);
Lab Experiment 5
Implement gradient descent function
% Example function: f(x) = x^2
f = @(x) x.^2; % Function
df = @(x) 2 * x; % Derivative of f(x)
% Gradient descent parameters
x = 10; % Initial guess
alpha = 0.1; % Learning rate
tolerance = 1e-6;
% Track the history of x for visualization
x_history = x;
f_history = f(x);
% Gradient descent loop
while abs(df(x)) > tolerance
x = x - alpha * df(x); % Update x
x_history = [x_history, x]; % Save x for visualization
f_history = [f_history, f(x)]; % Save f(x) for visualization
end
disp(['Minimum found at x = ', num2str(x)]);
% Plot the function and gradient descent process
x_vals = -10:0.1:10; % Range of x values for plotting
y_vals = f(x_vals); % Corresponding y values of the function
figure;
plot(x_vals, y_vals, 'b-', 'LineWidth', 1.5); % Plot f(x)
hold on;
scatter(x_history, f_history, 'ro', 'filled'); % Plot gradient descent steps
plot(x_history, f_history, 'r--'); % Connect the gradient descent steps
hold off;
% Annotations
title('Gradient Descent on f(x) = x^2');
xlabel('x');
ylabel('f(x)');
legend('f(x) = x^2', 'Gradient Descent Steps', 'Location', 'Best');
grid on;
Minimum found at x = 4.3136e-07