0% found this document useful (0 votes)
127 views

DONE SOFT COMPUTING Unit 1

The document contains 25 multiple choice questions about soft computing and neural networks. It covers topics like the basic structure of neurons, different types of neural networks like feedforward networks, learning methods like supervised vs unsupervised learning, functions like the sigmoid and tanh functions, and applications of neural networks like pattern recognition. It also discusses early neural network architectures like the Rosenblatt perception and the concept of backpropagation as a method for training multilayer neural networks.

Uploaded by

Dip
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
127 views

DONE SOFT COMPUTING Unit 1

The document contains 25 multiple choice questions about soft computing and neural networks. It covers topics like the basic structure of neurons, different types of neural networks like feedforward networks, learning methods like supervised vs unsupervised learning, functions like the sigmoid and tanh functions, and applications of neural networks like pattern recognition. It also discusses early neural network architectures like the Rosenblatt perception and the concept of backpropagation as a method for training multilayer neural networks.

Uploaded by

Dip
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

SOFT COMPUTING

UNIT – I

1. The structural constitute of a human brain is known as ------------------

a) Neuron b)Cells c)Chromosomes d)Genes

2.Neural networks also known as -----------------------

a)Artificial Neural Network b)Artificial Neural Systems


c)Both A and B d) None of the above

3. Neurons also known as -----------------

a)Neurodes b)Processing elements c)Nodes d)All the above

4. In the neuron, attached to the soma are long irregularly shaped filaments called--------------

a)Dendrites b)Axon c)Synapse d)Cerebellum

5. Signum function is defined as -------------------

a) φ(I) =+1, I>0, -1, I<=0

b) φ(I)=0

c) φ(I)=+1,I>0

d) φ(I)=-1,I<=0

6. To generate the final output, the sum is passed on to a non-linear filter φ called

a)Smash function b)sum function c)Activation function d)Output function

7. ---------------function is a continuous function that varies gradually between the asymptotic values 0
and 1 or -1 and +1

a)Activation function b)Thresholding function c)Signum function d)Sigmoidal function

8.-----------------produce negative output values

a)Hyperbolic tangent function b)Parabolic tangent function

c)Tangent function d)None of the above

9.-------------------- carrying the weights connect every input neuron to the output neuron but not vice-
versa.

a)Feed forward network


b)Fast forward network
c)Fast network
d)Forward network
10.------------- has not feedback loop

a)Neural network b)Recurrent Network c)Multilayer Network d)Feed forward network

11. In the learning method, the target output is not presented to the network ----------------

a) Supervised learning b)Unsupervised learning

c)Reinforced learning d)Hebbian learning

12. Combining a number of ADALINE is ----------------

a) MULTILINE b)MULTIPLE LINE C)MADALINE d)MANYLINE

13.Neural network applications -----------------

a) Pattern Recognition b)Optimization Problem c)Forecasting d)All the above

14.------------------ is a Systematic method for training multilayer artificial neural network

a)Back propagation b)Forward propagation c)Speed propagation d)Multilayer propagation

15. --------------------- is a computational model

a) neuron b) cell c)Perception d)Neucleus

16.Intermediatry layer is present in ----------------------

a)Multilayer feedforward perception model

b)Multilayer perception model

c)Multilayer Feedforward model

d)None of the above

17.Linear Activation Operator equation is ---------------

a) O=gI,g=tanφ

b) O=gI,g=sinφ

c) O=gI,g=cosφ

d) O=gI,g=-tanφ

18.--------------- is never assured of finding global minimum as in the simple layer delta rulecase.
a)Back propagation b)Front Propagation c)Propagation d)None above

19.The test of neural network is known as--------------

a)Inference Engine b)Checking c)Deriving d)None

20.Application of Back Propagation

a)Design of Journal Bearing b)Classification of soil

c)Hot Extrusion of soil d)All the above

21. Reinforced learning also known as ----------------

a)Output based learning b)Error based learning

c)Back propagation learning d)None

22.---------------------learning follows “Winner takes all” strategy

a)Stochastic learning b)Competitive learning c)Hebbian learning d)BackPropagation learning

23.------------------earlier neural network architecture,

a)Rosenblatt Perception b)Rosen Perception c)Roshon Perception d)None

24. In Rosenblatt’s Perception network has three units, sensory unit, association unit and --------------
a)Output unit b) Response unit c) feedback unit d) Result unit

25.ADALINE stands for --------------------------

a)Adaptive Linear Neural Element Network

b)Adaptive Line Neural Network

c)Adapt Line Neural Element Network

d)Adaptive Linear Neural Network

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy