CSC445: Neural Networks
CSC445: Neural Networks
CSC445: Neural Networks
CHAPTER 00
INTRODUCTION
Don’t accumulate …
Do it yourself …
Medical Applications
Forecasting
Adaptive Filtering
Adaptive Control
Figure 4 Cytoarchitectural map of the cerebral cortex. The different areas are identified by
the thickness of their layers and types of cells within them. Some of the key sensory areas
are as follows: Motor cortex: motor strip, area 4; premotor area, area 6; frontal eye fields,
area 8. Somatosensory cortex: areas 3, 1, and 2. Visual cortex: areas 17, 18, and 19.
Auditory cortex: areas 41 and 42. (From A. Brodal, 1981; with permission of Oxford
University Press.)
1949 Hebb published his book The Organization of Behavior, in which the Hebbian
learning rule was proposed.
1958 Rosenblatt introduced the simple single layer networks now called Perceptrons.
1969 Minsky and Papert’s book Perceptrons demonstrated the limitation of single
layer perceptrons, and almost the whole field went into hibernation.
1982 Kohonen developed the Self-Organising Maps that now bear his name.
2000s The power of Ensembles of Neural Networks and Support Vector Machines
becomes apparent.
vk uk bk
That is, the bias value changes
the relation between the induced
local field, or the activation
potential vk, and the linear
combiner uk as shown in Fig. 6.
Figure 6 Affine transformation
produced by the presence of a
bias; note that vk = bk at uk = 0.
m
vk wkj x j
j 0
and
yk (vk )
where
x0 1, and wk 0 bk
Figure 7 Another nonlinear model of a
neuron; wk0 accounts for the bias bk.
1 if v 0
(v )
0 if v 0
This model is the
McCulloch and Pitts Figure 8 (a) Threshold function.
neuron model.
That is, the neuron will has output signal only if its
activation potential is non-negative, a property
known as all-or-none.
1
(v )
1 eav
a is the slope parameter.
Figure 8(b) Sigmoid function for
This function is differentiable, varying slope parameter a.
1 if v 0
(v) 0 if v 0
1 if v 0
The hyperbolic tangent function , which allows for an
odd sigmoid-type function.
(v) tanh(v)
Self-feedback refers to a
situation where the output
of a node is feedback into its
own input.
(Explicit)
Data Model Training
Building
Data Training
Rosenblatt’s Perceptron
51