Tabu Based Back Propagation Algorithm For Performance Improvement in Communication Channels
Tabu Based Back Propagation Algorithm For Performance Improvement in Communication Channels
Tabu Based Back Propagation Algorithm For Performance Improvement in Communication Channels
ABSTRACT
This paper presents the equalization of The popularity of Tabu search has grown
communication channels using Artificial Neural significantly in the past few years as a global search
Networks (ANNs). The novel method of training the technique. The roots of the Tabu search go back to
ANNs using Tabu based Back Propagation (BP) 1970’s; it was presented in its present form by Glover
Algorithm is described. The algorithm uses the Tabu [5, 6]. This technique is famous in solving many
Search (TS) to improve the performance of the combinatorial problems like the traveling salesmen
equalizer by escaping from the local minima which problem, design optimization, and the quadratic
occurs in BP algorithm and obtain some superior assignment problem.
global solution. From the results it can be noted that
the proposed algorithm improves the classification In this paper we will apply this technique to the more
capability of the ANNs in classifying the received complex problem of neural network training for it to
data. wok as an equalizer.
Concentration on the simpler real case allows us to An equalizer of order m implies that it has m input
highlight the basic principles and concepts. In nodes in its input layer as shown in the figure. An
particular, the case of binary symbols (M = 2) equalizer will have a single node in its output layer.
provides a very useful geometric visualization of The signal received sequentially is allowed to
equalization process. propagate through the hidden layers up to the node in
the output layer [11, 12].
The task of equalizer is to reconstruct the transmitted
symbols as accurately as possible based on the noisy l
The output of the each node y i is the weighted sum
channel observations r (k ) . Various equalizers can be
of outputs of all the nodes in the previous layer and
classified into two categories, namely, the symbol-
affected by the activation function, which here is the
decision equalizer and the sequence-estimation
hyperbolic tangent function given by
equalizer. The later type is hardly used as it is ax
computationally very expensive. The symbol- 1 e
decision equalizers in its initial stages were
( x) ax
(2)
1 e
implemented using Linear Transversal Filters. Later
the advent of ANNs marked the modeling of where a represents the slope of the activation
equalizers which can provide superior performance in
function. Mathematically the forward propagation of
terms of Bit Error Rate (BER) compared to FIR
the neural network is given by [12].
modeling. Nl 1
l
v i (wijl 1 y lj 1 ) (3)
3. Neural Network Equalizer j 1
''
and E ( Wb ) = E ( W ) .
-0.5
i
' -1
(4) Generate a new solution Wij in the neighbor-
-1.5
' '
hood of Wi and evaluate E (Wij ) . TBBP
BP
BER
-2
the best solution if needed. Fig. 3. SNR Vs BER plot for BP and TBBP for H1(z)
(7) If j is less than maximum number of
neighborhood searches go to step (4). Else
SNR
finalize Wb as the best solution. 0
0 5 10 15 20 25
-0.5
5. Experimental Results -1
-1.5
In this section we will consider the experimental TBBP
BP
results and compare the performance of the equalizer -2
-2.5
using Microsoft VC++ 6.0. The plots have been taken -3.5
3 4
0.4201z 1.0 z -5
-1
BP 6-1 of the search, algorithm together with the neural
network in improving the performance of the
-1.5
equalizer even with a simple structure.
BER -2
-2.5
References
-3
[1] S. Haykin, Neural Networks: A
-3.5
Comprehensive Foundation (2nd Ed, Pearson
-4
Education, 2001)
[2] R. P. Lippmann, An Introduction to Computing
-4.5