A Filterbank-Based Representation For Classification and Matching of Fingerprints
A Filterbank-Based Representation For Classification and Matching of Fingerprints
A Filterbank-Based Representation For Classification and Matching of Fingerprints
Representation for
Classification and Matching of
Fingerprints
http://matlab-recognition-code.com
Step
Corresponding M-function
Supercore.m
Crop
Cropping.m
Sectorization
Whichsector.m
Normalize
Sector_norm.m
Filter-bank (8 filters)
Gabor2d_sub.m
Sector_norm.m
The
pixel-wise
orientation
field
estimation
(the
M-function
is
orientation_image_luigiopt.m) is greatly accelerated reusing previous sum
computations. The sum of elements of a block centered at pixel (I,J) can be used
for the computation of the sum of block elements centered at pixel (I,J+1). This
can be performed in the following way:
Once the sum of values centered at pixel (I, J) has been calculated (sum of
yellow pixels and orange pixels of Figure 1), in order to calculate the sum
centered at pixel (I, J+1) we simply subtract from the previous sum the yellow
area and add the green area (see Figure 2): in this way it is possible to save a lot
of computation. In other words
SUM (I, J) = yellow + orange
SUM (I, J+1) = SUM (I, J) yellow + green.
For more details concerning the implementation of this algorithm please visit
http://www.ece.cmu.edu/~ee551/Old_projects/projects/s99_19/finalreport.html
References
A. K. Jain, S. Prabhakar, and S. Pankanti, "A Filterbank-based Representation
for Classification and Matching of Fingerprints", International Joint Conference on
Neural Networks (IJCNN), pp. 3284-3285, Washington DC, July 10-16, 1999.
http://www.cse.msu.edu/~prabhaka/publications.html
matrix F is above a threshold value, the parameters for image segmentation are
re-calculated and the whole process is repeated once again (complex filtering
output does not vary, only the region of interest has been changed).
Note: these relative maxima larger than the threshold value are not considered
core points but they will be considered candidates for it when a new image is
selected for fingerprint matching.
Figure 3. Image is segmented, closed and eroded (binary erosion and closing).
Step 3
Then we perform a fast pixel-wise orientation field computation [2].
References
[1] S. Chikkerur,C. Wu and Govindaraju, "A Systematic approach for feature
extraction in fingerprint images", ICBA 2004
[2] Ravishankar Rao, "A taxonomy of texture description", Springer Verlag
[3] Kenneth Nilsson and Josef Bigun, Localization of corresponding points in
fingerprints by complex filtering, Pattern Recognition Letters 24 (2003), 21352144