Human Face Recognition

Face Recognition with correlation filters



                   Appearance of a face image varies with changes in expression, pose, and illumination. Savvides et al. demonstrated the superior face verification performance of MACE filters in the presence of expression changes. Pose-tolerant FR using correlation filters has not advanced as much; one practical way of achieving pose-tolerant FR appears to design and use multiple correlation filters for multiple (possibly overlapping) sectors of pose angles. In this section, we illustrate the use of correlation filters for FR by focusing on their performance under strong illumination variations. To this purpose, we will use the

illumination subset of the images from the CMU pose, illuminations, and expressions (PIE) face database. The PIE database has 65 subjects and 21 images of one subject’s face under different illuminations are shown in Fig. 2. All face images have been manually cropped to yield images shown in Fig. 2.


Fig. 2. Images of one face under different illuminations in the CMU PIE database .

 For each subject in this illumination subset, a MACE filter was designed based on images numbered 3 (left half of face in shadow), 7 (frontal illumination), and 16 (right

half of the face in shadow) and the resulting correlation filter was tested against all 21 images of each of the 65 subjects in the database. We expect the correlation output to exhibit sharp, high peaks for the authentic and no such peaks for the impostors. We quantify the peak sharpness by the peak-to-sidelobe ratio (PSR) defined as


                                                               (peak mean)

                                             PSR=  ---------------------------……………6


where peak is the largest value in the correlation output and mean and std are the average value and the standard deviation of the correlation outputs in an annular region (of size 20 x 20 for the PIE database images) centered on the peak, but excluding the peak region (a 5 x 5 region). PSR is designed to measure the relative height of the correlation peak to the background and is observed to be not too sensitive to the sizes of these regions. One benefit of the PSR definition in (6) is that it is unaffected by constant illumination changes in the input image. For well-designed MACE filters, PSR should be large for

authentic images and small for impostor images.

                  As described earlier, a MACE filter for each subject was designed using only images numbered 3, 7, and 16 of that subject.





In Fig. 3, we show two correlation outputs produced by one such correlation filter in response to different input images. Fig. 3 (left) shows the correlation output in response to image 10 from the same subject. The correlation output exhibits a sharp peak and the corresponding PSR value is 40.95. Fig. 3  shows the correlation output in response to the face image of a different subject (i.e., an impostor) in this database. This correlation output has no discernible peak and leads to a PSR of only 4.77.        




                  Fig. 4 shows the PSR values of person 1’s correlation filter when tested against all 21 illumination images of that person (solid curve) and when tested against 21 images from each of the other 64 subjects (dashed curves at the bottom) in this database. It is seen that the filter yields the highest PSR values for the three images (3, 7, and 16) used for the training. Although PSR values for nontraining authentic images are lower, they are still higher than any impostor PSR values, indicating that this filter can discriminate Person 1 from 64 others in the CMU PIE database. The discrimination of filters designed for other

64 subjects was equally good in that no verification errors were observed [22] from MACE filters designed from images numbered 3, 7, and 16. In contrast, individual PCA

(IPCA) methods trained on the same three images yield nearly 34% equal error rate.


Want To Know more with

Video ???

Contact for more learning: webmaster@freehost7com





The contents of this webpage are copyrighted © 2008
 All Rights Reserved.