Upload
gladys-ward
View
216
Download
0
Embed Size (px)
Citation preview
12/4/98 1
Automatic Target Recognition with Support Vector Machines
Automatic Target Recognition with Support Vector Machines
Qun Zhao, Jose Principe
Computational Neuro-Engineering LaboratoryDepartment of Electrical and Computer Engineering
University of Florida
December 4, 1998
12/4/98 2
OverviewOverview
Introduction to SAR ATR
4 Classifiers
Conclusions
Experiment results
12/4/98 3
1. Introduction1. Introduction
Recognition of vehicles in synthetic aperture radar (SAR) is a
difficult problem due to the low resolution of the sensor (1 meter)
and the speckle (noise) intrinsic to the image formation.
Another difficulty is due to the operating conditions. Vehicles can
be placed in high clutter backgrounds, partial occluded, and NEW
vehicles may be found that were not used in the training set.
Training data is always limited. We use here the MSTAR I and II
database (Veda).
12/4/98 4
1. Data Examples 1. Data Examples
BMP2
BTR72
T72
DS1
D7
12/4/98 5
2. Four Classifiers2. Four Classifiers
1). Perceptron with hard limiter (perceptron training)
2). Perceptron with sigmoids (delta rule)
)(sgn ji
ijij xwy
)exp(1
1)(
)(
xxf
xwfy ji
ijij
12/4/98 6
2. Four Classifiers2. Four Classifiers
3). Optimal Separating Hyperplane
bxxyxfvs
iii ..
)()(
12/4/98 7
2. Four Classifiers2. Four Classifiers
4). Support vector machine
Training: kernel-Adatron (FrieB, T., Cristianini, N., and Campbell, C. 1998).
Use Gaussian Kernel.
bxxKyxfvs
iii ..
),()(
),()()()(
)()(:
iii
n
xxKxxxx
featureinputR
12/4/98 8
3. Experiments3. Experiments
3 Target classes:
T72, BTR70, and BMP2
Pairwise classification
Image sizes 80 x 80. Aspect 0 ~ 180 degrees.
Training: 17 degree depression
Number of Training samples: 406
Testing: 15 degree depression
Number of Testing samples: 724
12/4/98 9
3. Experiments3. Experiments
1. Classification Table Comparison of classification results between classifiers (error rate)
BMP2 BTR70 T72 Total
Perceptron (hardlimiter)
56.77 32.71 45.93 48.62
Perceptron(sigmoid)
11.94 0 2.28 6.08
Optimalhyperplane
8.71 0 2.93 4.97
SVM 7.42 0.93 5.86 5.80
12/4/98 10
3. Experiments - Recognition3. Experiments - Recognition
Added two more vehicles to test set. They are called confusers.
Confusers: 2S1 and D7
Number of confuser images : 275
This becomes a recognition problem. The point PD=0.9 of the receiver operating characteristics (ROC) is chosen for the comparison. Output of classifiers are thresholded to achieve PD=0.9.
Now performance is measured by error rate and false alarms.
12/4/98 11
3. Experiments - Recognition3. Experiments - Recognition
Table Comparison of recognition results between classifiers(Pe: error rate %; Pfa: probability of false alarm)
BMP2Pe
BTR70Pe
T72Pe
TotalPe
ConfuserPfa
Perceptron(hard limiter)
39.41 30.84 49.68 42.54 85.48
Perceptron(sigmoid)
1.63 0 5.81 3.18 76.00
Optimalhyperplane
0.65 0 3.23 1.66 61.82
SVM 0.98 0 3.23 1.80 55.27
12/4/98 12
4. Conclusion4. Conclusion
Classification and recognition are different problems, and the latter is more realistic (and hard).
SVMs with the Gaussian kernel perform better for recognition. The local shape of the Gaussian kernel is very useful and should be utilized (samples that are far away from the class centers tend to have small feature values).
In our problem (large input space) the optimal separating hyperplane performs better for classification.
Kernel-Adatron: easy and fast training