19
Co-operative Co-operative Training in Training in Classifier Classifier Ensembles Ensembles Rozita Dara Rozita Dara PAMI Lab PAMI Lab University of Waterloo University of Waterloo

Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

Embed Size (px)

Citation preview

Page 1: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

Co-operative Training in Co-operative Training in Classifier EnsemblesClassifier Ensembles

Rozita DaraRozita DaraPAMI LabPAMI Lab

University of WaterlooUniversity of Waterloo

Page 2: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IntroductionIntroductionSharing Training ResourcesSharing Training Resources

Sharing Training PatternsSharing Training Patterns Sharing Training AlgorithmsSharing Training Algorithms Sharing Training InformationSharing Training Information

Sharing Training Information: An AlgorithmSharing Training Information: An AlgorithmExperimental StudyExperimental StudyDiscussion and ConclusionsDiscussion and Conclusions

OutlineOutline

Page 3: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

IntroductionIntroduction

Multiple Classifier Systems provideMultiple Classifier Systems provide Improved PerformanceImproved Performance Better Reliability and Generalization Better Reliability and Generalization

Multiple Classifier Systems motivations includeMultiple Classifier Systems motivations include Empirical ObservationEmpirical Observation Problem decomposed naturally from using various sensorsProblem decomposed naturally from using various sensors Avoid making commitments to arbitrary initial conditions or Avoid making commitments to arbitrary initial conditions or

parametersparameters

Page 4: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

IntroductionIntroduction (cntd…) (cntd…)

““Combining identical classifiers will not lead to Combining identical classifiers will not lead to improved performance.”improved performance.”

Importance of creating diverse classifiersImportance of creating diverse classifiers

How does the amount of “sharing” How does the amount of “sharing” between classifiers affect the performance?between classifiers affect the performance?

Page 5: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Sharing Training ResourcesSharing Training Resources

A measure of the degree of co-operation A measure of the degree of co-operation between various classifiers.between various classifiers.

Sharing Training PatternsSharing Training Patterns Sharing Training AlgorithmsSharing Training Algorithms Sharing Training InformationSharing Training Information

Page 6: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Sharing Training PatternsSharing Training Patterns

Page 7: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Sharing Training AlgorithmsSharing Training Algorithms

Page 8: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Sharing Training InformationSharing Training Information

Page 9: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

TrainingTraining

Training each component independentlyTraining each component independently Optimize individual components, may not lead to Optimize individual components, may not lead to

overall improvementoverall improvement Collinearity, high correlation between classifiersCollinearity, high correlation between classifiers Components, under-trained or over-trainedComponents, under-trained or over-trained

Page 10: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Training Training (cntd…)(cntd…)

Adaptive trainingAdaptive training Selective: Selective: Reducing correlation between componentsReducing correlation between components Focused: Focused: Re-training focuses on misclassified patterns.Re-training focuses on misclassified patterns. Efficient: Efficient: Determined the duration of trainingDetermined the duration of training

Page 11: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Adaptive Training: Main loopAdaptive Training: Main loop

Share Training Information Share Training Information between members of the ensemblebetween members of the ensemble

Incremental learningIncremental learning

Evaluation of training to determine Evaluation of training to determine the re-training setthe re-training set

Initialize

DONE = TRUE

Train

Evaluate andcomposetraining

END

START

YES

NO

Page 12: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Adaptive Training: TrainingAdaptive Training: Training

Save classifier if it performs well Save classifier if it performs well on the evaluation seton the evaluation set

Determine when to terminate Determine when to terminate training for each moduletraining for each module

i k

DONEi = TRUE

Train CiEvaluate Ci

CFi > CFi_best

Save CiCFi_best = CFi

CFi-CFi-1 <

DONEi=TRUE

i=i+1

No

YES

NO

YES

YES

No

NO

Yes

Page 13: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Adaptive Training: EvaluationAdaptive Training: Evaluation

Train aggregation modulesTrain aggregation modules

Evaluate training sets for each Evaluate training sets for each classifierclassifier

Compose new training dataCompose new training data

i kEvaluate

System onTraini

DONEi=TRUE i

DONE=TRUE

TrainAggregation

Select newTraining data

YES

NO

YES

Page 14: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Adaptive Training: Data SelectionAdaptive Training: Data Selection

New training data are composed by concatenatingNew training data are composed by concatenating ErrorErrorii: Misclassified entries of training data for : Misclassified entries of training data for

classifier classifier i.i. CorrectCorrectii: Random choice of : Random choice of R*(P*R*(P*δδ_i)_i)

correctly classified entries of the training data correctly classified entries of the training data for classifier for classifier i.i.

Page 15: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

ResultsResults

Five one-hidden layer BP classifiersFive one-hidden layer BP classifiers

Training used partially disjoint data setsTraining used partially disjoint data sets

No optimization is performed for the trained networksNo optimization is performed for the trained networks

The parameters of all the networks are maintained for all The parameters of all the networks are maintained for all the classifiers that are trainedthe classifiers that are trained

Three data setsThree data sets 20 Class Gaussian20 Class Gaussian SatimagesSatimages

Page 16: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Results Results (cntd…)(cntd…)

20 Class Satimages

Without Sharing With Sharing Without Sharing With Sharing

Majority 14.25 0.51 13.48 0.32 13.42 0.88 13.39 0.90

Maximum 14.34 0.99 14.00 0.34 14.87 1.90 14.59 1.54

Average 13.88 0.41 13.23 0.09 13.31 1.01 13.26 0.93

Nash 13.75 0.43 13.08 0.19 17.36 4.15 16.28 4.44

Borda 14.00 0.62 13.06 0.25 13.97 1.33 13.90 1.03

Weighted Average 13.46 0.56 12.84 0.17 13.17 0.94 13.09 0.91

Bayesian 13.12 0.20 12.66 0.10 13.63 0.80 13.76 1.03

Choquet Integral 14.39 0.97 14.12 0.30 14.85 2.07 14.57 1.43

Best Classifier 16.20 4.03 15.57 3.27 16.52 2.80 17.13 1.03

Oracle 3.74 0.32 3.88 0.31 5.08 0.13 5.41 0.23

Page 17: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

ConclusionsConclusions

Exchange of information during training Exchange of information during training would allow for a informed fusion processwould allow for a informed fusion process

Enhance diversity amongst classifiersEnhance diversity amongst classifiers

Algorithms that share training information Algorithms that share training information can improve overall classification accuracy.can improve overall classification accuracy.

Page 18: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

Conclusions Conclusions (cntd…)(cntd…)

Page 19: Co-operative Training in Classifier Ensembles Rozita Dara PAMI Lab University of Waterloo

IFusion 2004IFusion 2004 Co-operative Training in Classifier EnsemblesCo-operative Training in Classifier Ensembles

ReferencesReferences