25
A Rate-Distortion One-Class Model and its Applications to Clustering Koby Crammer Partha Pratim Talukdar Fernando Pereira 1 University Of Pennsylvania 1 Currently at Google, Inc. A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

A Rate-Distortion One-Class Model and its Applications to

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: A Rate-Distortion One-Class Model and its Applications to

A Rate-Distortion One-Class Model and itsApplications to Clustering

Koby Crammer Partha Pratim Talukdar Fernando Pereira1

University Of Pennsylvania

1Currently at Google, Inc.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 2: A Rate-Distortion One-Class Model and its Applications to

One Class Prediction

• Problem Statement• Predict a coherent superset of a small set of positive instances.

• Applications• Document Retrieval• Information Extraction• Gene Expression

• Prefer high precision over high recall.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 3: A Rate-Distortion One-Class Model and its Applications to

Previous Approaches

(Ester et al. 1996) : Density based non-exhaustive clusteringalgorithm. Unfortunately, density analysis is hard in highdimension.

(Tax & Duin 1999) : Find a small ball that contains as many of theseed examples as possible. Most of the points areconsidered relevant, a few outliers are dropped.

(Crammer & Chechik 2004) : Identify a small subset of relevantexamples, leaving out most less relevant ones.

(Gupta & Ghosh 2006) : Modified version of (Crammer & Chechik2004).

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 4: A Rate-Distortion One-Class Model and its Applications to

Our Approach: A Rate-Distortion One-ClassModel

• Express the one-class problem as lossy coding of each instance intoinstance-dependent codewords (clusters).

• In contrast to previous methods, use more codewords thaninstances.

• Regularization via sparse coding: each instance has to be assignedto one of only two codewords.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 5: A Rate-Distortion One-Class Model and its Applications to

Coding Scheme

joint cw 0

point 1 cw 1

point 2 cw 2

point 3 cw 3

point 4 cw 4

point 5 cw 5

• Instances can be coded as themselves, or as a shared codeword(“0”) represented by the vector w.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 6: A Rate-Distortion One-Class Model and its Applications to

Notation

joint cw 0

point xq(0|x)

cw xq(x|x)

p(x) Prior on point x.q(0|x) Probability of x being encoded by the joint code (“0”).q(x|x) Probability of self-coding point x.

vx Vector representation of point x.w Centroid vector of the single class.

D (vx‖w) Cost (distortion) suffered when point x is assigned to theone class whose centroid is w.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 7: A Rate-Distortion One-Class Model and its Applications to

Rate & Distorition Tradeoff

joint cw 0

point 1 cw 1

point 2 cw 2

point 3 cw 3

point 4 cw 4

point 5 cw 5

joint cw 0

point 1 cw 1

point 2 cw 2

point 3 cw 3

point 4 cw 4

point 5 cw 5

All in one All aloneHigh Compression (Low Rate) Low Compression (High Rate)High Distortion Low Distorition

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 8: A Rate-Distortion One-Class Model and its Applications to

Rate-Distortion Optimization

Random variables:• X: instance to be coded;

• T : code for an instance, either T = 0 (shared codeword) or T = x > 0

(instance-specific codeword).

Rate: Amount of compression from the source X to thecode T , measured by the mutual information I(T ;X)

Distortion: How well on average the centroid w serves asa proxy to the instances vx.

Objective (β > 0 tradeoff parameter):

minw,{q(0|x)}

Rate + β ×Distortion

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 9: A Rate-Distortion One-Class Model and its Applications to

Self-Consistent Equations

Solving the Rate-Distortion optimization in the OC setting, we get thefollowing three self-consistent equations, as in IB.

q(0) =∑

x

p(x)q(0|x) (1)

q(0|x) = min

{q(0)

e−βD(vx‖w)

p(x), 1

}(2)

w =∑

x

q(x|0)vx (3)

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 10: A Rate-Distortion One-Class Model and its Applications to

One Class Rate Distortion Algorithm (OCRD)

We optimize the rate-distortion tradeoff following the Blahut-Arimotoand Information bottleneck (IB) algorithms, alternating between thefollowing two steps:

1 Compute the centroid location w as the weighted average ofinstances vx with weights proportional to q(0|x)p(x)

w =∑

x

q(x|0)vx

2 Fix w and optimize for the coding policy q(0|x), q(0)

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 11: A Rate-Distortion One-Class Model and its Applications to

Step 2: Finding a Coding Policy

Let C = {x : q(0|x) = 1} be the set of points assigned to the one class.

Lemma

Let s(x) = βdx + log(p(x))

then there is θ such that x ∈ C if and only if s(x) < θ

The lemma allows us to develop a deterministic algorithm to solve forq(0|x) for x = 1, . . . ,m simultaneously in time complexity O(m log m)

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 12: A Rate-Distortion One-Class Model and its Applications to

Phase Transitions in the Optimal Solution

11.5

22.5

33.5

1

2

3

4

50

0.2

0.4

0.6

0.8

1

temp=1/βx id

q(0

|x)

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 13: A Rate-Distortion One-Class Model and its Applications to

Multiclass Extension

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 14: A Rate-Distortion One-Class Model and its Applications to

Multiclass Coding Scheme

• We have m points and k centroids. The natural extension doesn’twork because 1− q(x|x) does not specify which centroid x shouldbe assigned to.

• Our Multiclass Coding Scheme:

point 1

0

cw 1

cw k1

cw k2

point 2cw 2

point 3cw 3

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 15: A Rate-Distortion One-Class Model and its Applications to

Multiclass Coding Scheme

• We have m points and k centroids. The natural extension doesn’twork because 1− q(x|x) does not specify which centroid x shouldbe assigned to.

• Our Multiclass Coding Scheme:

point 1

0

cw 1

cw k1

cw k2

point 2cw 2

point 3cw 3

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 16: A Rate-Distortion One-Class Model and its Applications to

Multiclass Rate-Distortion Algorithm (MCRD)

MCRD alternates between the following two steps:

1 Use the OCRD algorithm to decide whether we want to self-code apoint or not.

2 Use a hard clustering algorithm (sIB) to clusters the points whichwe decided not to self-code in the first step. Then iterate.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 17: A Rate-Distortion One-Class Model and its Applications to

Multiclass Rate-Distortion Algorithm (MCRD)

MCRD alternates between the following two steps:

1 Use the OCRD algorithm to decide whether we want to self-code apoint or not.

2 Use a hard clustering algorithm (sIB) to clusters the points whichwe decided not to self-code in the first step. Then iterate.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 18: A Rate-Distortion One-Class Model and its Applications to

Experimental Results

1 One Class Document Classification.

2 Multiclass Clustering of synthetic data.

3 Multiclass Clustering of real-world data.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 19: A Rate-Distortion One-Class Model and its Applications to

One Class Document Classification

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

Recall

Pre

cisi

on

Category: crude (#578)

OC−ConvexOC−IBOCRD−BA

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

Recall

Pre

cisi

on

Category: acq (#2369)

OC−ConvexOC−IBOCRD−BA

PR plots for two categories of the Reuters-21678 data set using OCRD andtwo previously proposed methods (OC-IB & OC-Convex). During training,each of the algorithms searched for a meaningful subset of the training dataand generated a centroid. The centroid was then used to label the test data,

and to compute recall and precision.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 20: A Rate-Distortion One-Class Model and its Applications to

Multiclass: Synthetic Data Clustering

−1 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1beta=100 , coded = 585 / 900 ( 45 132 132 135 141 )

−1 −0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 1−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1beta=140 , coded = 490 / 900 ( 0 119 121 122 128 )

Clusterings produced by MCRD on a synthetic data set for two values of β

with k = 5. There were 900 points, 400 sampled from four Gaussiandistributions, 500 sampled from a uniform distribution. Self-coded points aremarked by black dots, coded points by colored dots and cluster centroids by

bold circles.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 21: A Rate-Distortion One-Class Model and its Applications to

Multiclass: Unsupervised Document Clustering

0 0.2 0.4 0.6 0.8 10.7

0.75

0.8

0.85

0.9

0.95

1

Pre

cisi

on

Recall

Total Points: 500, Clusters: 5

sIBMCRD

PR plots for sIB and MCRD (β = 1.6) on the Multi5 1 dataset (2000 wordvocabulary). These plots show that better clustering can be obtained if the

algorithm is allowed to selectively leave out data points (through self-coding).

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 22: A Rate-Distortion One-Class Model and its Applications to

Conclusion

• We have cast the problem of identifying a small coherent subset ofdata as an optimization problem that trades off between class size(compression) and accuracy (distortion).

• We also show that our method allows us to move from one-class tostandard clustering, but with background noise left out (theability to “give up” some points).

• Extend to more general instance spaces and distortions: graphs,manifolds.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 23: A Rate-Distortion One-Class Model and its Applications to

Conclusion

• We have cast the problem of identifying a small coherent subset ofdata as an optimization problem that trades off between class size(compression) and accuracy (distortion).

• We also show that our method allows us to move from one-class tostandard clustering, but with background noise left out (theability to “give up” some points).

• Extend to more general instance spaces and distortions: graphs,manifolds.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 24: A Rate-Distortion One-Class Model and its Applications to

Conclusion

• We have cast the problem of identifying a small coherent subset ofdata as an optimization problem that trades off between class size(compression) and accuracy (distortion).

• We also show that our method allows us to move from one-class tostandard clustering, but with background noise left out (theability to “give up” some points).

• Extend to more general instance spaces and distortions: graphs,manifolds.

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)

Page 25: A Rate-Distortion One-Class Model and its Applications to

Thanks!

A Rate-Distortion One-Class Model and its Applications to Clustering (Crammer et al.)