63
Something for almost nothing: Advances in sublinear time algorithms Ronitt Rubinfeld MIT and Tel Aviv U.

Something for almost nothing: Advances in sublinear time algorithms

  • Upload
    emile

  • View
    57

  • Download
    0

Embed Size (px)

DESCRIPTION

Something for almost nothing: Advances in sublinear time algorithms. Ronitt Rubinfeld MIT and Tel Aviv U. Algorithms for REALLY big data. No time. What can we hope to do without viewing most of the data?. Small world phenomenon. The social network is a graph: “ node’’ is a person - PowerPoint PPT Presentation

Citation preview

Page 1: Something for almost nothing:  Advances in  sublinear  time algorithms

Something for almost nothing: Advances in sublinear time algorithms

Ronitt RubinfeldMIT and Tel Aviv U.

Page 2: Something for almost nothing:  Advances in  sublinear  time algorithms

Algorithms for REALLY big data

Page 3: Something for almost nothing:  Advances in  sublinear  time algorithms

No time

What can we hope to do without viewing most of the data?

Page 4: Something for almost nothing:  Advances in  sublinear  time algorithms

Small world phenomenon• The social network is a

graph:– “node’’ is a person– “edge’’ between people that

know each other• “6 degrees of separation’’

• Are all pairs of people connected by path of distance at most 6?

Page 5: Something for almost nothing:  Advances in  sublinear  time algorithms

Vast data

• Impossible to access all of it

• Accessible data is too enormous to be viewed by a single individual

• Once accessed, data can change

Page 6: Something for almost nothing:  Advances in  sublinear  time algorithms

The Gold Standard

• Linear time algorithms:– for inputs encoded by n

bits/words, allow cn time steps (constant c)

• Inadequate…

Page 7: Something for almost nothing:  Advances in  sublinear  time algorithms

What can we hope to do without viewing most of the data?

• Can’t answer “for all” or “exactly” type statements:– are all individuals connected by at most 6 degrees of

separation? – exactly how many individuals on earth are left-handed?

• Compromise?– is there a large group of individuals connected by at most 6

degrees of separation?– approximately how many individuals on earth are left-handed?

Page 8: Something for almost nothing:  Advances in  sublinear  time algorithms

What types of approximation?

Page 9: Something for almost nothing:  Advances in  sublinear  time algorithms

Property Testing• Does the input object have crucial properties?

• Example Properties:

– Clusterability, – Small diameter graph, – Close to a codeword, – Linear or low degree polynomial function,– Increasing order– Lots and lots more…

Page 10: Something for almost nothing:  Advances in  sublinear  time algorithms

“In the ballpark” vs. “out of the ballpark” tests

• Property testing: Distinguish inputs that have specific property from those that are far from having that property

• Benefits:– Can often answer such questions much faster– May be the natural question to ask

• When some “noise” always present• When data constantly changing • Gives fast sanity check to rule out very “bad” inputs (i.e., restaurant

bills)• Model selection problem in machine learning

Page 11: Something for almost nothing:  Advances in  sublinear  time algorithms

Examples

• Can test if a function is a homomorphism in CONSTANT TIME [Blum Luby R.]

• Can test if the social network has 6 degrees of separation in CONSTANT TIME [Parnas Ron]

Page 12: Something for almost nothing:  Advances in  sublinear  time algorithms

• Find characterization of property that is• Efficiently (locally) testable• Robust -

• objects that have the property satisfy characterization, • and objects far from having the property are unlikely to

PASS

Constructing a property tester:

Usually the bigger

challenge

Page 13: Something for almost nothing:  Advances in  sublinear  time algorithms

• A “bad” testing characterization: ∀x,y f(x)+f(y) = f(x+y)• Another bad characterization: For most x f(x)+f(1) = f(x+1) • Good characterization: For most x,y f(x)+f(y) = f(x+y)

Example: Homomorphism property of functions

Page 14: Something for almost nothing:  Advances in  sublinear  time algorithms

• A “bad” testing characterization: For every node, all other nodes within distance 6.

• Another bad one: For most nodes, all other nodes within distance 6.

• Good characterization: For most nodes, there are many other nodes within distance 6.

Example: 6 degrees of separation

Page 15: Something for almost nothing:  Advances in  sublinear  time algorithms

An example in depth

Page 16: Something for almost nothing:  Advances in  sublinear  time algorithms

Monotonicity of a sequence

• Given: list y1 y2 ... yn

• Question: is the list sorted?

• Clearly requires n steps – must look at each yi

Page 17: Something for almost nothing:  Advances in  sublinear  time algorithms

Monotonicity of a sequence

• Given: list y1 y2 ... yn

• Question: can we quickly test if the list close to sorted?

Page 18: Something for almost nothing:  Advances in  sublinear  time algorithms

What do we mean by ``quick’’?

• query complexity measured in terms of list size n

• Our goal (if possible): – Very small compared to n, will go for clog n

Page 19: Something for almost nothing:  Advances in  sublinear  time algorithms

What do we mean by “close’’?

Definition: a list of size n is .99-close to sorted if can delete at most .01n values to make it sorted. Otherwise, .99-far.

Sorted: 1 2 4 5 7 11 14 19 20 21 23 38 39 45Close: 1 4 2 5 7 11 14 19 20 39 23 21 38 45 1 4 5 7 11 14 19 20 23 38 45Far: 45 39 23 1 38 4 5 21 20 19 2 7 11 14 1 4 5 7 11 14Requirements for algorithm:• pass sorted lists • if list passes test, can change at most .01 fraction of list to make

it sorted

Page 20: Something for almost nothing:  Advances in  sublinear  time algorithms

An attempt:

• Proposed algorithm:– Pick random i and test that yi≤yi+1

• Bad input type:– 1,2,3,4,5,…j, 1,2,3,4,5,….j, 1,2,3,4,5,…j, 1,2,3,4,5,…,j– Difficult for this algorithm to find “breakpoint” – But other algorithms work well

i

yi

Page 21: Something for almost nothing:  Advances in  sublinear  time algorithms

A second attempt:• Proposed algorithm:

– Pick random i<j and test that yi≤yj

• Bad input type:– n/m groups of m elements m,m-1,m-2,…,1, 2m,2m-1,2m-2,…m+1, 3m,3m-1,3m-2,…, – must pick i,j in same group– need at least (n/m)1/2 choices to do this

i

yi

Page 22: Something for almost nothing:  Advances in  sublinear  time algorithms

A test that works• The test: (for distinct yi)

– Test several times:• Pick random i• Look at value of yi • Do binary search for yi

• Does the binary search find any inconsistencies? If yes, FAIL• Do we end up at location i? If not FAIL

– Pass if never failed

• Running time: O(log n) time• Why does this work?

– If list is in order, then test always passes– If the test passes on choice i and j, then yi and yj are in correct order– Since test usually passes, most yi’s in the right order

Page 23: Something for almost nothing:  Advances in  sublinear  time algorithms

Many more properties studied!

• Graphs, functions, point sets, strings, …

• Amazing characterizations of problems testable in graph and function testing models!

Page 24: Something for almost nothing:  Advances in  sublinear  time algorithms

Properties of functions• Linearity and low total degree polynomials

[Blum Luby R.] [Bellare Coppersmith Hastad Kiwi Sudan] [R. Sudan] [Arora Safra] [Arora Lund Motwani Sudan Szegedy] [Arora Sudan] ...

• Functions definable by functional equations – trigonometric, elliptic functions [R.]

• All locally characterized affine invariant function classes! [Bhattacharyya Fischer Hatami Hatami Lovett]

• Groups, Fields [Ergun Kannan Kumar R. Viswanathan]

• Monotonicity [EKKRV] [Goldreich Goldwasser Lehman Ron Samorodnitsky] [Dodis Goldreich Lehman Ron Raskhodnikova Samorodnitsky][Fischer Lehman Newman Raskhodnikova R. Samorodnitsky] [Chakrabarty Seshadri]…

• Convexity, submodularity [Parnas Ron R.] [Fattal Ron] [Seshadri Vondrak]…

• Low complexity functions, Juntas [Parnas Ron Samorodnitsky] [Fischer Kindler Ron Safra Samorodnitsky]

[Diakonikolas Lee Matulef Onak R. Servedio Wan]…

Page 25: Something for almost nothing:  Advances in  sublinear  time algorithms

Properties of sparse and general graphs

• Easily testable dense and hyperfinite graph properties are completely characterized! [Alon Fischer Newman Shapira][Newman Sohler]

• General Sparse graphs: bipartiteness, connectivity, diameter, colorability, rapid mixing, triangle free,… [Goldreich Ron] [Parnas Ron] [Batu Fortnow R. Smith White] [Kaufman Krivelevich Ron] [Alon Kaufman Krivelevich Ron]…

• Tools: Szemeredi regularity lemma, random walks, local search, simulate greedy, borrow from parallel algorithms

Page 26: Something for almost nothing:  Advances in  sublinear  time algorithms

Some other combinatorial properties

• Set properties – equality, distinctness,... • String properties – edit distance, compressibility,…• Metric properties – metrics, clustering, convex hulls,

embeddability... • Membership in low complexity languages –

regular languages, constant width branching programs, context-free languages, regular tree languages…

• Codes – BCH and dual-BCH codes, Generalized Reed-Muller,…

Page 27: Something for almost nothing:  Advances in  sublinear  time algorithms

“Traditional” approximation

• Output number close to value of the optimal solution (not enough time to construct a solution)

• Some examples:– Minimum spanning tree, – vertex cover, – max cut, – positive linear program, – edit distance, …

Page 28: Something for almost nothing:  Advances in  sublinear  time algorithms

• Given graph G(V,E), a vertex cover (VC) C is a subset of V such that it “touches” every edge.

• What is minimum size of a vertex cover?• NP-complete• Poly time multiplicative 2-approximation based on

relationship of VC and maximal matching

Example: Vertex Cover

Page 29: Something for almost nothing:  Advances in  sublinear  time algorithms

Vertex Cover and Maximal Matching

• Maximal Matching: – M E is a matching if no node in in more than one edge. – M is a maximal matching if adding any edge violates the

matching property

• Note: nodes matched by M form a pretty good Vertex Cover!– Vertex cover must include at least one node in each edge of M– Union of nodes of M form a vertex cover– So |M| ≤ VC ≤ 2 |M|

Page 30: Something for almost nothing:  Advances in  sublinear  time algorithms

“Classical” approximation examples

• Can get CONSTANT TIME approximation for vertex cover on sparse graphs! – Output y which is at most 2∙ OPT + ϵn

How? • Oracle reduction framework [Parnas Ron]

– Construct “oracle” that tells you if node u in 2-approx vertex cover

– Use oracle + standard sampling to estimate size of cover

But how do you implement the oracle?

Page 31: Something for almost nothing:  Advances in  sublinear  time algorithms

Implementing the oracle – two approaches:

• Sequentially simulate computations of a local distributed algorithm [Parnas Ron]

• Figure out what greedy maximal matching algorithm would do on u [Nguyen Onak]

Page 32: Something for almost nothing:  Advances in  sublinear  time algorithms

Greedy algorithm for maximal matching

• Algorithm:

– For every edge (u,v)• If neither of u or v matched

–Add (u,v) to M– Output M

• Why is M maximal?– If e not in M then either u or v already matched by earlier

edge

Page 33: Something for almost nothing:  Advances in  sublinear  time algorithms

Implementing the Oracle via Greedy

• To decide if edge e in matching:– Must know if adjacent edges that come before e in

the ordering are in the matching– Do not need to know anything about edges coming

after

• Arbitrary edge order can have long dependency chains!

Odd or even steps from beginning?

1 2 4 8 25 36 47 88 89 110 111 112 113

Page 34: Something for almost nothing:  Advances in  sublinear  time algorithms

Breaking long dependency chains[Nguyen Onak]

• Assign random ordering to edges– Greedy works under any ordering– Important fact: random order has short

dependency chains

Page 35: Something for almost nothing:  Advances in  sublinear  time algorithms

Better Complexity for VC

• Always recurse on least ranked edge first• Heuristic suggested by [Nguyen Onak]

• Yields time poly in degree [Yoshida Yamamoto Ito]

• Additional ideas yield query complexity nearly linear in average degree for general graphs [Onak Ron Rosen R.]

Page 36: Something for almost nothing:  Advances in  sublinear  time algorithms

Further work

• More complicated arguments for maximum matching, set cover, positive LP… [Parnas Ron + Kuhn Moscibroda Wattenhofer] [Nguyen Onak] [Yoshida Yamamoto Ito]

• Even better results for hyperfinite graphs [Hassidim Kelner Nguyen Onak][Newman Sohler]– e.g., planar

Can dependence be made poly in average

degree?

Page 37: Something for almost nothing:  Advances in  sublinear  time algorithms

No samplesWhat if data only accessible via random

samples?

Page 38: Something for almost nothing:  Advances in  sublinear  time algorithms

Distributions

Page 39: Something for almost nothing:  Advances in  sublinear  time algorithms

Play the lottery?

Page 40: Something for almost nothing:  Advances in  sublinear  time algorithms

Is the lottery unfair?

• From Hitlotto.com: Lottery experts agree, past number histories can be the key to predicting future winners.

Page 41: Something for almost nothing:  Advances in  sublinear  time algorithms

True Story!

• Polish lottery Multilotek – Choose “uniformly” at random distinct 20 numbers

out of 1 to 80. – Initial machine biased

• e.g., probability of 50-59 too small

• Past results: http://serwis.lotto.pl:8080/archiwum/wyniki_wszystkie.php?id_gra=2

Page 42: Something for almost nothing:  Advances in  sublinear  time algorithms

Thanks to Krzysztof Onak (pointer) and Eric Price (graph)

Page 43: Something for almost nothing:  Advances in  sublinear  time algorithms

New Jersey Pick 3,4 Lottery

• New Jersey Pick k ( =3,4) Lottery.– Pick k digits in order. – 10k possible values. – Assume lottery draws iid

• Data:– Pick 3 - 8522 results from 5/22/75 to 10/15/00

• 2-test gives 42% confidence– Pick 4 - 6544 results from 9/1/77 to 10/15/00.

• fewer results than possible values• 2-test gives no confidence

Page 44: Something for almost nothing:  Advances in  sublinear  time algorithms

Distributions on BIG domains• Given samples of a distribution, need to know, e.g.,

• entropy• number of distinct elements• “shape” (monotone, bimodal,…)• closeness to uniform, Gaussian, Zipfian…

• No assumptions on shape of distribution• i.e., smoothness, monotonicity, Normal distribution,…

• Considered in statistics, information theory, machine learning, databases, algorithms, physics, biology,…

Page 45: Something for almost nothing:  Advances in  sublinear  time algorithms

Key Question

• How many samples do you need in terms of domain size?• Do you need to estimate the probabilities of each

domain item?• Can sample complexity be sublinear in size of the

domain?

Rules out standard statistical techniques, learning distribution

Page 46: Something for almost nothing:  Advances in  sublinear  time algorithms

Our Aim:

Algorithms with sublinear sample complexity

Page 47: Something for almost nothing:  Advances in  sublinear  time algorithms

Similarities of distributions

Are p and q close or far?– p is given via samples– q is either

• known to the tester (e.g. uniform)• given via samples

Page 48: Something for almost nothing:  Advances in  sublinear  time algorithms

Is p uniform?

• Theorem: ([Goldreich Ron] [Batu Fortnow R. Smith White] [Paninski]) Sample complexity of distinguishing from is

• Nearly same complexity to test if p is any known distribution [Batu Fischer Fortnow Kumar R. White]: “Testing identity”

p

Test

samples

Pass/Fail? ||𝑝−𝑈||1=Σ|𝑝𝑖−1𝑛|

Page 49: Something for almost nothing:  Advances in  sublinear  time algorithms

Upper bound for L2 distance [Goldreich Ron]

• L2 distance:

• ||p-U||22 = S(pi -1/n)2

= Spi2 - 2Spi

/n + S1/n2 ========

= Spi2 - 1/n

• Estimate collision probability to estimate L2 distance from uniform

Page 50: Something for almost nothing:  Advances in  sublinear  time algorithms

Testing uniformity [GR, Batu et. al.]

• Upper bound: Estimate collision probability and use known relation between between L1 and L2 norms– Issues:

• Collision probability of uniform is 1/n • Use O(sqrt(n)) samples via recycling

– Comment: [P] uses different estimator

• Easy lower bound: (n½)– Can get (n½/2) [P]

Page 51: Something for almost nothing:  Advances in  sublinear  time algorithms

Back to the lottery…

plenty of samples!

Page 52: Something for almost nothing:  Advances in  sublinear  time algorithms

Is p uniform?

• Theorem: ([Goldreich Ron][Batu Fortnow R. Smith White] [Paninski]) Sample complexity of distinguishing p=U from |p-U|1> is (n1/2)

• Nearly same complexity to test if p is any known distribution [Batu Fischer Fortnow Kumar R. White]: “Testing identity”

p

Test

samples

Pass/Fail?

Page 53: Something for almost nothing:  Advances in  sublinear  time algorithms

Transactions of 20-30 yr olds Transactions of 30-40 yr olds

Testing closeness of two distributions:

trend change?

Page 54: Something for almost nothing:  Advances in  sublinear  time algorithms

Testing closeness

Theorem: ([BFRSW] [P. Valiant]) Sample complexity of distinguishing

p=q from ||p-q||1 > is (n2/3)

p

Test

Pass/Fail?

q

~

Page 55: Something for almost nothing:  Advances in  sublinear  time algorithms

Why so different?

• Collision statistics are all that matter• Collisions on “heavy” elements can hide collision

statistics of rest of the domain• Construct pairs of distributions where heavy

elements are identical, but “light” elements are either identical or very different

Page 56: Something for almost nothing:  Advances in  sublinear  time algorithms

Output ||p-q||1 ± need (n/log n) samples [G. Valiant P. Valiant]

Additively estimate distance?

Page 57: Something for almost nothing:  Advances in  sublinear  time algorithms

Information theoretic quantities

• Entropy• Support size

Page 58: Something for almost nothing:  Advances in  sublinear  time algorithms

Information in neural spike trails

• Each application of stimuli gives sample of signal (spike trail)

• Entropy of (discretized) signal indicates which neurons respond to stimuli

Neural signals

time

[Strong, Koberle, de Ruyter van Steveninck, Bialek ’98]

Page 59: Something for almost nothing:  Advances in  sublinear  time algorithms

Compressibility of data

Page 60: Something for almost nothing:  Advances in  sublinear  time algorithms

Can we get multiplicative approximations?

• In general, no….– 0 entropy distributions are hard to distinguish

• What if entropy is bigger?– Can g-multiplicatively approximate the entropy with Õ(n1/g2)

samples (when entropy >2g/) [Batu Dasgupta R. Kumar]– requires (n1/g2) [Valiant]– better bounds when support size is small [Brautbar

Samorodnitsky]– Similar bounds for estimating support size [Raskhodikova Ron

R. Smith] [Raskhodnikova Ron Shpilka Smith]

Page 61: Something for almost nothing:  Advances in  sublinear  time algorithms

More properties:• Independence: [Batu Fischer Fortnow Kumar R. White] • Limited Independence: [Alon Andoni Kaufman Matulef R.

Xie] [Haviv Langberg]

• K-flat distributions [Levi Indyk R.]

• K-modal distributions [Daskalakis Diakonikolas Servedio]

• Poisson Binomial Distributions [Daskalakis Diakonikolas Servedio]

• Monotonicity over general posets [Batu Kumar R.] [Bhattacharyya Fischer R. P. Valiant]

• Properties of multiple distributions [Levi Ron R.]

Page 62: Something for almost nothing:  Advances in  sublinear  time algorithms

• For many problems, we need a lot less time and samples than one might think!

• Many cool ideas and techniques have been developed

• Lots more to do!

Conclusion:

Page 63: Something for almost nothing:  Advances in  sublinear  time algorithms

Thank you!