28
Communication with Imperfect Shared Randomness (Joint work with Venkatesan Guruswami (CMU), Raghu Meka (UCLA) and Madhu Sudan (MSR)) Who? Cl´ ement Canonne (Columbia University) When? January 12, 2015 1 / 14

Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication with Imperfect SharedRandomness

(Joint work with Venkatesan Guruswami (CMU), RaghuMeka (UCLA) and Madhu Sudan (MSR))

Who? Clement Canonne (Columbia University)

When? January 12, 2015

1 / 14

Page 2: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityThere is a world outside of n

Context There is Alice, Bob, what they communicate and whatthey don’t have to.

Thef : 0, 1n × 0, 1n → 0, 1 ,

they compute; the protocol

Π

they use; from which

Dx ,Dy

their inputs come; what is blue and what red means.

2 / 14

Page 3: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityThere is a world outside of n

Context There is Alice, Bob, what they communicate and whatthey don’t have to.The

f : 0, 1n × 0, 1n → 0, 1 ,

they compute; the protocol

Π

they use; from which

Dx ,Dy

their inputs come; what is blue and what red means.

2 / 14

Page 4: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityBut context is not perfect. . .

Noise, misun-derstandings,

falseassumptions

Context is almost never perfectly shared.

My periwinkle is your orchid.

the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.

what on Earth is a “French baguette” in New York?!

This talk: Shared randomness in communicationcomplexity as “context”

3 / 14

Page 5: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityBut context is not perfect. . .

Noise, misun-derstandings,

falseassumptions

Context is almost never perfectly shared.

My periwinkle is your orchid.

the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.

what on Earth is a “French baguette” in New York?!

This talk: Shared randomness in communicationcomplexity as “context”

3 / 14

Page 6: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityBut context is not perfect. . .

Noise, misun-derstandings,

falseassumptions

Context is almost never perfectly shared.

My periwinkle is your orchid.

the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.

what on Earth is a “French baguette” in New York?!

This talk: Shared randomness in communicationcomplexity as “context”

3 / 14

Page 7: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityBut context is not perfect. . .

Noise, misun-derstandings,

falseassumptions

Context is almost never perfectly shared.

My periwinkle is your orchid.

the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.

what on Earth is a “French baguette” in New York?!

This talk: Shared randomness in communicationcomplexity as “context”

3 / 14

Page 8: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityBut context is not perfect. . .

Noise, misun-derstandings,

falseassumptions

Context is almost never perfectly shared.

My periwinkle is your orchid.

the printer on the 5th floor of Columbia is not exactlythe model my laptop has a driver for.

what on Earth is a “French baguette” in New York?!

This talk: Shared randomness in communicationcomplexity as “context”

3 / 14

Page 9: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Communication & ComplexityRecall: Randomness in Communication Complexity

Equality testing I have x ∈ 0, 1n, you have y ∈ 0, 1n, are they equal?

Complexity?

Deterministic det(EQ) = Θ(n)

Private randomness private(EQ) = Θ(log n)

Shared randomness psr(EQ) = O(1)

(Recall Newman’s Theorem:

private(P) ≤ psr(P) + O(log n).)

4 / 14

Page 10: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

This workRandomness and uncertainty

ISR(Imperfectly

SharedRandomness)

What if the randomness (“context”) was not perfectlyin sync?

To compute f (x , y):

Alice: has access to r ∈ ±1∗, gets input x ∈ 0, 1n

Bob: has access to s ∈ ±1∗, gets input y ∈ 0, 1n

w/ r ∼ρ s: s is obtained by perturbing each bit of rindependently

Studied (independently) by [BGI14] (different focus:“referee model”; more general correlations).

5 / 14

Page 11: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

This workRandomness and uncertainty

ISR(Imperfectly

SharedRandomness)

What if the randomness (“context”) was not perfectlyin sync?

To compute f (x , y):

Alice: has access to r ∈ ±1∗, gets input x ∈ 0, 1n

Bob: has access to s ∈ ±1∗, gets input y ∈ 0, 1n

w/ r ∼ρ s: s is obtained by perturbing each bit of rindependently

Studied (independently) by [BGI14] (different focus:“referee model”; more general correlations).

5 / 14

Page 12: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

ISR: general relations

For every P with x , y ∈ 0, 1n and 0 ≤ ρ ≤ ρ′ ≤ 1,

psr(P) ≤ isrρ′(P) ≤ isrρ(P)

≤ private(P) ≤ psr(P) + O(log n).

(also true for one-way: psrow, isrowρ , privateow)

but for many problems, log n is already huge.

6 / 14

Page 13: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Rest of the talk

1 A first example: the Compression problem

2 General upperbound on ISR in terms of PSR

3 Strong lower bound: Alice, Bob, Charlie and Dana.

7 / 14

Page 14: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

First result: Uncertain Compression

Compressionwith uncertain

priors

Alice has P, gets m ∼ P; Bob knows Q ' P, wants m.

Previous work

P = Q H(P) (Huffman coding)

P '∆ Q H(P) + 2∆ [JKKS11] (w/ shared randomness)

P '∆ Q O(H(P) + ∆ + log log N) [HS14] (deterministic)

This work For all ε > 0,isrowρ (Compress∆) ≤ 1+ε

1−h( 1−ρ2

)(H(P) + 2∆ + O(1))

“natural protocol”

8 / 14

Page 15: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

First result: Uncertain Compression

Compressionwith uncertain

priors

Alice has P, gets m ∼ P; Bob knows Q ' P, wants m.

Previous work

P = Q H(P) (Huffman coding)

P '∆ Q H(P) + 2∆ [JKKS11] (w/ shared randomness)

P '∆ Q O(H(P) + ∆ + log log N) [HS14] (deterministic)

This work For all ε > 0,isrowρ (Compress∆) ≤ 1+ε

1−h( 1−ρ2

)(H(P) + 2∆ + O(1))

“natural protocol”

8 / 14

Page 16: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

General upperboundIt’s inner products all the way down!

Theorem ∀ρ > 0, ∃c <∞ such that ∀k, we have

PSR(k) ⊆ ISRowρ (ck) .

Proof. (Outline)

Define GapInnerProduct, “complete” for PSR(k)(see strategies as XR ,YR0, 12k

; use Newman’s Theorem to

bound # R’s);

Show there exists a (Gaussian-based) isr protocol forGapInnerProduct, with Oρ(4k) bits of comm.

9 / 14

Page 17: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

General upperboundCan we do better?

For problems inPSRow(k)?

PSRow(k) ⊆ ISRowρ (co(k))?

For ISRρ? PSR(ω(k)) ⊆ ISRρ(ck)?

Answer No.

10 / 14

Page 18: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

General upperboundCan we do better?

For problems inPSRow(k)?

PSRow(k) ⊆ ISRowρ (co(k))?

For ISRρ? PSR(ω(k)) ⊆ ISRρ(ck)?

Answer No.

10 / 14

Page 19: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Strong converse: lower boundIt’s as good as it gets.

Theorem ∀k, ∃P = (Pn)n∈N s.t. psrow(P) ≤ k, yet ∀ρ < 1

isrρ(P) = 2Ωρ(k) .

Proof. (High-level)

Define SparseGapInnerProduct, relaxation ofGapInnerProduct.

Show it has as O(log q)-bit one-way psr protocol (Alice

uses the shared randomness to send one coordinate to Bob)

isr lower bound: argue that for any (fixed)* strategy ofAlice and Bob using less that

√q bits, either (a)

something impossible happens in the Boolean world, or(b) something impossible happens in the Gaussian world.

11 / 14

Page 20: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Strong converse: lower boundTwo-pronged impossibility: more on the prongs.

Case (a) The strategies (fr , gs)r ,s have commonhigh-influence variable: Charlie and Danacan use Alice and Bob’s protocol to distillrandomness.

Case (b) If no such variable: with a new InvariancePrinciple1, Charlie and Dana can applyAlice and Bob’s protocol “transferred tothe Gaussian world” to solve (yet another)problem, Gaussian Inner Product.

1In the spirit of [Mos10]12 / 14

Page 21: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Conclusions

Summary

Dealing with more realistic situations: Alice, Bob, andwhat they do not know about each other;

comes into play when n is huge (Newman’s Theorembecomes loose);

show general and tight relations and reductions in thismodel, with both upper and lower bounds.

a new invariance theorem, and use in comm. complexity.

What about. . .

more general forms of correlations?

cases where even randomness is expensive? (minimizeits use)

one-sided error?

13 / 14

Page 22: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Conclusions

Summary

Dealing with more realistic situations: Alice, Bob, andwhat they do not know about each other;

comes into play when n is huge (Newman’s Theorembecomes loose);

show general and tight relations and reductions in thismodel, with both upper and lower bounds.

a new invariance theorem, and use in comm. complexity.

What about. . .

more general forms of correlations?

cases where even randomness is expensive? (minimizeits use)

one-sided error?

13 / 14

Page 23: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Thank you.(Questions?)

14 / 14

Page 24: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Mohammad Bavarian, Dmitry Gavinsky, and Tsuyoshi Ito.

On the role of shared randomness in simultaneous communication.In Javier Esparza, Pierre Fraigniaud, Thore Husfeldt, and Elias Koutsoupias, editors,ICALP (1), volume 8572 of Lecture Notes in Computer Science, pages 150–162. Springer,2014.

Andrej Bogdanov and Elchanan Mossel.

On extracting common random bits from correlated sources.CoRR, abs/1007.2315, 2010.

Venkatesan Guruswami, Johan Hastad, Rajsekar Manokaran, Prasad Raghavendra, and

Moses Charikar.Beating the random ordering is hard: Every ordering CSP is approximation resistant.SIAM J. Comput., 40(3):878–914, 2011.

Oded Goldreich, Brendan Juba, and Madhu Sudan.

A theory of goal-oriented communication.Journal of the ACM, 59(2):8:1–8:65, May 2012.

Elad Haramaty and Madhu Sudan.

Deterministic compression with uncertain priors.In Proceedings of the 5th Conference on Innovations in Theoretical Computer Science,ITCS ’14, pages 377–386, New York, NY, USA, 2014. ACM.

Brendan Juba, Adam Tauman Kalai, Sanjeev Khanna, and Madhu Sudan.

Compression without a common prior: an information-theoretic justification for ambiguityin language.In Bernard Chazelle, editor, ICS, pages 79–86. Tsinghua University Press, 2011.

Elchanan Mossel.

Gaussian bounds for noise correlation of functions.Geometric and Functional Analysis, 19(6):1713–1756, 2010.

1 / 4

Page 25: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Theorem (OurInvariancePrinciple)

Fix any two parameters p1, p2 ∈ (−1, 1). For all ε ∈ (0, 1],` ∈ N, θ0 ∈ [0, 1), and closed convex sets K1,K2 ⊆ [0, 1]`

there exist τ > 0 and mappings

T1 : f : +1,−1n → K1 → F : Rn → K1T2 : g : +1,−1n → K2 → G : Rn → K2

such that for all θ ∈ [−θ0, θ0], if f , g satisfy

maxi∈[n]

min

(maxj∈[`]

Inf i (d)fj ,maxj∈[`]

Inf i (d)gj

)≤ τ

then, for F = T1(f ) and G = T2(g), we havewhere N = Np1,p2,θ and G is the Gaussian distribution whichmatches the first and second-order moments of N.

1 / 4

Page 26: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Theorem(Invariance

Theorem of[GHM+11])

Let (Ω, µ) be a finite prob. space with each prob. at leastα ≤ 1/2. Let b = |Ω| and L = χ0 = 1, χ1, χ2, . . . , χb−1 bea basis for r.v.’s over Ω. Let Υ = ξ0 = 1, ξ1, . . . , ξb−1 bean ensemble of real-valued Gaussian r.v.’s with 1st and 2nd

moments matching those of the χi ’s; andh = (h1, h2, . . . , ht) : Ωn → Rt s.t.

Inf i (h`) ≤ τ, Var(h`) ≤ 1

for all i ∈ [n] and ` ∈ [t]. For η ∈ (0, 1), let H` (` ∈ [t]) bethe multilinear polynomial associated with T1−ηh` w.r.t. L.If Ψ: Rt → R is Λ-Lipschitz (w.r.t. the L2-norm), then∣∣∣∣E[Ψ(H1(Ln), · · · ,Ht(Ln)

)]−E[Ψ(H1(Υn), · · · ,Ht(Υn)

)]∣∣∣∣≤ C (t) · Λ · τ

η18 log 1

α = oτ (1)

for some constant C = C (t).

2 / 4

Page 27: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Strong converse: lower boundTwo-pronged impossibility, first prong.

Case (a) The strategies (fr , gs)r ,s have common high-influence variable(recall the one-way psr protocol).But then, two players Charlie and Dana can* leverage thisstrategies to win an agreement distillation game:

Definition(Agreementdistillation)

Charlie and Dana have no inputs. Their goal is to output wC

and wD satisfying:

(i) Pr[ wC = wD ] ≥ γ;

(ii) H∞(wC ),H∞(wD) ≥ κ.

But this requires Ω(κ)− log(1/γ) bits of communication (via[BM10, Theorem 1]).

3 / 4

Page 28: Communication with Imperfect Shared Randomnessccanonne/files/talks/isr-2015-01-12-itcs.pdf · 12/01/2015  · Venkatesan Guruswami, Johan H astad, Rajsekar Manokaran, Prasad Raghavendra,

Strong converse: lower boundTwo-pronged impossibility, second prong.

Case (b) fr : 0, 1n → KA ⊂ [0, 1]2k, gs : 0, 1n → KB ⊂ [0, 1]2

k

have no common high-influence variable.We then show that this implies k = 2Ω(

√q), by using an

Invariance Principle (in the spirit of [Mos10]) to “go tothe Gaussian world”: if f , g are low-degree polynomialswith no common influential variable, then

E(x ,y)∼N⊗n [〈f (x), g(y)〉] ' E(X ,Y )∼G⊗n [〈F (X ),G (Y )〉]

and Charlie and Dana can use this solve (yet another)problem, the Gaussian Inner Product(GaussianCorrelationξ).

But. . . a reduction to Disjointness shows that (even withpsr) this requires Ω(1/ξ) bits of communication.

4 / 4