How to Grow Your Lower Bounds Mihai P ă trașcu Tutorial, FOCS’10.

download How to Grow Your Lower Bounds Mihai P ă trașcu Tutorial, FOCS’10.

If you can't read please download the document

  • date post

    14-Dec-2015
  • Category

    Documents

  • view

    221
  • download

    6

Embed Size (px)

Transcript of How to Grow Your Lower Bounds Mihai P ă trașcu Tutorial, FOCS’10.

  • Slide 1

How to Grow Your Lower Bounds Mihai P tracu Tutorial, FOCS10 Slide 2 A Personal Story MIT freshman, 2002 half year and no solution later What problem could I work on? P vs. NP How far did you guys get, anyway? Slide 3 Partial Sums problem: (Augmented) Binary search trees: t u = t q = O(lg n) Open problem: max { t u, t q } = (lg n) (lg n)not known for any dynamic problem Maintain an array A[n] under: update(k, ): A[k] = query(k): return A[1] + + A[k] What lower bounds can we prove? Slide 4 What kind of lower bound? Memory: array of S words Word: w = (lg S) bits Unit-time operations: random access to memory +, -, *, /, %,, ==, >, ^, &, |, ~ address Mem[address] Lower bounds you can trust. TM Internal state: O(w) bits Hardware: TC 0 Slide 5 What kind of lower bound? Memory: array of S words Word: w = (lg S) bits Unit-time operations: random access to memory any function of two words (nonuniform) address Mem[address] Lower bounds you can trust. TM ell Probe Model Slide 6 Theorem: max { t u, t q } = (lg n) [P tracu, Demaine SODA04] I will give the full proof. Maintain an array A[n] under: update(k, ): A[k] = query(k): return A[1]++A[k] Slide 7 The hard instance: = random permutation for t = 1 to n: query((t)) t = random() mod 2 w update((t), t ) k time 11 22 33 44 55 66 77 88 99 10 11 12 13 14 15 16 Maintain an array A[n] under: update(k, ): A[k] = query(k): return A[1]++A[k] A[1] A[n] Slide 8 time 11 22 33 44 55 66 77 88 99 10 11 13 14 15 16 12 t = 9,,12 How can Mac help PC run ? Address and contents of cells W R W = written cells R = read cells Slide 9 time How much information needs to be transferred? 77 66 88 11 11 22 33 44 55 13 14 16 17 1+5+31+5+3 1+5+3+7+21+5+3+7+2 1+5+3+7+2 +6 +41+5+3+7+2 +6 +4 PC learns 5, 5 + 7, 5 + 6 + 7 entropy 3 words Slide 10 The general principle Message entropy w # down arrows E[down arrows] = (2k-1) Pr[ ] Pr[ ] = (2k-1) = (k) beige period mauve period * read during * written during # memory cells = (k) k updates k queries Slide 11 Putting it all together time ( n /2) ( n /4) ( n /8) total ( n lg n ) Every read instruction counted once @ lowest_common_ancestor(, ) write timeread time Slide 12 Q.E.D. The optimal solution for maintaining partial sums = binary search trees Slide 13 [Fredman, Saks STOC89] (lg n / lglg n) The hard instance: = random permutation for t = 1 to n: t = random() mod 2 w update((t), t ) query(random() mod n) What were people trying before? k time 11 22 33 44 55 66 77 88 99 10 11 12 13 14 15 A[1] A[n] query Slide 14 Build epochs of (lg n) i updates W i = cells last written in epoch i Claim: E[#cells read by query from W i ] = (1) E[t q ] = (lg n / lglg n) time 11 22 33 44 55 66 77 88 99 10 11 12 13 14 15 Epochs query Slide 15 time 11 22 33 44 55 66 77 88 99 10 11 12 13 14 15 i Focus on some epoch i x = E[#cells read by query from W i ] Generate (lg n) i random queries Epochs Slide 16 time 11 22 33 44 55 66 77 88 99 10 11 12 13 14 15 i Slide 17 time 11 22 33 44 55 66 77 88 99 10 11 12 13 14 15 i Entropy = (w lg i n) bits Possible message: W 0 W 1 W i-1 j