### FOCS 10 Tutorials: Structures in One's Mind

I missed Ketan's tutorial, but managed to attend the tutorials of Mihai and Tim, both gems.

Mihai Patrascu cast lower bounds for data structures on cell probe model (hence for general purpose computation) in the context of P vs NP. He started with the simple lower bound of \Omega(log n) on \max(t_{update},t_{query}) for the dynamic partial sums problem. He showed the whole proof by communication complexity, with a picture and succinctly stated hard case. Then he discussed extensions to the marked ancestors problem, and eventually, dynamic connectivity which leads to non-deterministic communication complexity problems. The hard instances are now 3 stage problems, and he showed connection to 3-sum problem. The lower bound in this case is like \Omega(n^{1/\eps}). In the second half, he addressed static data structure problems by starting with asymmetric communication complexity. In particular, he defined the Lopsided Disjointness problem (LSD) and showed how it can be used to show lower bounds for the partial match problem, 2d stabbing queries, range median queries, and so on. There were cute connections like 2d stabbing to butterfly graph reachability, static 2d stabbing to 1d persistent queries and so on. As the talk's end neared, one sensed there is a huge dependency graph in Mihai's mind that connected the lower bounds of these problems together!

Tim Roughgarden chose to talk about algorithmic mechanism design. He started truthful mechanisms and then presented his philosophy: designing truthful mechanisms is like working in a restricted computational model. He used the example of single parameter problems to show that instead of thinking of truthfulness (which may be a new concept to grapple with for theory CS folks), one can equivalently just think of mechanisms with monotonic allocation (which is a more familiar concept for TCS). I thought this was a great message, phrasing this classical result as a transformation in thinking from Economics to Algorithms. Ex 2 was revenue maximization for k item auction with n bidders. Bayesian view is standard and interesting in itself, but TCS folks focus on worst case equilibrium. The challenge is to set up a benchmark. Instead of comparison against per instance valuation or max_i v_i, or distribution F, he proposed revenue benchmark of max_{i\leq k} iv_i (there are some details). After the break, Tim spoke about multiparameter problems and focused on combinatorial auction. He mentioned Feiger STOC 06 that gives 2-approx for optimizing total welfare under monotone and subadditive valuation, but no good truthful approximation is known; also, very little known about revenue optimization Tim described the results known for combinatorial auctions, including the \sqrt{m} approx on welfare (do VCG on sample of limited outcome: give all good to one player or one good to each player) and its tightness from DN STOC 07. He mentioned recent results on blackbox reductions, connection between smooth analyses and mechanism design and so on. Q: David Johnson asked what is known about non- subadditive cases like spectrum auction, ex with complements. Sanjeev Arora asked how Economists respond to failures of VCG or TCS insights. Yevgeniy Dodis asked about collusions and differentially private mechanisms as alternate solutions. There were other questions about repeated games, richer interactions and models, etc. Tim is a powerful speaker with a richer working vocabulary than most TCS speakers; he distilled the maze of problems and metrics in this area into a few nuggets for TCS researchers. Curiosity: Tim did not once mention "Nash" (maybe just once during Q/A).

Both the tutorials had something in common. One sensed that in each case, the speaker had a vast structural zoo in their mind relating the various problems, and were expertly revealing only parts of it to the audience, the way a good tutorial should.

Mihai Patrascu cast lower bounds for data structures on cell probe model (hence for general purpose computation) in the context of P vs NP. He started with the simple lower bound of \Omega(log n) on \max(t_{update},t_{query}) for the dynamic partial sums problem. He showed the whole proof by communication complexity, with a picture and succinctly stated hard case. Then he discussed extensions to the marked ancestors problem, and eventually, dynamic connectivity which leads to non-deterministic communication complexity problems. The hard instances are now 3 stage problems, and he showed connection to 3-sum problem. The lower bound in this case is like \Omega(n^{1/\eps}). In the second half, he addressed static data structure problems by starting with asymmetric communication complexity. In particular, he defined the Lopsided Disjointness problem (LSD) and showed how it can be used to show lower bounds for the partial match problem, 2d stabbing queries, range median queries, and so on. There were cute connections like 2d stabbing to butterfly graph reachability, static 2d stabbing to 1d persistent queries and so on. As the talk's end neared, one sensed there is a huge dependency graph in Mihai's mind that connected the lower bounds of these problems together!

Tim Roughgarden chose to talk about algorithmic mechanism design. He started truthful mechanisms and then presented his philosophy: designing truthful mechanisms is like working in a restricted computational model. He used the example of single parameter problems to show that instead of thinking of truthfulness (which may be a new concept to grapple with for theory CS folks), one can equivalently just think of mechanisms with monotonic allocation (which is a more familiar concept for TCS). I thought this was a great message, phrasing this classical result as a transformation in thinking from Economics to Algorithms. Ex 2 was revenue maximization for k item auction with n bidders. Bayesian view is standard and interesting in itself, but TCS folks focus on worst case equilibrium. The challenge is to set up a benchmark. Instead of comparison against per instance valuation or max_i v_i, or distribution F, he proposed revenue benchmark of max_{i\leq k} iv_i (there are some details). After the break, Tim spoke about multiparameter problems and focused on combinatorial auction. He mentioned Feiger STOC 06 that gives 2-approx for optimizing total welfare under monotone and subadditive valuation, but no good truthful approximation is known; also, very little known about revenue optimization Tim described the results known for combinatorial auctions, including the \sqrt{m} approx on welfare (do VCG on sample of limited outcome: give all good to one player or one good to each player) and its tightness from DN STOC 07. He mentioned recent results on blackbox reductions, connection between smooth analyses and mechanism design and so on. Q: David Johnson asked what is known about non- subadditive cases like spectrum auction, ex with complements. Sanjeev Arora asked how Economists respond to failures of VCG or TCS insights. Yevgeniy Dodis asked about collusions and differentially private mechanisms as alternate solutions. There were other questions about repeated games, richer interactions and models, etc. Tim is a powerful speaker with a richer working vocabulary than most TCS speakers; he distilled the maze of problems and metrics in this area into a few nuggets for TCS researchers. Curiosity: Tim did not once mention "Nash" (maybe just once during Q/A).

Both the tutorials had something in common. One sensed that in each case, the speaker had a vast structural zoo in their mind relating the various problems, and were expertly revealing only parts of it to the audience, the way a good tutorial should.

Labels: aggregator

## 0 Comments:

Post a Comment

<< Home