Course Syllabus

For many real world computational tasks there is a significant gap between our theoretical and practical understanding. While the theory of NP-completeness and worst case analysis tells us that many interesting computational problems are NP-hard in the worst case, practitioners in
areas like machine learning and computer vision have made significant progress in solving such theoretically hard problems. Bridging this gap is a significant challenge for modern day theoretical computer science.

In this course, we will study systematically alternatives to worst-case analysis that nevertheless enable rigorous and robust guarantees on algorithm performance. The course will explore recent attempts using well-motivated non-worst-case approaches to bridge this disconnect between theory and practice, and also address some of the open research directions along these lines.  

Some of the topics that will be covered include:

1. Average-case analysis and planted models for NP-hard optimization problems
2. Semi-random models and robust probabilistic models.
3. Smoothed analysis
4. Implicit structural assumptions for clustering and partitioning
5. Sparse recovery problems, and incoherence assumptions
6. Success of iterative algorithms and heuristics.  

 

Suggestions for student paper presentations and projects: 

Smoothed Analysis:

  1. Tensor decompositions: https://arxiv.org/abs/1311.3651
  2. Local Max-Cut: https://arxiv.org/abs/1610.04807

 

Average-case analysis:

  1. Semi-random graph partitioning: https://arxiv.org/abs/1205.2234 or https://arxiv.org/abs/1406.5665
  2. Semi-random correlation clustering: https://arxiv.org/abs/1406.5667
  3. Semi-random independent set/ clique: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.35.6813

 

Average-case clustering:

  1. Method-of-moments for Learning mixtures of Gaussians: https://arxiv.org/abs/1004.4223
  2. Isotropic PCA: https://arxiv.org/abs/0804.3575

 

 

Matrix Completion: https://arxiv.org/abs/1011.1518

Subspace clustering: https://statweb.stanford.edu/~candes/papers/RobustSubspaceClustering.pdf

Dictionary Learning: https://arxiv.org/pdf/1308.6273.pdf and https://arxiv.org/abs/1503.00778

 

High-dimensional Robust estimators: https://arxiv.org/abs/1604.06443 ,

Learning with untrusted data: https://arxiv.org/abs/1611.02315

You are also welcome to pick a different paper not in this list, after discussion with the instructor. 

 

Course Summary:

Date Details Due