ELE538B: Sparsity, Structure and Inference

Yuxin Chen, Princeton University, Spring 2017

The term project can either be a literature review or include original research:

  • Literature review. We will provide a list of related papers not covered in the lectures, and the literature review should involve in-depth summaries and exposition of one of these papers.

  • Original research. It can be either theoretic or experimental (ideally a mix of the two), with approval from the instructor. If you choose this option, you can do it either individually or in groups of two. You are encouraged to combine your current research with your term project.

There are 3 milestones / deliverables to help you through the process.

  1. Proposal (due Mar. 17). Submit a short report (no more than 1 page) stating the papers you plan to survey or the research problems that you plan to work on. Describe why they are important or interesting, and provide some appropriate references. If you elect to do original research, please do not propose an overly ambitious project that cannot be completed by the end of the semester, and do not be too lured by generality. Focus on the simplest scenarios that can capture the issues you’d like to address.

  2. In-class presentation. Prepare an oral presentation with slides (the exact time will depend on the number of projects in the class). Focus on high-level ideas, and leave most technical details to your report.

  3. A written report (due May. 14). You are expected to submit a final project report—up to 4 pages with unlimited appendix—summarizing your findings / contributions. You must turn in a hard copy of your report to my office (you can slip it under the door of my office), as well as an electronic copy to my email for our records.

A few suggested (theoretical) papers for literature review

  1. ‘‘Universality laws for randomized dimension reduction, with applications,’’ S. Oymak, and Joel A. Tropp, arXiv preprint arXiv:1511.09433, 2015.

  2. ‘‘Phase transitions in semidefinite relaxations,’’ A. Javanmard, A. Montanari, and F. Ricci-Tersenghi, Proceedings of the National Academy of Sciences, 2016.

  3. ‘‘The convex geometry of linear inverse problems,’’ V. Chandrasekaran, B. Recht, P. Parrilo, and A. Willsky, Foundations of Computational mathematics, 2012.

  4. ‘‘Super-resolution of positive sources: the discrete setup,’’ V. Morgenshtern, and E. J. Candes, SIAM Journal on Imaging Sciences, 2015.

  5. ‘‘Construction of a large class of deterministic sensing matrices that satisfy a statistical isometry property,’’ R. Calderbank, H. Stephen, S. Jafarpour, IEEE Journal of Selected Topics in Signal Processing, 2010.

  6. ‘‘One‐bit compressed sensing by linear programming,’’ Y. Plan and R. Vershynin, Communications on Pure and Applied Mathematics, 2013.

  7. ‘‘The Dantzig selector: Statistical estimation when p is much larger than n,’’ E. Candes and T. Tao, The Annals of Statistics, 2007.

  8. ‘‘Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery,’’ C. Mu, B. Huang, J. Wright, D. Goldfarb, International Conference on Machine Learning, 2014.

  9. ‘‘On the Optimization Landscape of Tensor Decompositions,’’ R. Ge and T. Ma, 2016.

  10. ‘‘SLOPE is adaptive to unknown sparsity and asymptotically minimax,’’ W. Su and E. Candes, The Annals of Statistics, 2016.

  11. ‘‘Spectral methods meet EM: A provably optimal algorithm for crowdsourcing,’’ Y. Zhang, X. Chen, D. Zhou, and M. Jordan, Advances in Neural Information Processing Systems, 2014.

  12. ‘‘Budget-optimal crowdsourcing using low-rank matrix approximations,’’ D. Karger, S. Oh, and D. Shah, Allerton Conference on Communication, Control, and Computing, 2011.

  13. ‘‘Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima’’, P. Loh and M. Wainwright, Advances in Neural Information Processing Systems, 2013.

  14. ‘‘Phase transitions of spectal initialization for high-dimensional nonconvex estimation,’’ Y. Lu and G. Li, arXiv preprint arXiv:/1702.06435, 2017.

  15. ‘‘The landscape of empirical risk for non-convex losses,’’ S. Mei, Y. Bai, and A. Montanari, arXiv preprint arXiv:/1607.06534, 2016.

You have the freedom to select a paper of your own interest (especially more practical papers), upon the instructor's approval.