**Instructor:** Ahmed El Alaoui

**Email:** elalaoui at cornell dot edu

**Meeting time:** Tuesday-Thursday 2:45pm-4pm (virtual)

**Office hours:** Wednesday 3-5pm, or by appointment.

**Description:** We will be interested in the mathematics behind the problem of detecting and extracting a low-dimensional structure embedded in large amounts of noise.
We will consider a few instructive models, and try to understand when this problem is feasible, first information-theoretically, and then algorithmically via an efficient procedure.
We will see that these questions admit sharp characterizations, often reflected in phase transition phenomena in the high-dimensional limit.
Our main goal will be to understand the mathematical techniques allowing such precise characterizations and get a view of the outstanding open problems.
We will devote the first half of the course to an exposition of the foundations.
We will cover some research papers in the second half through presentations by students.

The format of this class is inspired by a class taught by Nike Sun at MIT.

A tentative list of topics includes:

Estimation and detection in the rank-one spiked Wigner model:

The replica-symmetric formula for the mutual information. The optimal overlap. The reconstruction threshold.

The interpolation method, and the role of side information.

Behavior of the likelihood ratio. The second moment method. The cavity method.

Contiguity and Gaussian fluctuations below the reconstruction threshold.

Computationally efficient procedures:

Spectral methods, the BBP transition.

Approximate message passing.

Linear spectral statistics for hypothesis testing.

Variational inference. Accuracy of the mean-field approximation.

Algorithmic lower bounds / evidence of computational hardness:

The low-degree likelihood ratio method.

Lower bounds via the phenomenon of overlap gap.

Sampling from high-dimensional distributions:

Log-concave measures and accompanying theory.

Ising measures and the SK measure.

Stochastic block models and reconstruction on trees.

Reconstruction on general graphs and information-percolation bounds.

...

**Evaluation:** Students will be asked to work on a small project in groups of at most two, present in one of the lectures, and return a graded homework.
A successful project can either consist in understanding and distilling the main ideas behind a particular result, applying a known method to a new problem, or attempting to push the limits of a particular technique.

**Prerequisites:** A graduate or advanced undergraduate level course in probability and real analysis is highly recommended.

Notes in tex (so far).

Lecture 1 (02/09): Introduction. The additive Gaussian noise channel. Posterior measure. I-MMSE relation. Notes. Video.

Lecture 2 (02/11): Capacity of the Gaussian channel. The gaussian mean location problem (aka the needle in a haystack problem). Notes. Video.

Lecture 3 (02/16): The rank-one spiked Wigner model. The replica symmetric formula for the free energy. Notes. Video.

Lecture 5 (02/23): Proof of the RS formula II. Guerra's interpolation method + constrained overlap. Notes. Video.

March 03/09: No class (Wellness day).

Lecture 13 (03/25): Binary hypothesis testing. Likelihood ratio, contiguity and second moment method. Notes. Video.

Lecture 14 (03/30): Hypothesis testing cont'd. Spike detection in the spiked Wigner model. Notes. Video.

April 13: No class (ITW conference).

Student presentation:

Lecture 19 (04/20): “Optimality and suboptimality of PCA in spiked random matrix models” by Samriddha Lahiry.

Lecture 20 (04/22): “Asymptotic normality and analysis of variance of log-likelihood ratios in spiked random matrix models” by Ritwik Sadhu.

Lecture 21 (04/27): “Information-theoretic bounds and phase transitions in clustering, sparse PCA, and submatrix localization” by Xumei Xi.

Lecture 22 (04/29): “Computational hardness of certifying bounds on constrained PCA” Liwei Jiang.

Lecture 23 (05/04): “The lasso risk for Gaussian matrices” by Lijun Ding.

Lecture 24 (05/06): “Average-case integrality gap for non-negative principal component analysis” by Yandi Shen.

You can use the following list to pick a project and a subject for your talk.

**The rank-one spiked Wigner model - estimation:**

Léo Miolane's PhD thesis.

*Fundamental limits of inference: A statistical physics approach*, 2019.

M. Lelarge, L. Miolane.

*Fundamental limits of symmetric low rank matrix estimation*, 2016.

Y. Deshpande, E. Abbe, A. Montanari.

*Asymptotic mutual information for the binary stochastic block model*, 2015.

J. Barbier, N. Macris.

*The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference*, 2017.

**The rank-one spiked Wigner model - detection:**

A. Perry, A. S. Wein, A. S. Bandeira, and A. Moitra.

*Optimality and sub-optimality of PCA in spiked random matrix models*, 2018.

A. Montanari, D. Reichman, and O. Zeitouni.

*On the limitation of spectral methods: from the Gaussian hidden clique problem to rank one perturbations of Gaussian tensors*, 2014.

D. Banerjee, Z. Ma.

*Asymptotic normality and analysis of variance of log-likelihood ratios in spiked random matrix models*, 2018.

J. Banks, C. Moore, N. Verzelen, R. Vershynin, J. Xu.

*Information-theoretic bounds and phase transitions in clustering, sparse PCA, and submatrix localization*, 2017.

A. El Alaoui, F. Krzakala, M. Jordan.

*Fundamental limits of detection in the spiked Wigner model*, 2018.

H. W. Chung, J.O. Lee.

*Weak detection of signal in the spiked Wigner model*, 2018.

**Behavior of eigenvalues and eigenvectors:**

F. Benaych-Georges, R. R. Nadakuditi.

*The eigenvalues and eigenvectors of finite, low rank perturbations of large random matrices*, 2010.

G. Anderson, A. Guionnet, O. Zeitouni.

*An introduction to random matrices*, 2010 (textbook).

**Approximate message passing:**

E. Bolthausen.

*An iterative construction of solutions of the TAP equations for the Sherrington-Kirkpatrick model*, 2012.

M. Bayati, A. Montanari.

*The dynamics of message passing on dense graphs, with applications to compressed sensing*, 2010.

Y. Deshpande, A. Montanari.

*Information-theoretically optimal sparse PCA*, 2014.

A. Javanmard, A. Montanari.

*State evolution for general approximate message passing algorithms, with applications to spatial coupling*, 2012.

** Pinning, simple decompositions of measures, and accuracy of the mean-field approximation:**

A. Montanari.

*Estimating random variables from random sparse observations*, 2007.

R. Eldan.

*Taming correlations through entropy-efficient measure decompositions with applications to mean-field approximation*, 2018.

V. Jain, F. Koehler, A. Risteski.

*Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective*, 2018.

Z. Fan, S. Mei, A. Montanari.

*TAP free energy, spin glasses, and variational inference*, 2018.

** Sampling from log-concave measures:**

A. Dalalyan.

*Theoretical guarantees for approximate sampling from smooth and log-concave densities*, 2014.

A. Dalalyan.

*Further and stronger analogy between sampling and optimization: Langevin Monte Carlo and gradient descent*, 2017.

A. Durmus, E. Moulines.

*High-dimensional Bayesian inference via the unadjusted Langevin algorithm*, 2016.

X. Cheng, N. Chatterji, P. Bartlett, M. Jordan.

*Underdamped Langevin MCMC: A non-asymptotic analysis*, 2018.

S. Bubeck, R. Eldan, J. Lehec.

*Sampling from a log-concave distribution with projected Langevin Monte Carlo*, 2016.

** Sampling from the SK measure:**

R. Bauerschmidt, T. Bodineau.

*A very simple proof of the LSI for high temperature spin systems*, 2018.

R. Eldan, F. Koehler, O. Zeitouni.

*A spectral condition for spectral gap: fast mixing in high-temperature Ising models*, 2020.

** Evidence of hardness: the low-degree likelihood ratio method and overlap gaps:**

D Kunisky, A. S. Wein, A. S. Bandeira.

*Notes on Computational Hardness of Hypothesis Testing: Predictions using the Low-Degree Likelihood Ratio*, 2019.

T. Schramm, A. S. Wein.

*Computational barriers to estimation from low-degree polynomials*, 2020.

D. Gamarnik, A. Jagannath.

*The Overlap Gap Property and Approximate Message Passing Algorithms for p-spin models*, 2019.

D. Gamarnik, I. Zadik.

*High-dimensional regression with binary coefficients. Estimating squared error and a phase transition*, 2017.

A. S. Bandeira, D Kunisky, A. S. Wein.

*Computational hardness of certifying bounds on constrained PCA problems*, 2019.

A. S. Bandeira, D Kunisky, A. S. Wein.

*Average-case integrality gap for non-negative principal component analysis*, 2020.

G. Ben Arous, A. S. Wein, I. Zadik.

*Free energy wells and overlap gap property in sparse PCA*, 2020.

** The stochastic block model and broadcasting / reconstruction on trees:**

W. Evans, C. Kenyon, Y. Peres, and L. J. Schulman.

*Broadcasting on trees and the Ising model*, 2000.

M. Mézard, A. Montanari.

*Reconstruction on trees and spin glass transition*, 2006.

A. Sly.

*Reconstruction for the Potts model*, 2011.

E. Mossel, J. Neeman, A. Sly.

*Reconstruction and estimation in the planted partition model*, 2012. (Lower bound)

E. Mossel, J. Neeman, A. Sly.

*A proof of the block model threshold conjecture*, 2013. (Upper bound)

E. Mossel, J. Neeman, A. Sly.

*Belief propagation, robust reconstruction and optimal recovery of block models*, 2013. (Optimal overlap)

L. Massoulié.

*Community detection thresholds and the weak Ramanujan property*, 2013.

J. Banks, C. Moore, J. Neeman, P. Netrapalli.

*Information-theoretic thresholds for community detection in sparse networks*, 2016.

** Beyond locally tree-like graphs, and information-percolation bounds:**

E. Abbé, L. Massoulié, A. Montanari, A. Sly, N. Srivastava.

*Group synchronization on grids*, 2017.

E. Abbé, E. Boix.

*An information-percolation bound for spin synchronization on general graphs*, 2018.

Y. Polyanskiy and Y. Wu.

*Application of information-percolation method to reconstruction problems on graphs*, 2018.

A. El Alaoui, A. Montanari.

*On the computational tractability of statistical estimation on amenable graphs*, 2019.