Simple statistical tasks can be hard on average.
Consider the problem of recovering a rank 1 tensor of order k that has been subject to Gaussian noise. We will begin by reviewing results surrounding the statistical limits of maximum likelihood estimation for this problem and discuss an geometric analogue of the well-known BBP phase transition from the matrix setting. We then discuss recent analyses of the behavior of this problem from an optimization perspective. While the threshold for estimation occurs at a finite signal-to-noise ratio, it is expected that one needs a polynomially diverging signal-to-noise ratio to be able to do so efficiently. We present a recent study of the thresholds for efficient recovery for a simple family of algorithms, Langevin dynamics and gradient descent, to better understand the mechanism for this diverging statistical-to-computational gap. I will report on recent works with Ben Arous-Gheissari on the algorithmic threshold and Lopatto-Miolane on the statistical threshold.