|Location:||MSRI: Simons Auditorium|
Modern data analysis and signal processing problems are often attacked by iterative algorithms that aim at estimating the values of thousands to millions of unknown parameters. Understanding the dynamics of such algorithms, and designing optimal ones is an ongoing effort across computer science, applied mathematics and
Within classical optimization theory, the problem instance is regarded as deterministic or adversarial. In contrast, for modern applications it often makes sense to assume that data is random. A precise understanding of the interplay between dynamics and randomness is largely lacking.
I will discuss some examples in this area, and then focus on a specific class of low-complexity algorithms for which interesting scaling limits seem to arise.
No Notes/Supplements Uploaded
No Video Files Uploaded
[Based on joint work with David Donoho, Arian Maleki, Mohsen Bayati, Marc Lelarge, Adel Javanmard, Yash Deshpande]