Taming nonconvexity: from smooth to nonsmooth problems and beyond.

February 11, 2019 - 11:15am to 12:15pm


Ju Sun
Keller 3-180
Dan Boley

ABSTRACT: Most applied problems we encounter can be naturally written as nonconvex optimization, for which obtaining a local minimizer is computationally hard in theory, never mind the global minimizer. In practice, however, simple numerical methods often work surprisingly well in finding high-quality solutions, e.g., training deep neural networks.

 In this talk, I will describe our recent effort in bridging the mysterious theory-practice gap for nonconvex optimization, in the context of solving practical problems in signal processing, machine learning, and scientific imaging. 1) I will highlight a family of smooth nonconvex problems that can be solved to global optimality using simple numerical methods, independent of initialization. 2) The discovery, however, does not cover nonsmooth functions, which are frequently used to encode structural objects (e.g., sparsity) or achieve robustness. I will introduce tools from nonsmooth analysis, and demonstrate how nonsmooth, nonconvex problems can also be analyzed and solved in a provable manner. 3) Toward the end, I will provide examples to show how innovative problem formulation and physical design can help to tame nonconvexity. 

BIO: Ju Sun is a postdoctoral scholar at Stanford University, working with Professor Emmanuel Candѐs. Prior to this, he received his Ph.D. degree from Electrical Engineering of Columbia University in 2016 (2011--2016) and B.Eng. degree in Computer Engineering (with a minor in Mathematics) from the National University of Singapore in 2008 (2004--2008). His research interests span computer vision, machine learning, numerical optimization, signal/image processing, and high-dimensional data analysis. Recently, he is particularly fascinated by why simple numerical methods often solve nonconvex problems surprisingly well (on which he maintains a bibliographic webpage: http://sunju.org/research/nonconvex/ ) and the implication on representation learning. He won the best student paper award from SPARS'15 and honorable mention of doctoral thesis for the New World Mathematics Awards (NWMA) 2017.