Algorithms Seminar

Matrix Completion has No Spurious Local Minimum

Speaker:Rong Ge
Date: Thursday, September 22, 2016
Time: 12:00pm - 1:00pm
Location: North 311, Duke

Abstract

Matrix completion is a basic machine learning problem that has wide applications, especially in collaborative filtering and recommender systems. Simple non-convex optimization algorithms are popular and effective in practice. Despite recent progress in proving various non-convex algorithms converge from a good initial point, it remains unclear why random or arbitrary initialization suffices in practice. We prove that the commonly used non-convex objective function for matrix completion has no spurious local minima -- all local minima must also be global. Therefore, many popular optimization algorithms such as (stochastic) gradient descent can provably solve matrix completion with arbitrary initialization in polynomial time. This is joint work with Jason D. Lee and Tengyu Ma.

Biography

Rong Ge is an assistant professor at Duke University. He obtained his Ph.D. at Princeton University, advised by Sanjeev Arora, and did a postdoc at Microsoft Research, New England before coming to Duke. He is interested in applying techniques from theoretical computer science to problems in machine learning, with the hope of developing new algorithms with provable guarantees.