Algorithms Seminar
Sparse Newton's Method
Speaker:  Yu Cheng

Date: 
Thursday, March 23, 2017 
Time: 
11:30am  12:30pm 
Location: 
D344 LSRC, Duke 

Note Time Change 


Abstract
Newton’s method provides a powerful framework for designing efficient and parallel algorithms. However, when applied to largescale matrices, the intermediate matrices quickly become dense, even when the input matrix is sparse. In this talk, I will present an approach that uses spectral sparsification to keep all intermediate matrices sparse while preserving the effectiveness of Newton’s method. As building blocks and applications of our sparse Newton’s method, we gave the first nearlylinear time algorithms for problems in machine learning, numerical analysis, and spectral graph theory: (1) sampling Gaussian graphical models with symmetric diagonally dominant precision matrices, (2) solving the matrix roots problem for graph Laplacians, and (3) spectral sparsification of randomwalk matrix polynomials.
This is based on joint work with Dehua Cheng, Yan Liu, Richard Peng and ShangHua Teng.
Biography
Yu Cheng is a final year Ph.D. student in the Computer Science Department at the University of Southern California (USC), advised by ShangHua Teng. He is interested in the area of theoretical computer science, focusing on algorithmic game theory and spectral graph theory.