Selected Publications of ADSI

2017 Survey

The Many Faces of Degeneracy in Conic Optimization.
Dmitriy Drusvyatskiy, Henry Wolkowicz.
Foundations and Trends in Optimization, Vol. 3, No. 2, pp 77-170, 2017..
This work surveys the roles of Slater's condition and the facial reduction technique in conic optimization.

The proximal point method revisited.
Dmitriy Drusvyatskiy.
Survey submitted to SIAG/OPT Views and News.
This short survey revisits the role of the proximal point method in large scale optimization, focusing on three recent examples: a proximally guided stochastic subgradient method, the prox-linear algorithm, and Catalyst generic acceleration.

2017 Paper

Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice.
Hongzhou Lin, Julien Mairal, Zaid Harchaoui.
arXiv abs/1712.05654.
The paper presents the comprehensive theory and practice of the Catalyst acceleration scheme for accelerating gradient-based convex optimization methods.
Catalyst applies to a large class of optimization algorithms, including gradient descent, block coordinate descent, incremental algorithms such as SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants.

Invariances and Data Augmentation for Supervised Music Transcription.
John Thickstun, Zaid Harchaoui, Dean Foster, Sham M. Kakade.
arXiv abs/1711.04845.
This paper presents the winning entry to the 2017 MIREX Multiple Fundamental Frequency Estimation evaluation. The method is based on a translation-invariant convolutional architecture.

The nonsmooth landscape of phase retrieval.
Damek Davis, Dmitriy Drusvyatskiy, Courtney Paquette.
arXiv abs/1711.03247.
This paper shows that the standard Polyak subgradient method converges linearly for the phase retrieval problem, when initialized within a constant relative distance of the underlying signal. The second part of the paper analyzes concentration of stationary points of the subsampled objective.

An homotopy method for \(\ell_p\) regression provably beyond self-concordance and in input-sparsity time.
S├ębastien Bubeck, Michael B. Cohen, Yin Tat Lee, Yuanzhi Li.
arXiv abs/1711.01328.
Solving two long open problems about $\ell_p$ ball:
1. The existence of input sparsity time algorithm for lp regression.
2. The inexistence of good self-concordance barrier.

k-server via multiscale entropic regularization.
S├ębastien Bubeck, Michael B. Cohen, James R. Lee, Yin Tat Lee, Aleksander Madry.
arXiv abs/1711.01085. [Blog]
A breakthough in the k-server problem. The first o(k)-competitive algorithm for hierarchically separated trees.

Convergence Rate of Riemannian Hamiltonian Monte Carlo and Faster Polytope Volume Computation.
Yin Tat Lee, Santosh S. Vempala.
arXiv abs/1710.06261.
A quadratic improvement of polytope volume computation!

The Paulsen Problem, Continuous Operator Scaling, and Smoothed Analysis.
Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Akshay Ramachandran.
arXiv abs/1710.02587.
Resolving the Paulsen problem - a basic open problem in frame theory with applications from signal processing, communication theory and theoretical computer science.

Catalyst Acceleration for Gradient-Based Non-Convex Optimization.
Hongzhou Lin, Julien Mairal, Zaid Harchaoui.
arXiv abs/1703.10993.
The paper presents the extension of the Catalyst acceleration scheme for gradient non-convex optimization. When the objective is convex, 4WD-Catalyst enjoys the same properties as the regular Catalyst. When the objective is nonconvex, it achieves the best known convergence rate to stationary points for first-order methods.