Papers and talks
Grouped by topic
Grouped by topic
A consistently adaptive trust-region method. Fadi Hamad, Oliver Hinder. Advances in Neural Information Processing Systems, 2022.
DoG is SGD's Best Friend: A Parameter-Free Dynamic Step Size Schedule. Maor Ivgi, Oliver Hinder, Yair Carmon. ICML, 2023. [code]
Making SGD Parameter-Free. Yair Carmon, Oliver Hinder. Conference on learning theory, 2022. [COLT recorded talk] [OP23 slides]
Recorded talk: https://www.youtube.com/watch?v=aViqFWsrT2M
Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient. David Applegate, Mateo Díaz, Oliver Hinder, Haihao Lu, Miles Lubin, Brendan O'Donoghue, Warren Schudy. Advances in Neural Information Processing Systems, 2021. [code]
Faster First-Order Primal-Dual Methods for Linear Programming using Restarts and Sharpness. David Applegate, Oliver Hinder, Haihao Lu, Miles Lubin. Mathematical programming, 2023.
Worst-case analysis of restarted primal-dual hybrid gradient on totally unimodular linear programs. Oliver Hinder, 2023.
Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond. Oliver Hinder, Aaron Sidford, Nimit S. Sohoni. COLT, 2020.
An efficient nonconvex reformulation of stagewise convex optimization problems. Rudy Bunel, Oliver Hinder, Srinadh Bhojanapalli, Dvijotham Krishnamurthy. Advances in Neural Information Processing Systems, 2020. [code]
Slides from my 2019 ICCOPT talk summarizing this body of work.
Accelerated Methods for Non-Convex Optimization. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. SIAM Journal on Optimization, 2018. Slides. Video of talk at ICML.
‘Convex Until Proven Guilty’: Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. ICML 2017.
Cutting plane methods can be extended into nonconvex optimization. Oliver Hinder. Conference on Learning Theory, 2018. Slides.
Lower Bounds for Finding Stationary Points I. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
Lower Bounds for Finding Stationary Points II: First-Order Methods. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
See a summary of both lower bounds in our Neurips workshop paper.
Worst-case iteration bounds for log barrier methods for problems with nonconvex constraints. Oliver Hinder, Yinyu Ye. Slides.
A one-phase interior point method for nonconvex optimization. Oliver Hinder, Yinyu Ye. Code. Slides.
On the behavior of Lagrange multipliers in convex and non-convex infeasible interior point methods. Gabriel Haeser, Oliver Hinder, Yinyu Ye.
A novel integer programming formulation for scheduling with family setup times on a single machine to minimize maximum lateness. Oliver Hinder, Andrew Mason. European Journal of Operational Research, 2017.