I am an Assistant Professor in the Industrial Engineering Department at the University of Pittsburgh. Before joining the University of Pittsburgh, I was a visiting postdoctoral researcher at Google in the Optimization and Algorithms group in New York. In 2019, I received my Ph.D. from the Department of Management Science and Engineering at Stanford University where Professor Yinyu Ye was my advisor.
My research focuses on continuous optimization, with a penchant for local optimization methods such as gradient descent. I aim to develop reliable and efficient algorithms built on solid mathematical foundations. This research is motivated by applications in network optimization and machine learning that push the limits of current computational capabilities.
You can contact me at ohinder at pitt dot edu.
Please see my scholar page for an up to date publications list.
My Ph.D. thesis was Principled Algorithms for Finding Local Minimizers.
Papers and talks
Grouped by topic
First order methods for convex optimization
Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient. David Applegate, Mateo Díaz, Oliver Hinder, Haihao Lu, Miles Lubin, Brendan O'Donoghue, Warren Schudy. Advances in Neural Information Processing Systems, 2021
Faster First-Order Primal-Dual Methods for Linear Programming using Restarts and Sharpness. David Applegate, Oliver Hinder, Haihao Lu, Miles Lubin
Structured nonconvex optimization
Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond. Oliver Hinder, Aaron Sidford, Nimit S. Sohoni. COLT, 2020.
An efficient nonconvex reformulation of stagewise convex optimization problems. Rudy Bunel, Oliver Hinder, Srinadh Bhojanapalli, Dvijotham Krishnamurthy. Advances in Neural Information Processing Systems, 2020.
‘Convex Until Proven Guilty’: Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. ICML 2017.
Cutting plane methods can be extended into nonconvex optimization. Oliver Hinder. Conference on Learning Theory, 2018. Slides.
Lower Bounds for Finding Stationary Points I. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
Lower Bounds for Finding Stationary Points II: First-Order Methods. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
See a summary of both lower bounds in our Neurips workshop paper.
Interior point methods for problems with nonconvex constraints
Worst-case iteration bounds for log barrier methods for problems with nonconvex constraints. Oliver Hinder, Yinyu Ye. Slides.
A one-phase interior point method for nonconvex optimization. Oliver Hinder, Yinyu Ye. Code. Slides.
On the behavior of Lagrange multipliers in convex and non-convex infeasible interior point methods. Gabriel Haeser, Oliver Hinder, Yinyu Ye.
Machine scheduling and integer programming
A novel integer programming formulation for scheduling with family setup times on a single machine to minimize maximum lateness. Oliver Hinder, Andrew Mason. European Journal of Operational Research, 2017.