I am an Assistant Professor in the Industrial Engineering Department at the University of Pittsburgh. Before joining the University of Pittsburgh, I was a visiting postdoctoral researcher at Google in the Optimization and Algorithms group in New York. In 2019, I received my Ph.D. from the Department of Management Science and Engineering at Stanford University where Professor Yinyu Ye was my advisor.
My research focuses on continuous optimization, with a penchant for local optimization methods such as gradient descent. I aim to develop reliable and efficient algorithms built on solid mathematical foundations. The goal of my research is to build new optimization tools for operations research and machine learning. I typically aim to improve existing optimization software in terms of speed, scalability, accuracy, usability, reliability and representability. My work spans the spectrum from fundamental optimization theory to software development. I build both general purpose optimization solvers and solvers targeted at specific applications such as drinking water networks, electric grids, and certifiable deep learning.
You can contact me at ohinder at pitt dot edu.
Please see my scholar page for an up to date publications list.
My Ph.D. thesis was Principled Algorithms for Finding Local Minimizers.
My research has had direct impact on industry: my research on nonconvex IPM inspired LocalSolver's own implementation, my work on first-order methods for linear programming is soon to be part of the Google Operation Research tools package, and my work with DeepMind has led to state-of-the-art techniques for neural network certification.
Papers and talks
Grouped by topic
First order methods for convex optimization
Recorded talk: https://www.youtube.com/watch?v=aViqFWsrT2M
Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient. David Applegate, Mateo Díaz, Oliver Hinder, Haihao Lu, Miles Lubin, Brendan O'Donoghue, Warren Schudy. Advances in Neural Information Processing Systems, 2021
Faster First-Order Primal-Dual Methods for Linear Programming using Restarts and Sharpness. David Applegate, Oliver Hinder, Haihao Lu, Miles Lubin
Structured nonconvex optimization
Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond. Oliver Hinder, Aaron Sidford, Nimit S. Sohoni. COLT, 2020.
An efficient nonconvex reformulation of stagewise convex optimization problems. Rudy Bunel, Oliver Hinder, Srinadh Bhojanapalli, Dvijotham Krishnamurthy. Advances in Neural Information Processing Systems, 2020.
‘Convex Until Proven Guilty’: Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. ICML 2017.
Cutting plane methods can be extended into nonconvex optimization. Oliver Hinder. Conference on Learning Theory, 2018. Slides.
Lower Bounds for Finding Stationary Points I. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
Lower Bounds for Finding Stationary Points II: First-Order Methods. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
See a summary of both lower bounds in our Neurips workshop paper.
Interior point methods for problems with nonconvex constraints
Worst-case iteration bounds for log barrier methods for problems with nonconvex constraints. Oliver Hinder, Yinyu Ye. Slides.
A one-phase interior point method for nonconvex optimization. Oliver Hinder, Yinyu Ye. Code. Slides.
On the behavior of Lagrange multipliers in convex and non-convex infeasible interior point methods. Gabriel Haeser, Oliver Hinder, Yinyu Ye.
Machine scheduling and integer programming
A novel integer programming formulation for scheduling with family setup times on a single machine to minimize maximum lateness. Oliver Hinder, Andrew Mason. European Journal of Operational Research, 2017.