About me
I am an Assistant Professor in the Industrial Engineering Department at the University of Pittsburgh. Before joining the University of Pittsburgh, I was a visiting postdoctoral researcher at Google in the Optimization and Algorithms group in New York. In 2019, I received my Ph.D. from the Department of Management Science and Engineering at Stanford University where Professor Yinyu Ye was my advisor.
Research
My research focuses on continuous optimization, with a penchant for local optimization methods such as gradient descent. I aim to develop reliable and efficient algorithms built on solid mathematical foundations. The goal of my research is to build new optimization tools for operations research and machine learning. I typically aim to improve existing optimization software in terms of speed, scalability, accuracy, usability, reliability and representability. My work spans the spectrum from fundamental optimization theory to software development. I build both general purpose optimization solvers and solvers targeted at specific applications such as drinking water networks, electric grids, and certifiable deep learning.
Learn more about my research in the optimizing you podcast.
Please see my scholar page for an up to date publications list.
My Ph.D. thesis was Principled Algorithms for Finding Local Minimizers.
You can contact me at ohinder at pitt dot edu.
Courses
Introduction to optimization for machine learning - IE 1187
Operations research - IE 1081
Industry impact
My research has had direct impact on industry: my research on nonconvex IPM inspired LocalSolver's own implementation, my work on first-order methods for linear programming is soon to be part of the Google Operation Research tools package, and my work with DeepMind has led to state-of-the-art techniques for neural network certification.
Papers and talks
Grouped by topic
Reliable second-order methods
A consistently adaptive trust-region method. Fadi Hamad, Oliver Hinder. Advances in Neural Information Processing Systems, 2022.
Parameter-free machine learning
Recorded talk: https://slideslive.ch/38985633/making-sgd-parameterfree?ref=recommended
DoG is SGD's Best Friend: A Parameter-Free Dynamic Step Size Schedule. Maor Ivgi, Oliver Hinder, Yair Carmon. ICML, 2023. [code]
Making SGD Parameter-Free. Yair Carmon, Oliver Hinder. Conference on learning theory, 2022.
First-order methods for linear programming
Recorded talk: https://www.youtube.com/watch?v=aViqFWsrT2M
Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient. David Applegate, Mateo Díaz, Oliver Hinder, Haihao Lu, Miles Lubin, Brendan O'Donoghue, Warren Schudy. Advances in Neural Information Processing Systems, 2021. [code]
Faster First-Order Primal-Dual Methods for Linear Programming using Restarts and Sharpness. David Applegate, Oliver Hinder, Haihao Lu, Miles Lubin. Mathematical programming, 2022.
Structured nonconvex optimization
Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond. Oliver Hinder, Aaron Sidford, Nimit S. Sohoni. COLT, 2020.
An efficient nonconvex reformulation of stagewise convex optimization problems. Rudy Bunel, Oliver Hinder, Srinadh Bhojanapalli, Dvijotham Krishnamurthy. Advances in Neural Information Processing Systems, 2020. [code]
The complexity of finding stationary points of nonconvex functions
Slides from my 2019 ICCOPT talk summarizing this body of work.
Accelerated Methods for Non-Convex Optimization. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. SIAM Journal on Optimization, 2018. Slides. Video of talk at ICML.
‘Convex Until Proven Guilty’: Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. ICML 2017.
Cutting plane methods can be extended into nonconvex optimization. Oliver Hinder. Conference on Learning Theory, 2018. Slides.
Lower Bounds for Finding Stationary Points I. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
Lower Bounds for Finding Stationary Points II: First-Order Methods. Yair Carmon, John Duchi, Oliver Hinder and Aaron Sidford. Mathematical programming, 2020.
See a summary of both lower bounds in our Neurips workshop paper.
Interior point methods for problems with nonconvex constraints
Worst-case iteration bounds for log barrier methods for problems with nonconvex constraints. Oliver Hinder, Yinyu Ye. Slides.
A one-phase interior point method for nonconvex optimization. Oliver Hinder, Yinyu Ye. Code. Slides.
On the behavior of Lagrange multipliers in convex and non-convex infeasible interior point methods. Gabriel Haeser, Oliver Hinder, Yinyu Ye.
Machine scheduling and integer programming
A novel integer programming formulation for scheduling with family setup times on a single machine to minimize maximum lateness. Oliver Hinder, Andrew Mason. European Journal of Operational Research, 2017.