Ted Ralphs, Lehigh University
Parametric inequalities are valid inequalities that are parameterized such that they remain valid when (some of) the problem data are perturbed. The so-called Benders cuts that arise in Benders decomposition can be viewed as a kind of parametric inequality, but the notion arises in a number of other contexts and such inequalities are useful/necessary in a range of methodological applications, such as in warm-starting, the solution of bilevel/multilevel optimization problem, multiobjective optimization, etc. In this talk, we present a brief overview of the theoretical underpinnings and discuss a general approach to generating such inequalities. The talk will focus primarily on the linear case, but possible extensions to the nonlinear case will be discussed.