Abstract
Most branch-and-bound algorithms in global optimization depend on convex underestimators to calculate lower bounds of a minimization objective function. The BB methodology produces such underestimators for sufficiently smooth functions by analyzing interval Hessian approximations. Several methods to rigorously determine the BB parameters have been proposed, varying in tightness and computational complexity. We present new polynomial-time methods and compare their properties to existing approaches. The new methods are based on classical eigenvalue bounds from linear algebra and a more recent result on interval matrices. We show how parameters can be optimized with respect to the average underestimation error, in addition to the maximum error commonly used in BB methods. Numerical comparisons are made, based on test functions and a set of randomly generated interval Hessians. The paper shows the relative strengths of the methods, and proves exact results where one method dominates another.
Original language | Undefined/Unknown |
---|---|
Pages (from-to) | 411–427 |
Number of pages | 17 |
Journal | Journal of Global Optimization |
Volume | 58 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2014 |
MoE publication type | A1 Journal article-refereed |
Keywords
- alpha BB
- Convex relaxation
- Global optimization
- Nonconvex optimization