Computers & Chemical Engineering, Vol.22, No.9, 1159-1179, 1998
A global optimization method, alpha BB, for general twice-differentiable constrained NLPs - II. Implementation and computational results
Part I of this paper (Adjiman et al., 1998a) described the theoretical foundations of a global optimization algorithm, the alpha BB algorithm, which can be used to solve problems belonging to the broad class of twice-differentiable NPLs. For any such problem, the ability to automatically generate progressively tighter convex lower bounding problems at each iteration guarantees the convergence of the branch-and-bound alpha BB algorithm to within epsilon of the global optimum solution. Several methods were presented for the construction of valid convex underestimators for general nonconvex functions. In this second part, the performance of the proposed algorithm and its alternative underestimators is studied through their application to a variety of problems. An implementation of the alpha BB is described and a number of rules for branching variable selection and variable bound updates are shown to enhance convergence rates. A user-friendly parser facilitates problem input and provides flexibility in the selection of an underestimating strategy. In addition, the package features both automatic differentiation and interval arithmetic capabilities. Making use of all the available options, the alpha BB algorithm successfully identifies the global optimum solution of small literature problems, of small and medium size chemical engineering problems in the areas of reactors network design, heat exchanger network design, reactor-separator network design, of generalized geometric programming problems for design and control, and of batch process design problems with uncertainty. (C) 1998 Elsevier Science Ltd. All rights reserved.
Keywords:DESIGN