화학공학소재연구정보센터
IEEE Transactions on Automatic Control, Vol.39, No.6, 1302-1305, 1994
An Input Normal-Form Homotopy for the L2 Optimal-Model Order Reduction Problem
In control system analysis and design, finding a reduced-order model, optimal in the L2 sense, to a given system model is a fundamental problem. The problem is very difficult without the global convergence of homotopy methods, and a homotopy based approach has been proposed. The issues are the number of degrees of freedom, the well posedness of the finite dimensional optimization problem, and the numerical robustness of the resulting homotopy algorithm. A homotopy algorithm based on the input normal form characterization of the reduced-order model is developed here and is compared with the homotopy algorithms based on Hyland and Bernstein’s optimal projection equations. The main conclusions are that the input normal form algorithm can be very efficient, but can also be very ill conditioned or even fail.