IEE Proceedings-Control Theory & Applications, Vol.143, No.3, 247-254, 1996
Model-Reduction with Time-Delay Combining the Least-Squares Method with the Genetic Algorithm
The paper proposes a novel method of model reduction with time delay for single-input, single-output systems combining the linear least-squares (LS) method with the genetic algorithm (GA). The denominator parameters and time delay of the reduced-order model are coded into binary bit strings and searched by the GA, while the numerator parameters are calculated by the LS method for each of candidates of the denominator parameters and time delay. Thus, all the best parameters and time delay reduced-order model can be obtained repeating genetic operations of strings. performance of the proposed method is shown by two numerical examples.
Keywords:SYSTEMS