Application of Genetic Algorithm to the Problem of Control System
Abstract
Genetic algorithm (GA) is stochastic global adaptive search optimization technique based on the mechanisms of natural selection. GA was first suggested by John Holland and his colleagues in 1975. It mimics the metaphor of natural biological evolution [1]. GA has been recognized as an effective and efficient technique to solve optimization problems. Compared with other optimization techniques, such as simulating annealing and random search method techniques, GA is superior in avoiding local minima, which is a significant issue in the case of nonlinear systems. GA operates on a population of potential solutions applying the principle of survival of the fittest to produce successively better approximations to a solution. At each generation of a GA, a new set of approximations is created by the process of selecting individuals according to their level of fitness in the problem domain and reproducing them using operators borrowed from natural genetics. This process leads to the evolution of populations of individuals that are better suited to their environment than the individuals from which they were created, just as in natural adaptation. GA has been shown to be an effective strategy in the off-line design of control systems by a number of practitioners. For example, Krishna Kumar and Goldberg [2] and Bramlette and Cusin [3] have demonstrated how genetic optimization methods can be used to derive superior controller structures in aerospace applications in less time (in terms of function evaluations) than traditional methods such as LQR and Powell’s gain set design. Porter and Mohamed [4] have presented schemes for the genetic design of multivariable flight control systems using Eigen structure assignment, whilst others have demonstrated how GAs can be used in the selection of controller structures [5]. GA starts with an initial population containing a number of chromosomes where each one represents a solution of the problem, the performance of which is evaluated by a fitness function. Basically, GA consists of three main stages: Selection, Crossover and Mutation. The application of these three basic operations allows the creation of new individuals, which may be better than their parents. From the above discussion, it can be seen that the GA differs substantially from more traditional search and optimization methods. The four most significant differences are: GAs search a population of points in parallel, not a single point. GAs do not require derivative information or other auxiliary knowledge; only the objective function and corresponding fitness levels influence the directions of search. GAs use probabilistic transition rules, not deterministic ones. GAs work on an encoding of the parameter set rather than the parameter set itself (except in where real-valued individuals are used). It is important to note that the GA provides a number of potential solutions to a given problem and the choice of final solution is left to the user. In cases where a particular problem does not have one individual solution, for example a family of Pareto-optimal solutions, as is the case in multi objective optimization and scheduling problems, then the GA is potentially
Downloads
Author(s) and co-author(s) jointly and severally represent and warrant that the Article is original with the author(s) and does not infringe any copyright or violate any other right of any third parties, and that the Article has not been published elsewhere. Author(s) agree to the terms that the IJRDO Journal will have the full right to remove the published article on any misconduct found in the published article.