| Estimation of distribution algorithms (EDAs) address broad classes of optimization problems by learning explicit probabilistic models of promising solutions found so far and sampling the built models to generate new candidate solutions. By incorporating advanced machine learning techniques into genetic and evolutionary algorithms, EDAs can scalably solve many challenging problems, significantly outperforming standard genetic and evolutionary algorithms and other optimization techniques. In the recent decade, many impressive results have been produced in the design, theoretical investigation, and applications of EDAs.
An EDA evolves a population of candidate solutions to the given problem. Each iteration starts by evaluating the candidate solutions and selecting promising solutions so that solutions of higher quality are given more copies than solutions of lower quality. EDAs can use any standard selection method of genetic and evolutionary algorithms, such as binary tournament selection. Next, a probabilistic model is build for the selected solutions and new solutions are generated by sampling the built probabilistic model. New solutions are then incorporated into the original population using some replacement strategy, and the next iteration is executed unless the termination criteria have been met. EDAs usually differ in the representation of candidate solutions, the considered class of probabilistic models, or the procedures for learning and sampling probabilistic models. |