11Search Metaheuristics
Many search methods have been encountered in the context of “tuning” on the previous acquisition, classification, and clustering methods (especially tuning on the SVM in Chapter 14). Tuning is searching for an optimal configuration. Methods and metaheuristics to perform searches are now described in a more general context.
Numerous prior book, journal, and patent publications by the author are drawn upon extensively throughout the text [1–68]. Almost all of the journal publications are open access. These publications can typically be found online at either the author’s personal website (www.meta‐logos.com) or with one of the following online publishers: www.m‐hikari.com or bmcbioinformatics.biomedcentral.com.
11.1 Trajectory‐Based Search Metaheuristics
If you have a configuration that you need to optimize, and for any configuration you can evaluate its “score,” or “fitness,” then a variety of metaheuristics have been developed for configuration selection or model tuning, both by Man and by Nature. If the configuration fitness can be determined from a differentiable function of its configuration parameters, then classic gradient ascent (or descent) can be used to optimize the configuration by making learning steps that climb (for maximization type optimization, see Figure 11.1). If the fitness function has a second derivative, then an improved version has been known for over 400 years, Newton’s method, which involves calculation of the Hessian to get ...
Get Informatics and Machine Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.