Chapter 13. Grid Search
In Chapter 12 we demonstrated how users can mark or tag arguments in preprocessing recipes and/or model specifications for optimization using the tune()
function. Once we know what to optimize, itâs time to address the question of how to optimize the parameters. This chapter describes grid search methods that specify the possible values of the parameters a priori. (Chapter 14 will continue the discussion by describing iterative search methods.)
Letâs start by looking at two main approaches for assembling a grid.
Regular and Nonregular Grids
There are two main types of grids. A regular grid combines each parameter (with its corresponding set of possible values) factorially, i.e., by using all combinations of the sets. Alternatively, a nonregular grid is one where the parameter combinations are not formed from a small set of points.
Before we look at each type in more detail, letâs consider an example model: the multilayer perceptron model (a.k.a. single-layer artificial neural network). The parameters marked for tuning are:
-
The number of hidden units
-
The number of fitting epochs/iterations in model training
-
The amount of weight decay penalization
Using parsnip, the specification for a classification model fit using the nnet package is:
library
(
tidymodels
)
tidymodels_prefer
()
mlp_spec
<-
mlp
(
hidden_units
=
tune
(),
penalty
=
tune
(),
epochs
=
tune
())
%>%
set_engine
(
"nnet"
,
trace
=
0
)
%>%
set_mode
(
"classification"
)
The argument trace = 0
prevents ...
Get Tidy Modeling with R now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.