Hyperparameters are parameters that are passed as arguments to the constructor of the estimator. For example, in Lasso regression, the parameter alpha can be considered as a hyperparameter. We can tune these hyperparameters to optimize the performance of our model.
In grid search, we perform an exhaustive search over specified parameter values of an estimator. For example, in Lasso regression, we can give a set of alpha values and perform an exhaustive search to find out which alpha value gives the optimal result. If there is more than one parameter, then the parameters are specified in a grid, and an exhaustive search is performed for each combination of the parameters to find out the optimal values of the parameters.
In Python, we can use the GridSearchCV() function for grid search parameter tuning. We can use the following Python code to perform a grid search to find out the optimal alpha value for Lasso regression.
import numpy import seaborn from sklearn.linear_model import Lasso from sklearn.model_selection import GridSearchCV data = seaborn.load_dataset("mpg") data.dropna(inplace=True) D = data.values X = data.drop(labels=["origin", "name", "mpg"], axis=1).values y = data.filter(items=["mpg"], axis=1).values alphas = numpy.linspace(0.01, 1, num=100, endpoint=True) params = dict(alpha=alphas) regressor = Lasso() grid_search_cv = GridSearchCV(estimator=regressor, param_grid=params, cv=10, scoring="r2") grid_search_cv.fit(X, y) print(grid_search_cv.best_estimator_) print(grid_search_cv.best_score_)
Here, we are reading the mpg dataset using the seaborn Python library. In this dataset, we are given a set of car models along with their horsepower, weight, acceleration, mpg or miles driven per 1 gallon of gasoline, etc. We want to create a Lasso …






0 Comments