In randomized search parameter tuning, a randomized search is performed on hyperparameters. Hyperparameters are parameters that are passed as arguments to the constructor of the estimator. For example, in Lasso regression, the parameter alpha can be considered as a hyperparameter. We can tune these hyperparameters to optimize the performance of our model.
In our previous article, we discussed grid search parameter tuning. Unlike grid search parameter tuning, in randomized search parameter tuning, all parameter values are not tried out. Instead, a fixed number of parameter settings is sampled from a specified distribution and tried out. We can specify the number of parameter settings to be tried out through an argument of the function RandomizedSearchCV().
In this article, we will take the example of Lasso regression. The parameter alpha is a hyperparameter here. We can try out different values of alpha to find out the value of alpha that gives the optimal performance. We will use randomized search parameter tuning here.
We can use the following Python code for hyperparameter tuning for Lasso regression using randomized search parameter tuning.
import numpy import seaborn from sklearn.linear_model import Lasso from sklearn.model_selection import RandomizedSearchCV data = seaborn.load_dataset("mpg") data.dropna(inplace=True) D = data.values X = data.drop(labels=["origin", "name", "mpg"], axis=1).values y = data.filter(items=["mpg"], axis=1).values alphas = [numpy.random.uniform() for _ in range(100)] params = dict(alpha=alphas) regressor = Lasso() randomized_search_cv = RandomizedSearchCV(estimator=regressor, param_distributions=params, n_iter=50, cv=10, scoring="r2", random_state=1) randomized_search_cv.fit(X, y) print(randomized_search_cv.best_estimator_) print(randomized_search_cv.best_score_)
Here, we are first reading the “mpg” dataset using the seaborn Python library. In this dataset, we are given a set of car …






0 Comments