Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
ELASTICNET REGRESSION
This is used when large number of parameters are there (sometimes millions) and there are chances of
more useless data to make any prediction. ElasticNET regression is combination of Ridge and Lasso
regressions.
Note: the lasso regression penalty and ridge regression penalty get their own λs. They may not be
similar. So, the equation will be:
Here, λ1 indicates penalty for Lasso and λ2 indicates penalty for Ridge.
Note: When λ1 and λ2 are greater than 0, it will become ElasticNET regression.
Note: When creating object to ElasticNet class, l1_ratio indicates mixing parameter and its value is
between 0 and 1. When l1_ratio is 0, it indicates L2 penalty. When it is 1, it indicates L1 penalty. When
its value is between 0 and 1, then the penalty is combination of L1 and L2.
Ex:
es = ElasticNet(l1_ratio=0.5)
Elastic Net regression internally uses a loop and uses all features in predicting the outcome. Hence it
gives more accuracy. But Ridge regression uses only one feature.
Problem: Select the most appropriate feature of boston housing dataset using Elastic Net regression.
y = boston['MEDV'].values
y
names = x.columns
names
# range of columns
rng = range(len(names))
rng
# draw line plot between range and coefficients in elastic net formula
from sklearn.linear_model import ElasticNet
es = ElasticNet(l1_ratio=0.5)
es_coef = es.fit(x,y).coef_
es_coef
output
Task using Elastic Net Regression: Using boston housing data, predict the price of a house with 5
rooms (RM). Your result should be around 15 to 17,000$.