Lasso and Ridge are two regularization techniques used in machine learning, especially in linear regression, to prevent overfitting and improve the model's generalization performance. Both Lasso and Ridge introduce a regularization term to the cost function, but they differ in the type of regularization they apply.
To summarize, Lasso is suitable when you want to perform feature selection by driving some coefficients to zero, while Ridge is suitable when you want to prevent the model from relying too heavily on any particular feature and still include all features in the model. The choice between Lasso and Ridge often depends on the specific characteristics of the dataset and the goals of the modeling task. Additionally, there is also the Elastic Net regularization, which combines both L1 and L2 regularization.
Lasso: 모든 가중치에 똑같은 값으로 규제 > 특성 선택
Ridge:
from sklearn.linear_model import Lasso, Ridge
#규제 조절: alpha
# alpha 상승: 규제를 많이 가한다
# alpha 하강: 규제를 조금 가한다 , alpha를 너무 낮게잡으면 linear regression과 같은 결과
lasso = Lasso(alpha = 0.1)
ridge = Ridge(alpha = 0.1)
lasso.fit(X_train,y_train)
ridge.fit(X_train,y_train)
from sklearn.model_selection import cross_val_score
cross_val_score(lasso, X_train, y_train, cv =5).mean()
cross_val_score(ridge, X_train, y_train, cv =5).mean()