



| Step 1 | Step 2 | Step 3 |
|---|---|---|
![]() | ![]() | ![]() |
: Step 2 Merge Exclusive Features

[LightGBM Parameters]
Package : https://lightgbm.readthedocs.io/en/latest/Python-Intro.html
learning_rate : GBM에서 shrinking 하는 것과 같은 것
reg_lambda : L2 regularization term on weights (analogous to Ridge regression)
reg_alpha : L1 regularization term on weight (analogous to Lasso regression)
objective
objective 🔗︎, default = regression, type = enum, options: regression, regression_l1, huber, fair, poisson, quantile, mape, gamma, tweedie, binary, multiclass, multiclassova, cross_entropy, cross_entropy_lambda, lambdarank, rank_xendcg, aliases: objective_type, app, application, loss
eval_metric [ default according to objective ]
The metric to be used for validation data.
The default values are rmse for regression and error for classification.
Typical values are:
rmse – root mean square error
mae – mean absolute error
logloss – negative log-likelihood
error – Binary classification error rate (0.5 threshold)
merror – Multiclass classification error rate
mlogloss – Multiclass logloss
auc: Area under the curve
[LightGBM]
Hyperparameter tuning
n_estimators, learning_rate, max_depth, reg_alpha
LightGBM은 Hyperparam이 굉장히 많은 알고리즘 중에 하나임
위에 4가지만 잘 조정해도 좋은 결과를 얻을 수 있음