Reduce on Plateau Learning Rate Scheduler

박정재·2023년 2월 10일
0

딥러닝 학습을 하며 metric의 성능 개선이 없을 때, learning rate를 줄여주는 기능을 해준다.

ReduceLROnPlateau
Reduce learning rate when a metric has stopped improving. Models often benefit from reducing the learning rate by a factor of 2-10 once learning stagnates. This scheduler reads a metrics quantity and if no improvement is seen for a ‘patience’ number of epochs, the learning rate is reduced.

# 예시 1
optimizer = torch.optim.Adam(params = model.parameters(), lr = CFG["LEARNING_RATE"])
scheduler = torch.optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='max', factor=0.5, patience=2,threshold_mode='abs',min_lr=1e-8, verbose=True)

# 예시 2
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
scheduler = ReduceLROnPlateau(optimizer, 'min')
profile
Keep on dreaming and dreaming

0개의 댓글