251009 - Project (14)

TaeHyun·2025년 10월 9일

TIL

목록 보기
72/182

시작하며

Model v4의 학습이 30시간 정도 진행 중이다. 20 Epoch 진행 중인데 아직 테스트는 안 해봤지만 v5 준비를 해둬야 할 것 같은 느낌이다. 중후반까지는 꽤 괜찮은 결과가 나오는 것 같아 보였는데 19 Epoch에서 과적합이 생겼는지 정확도가 갑자기 3%가 떨어졌다. 학습이 모두 끝나고 테스트해보긴 해야겠지만 여전히 플라스틱 클래스의 정확도가 낮게 나오면 학습률 조정이나 Optimizer 조정 등을 시도해볼까 생각 중이다.

Model v4 학습

[Epoch] 1 / 20
[batch :  100], Loss : 0.362177 ( 6336 / 92661)
[batch :  200], Loss : 0.272911 (12736 / 92661)
[batch :  300], Loss : 0.253460 (19136 / 92661)
[batch :  400], Loss : 0.173520 (25536 / 92661)
[batch :  500], Loss : 0.160706 (31936 / 92661)
[batch :  600], Loss : 0.185602 (38336 / 92661)
[batch :  700], Loss : 0.072793 (44736 / 92661)
[batch :  800], Loss : 0.251362 (51136 / 92661)
[batch :  900], Loss : 0.112455 (57536 / 92661)
[batch :  1000], Loss : 0.115068 (63936 / 92661)
[batch :  1100], Loss : 0.273216 (70336 / 92661)
[batch :  1200], Loss : 0.223410 (76736 / 92661)
[batch :  1300], Loss : 0.115687 (83136 / 92661)
[batch :  1400], Loss : 0.103487 (89536 / 92661)
Validation Results: Accuracy: 0.807 (80.7%), Avg loss: 0.5895

[Epoch] 2 / 20
[batch :  100], Loss : 0.065409 ( 6336 / 92661)
[batch :  200], Loss : 0.085801 (12736 / 92661)
[batch :  300], Loss : 0.062291 (19136 / 92661)
[batch :  400], Loss : 0.126534 (25536 / 92661)
[batch :  500], Loss : 0.057509 (31936 / 92661)
[batch :  600], Loss : 0.072117 (38336 / 92661)
[batch :  700], Loss : 0.108745 (44736 / 92661)
[batch :  800], Loss : 0.057688 (51136 / 92661)
[batch :  900], Loss : 0.041158 (57536 / 92661)
[batch :  1000], Loss : 0.137404 (63936 / 92661)
[batch :  1100], Loss : 0.096874 (70336 / 92661)
[batch :  1200], Loss : 0.151995 (76736 / 92661)
[batch :  1300], Loss : 0.042159 (83136 / 92661)
[batch :  1400], Loss : 0.125823 (89536 / 92661)
Validation Results: Accuracy: 0.836 (83.6%), Avg loss: 0.5410

[Epoch] 3 / 20
[batch :  100], Loss : 0.064552 ( 6336 / 92661)
[batch :  200], Loss : 0.015281 (12736 / 92661)
[batch :  300], Loss : 0.013923 (19136 / 92661)
[batch :  400], Loss : 0.016526 (25536 / 92661)
[batch :  500], Loss : 0.108855 (31936 / 92661)
[batch :  600], Loss : 0.121082 (38336 / 92661)
[batch :  700], Loss : 0.038809 (44736 / 92661)
[batch :  800], Loss : 0.029595 (51136 / 92661)
[batch :  900], Loss : 0.054477 (57536 / 92661)
[batch :  1000], Loss : 0.066415 (63936 / 92661)
[batch :  1100], Loss : 0.026475 (70336 / 92661)
[batch :  1200], Loss : 0.061759 (76736 / 92661)
[batch :  1300], Loss : 0.158551 (83136 / 92661)
[batch :  1400], Loss : 0.044870 (89536 / 92661)
Validation Results: Accuracy: 0.819 (81.9%), Avg loss: 0.6091

[Epoch] 4 / 20
[batch :  100], Loss : 0.071000 ( 6336 / 92661)
[batch :  200], Loss : 0.049418 (12736 / 92661)
[batch :  300], Loss : 0.107890 (19136 / 92661)
[batch :  400], Loss : 0.008941 (25536 / 92661)
[batch :  500], Loss : 0.054232 (31936 / 92661)
[batch :  600], Loss : 0.027066 (38336 / 92661)
[batch :  700], Loss : 0.016244 (44736 / 92661)
[batch :  800], Loss : 0.029642 (51136 / 92661)
[batch :  900], Loss : 0.012548 (57536 / 92661)
[batch :  1000], Loss : 0.021828 (63936 / 92661)
[batch :  1100], Loss : 0.032511 (70336 / 92661)
[batch :  1200], Loss : 0.027823 (76736 / 92661)
[batch :  1300], Loss : 0.019019 (83136 / 92661)
[batch :  1400], Loss : 0.042878 (89536 / 92661)
Validation Results: Accuracy: 0.849 (84.9%), Avg loss: 0.5323

[Epoch] 5 / 20
[batch :  100], Loss : 0.018627 ( 6336 / 92661)
[batch :  200], Loss : 0.043771 (12736 / 92661)
[batch :  300], Loss : 0.035218 (19136 / 92661)
[batch :  400], Loss : 0.016184 (25536 / 92661)
[batch :  500], Loss : 0.025136 (31936 / 92661)
[batch :  600], Loss : 0.009468 (38336 / 92661)
[batch :  700], Loss : 0.011016 (44736 / 92661)
[batch :  800], Loss : 0.107133 (51136 / 92661)
[batch :  900], Loss : 0.021917 (57536 / 92661)
[batch :  1000], Loss : 0.021748 (63936 / 92661)
[batch :  1100], Loss : 0.016369 (70336 / 92661)
[batch :  1200], Loss : 0.034794 (76736 / 92661)
[batch :  1300], Loss : 0.014892 (83136 / 92661)
[batch :  1400], Loss : 0.131458 (89536 / 92661)
Validation Results: Accuracy: 0.830 (83.0%), Avg loss: 0.6251

[Epoch] 6 / 20
[batch :  100], Loss : 0.112842 ( 6336 / 92661)
[batch :  200], Loss : 0.018399 (12736 / 92661)
[batch :  300], Loss : 0.013338 (19136 / 92661)
[batch :  400], Loss : 0.006084 (25536 / 92661)
[batch :  500], Loss : 0.005993 (31936 / 92661)
[batch :  600], Loss : 0.040624 (38336 / 92661)
[batch :  700], Loss : 0.003857 (44736 / 92661)
[batch :  800], Loss : 0.028897 (51136 / 92661)
[batch :  900], Loss : 0.003008 (57536 / 92661)
[batch :  1000], Loss : 0.032947 (63936 / 92661)
[batch :  1100], Loss : 0.037042 (70336 / 92661)
[batch :  1200], Loss : 0.043529 (76736 / 92661)
[batch :  1300], Loss : 0.012494 (83136 / 92661)
[batch :  1400], Loss : 0.054238 (89536 / 92661)
Validation Results: Accuracy: 0.852 (85.2%), Avg loss: 0.5423

[Epoch] 7 / 20
[batch :  100], Loss : 0.046341 ( 6336 / 92661)
[batch :  200], Loss : 0.004629 (12736 / 92661)
[batch :  300], Loss : 0.059395 (19136 / 92661)
[batch :  400], Loss : 0.012804 (25536 / 92661)
[batch :  500], Loss : 0.017760 (31936 / 92661)
[batch :  600], Loss : 0.023058 (38336 / 92661)
[batch :  700], Loss : 0.039415 (44736 / 92661)
[batch :  800], Loss : 0.072791 (51136 / 92661)
[batch :  900], Loss : 0.085310 (57536 / 92661)
[batch :  1000], Loss : 0.225461 (63936 / 92661)
[batch :  1100], Loss : 0.005818 (70336 / 92661)
[batch :  1200], Loss : 0.021281 (76736 / 92661)
[batch :  1300], Loss : 0.021883 (83136 / 92661)
[batch :  1400], Loss : 0.083328 (89536 / 92661)
Validation Results: Accuracy: 0.872 (87.2%), Avg loss: 0.4974

[Epoch] 8 / 20
[batch :  100], Loss : 0.017529 ( 6336 / 92661)
[batch :  200], Loss : 0.008949 (12736 / 92661)
[batch :  300], Loss : 0.002719 (19136 / 92661)
[batch :  400], Loss : 0.063323 (25536 / 92661)
[batch :  500], Loss : 0.056177 (31936 / 92661)
[batch :  600], Loss : 0.013284 (38336 / 92661)
[batch :  700], Loss : 0.014647 (44736 / 92661)
[batch :  800], Loss : 0.085098 (51136 / 92661)
[batch :  900], Loss : 0.012870 (57536 / 92661)
[batch :  1000], Loss : 0.076708 (63936 / 92661)
[batch :  1100], Loss : 0.012906 (70336 / 92661)
[batch :  1200], Loss : 0.004933 (76736 / 92661)
[batch :  1300], Loss : 0.017708 (83136 / 92661)
[batch :  1400], Loss : 0.044334 (89536 / 92661)
Validation Results: Accuracy: 0.872 (87.2%), Avg loss: 0.5048

[Epoch] 9 / 20
[batch :  100], Loss : 0.026294 ( 6336 / 92661)
[batch :  200], Loss : 0.023511 (12736 / 92661)
[batch :  300], Loss : 0.086351 (19136 / 92661)
[batch :  400], Loss : 0.008884 (25536 / 92661)
[batch :  500], Loss : 0.100148 (31936 / 92661)
[batch :  600], Loss : 0.018110 (38336 / 92661)
[batch :  700], Loss : 0.107957 (44736 / 92661)
[batch :  800], Loss : 0.018386 (51136 / 92661)
[batch :  900], Loss : 0.001005 (57536 / 92661)
[batch :  1000], Loss : 0.007597 (63936 / 92661)
[batch :  1100], Loss : 0.014448 (70336 / 92661)
[batch :  1200], Loss : 0.022159 (76736 / 92661)
[batch :  1300], Loss : 0.016583 (83136 / 92661)
[batch :  1400], Loss : 0.033243 (89536 / 92661)
Validation Results: Accuracy: 0.827 (82.7%), Avg loss: 0.6587

[Epoch] 10 / 20
[batch :  100], Loss : 0.006628 ( 6336 / 92661)
[batch :  200], Loss : 0.035669 (12736 / 92661)
[batch :  300], Loss : 0.040949 (19136 / 92661)
[batch :  400], Loss : 0.015860 (25536 / 92661)
[batch :  500], Loss : 0.002217 (31936 / 92661)
[batch :  600], Loss : 0.001684 (38336 / 92661)
[batch :  700], Loss : 0.001312 (44736 / 92661)
[batch :  800], Loss : 0.009470 (51136 / 92661)
[batch :  900], Loss : 0.006152 (57536 / 92661)
[batch :  1000], Loss : 0.003599 (63936 / 92661)
[batch :  1100], Loss : 0.062997 (70336 / 92661)
[batch :  1200], Loss : 0.012266 (76736 / 92661)
[batch :  1300], Loss : 0.013659 (83136 / 92661)
[batch :  1400], Loss : 0.021846 (89536 / 92661)
Validation Results: Accuracy: 0.862 (86.2%), Avg loss: 0.5604
[Epoch] 11 / 20
[batch :  100], Loss : 0.010269 ( 6336 / 92661)
[batch :  200], Loss : 0.104387 (12736 / 92661)
[batch :  300], Loss : 0.023013 (19136 / 92661)
[batch :  400], Loss : 0.006130 (25536 / 92661)
[batch :  500], Loss : 0.042128 (31936 / 92661)
[batch :  600], Loss : 0.001363 (38336 / 92661)
[batch :  700], Loss : 0.007690 (44736 / 92661)
[batch :  800], Loss : 0.019090 (51136 / 92661)
[batch :  900], Loss : 0.073161 (57536 / 92661)
[batch :  1000], Loss : 0.038282 (63936 / 92661)
[batch :  1100], Loss : 0.060245 (70336 / 92661)
[batch :  1200], Loss : 0.058648 (76736 / 92661)
[batch :  1300], Loss : 0.138333 (83136 / 92661)
[batch :  1400], Loss : 0.007073 (89536 / 92661)
Validation Results: Accuracy: 0.876 (87.6%), Avg loss: 0.4720

[Epoch] 12 / 20
[batch :  100], Loss : 0.055030 ( 6336 / 92661)
[batch :  200], Loss : 0.002910 (12736 / 92661)
[batch :  300], Loss : 0.037388 (19136 / 92661)
[batch :  400], Loss : 0.003176 (25536 / 92661)
[batch :  500], Loss : 0.032907 (31936 / 92661)
[batch :  600], Loss : 0.027280 (38336 / 92661)
[batch :  700], Loss : 0.002279 (44736 / 92661)
[batch :  800], Loss : 0.016052 (51136 / 92661)
[batch :  900], Loss : 0.007381 (57536 / 92661)
[batch :  1000], Loss : 0.037064 (63936 / 92661)
[batch :  1100], Loss : 0.009007 (70336 / 92661)
[batch :  1200], Loss : 0.012664 (76736 / 92661)
[batch :  1300], Loss : 0.015841 (83136 / 92661)
[batch :  1400], Loss : 0.004255 (89536 / 92661)
Validation Results: Accuracy: 0.846 (84.6%), Avg loss: 0.6312

[Epoch] 13 / 20
[batch :  100], Loss : 0.034756 ( 6336 / 92661)
[batch :  200], Loss : 0.010978 (12736 / 92661)
[batch :  300], Loss : 0.033007 (19136 / 92661)
[batch :  400], Loss : 0.023683 (25536 / 92661)
[batch :  500], Loss : 0.003449 (31936 / 92661)
[batch :  600], Loss : 0.013240 (38336 / 92661)
[batch :  700], Loss : 0.028034 (44736 / 92661)
[batch :  800], Loss : 0.002425 (51136 / 92661)
[batch :  900], Loss : 0.019244 (57536 / 92661)
[batch :  1000], Loss : 0.026254 (63936 / 92661)
[batch :  1100], Loss : 0.085374 (70336 / 92661)
[batch :  1200], Loss : 0.001378 (76736 / 92661)
[batch :  1300], Loss : 0.016276 (83136 / 92661)
[batch :  1400], Loss : 0.000834 (89536 / 92661)
Validation Results: Accuracy: 0.855 (85.5%), Avg loss: 0.5965

[Epoch] 14 / 20
[batch :  100], Loss : 0.007191 ( 6336 / 92661)
[batch :  200], Loss : 0.023955 (12736 / 92661)
[batch :  300], Loss : 0.012830 (19136 / 92661)
[batch :  400], Loss : 0.014140 (25536 / 92661)
[batch :  500], Loss : 0.007067 (31936 / 92661)
[batch :  600], Loss : 0.013766 (38336 / 92661)
[batch :  700], Loss : 0.013444 (44736 / 92661)
[batch :  800], Loss : 0.005513 (51136 / 92661)
[batch :  900], Loss : 0.031444 (57536 / 92661)
[batch :  1000], Loss : 0.004867 (63936 / 92661)
[batch :  1100], Loss : 0.002546 (70336 / 92661)
[batch :  1200], Loss : 0.001847 (76736 / 92661)
[batch :  1300], Loss : 0.099273 (83136 / 92661)
[batch :  1400], Loss : 0.047551 (89536 / 92661)
Validation Results: Accuracy: 0.873 (87.3%), Avg loss: 0.5378

[Epoch] 15 / 20
[batch :  100], Loss : 0.029984 ( 6336 / 92661)
[batch :  200], Loss : 0.007948 (12736 / 92661)
[batch :  300], Loss : 0.000871 (19136 / 92661)
[batch :  400], Loss : 0.000664 (25536 / 92661)
[batch :  500], Loss : 0.000564 (31936 / 92661)
[batch :  600], Loss : 0.000964 (38336 / 92661)
[batch :  700], Loss : 0.019516 (44736 / 92661)
[batch :  800], Loss : 0.057788 (51136 / 92661)
[batch :  900], Loss : 0.004707 (57536 / 92661)
[batch :  1000], Loss : 0.005720 (63936 / 92661)
[batch :  1100], Loss : 0.004432 (70336 / 92661)
[batch :  1200], Loss : 0.001483 (76736 / 92661)
[batch :  1300], Loss : 0.001588 (83136 / 92661)
[batch :  1400], Loss : 0.014402 (89536 / 92661)
Validation Results: Accuracy: 0.880 (88.0%), Avg loss: 0.5195

[Epoch] 16 / 20
[batch :  100], Loss : 0.001642 ( 6336 / 92661)
[batch :  200], Loss : 0.011642 (12736 / 92661)
[batch :  300], Loss : 0.043895 (19136 / 92661)
[batch :  400], Loss : 0.133541 (25536 / 92661)
[batch :  500], Loss : 0.003559 (31936 / 92661)
[batch :  600], Loss : 0.017483 (38336 / 92661)
[batch :  700], Loss : 0.008289 (44736 / 92661)
[batch :  800], Loss : 0.001678 (51136 / 92661)
[batch :  900], Loss : 0.004521 (57536 / 92661)
[batch :  1000], Loss : 0.017863 (63936 / 92661)
[batch :  1100], Loss : 0.002319 (70336 / 92661)
[batch :  1200], Loss : 0.001716 (76736 / 92661)
[batch :  1300], Loss : 0.079114 (83136 / 92661)
[batch :  1400], Loss : 0.001684 (89536 / 92661)
Validation Results: Accuracy: 0.868 (86.8%), Avg loss: 0.5780

[Epoch] 17 / 20
[batch :  100], Loss : 0.006200 ( 6336 / 92661)
[batch :  200], Loss : 0.017370 (12736 / 92661)
[batch :  300], Loss : 0.006725 (19136 / 92661)
[batch :  400], Loss : 0.034498 (25536 / 92661)
[batch :  500], Loss : 0.163961 (31936 / 92661)
[batch :  600], Loss : 0.017790 (38336 / 92661)
[batch :  700], Loss : 0.003818 (44736 / 92661)
[batch :  800], Loss : 0.001123 (51136 / 92661)
[batch :  900], Loss : 0.040660 (57536 / 92661)
[batch :  1000], Loss : 0.001056 (63936 / 92661)
[batch :  1100], Loss : 0.021301 (70336 / 92661)
[batch :  1200], Loss : 0.004215 (76736 / 92661)
[batch :  1300], Loss : 0.017027 (83136 / 92661)
[batch :  1400], Loss : 0.060951 (89536 / 92661)
Validation Results: Accuracy: 0.872 (87.2%), Avg loss: 0.5641

[Epoch] 18 / 20
[batch :  100], Loss : 0.008092 ( 6336 / 92661)
[batch :  200], Loss : 0.001458 (12736 / 92661)
[batch :  300], Loss : 0.002501 (19136 / 92661)
[batch :  400], Loss : 0.003560 (25536 / 92661)
[batch :  500], Loss : 0.005780 (31936 / 92661)
[batch :  600], Loss : 0.042658 (38336 / 92661)
[batch :  700], Loss : 0.016040 (44736 / 92661)
[batch :  800], Loss : 0.040591 (51136 / 92661)
[batch :  900], Loss : 0.011695 (57536 / 92661)
[batch :  1000], Loss : 0.001914 (63936 / 92661)
[batch :  1100], Loss : 0.000117 (70336 / 92661)
[batch :  1200], Loss : 0.010791 (76736 / 92661)
[batch :  1300], Loss : 0.013822 (83136 / 92661)
[batch :  1400], Loss : 0.000674 (89536 / 92661)
Validation Results: Accuracy: 0.873 (87.3%), Avg loss: 0.5637

[Epoch] 19 / 20
[batch :  100], Loss : 0.007179 ( 6336 / 92661)
[batch :  200], Loss : 0.004101 (12736 / 92661)
[batch :  300], Loss : 0.032004 (19136 / 92661)
[batch :  400], Loss : 0.013676 (25536 / 92661)
[batch :  500], Loss : 0.005328 (31936 / 92661)
[batch :  600], Loss : 0.013028 (38336 / 92661)
[batch :  700], Loss : 0.004207 (44736 / 92661)
[batch :  800], Loss : 0.012788 (51136 / 92661)
[batch :  900], Loss : 0.006323 (57536 / 92661)
[batch :  1000], Loss : 0.000468 (63936 / 92661)
[batch :  1100], Loss : 0.010100 (70336 / 92661)
[batch :  1200], Loss : 0.044969 (76736 / 92661)
[batch :  1300], Loss : 0.018788 (83136 / 92661)
[batch :  1400], Loss : 0.005582 (89536 / 92661)
Validation Results: Accuracy: 0.840 (84.0%), Avg loss: 0.6608

마치며

Model v5는 내일 바로 학습시키지는 않을 것 같고 다른 필요한 기능들이 거의 다 구현 완료됐을 때 v5 ~ v7 정도를 시도해볼 수 있을 것 같다. 역시나 고성능의 AI 모델을 만들기는 정말 어렵다는 것을 많이 느끼게 되는 프로젝트이다.

profile
Hello I'm TaeHyunAn, Currently Studying Data Analysis

0개의 댓글