ntk1750.log
ntk1750.log
profile
노태경
글시리즈소개
태그 목록
  • 전체보기(7)
  • AI(2)
  • 기초(1)
  • cn231n(1)
  • gradient(1)
  • DeppLearning(1)
  • Backpropagation(1)
  • Object Detection(1)
  • instance(1)
  • ObjectDetection(1)
  • Regularization(1)
  • overfitting(1)
  • 자율주행(1)
  • Chain Rule(1)
  • 정확도(1)
  • 분류문제(1)
  • lossfunction(1)
  • 과정(1)
  • class(1)
  • model(1)
  • cs231n(1)
  • SVMloss(1)
  • FCLayer(1)
  • feature(1)
  • CAM(1)
  • Convolutional Neural Networks(1)
  • lecture3(1)
  • pooling(1)
  • Segmenation(1)
  • semantic(1)
  • 인공지능(1)
  • layer(1)
  • Exploding(1)
  • Resnet(1)
  • Activation(1)
  • Vanishing(1)
전체보기 (7)AI(2)기초(1)cn231n(1)gradient(1)DeppLearning(1)Backpropagation(1)Object Detection(1)instance(1)ObjectDetection(1)Regularization(1)overfitting(1)자율주행(1)Chain Rule(1)정확도(1)분류문제(1)lossfunction(1)과정(1)class(1)model(1)cs231n(1)SVMloss(1)FCLayer(1)feature(1)CAM(1)Convolutional Neural Networks(1)lecture3(1)pooling(1)Segmenation(1)semantic(1)인공지능(1)layer(1)Exploding(1)Resnet(1)Activation(1)Vanishing(1)
post-thumbnail

loss_Fun Bug를 잡자! Regularization

<SVM loss 구하는 공식> SVM loss를 사용하여 Loss=0 인 Weight 값을 구했다면 과연 Loss=0일 때 Weight 값이 유일한 값일까?!만약! Loss=0일 때 $Weight^2$ 을 한다면 Loss 값은?!즉, weight 를 제곱하여

RegularizationSVMlosscn231nlecture3lossfunctionoverfitting
2022년 1월 14일
·
0개의 댓글
·
0