brandonnam.log
로그인
brandonnam.log
로그인
ML 7: Cost Function for Logistic Regression
brandon
·
2023년 8월 8일
팔로우
0
0
SupervisedML
목록 보기
7/27
1. Logistic Loss Function
The squared cost function for logistic regression shows many local minima, which is not ideal for gradient descent.
The upper branch log function works for y = 1, because the cost decreases exponentially when expected value gets closer to 1.
Closer to 0, cost goes to infinity.
The opposite for y = 0.
2. Simplified Cost Function
either one of the two terms gets eliminated because y is only either 1 or 0.
average of all the losses
brandon
everything happens for a reason
팔로우
이전 포스트
ML 6: Classification with Logistic Regression
다음 포스트
ML 8: Gradient Descent for Logistic Regression
0개의 댓글
댓글 작성