brandonnam.log
로그인
brandonnam.log
로그인
ML 6: Classification with Logistic Regression
brandon
·
2023년 8월 7일
팔로우
0
ML
0
SupervisedML
목록 보기
6/27
1. Logistic Regression
Sigmoid function addresses the classification problem well.
The output of logistic regression is the probability that class is 1 (positive).
2. Decision Boundary
Above the threshold is positive, while below is negative.
Threshold does not need to be 0.5.
For subjects like brain tumors, it would be safer with lower thresholds, because false positives are safer than false negative.
If w * x + b >= 0 , then positive. Plotting out w * x+b makes the decision boundary.
3. Cost Function for Logistic Regression
The squared cost function for logistic regression shows many local minima, which is not ideal for gradient descent.
The upper branch log function works for y = 1, because the cost decreases exponentially when expected value gets closer to 1.
Closer to 0, cost goes to infinity.
The opposite for y = 0.
brandon
everything happens for a reason
팔로우
이전 포스트
ML 5: Gradient Descent in Practice
다음 포스트
ML 7: Cost Function for Logistic Regression
1개의 댓글
댓글 작성
happy
2023년 8월 7일
큰 도움이 되었습니다, 감사합니다.
답글 달기
큰 도움이 되었습니다, 감사합니다.