[diary #6] cross entropy

kamchur·2022년 10월 15일
0

😁START

loss function

E(W,b)=i=1n{tilogyi+(1ti)log(1yi)}E(W, b) = -\displaystyle\sum_{i=1}^{n}{\{t_i\log{y_i} + (1-t_i)\log{(1-y_i)}\}}

def sigmoid(x):
	return 1 / (1 + np.exp(-x))
def loss_function(x, y):
	delta_x = 1e-5
    
    z = np.dot(x, w) + b    # w : weight, b : bias
    y_hat = sigmoid(z)
    
    cross_entropy = -np.sum( 
    	y * np.log(y_hat + delta_x) + (1-y) * np.log((1 -y_hat) + delta_x)
        )
    
    return cross_entropy

😂END

loss values formula difficult to me

2022.10.15. first commit
profile
chase free

0개의 댓글