loss function
def sigmoid(x):
return 1 / (1 + np.exp(-x))
def loss_function(x, y):
delta_x = 1e-5
z = np.dot(x, w) + b # w : weight, b : bias
y_hat = sigmoid(z)
cross_entropy = -np.sum(
y * np.log(y_hat + delta_x) + (1-y) * np.log((1 -y_hat) + delta_x)
)
return cross_entropy
loss values formula difficult to me
2022.10.15. first commit