TIL(22.6.10)

윤승현·2022년 6월 18일
0

TIL

목록 보기
2/3

Artificial Neural Network(ANN)

뇌의 실제 신경계의 특징을 모사하여 만들어진 계산 모델이다.
This is computational model which mimics the real neural network’s feature. The Perceptron gets a various input signal and print the output signal.

Input Layers + Hidden Layers + Output Layers
The difference between Deeplearning and MachineLearning depends on Representation Learning.

Activation Function

a = b + w1x1 + w2x2
y = h(a)
Each node of the Neural Network has Activation Function. This play a role if it transfers signal or not.

Step Activation Function
If input value exceeds 0: prints 1 else 0

def step_function(x):
	return np.array(x>0, dtype=np.int)

Graph of Step Function

The Sigmoid Activation Function

def sigmoid(x):
	return 1 / (1+np.exp(-x))
def sigmoid_deriv(x):
	return sigmoid(x)*(1-sigmoid(x))

ReLu
If input value exceeds 0: prints input value else 0

def relu(x):
	return np.maximum(0, x)
def relu_deriv(x):
	return int(x>0)
profile
머리속에 작은 지식의 폭풍우가

0개의 댓글