brandonnam.log
로그인
brandonnam.log
로그인
Advanced Learning Algorithms 11: Additional Neural Network Concepts
brandon
·
2023년 8월 24일
팔로우
0
ML
0
SupervisedML
목록 보기
20/27
1. Advanced Optimization
Adam optimizer for faster gradient descent.
If w_j (or b) keeps moving in the same direction, we increase the learning rate a_j.
If w_j (or b) keeps oscillating, we reduce a_j.
another parameter in compile.
2. Additional Layer Types
Each neuron only looks at small parts of the previous layer's activation.
This leads to faster computation, needs less training data.
In this example, Xx is the height of the signal at any point of time.
In the first convolutional layer, each neuron only looks at a part of the original input.
The second layer can also be a convolutional layer, each neuron only looking at a small number of neurons in the previous layer.
Only the last layer takes in all activation inputs from the previous layer.
brandon
everything happens for a reason
팔로우
이전 포스트
Advanced Learning Algorithm 10: Multiclass Classification
다음 포스트
Advanced Learning Algorithm 12: Back Propagation
0개의 댓글
댓글 작성