Transformer / Vi-T / Attention and Convolution

1.[논문리뷰] On The Relationship Between Self-Attention and Convolution Layers

post-thumbnail

2.[논문리뷰] TransPose: Keypoint Localization via Transformer

post-thumbnail

3.[이론&코드] Transformer in Pytorch

post-thumbnail

4.[개념정리] Attention Mechanism

post-thumbnail

5.Dot product Self-attention은 Lipchitz인가?

post-thumbnail

6.[논문리뷰] Predicting Human Scanpaths in Visual Questions Answering, in CVPR 2021.

post-thumbnail