NLP 

1.Word2vec, RNN, LSTM

post-thumbnail

2.Seq2Seq & Attention model

post-thumbnail

3.Transformer

post-thumbnail

4.Transformer 코드 구현

post-thumbnail

5.GPT-1 : Improving Language Understanding by Generative Pre-Training (2018) 논문리뷰

post-thumbnail

6.BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding-(1)

post-thumbnail

7.BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding-(2)

post-thumbnail

8.BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension 논문 리뷰 (1)

post-thumbnail

9.[논문리뷰] Language Models are Unsupervised Multitask Learners -GPT-2

post-thumbnail

10.RLHF - Reinforcement Learning from Human feedback

post-thumbnail

11.[논문 리뷰] Finetuned Language Models Are Zero-Shot Learners 2022

post-thumbnail

12.[논문 리뷰] LLAMA: Open and Efficient Foundation Language Models (2023)

post-thumbnail

13.[논문리뷰]-TinyBERT: Distilling BERT for Natural Language Understanding

post-thumbnail

14.[논문 리뷰]- Parameter-Efficient Transfer Learning for NLP (2019)

post-thumbnail