NLP모델

1.BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

post-thumbnail

2.📜Transformer: Attention is All You Need (2017)

post-thumbnail

3.GPT-1 : Improving Language Understanding by Generative Pre-Training (2018)

post-thumbnail

4.RoBERTa: A Robustly Optimized BERT Pretraining Approach

post-thumbnail

5.BERT를 활용한 이진 분류 실습(전처리 및 모델 학습)

post-thumbnail