NLP 논문 리뷰

1.[논문 리뷰] ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators

post-thumbnail

2.[논문 리뷰] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

post-thumbnail

3.[논문 리뷰] RoBERTa: A Robustly Optimized BERT Pretraining Approach

post-thumbnail