Paper Review - NLP

1.Efficient Estimation of Word Representations in Vector Space

post-thumbnail

2.Attention Is All You Need

post-thumbnail

3.BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

post-thumbnail

4.BART: Denoising Sequence-to-Sequence Pre-training for NaturalLanguage Generation, Translation, and Comprehension

post-thumbnail

5.Big Bird: Transformers for Longer Sequences

post-thumbnail

6.Deep contextualized word representations

post-thumbnail

7.Improving Language Understanding by Generative Pre-Training

post-thumbnail