paper_review

1.Attention is All you need

post-thumbnail

2.Dropout: a simple way to prevent neural networks from overfitting

post-thumbnail

3.Deep Learning of Representations: Looking Forward

post-thumbnail

4.Editing Factual Knowledge in Language Models

post-thumbnail

5.Generating Text with Recurrent Neural Networks

post-thumbnail

6.Language Models are Few-Shot Learners

post-thumbnail

7.PPT: Pre-trained Prompt Tuning for Few-shot Learning

post-thumbnail

8.Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting

post-thumbnail

9.Training Language Models to Follow Instructions with Human Feedback

post-thumbnail

10.Transformers: State-of-the-Art Natural Language Processing

post-thumbnail

11. Very Deep Convolutional Networks for Large-Scale Image Recognition (VGG)

post-thumbnail