# Continual Learning

[논문리뷰] DKT: Diverse Knowledge Transfer Transformer for Class Incremental Learning
학회: CVPR'23저자: Xinyuan Gao et al.,링크: https://openaccess.thecvf.com/content/CVPR2023/papers/Gao_DKT_Diverse_Knowledge_Transfer_Transformer_for_Cl

Continual Pre-training of Language Models
Continual Pre-training of Language Models, ICLR 2023

Towards Continual Knowledge Learning of Language Models
Towards Continual Knowledge Learning of Language Models, ICLR 2022

Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora
Lifelong Pretraining: Continually Adapting Language Models to Emerging Corpora, NAACL 2022

Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning
Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning, NeurIPS 2021

[논문리뷰] Rainbow Memory: Continual Learning with a Memory of Diverse Samples
학회: CVPR저자: Jihwan Bang and Heesu Kim et al., (NAVER)링크: https://arxiv.org/pdf/2103.17230.pdf글의 목적: 요즘 Continual Learning에 대해서도 논문을 보고 있음 + 해당 논문

Continual evaluation for lifelong learning: Identifying the stability gap
Continual evaluation for lifelong learning: Identifying the stability gap, ICLR 2023

ELLE: Efficient Lifelong Pre-training for Emerging Data
ELLE: Efficient Lifelong Pre-training for Emerging Data, ACL 2022

[논문 리뷰] Memory Aware Synapses: Learning what (not) to forget
Memory Aware Synapses: Learning what (not) to forget

Continual Sequence Generation with Adaptive Compositional Modules
Continual Sequence Generation with Adaptive Compositional Modules, ACL 2022

Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks
Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks, ACL 2021

Continual Learning for Text Classification with Information Disentanglement Based Regularization
Continual Learning for Text Classification with Information Disentanglement Based Regularization, NAACL 2021

Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning
Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning, EMNLP 2021

Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions
Continual Learning of Neural Machine Translation within Low Forgetting Risk Regions, EMNLP 2022

Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers
Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers, EMNLP 2021

CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks
CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks, EMNLP 2021

ConTinTin: Continual Learning from Task Instructions
ConTinTin: Continual Learning from Task Instructions, ACL 2022

LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5
LFPT5: A Unified Framework for Lifelong Few-shot Language Learning Based on Prompt Tuning of T5, ICLR 2022

Continual Few-shot Relation Learning via Embedding Space Regularization and Data Augmentation
Continual Few-shot Relation Learning via Embedding Space Regularization and Data Augmentation, ACL 2022

Incremental Few-shot Text Classification with Multi-round New Classes: Formulation, Dataset & System
Incremental Few-shot Text Classification with Multi-round New Classes: Formulation, Dataset & System, NAACL 2021