# Contrastive Learning
[NLP #3] SimCSE: Simple Contrastive Learning of Sentence Embeddings (EMNLP, 2021)
한줄 요약: unlabeled or labeled data 모두에서 setence embedding 뽑을 수 있다? Paper: https://aclanthology.org/2021.emnlp-main.552/Code: https://github.co

CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks
CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks, EMNLP 2021

The Inductive Bias of In-Context Learning: Rethinking Pretraining Example Design
The Inductive Bias of In-Context Learning: Rethinking Pretraining Example Design, ICLR 2022

Text and Code Embeddings by Contrastive Pre-Training
Text and Code Embeddings by Contrastive Pre-Training, OpenAI

[논문 리뷰] Self-Supervised Contrastive Pre-Training for Time Series via Time-Frequency Consistency
Self-Supervised Contrastive PreTraining for TimeSeries via Time-Frequency Consistency

[논문 리뷰] COST: Contrastive learning of disentangled seasonal-trend representations for time series forecasting
CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting

[GDSC/ML] CLIP 이해를 위한 Transformer & GPT 리뷰📎
Attention Is All You Need 정리 Improving Language Understanding by Generative Pre-Training 정리

[RS][Paper Review] Exploiting Negative Preference in Content-based Music Recommendation with Contrastive Learning
💡 KeywordsNegative Preference, Contrastive LearningContrastive Learning 이란? 참고“A contrast is a great difference between two or more things which is c

TS-TCC 정리 및 분석 - 3
6. 실험 준비 이번 파트에서는 TS-TCC의 4번째인 실험에 관해서 설명하겠습니다. 논문의 내용 번역은 굵은 글씨로 나타내겠습니다. 4.1 **우리의 모델을 평가하기 위해, 우리는 세가지의 공개된, HAR 데이터셋과, Epilepsy Seizure Predictio

[논문 리뷰] ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer - 1
Contrastive Framework for Self-Supervised Sentence Representation
[Story Generation #2] Genre-Controllable Story Generation via Supervised Contrastive Learning (WWW, 2022)
Challenge : Pretraine language model 등의 발전으로 controllable text genration이 각광받고 있다. 하지만 story-specific controllability를 잘하기 위해선 아직 부족하다!
TS-TCC 정리 및 분석 - 1
졸업을 위해 캡스톤 디자인을 하면서 Contrastive learning쪽에 대해 공부하게 되었는데, 공부에 쓰인 TS-TCC 논문을 정리해서 남기고자 해당 포스팅을 씁니다. 코드 : https://github.com/emadeldeen24/TS-TCC 논문 : ht

Incremental False Negative Detection for Contrastive Learning (ICLR / 2022)
점진적으로 false negative 찾아내고 제거하는 방식을 통하여 Contrastive Learning의 false negative problem을 개선

Boosting Contrastive Learning with Relation Knowledge Distillation (AAAI/ 2022)
cluster-based와 contrast-based를 link하는 relation knowledge-distillation 기법 제안
[간단정리]Adversarial Self-Supervised Contrastive Learning(NIPS 2020)
Some notes on Adversarial, Self-Supervised, and Contrastive Learning
[간단정리]MixCo: Mix-up Contrastive Learning for Visual Representation(Arxiv 2020)
Some notes on MixCo

CoDA: Contrast-Enhanced and Diversity-Promoting Data Augmentation for Natural Language Understanding (ICLR / 2021)
back-translation에 adversarial training을 sequential stacking하고 consistency loss와 contrastive loss를 이용하여 informative한 augmented data augmentation 기법을 제안

Similarity Learning & Contrastive Learning
Similarity Leanring & Contrastive Learning(1)

Neighborhood Contrastive Learning for Novel Class Discovery (CVPR / 2021) paper review
Novel Class Discovery task에 대하여 Neighborhood Contrastive Learning + Hard Negative Generation method를 제안

Large-Margin Contrastive Learning with Distance polarization Regularizer (ICML / 2021) paper review
Distance Polarization Regularizer를 사용하여 기존의 Contrastive Learning을 개선하는 method를 제안