Huggingface Offline mode

ODD·2024년 12월 12일

출처:https://huggingface.co/docs/transformers/main/en/installation

export HF_HUB_OFFLINE=1

Fetch models and tokenizers to use offline

a) https://huggingface.co/models 에서 다운로드

b) from pretrained 사용
Download your files ahead of time with PreTrainedModel.from_pretrained():

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("bigscience/T0_3B")
model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0_3B")

Save your files to a specified directory with PreTrainedModel.save_pretrained():

tokenizer.save_pretrained("./your/path/bigscience_t0")
model.save_pretrained("./your/path/bigscience_t0")

Now when you’re offline, reload your files with PreTrainedModel.from_pretrained() from the specified directory:

tokenizer = AutoTokenizer.from_pretrained("./your/path/bigscience_t0")
model = AutoModel.from_pretrained("./your/path/bigscience_t0")

c) Programmatically download files with the huggingface_hub library
python -m pip install huggingface_hub

from huggingface_hub import hf_hub_download

hf_hub_download(repo_id="bigscience/T0_3B", filename="config.json", cache_dir="./your/path/bigscience_t0")

Once your file is downloaded and locally cached, specify it’s local path to load and use it:

from transformers import AutoConfig

config = AutoConfig.from_pretrained("./your/path/bigscience_t0/config.json")

0개의 댓글