[host]
needed: nvidia-driver & nividia-docker-toolkit (==cuda for docker)
(already set on 4090 server, do not modify)
[docker]
컨테이너에 직접 까는 건 잘 안되니
docker hub의 tensorflow/tensorflow or nvidia/cuda or pytorch/pytorch 이미지를 이용하세요.
you don't need to do extra setting to use gpu
[tip for checking cuda in docker container]
cuda is basically installed in /usr/local/lib folder. If u want to check cuda installation in container (and local PC either), simply go to that folder and find out cuda*. do not use command when for checking cuda
[making container from image]
sudo docker run -it --gpus all --runtime=nvidia --name [containername][maintainer]/[image]:[tag]
Command above will enable gpu access. This line is saved in our server's Desktop folder. You should use this command line making container.
nvidia-container-toolkit 알아보기:
https://github.com/NVIDIA/nvidia-container-toolkit
tensorflow 쓰는 경우:
https://www.tensorflow.org/install/docker?hl=ko @2024년 2월 28일 오후 4:28