# How to create docker container for machine learning that supports tensorboard, and jupyter When creating a container, forward two ports 22 (for ssh) and 6006 (for Tensorboard) and 8888 (for jupyter). ``` docker run -ti --runtime=nvidia --gpus all -p 8082:22 -p 8083:6006 -p 8084:8888 nvidia/cuda:10.2-cudnn7-devel-ubuntu18.04 /bin/bash ``` you may need to install these ``` apt update apt install net-tools apt install openssh-server service ssh start ``` When accessing by ssh ``` ssh @ -p 8082 ``` Access through localhost In a remote container, run Tensorboard with 6006 (the default port). ``` tensorboard --logdir lightning_logs ``` ``` TensorFlow installation not found - running with reduced feature set. Serving TensorBoard on localhost; to expose to the network, use a proxy or pass --bind_all TensorBoard 2.2.2 at http://localhost:6006/ (Press CTRL+C to quit) ``` In a local machine, bind local’s 8123 port to remote’s 6006 port. ``` ssh -L 8123:127.0.0.1:6006 @ -p 8082 ``` Now, we can access the Tensorboard web interface using the address `localhost:8123` in a local machine. You can use Jupyter in the same way. Just change 6006 to 8888 (the default port in Jupyter). Access through IP address or domain name If you want to access the Tensorboard through the IP address or domain name of the server, add `--host 0.0.0.0` to tensorboard command. ``` tensorboard --logdir lightning_logs --host 0.0.0.0 ``` You can access the Tensorboard page with http://:8083 on any kind of machine connected to the internet.