• The original address: Build Personal Deep Learning Rig: GTX 1080 + Ubuntu 16.04 + CUDA 8.0 RC + 7 + Tensorflow CuDnn/Mxnet/Caffe/Darknet

  • Originally published by Guanghan Ning

  • The Nuggets translation Project

  • Permanent link to this article: github.com/xitu/gold-m…

  • Translator: RichardLeeH

  • Proofread by: TobiasLee, FGHPDF

My internship at TCL is coming to an end. Before going back to school for the graduation ceremony, I decided to build my own personal deep learning platform. I don’t think I can really rely on a company or lab machine, after all I don’t own the workstation, and the development environment can be a mess (it has already happened once). With a personal platform, I can easily log into my deep learning workstation at any time through teamViewer. I had a chance to build the platform from scratch.

In this article, I will introduce the whole process of building deep learning on PC platform, including hardware and software. Here, I share it with you, hoping to help other researchers and engineers with similar needs. As I build platforms using GTX 1080, Ubuntu 16.04, CUDA 8.0RC, CuDNN 7, these are the latest releases. Here’s an overview of the article:

hardware

  1. Accessories to choose

  2. Set up workstation

software

  1. OS Installation

  • Prepare a bootable installation USB drive

  • Install the system

  1. Deep learning environment installation

  • Remote control: teamViewer

  • Development package management: Anaconda

  • Development environment: Python IDE

  • GPU optimization environments: CUDA and CuDNN

  • Deep learning frameworks: Tensorflow & Mxnet & Caffe & Darknet

  1. Out of the box deep learning environment: Docker

  • Install the Docker

  • Install NVIDIA – Docker

  • Download the deep learning Docker image

  • Data is shared between hosts and containers

  • Learn about simple Docker commands

Hardware:

Accessories to choose

I recommend using the PcPartPicker to pick out accessories. It can help you buy the accessory at the lowest price and check the compatibility of the selected accessory for you. They have also launched a YouTube channel where they offer videos showing the build process.

In my build case, I used their build article as a reference and created a build list that can be found inhereTo find it. The following are the accessories I used to build my workstation.

Since we are doing deep learning research, a good GPU is very necessary. So I chose the newly released GTX 1080. It’s hard to get, but if you notice the bundles on Newegg, some people are already stocking up and bundling together [GPU + motherboard] or [GPU + power]. You know, that’s the market. It’s better to buy a bundle than a higher-priced one. Either way, a good GPU will speed up the training or later tuning process. Here are some of the GTX 1080’s advantages over other gpus in terms of performance, price, and power consumption (saving on daily electricity consumption and spending on the right PC power supply).

Note: The GTX 1080 has only 8GB of ram compared to the TITAN X with 12GB of ram, so you’ll probably be richer or more generous and opt for a stacked GPU. Then remember to choose a motherboard with more PCI.

Parts assembly

Platform building starts with accessory assembly, which I referencedThis video (YouTube, need to climb the wall)Tutorial. While the parts are slightly different, the setup process is very similar. I don’t have any assembly experience, but with this tutorial I can do it in 3 hours. (You might spend less time, but you know, I’m very cautious.)

Software:

OS Installation

Ubuntu is often used for deep learning research. But sometimes you need to work with another operating system. For example, if you use GTX 1080 and are a VR developer, you may need to use Win10 for VR development based on Unity or another framework. Here I will introduce Win10 and Ubuntu installation. If you’re only interested in installing Ubuntu, you can skip the Windows installation.

Prepare a bootable installation USB drive

Using a USB disk to install the operating system is very convenient, because we need it. Since USB disks will be formatted, you don’t want this to happen on removable hard disks. Or if you have writable DVDS, you can use them to install the operating system and save them for future use, if you can find them again at that time.

As well explained on the official website, you can visit the Windows 10 page to learn how to make a USB driver. For Ubuntu, you can also download the ISO and build USB installation media or burn it to a DVD. If you are using Ubuntu, check out the tutorials on the Ubuntu website. If you are using Windows, refer to this tutorial.

System installation

It is strongly recommended to install Windows dual system as the main system. I will skip the Installation of Windows 10, as detailed installation instructions can be found on the Windows 10 home page. One thing to note is that you need to use an activation code. If you have Windows 7 or Windows 10 installed on your laptop, you can find the activation code TAB on the bottom of your laptop.

Ubuntu16.04 had a bit of unexpected trouble installing Ubuntu16.04. This is mainly because I didn’t have the GTX 1080 driver installed in the first place. I’m going to share this with you in case you have the same problem.

Install Ubuntu:

First, insert the boot USB used to install the system. There was nothing on my LG display, except the frequency was too high. But the display was fine because it was tested on another laptop. I tried to connect the PC to the TV, and it worked fine on the TV, but there was no tool panel on the desktop. I found this to be an NVIDIA driver issue. So I turned on the BIOS, set the integrated graphics card as the default and restarted. Remember to switch the HDMI from the GTX1080 port to the motherboard. Now the monitor works very well. I successfully installed Ubuntu by following the instructions.

To use the GTX1080, visit this page for an Ubuntu based NVIDIA graphics card driver. After installing the driver, make sure the GTX1080 is on the motherboard. Now the screen says “You appear to be running an X server..” . I refer to this link to fix the problem and install the driver. I quote:

  • Make sure you log out of the system.

  • Hold CTRL+ALT+F1 and log in with your authorization.

  • Kill the current X service session by running sudo service lightdm stop or sudo stop lightdm.

  • Go to level 3 and install the *.run file by running sudo init 3.

  • When the installation is complete, you need to restart the system. If not, run sudo service lightdm start or sudo start lightdm to restart the X service.

After the drive is installed, we need to reboot and set GTX1080 as default in BIOS. At this point, we’re ready.

Some other minor issues I encountered for future use:

  • Problem: WHEN I reboot, I can’t find the option to select Windows.

  • Solution: In Ubuntu, sudo gedit /boot/grub/grub. CFG, add the following line:

Menuentry 'Windows' {10setRoot = 'hd0, msdos1' chainloader + 1}Copy the code
  • Problem: Ubuntu doesn’t support the Belkin N300 wireless adapter that Best Buy often sells,

  • Solution: Refer to this linked guide and the problem will be solved.

  • Install teamViewer dependencies not met

  • Solution: Refer to this link.

Deep learning environment

Remote control software Installation (TeamViewer) :

DPKG -i teamviewer_11. 0. Xxxxx_i386. Deb

Package Management Tool Installation (Anaconda) :

Anaconda is an easy to install free package management, environment management, and Python distribution toolkit. Up to 720 open source packages are collected and a free support community is provided. It can create virtual environments that do not affect each other. This is useful when using different deep learning frameworks at the same time, even though they are configured differently. It is very convenient to use to install package pages. Easy to install, see here.

Some commands for using virtual environments:

  • source activate virtualenv

  • source deactivate

Development environment Installation (Python IDE) :

Spyder vs Pycharm?

Spyder:

  • Advantages: MatLAB-like, easy to view intermediate results.

Pycharm:

  • Advantages: Modular coding, a more complete Web development framework, and a cross-platform IDE.

In my personal philosophy, I think they’re just tools. Each tool comes in handy when used. I will use the IDE to build the main project. For example, build the framework using PyCharm. Then, I just used Vim to change the code. This is not to say that VIM is so powerful and fancy. After that, I’ll use Vim to modify the code. It’s because it’s a text editor I want to really master. For a text editor, we don’t need to master both. In special cases where we need to check IO frequently, directories, etc., we may want to use Spyder.

Installation:

  1. Spyder:

  • You don’t need to install spyder because it already comes with Anaconda

  1. Pycharm

  • Download it from the official website. Just unzip.

  • Set the project interpreter of Pycharm to Anaconda and manage the package. Focus here.

  1. vim

  • sudo apt-get install vim

  • The configuration I used: Github

  1. Git

  • sudo apt install git

  • Git config — global user.name “Guanghan Ning”

  • Git config — global user.email “[email protected]

  • Git config — global core.editor vim

  • Git config – the list

GPU Optimized Computing Environment Installation (CUDA and CuDNN)

CUDA

Installing CUDA 8.0 RC: There are two reasons for choosing 8.0 over 7.5:
  • CUDA 8.0 will improve GTX1080 (Pascal) performance compared to CUDA 7.5.

  • Ubuntu 16.04 doesn’t seem to support CUDA 7.5, as you can’t find it on the official website. So CUDA 8.0 was the only choice.

CUDA Getting started Guide
CUDA Installation Guide
  1. Sudo sh cuda_8. 0.27 _linux. Run

  2. Follow the command prompt

  3. As part of your CUDA environment, you will need to add the following to the ~/.bashrc file in your home directory.

  • Export CUDA_HOME = / usr/local/cuda – 8.0

  • export LD_LIBRARY_PATH=${CUDA_HOME}/lib64

  • PATH=${CUDA_HOME}/bin:${PATH}

  • export PATH

  1. Verify whether CUDA is installed (remember to restart terminal) :

  • NVCC version

CuDNN (CUDA Deep Learning Library)

Install CuDNN
  • CuDNN V5.0 for CUDA 8.0RC

The user guide
The installation guide
  1. Method 1 :(add CuDNN path to environment variable)

  • Extract the folder “cuda”

  • cd

  • export LD_LIBRARY_PATH=pwd:$LD_LIBRARY_PATH

  1. Method 2: (Copy CuDNN files to CUDA folder. If CUDA is working, it will automatically find CUDNN by relative path)

  • The tar XVZF cudnn – 8.0. TGZ

  • cd cudnn

  • sudo cp include/cudnn.h /usr/local/cuda/include

  • sudo cp lib64/libcudnn* /usr/local/cuda/lib64

  • sudo chmod a+r /usr/local/cuda/include/cudnn.h /usr/local/cuda/lib64/libcudnn*

Install the Deep learning framework:

Tensorflow / keras

First, install TensorFlow
  1. Install using Anaconda

  • Conda create -n tensorflow python=3.5

  1. Install Tensorflow in the environment using Pip (CUDA 8.0 is not currently supported. I will update CUDA 8.0 binaries when they are released.)

  • source activate tensorflow

  • sudo apt install python3-pip

  • Export TF_BINARY_URL=https://storage.googleapis.com/tensorflow/linux/gpu/tensorflow-0.9.0-cp35-cp35m-linux_x86_64.whl

  • Pip3 install – $TF_BINARY_URL upgrade

  1. Install Tensorflow directly using the source code

  • install bazel: install jdk 8, uninstall jdk 9.

  • sudo apt-get install python-numpy swig python-dev

  • ./configure

  • build with bazel: Bazel build – c opt – config = cuda / / tensorflow/cc: tutorials_example_trainer, Bazel – bin/tensorflow/cc/tutorials_example_trainer – use_gpu.

Install keras
  1. Download: github.com/fchollet/ke…

  2. Locate the Keras directory and run the install command:

  • sudo python setup.py install

  1. Change the default backend from Theano to TensorFlow

Use Conda to switch between virtual environments
  1. source activate tensorflow

  2. source deactivate

Mxnet

Create a virtual environment for Mxnet
  1. Conda create -n mxnet python=2.7

  2. source activate mxnet

Refer to the official website to install MXNET
  1. sudo apt-get update

  2. sudo apt-get install -y build-essential git libatlas-base-dev libopencv-dev

  3. Git clone – recursive https://github.com/dmlc/mxnet

  4. edit make/config.mk

  5. set cuda= 1, set cudnn= 1, add cuda path

  6. cd mxnet

  7. make clean_all

  8. make -j4

  • One problem I ran into was, “GCC above 5.3 is not supported!” , and my GCC is 5.4, so I had to delete it.

  • apt-get remove gcc g++

  • Conda install -c anaconda GCC =4.8.5

  • GCC version

Python package installation for MXNET
  1. Conda install -c anaconda numpy=1.11.1

  2. Method 1:

  • cd python; sudo python setup.py install

  • sudo apt-get install python-setuptools

  1. Method 2:

  • cd mxnet

  • cp -r .. /mxnet/python/mxnet .

  • cp .. /mxnet/lib/libmxnet.so mxnet/

  1. Quick test:

  • python example/image-classification/train_mnist.py

  1. GPU test:

  • Python example/image-classification/train_mnist.py – network lenet – gpus 0

Caffe

  1. For details, see the Caffe Ubuntu 16.04 or 15.10 Installation Guide

  2. OpenCV needs to be installed. For Opencv 3.1 installation, see the following links: Ubuntu 16.04 or 15.10 Opencv 3.1 Installation Guide

Darknet

  • This is the easiest installation tool of all. Just run the “make” command and that’s it.

Out of the box deep learning environment: Docker

I have correctly installed Caffe, Darknet, Mxnet and TensorFlow on Ubuntu 14.04 and Titan-X (CUDa7.5). I have completed projects for these frameworks and everything is going well. So if you want to focus on deep learning research, rather than obsessing over peripheral problems you might encounter, it’s safer to use these pre-built environments than the latest versions. Then, you should consider using Docker to isolate each framework from its own environment. These Docker images can be found in DockerHub.

Install the Docker

Unlike virtual machines, Docker images are built from layers. The same component can be shared between different images. When we download a new image, there is no need to re-download existing components. This is much more efficient and convenient than replacing virtual machine images completely. A Docker container is the runtime of a Docker image. These images can be committed and updated, just like Git.

To install Docker on Ubuntu 16.04, we can refer to the guide on the official website.

Install NVIDIA – Docker

Docker containers are hardware and platform independent, but Docker does not support NVIDIA Gpus through containers. (Hardware is specialized and requires drivers.) To solve this problem, we need nvidia-Docker to be mounted to the device and driver files when starting the container on a specific machine. In this case, mirroring is unknowable for Nvidia drivers.

The nVIDIa-Docker installation can be found here.

Download the deep learning Docker image

I collected some pre-ordered build images from Docker Hub. The list of these mirrors is as follows:

  • cuda-caffe

  • cuda-mxnet

  • cuda-keras-tensorflow-jupyter

You can find more images on Docker Hub.

Sharing data between hosts and containers can be embarrassing for computer vision researchers not to see the results. For example, to add picassos to an image, we want to output results from different epochs. See this page to quickly share data between hosts and containers. In a shared directory, we can create projects. On the host, we can write code using a text editor or our favorite IDE. Next, we can run the program in the container. The data in the shared container can be viewed and processed via the GUI on the Ubuntu machine-based host.

Understand simple commands

If you’re new to Docker, don’t be overwhelmed. You don’t need to systematically study it if you don’t need to use it in the future. Here are some simple commands to use on Docker. If you think of Docker as a tool, these commands are sufficient, and use it only for deep learning.

How do I check docker images?
  • Docker images: Query all installed Docker images.

How do I check docker containers?
  • Docker ps -a: Query all installed containers.

  • Docker ps: Query the current running container

How do I exit a Docker container?
  1. (Method 1) Enter at the terminal corresponding to the current container:

  • exit

  1. (Method 2) Use [Ctrl + Alt + T] to open a new terminal or use [Ctrl + Shift + T] to open a new terminal:

  • Docker ps -a: Queries the installed image.

  • Docker PS: query run container.

  • Docker stop [Container’s ID]: Stops exiting the container.

  1. How to delete a Docker image?

  • docker rmi [docker_image_name]

  1. How to delete a Docker container?

  • docker rm [docker_container_name]

  1. How to make our own Docker image based on an existing image? Update the container from an already created image and commit the result to the image.

  • Load the image and open a container

  • Make some changes in the container – Commit image: docker commit -m “Message: Added changes” -a “Author: Guanghan” 0b2616b0e5a8 Ning/CUDa-mxnet

  1. Copy data between host and Docker container:

  • docker cp foo.txt mycontainer:/foo.txt

  • docker cp mycontainer:/foo.txt foo.txt

  1. Open a container from the Docker image:

  • Do I need to save this container because it can be committed: docker runit [image_name]

  • If the container is only used temporarily: docker run -rm it [image_name]

Feel free to comment

The Nuggets Translation Project is a community that translates quality Internet technical articles from English sharing articles on nuggets. The content covers Android, iOS, React, front-end, back-end, product, design and other fields. If you want to see more high-quality translation, please continue to pay attention to the Project, official Weibo, Zhihu column.