In this project, the implementation of MXNet code in the original book of Hands-on Deep Learning is changed to PyTorch. Original Book authors: Aston Chang, Li Mu, Zachary C. Lipton, Alexander J. Smola and other community contributors at GitHub: github.com/d2l-ai/d2l-…

There are some differences between the English and Chinese versions of this book. For the Reconstruction of PyTorch for the English version of this book, please refer to this project. There are some differences between the Chinese and English versions of this book. For the PyTorch modifying of the English version, you can refer to this repo.

Introduction to the

This repository mainly contains two folders code and docs (plus some data stored in data). The Code folder is the relevant Jupyter Notebook code for each chapter (based on PyTorch); The Docs folder is a reference to the Markdown format of the Hands-on Deep Learning book, and docsify is used to deploy web documents to GitHub Pages. The docs content may be slightly different from the original book, which uses the MXNet framework, but the overall content is the same. Contributions or issues to this project are welcome.

For the crowd

This project is for children who are interested in deep learning, especially those who want to use PyTorch for deep learning. This program does not require you to have any deep learning or machine learning background, you just need to understand basic mathematics and programming, such as basic linear algebra, differential and probability, and basic Python programming.

How to eat it

Methods a

The repository contains some latex formulas, but Github’s Markdown native does not support formula display, and the Docs folder has been deployed to Github Pages using Docsify, so the easiest way to view the documentation is to go directly to the web version of the project. Of course, if you want to run the related code, you still have to clone the project, and then run the relevant code folder.

Method 2

You can also access the documentation locally by installing the Docsify -CLI tool:

npm i docsify-cli -gCopy the code

Clone this project to local:

git clone https://github.com/ShusenTang/Dive-into-DL-PyTorch.git
cd Dive-into-DL-PyTorchCopy the code

Then run a local server that makes it easy to access document web renderings in real time at http://localhost:3000.

docsify serve docsCopy the code

directory

  • Introduction to the
  • Reading guide
  • 1. Introduction to deep learning
  • 2. Prepare knowledge
    • 2.1 Environment Configuration
    • 2.2 Data Operation
    • 2.3 Automatic gradient calculation
  • 3. Fundamentals of deep learning
    • 3.1 Linear regression
    • 3.2 The realization of linear regression from zero
    • 3.3 Simple implementation of linear regression
    • 3.4 softmax regression
    • 3.5 Image Classification Data set (fashion-MNIST)
    • 3.6 Softmax regression is implemented from scratch
    • 3.7 Concise implementation of Softmax regression
    • 3.8 Multilayer perceptron
    • 3.9 Implementation of multilayer perceptron from scratch
    • 3.10 Simple implementation of multilayer perceptron
    • 3.11 Model selection, underfitting and overfitting
    • 3.12 Weight attenuation
    • 3.13 discarded method
    • 3.14 Forward propagation, back propagation and calculation diagram
    • 3.15 Numerical stability and model initialization
    • 3.16 Real Kaggle competition: Housing price forecast
  • 4. Deep learning computing
    • 4.1 Model Construction
    • 4.2 Access, initialization and sharing of model parameters
    • 4.3 Delayed initialization of model parameters
    • 4.4 Custom Layer
    • 4.5 Reading and Storage
    • 4.6 the GPU computing
  • Convolutional neural network
    • 5.1 Two-dimensional convolution layer
    • 5.2 Fill and stride
    • 5.3 Multiple Input Channels and Multiple Output Channels
    • 5.4 pooling layer
    • 5.5 Convolutional Neural Network (LeNet)
    • 5.6 Deep Convolutional Neural Network (AlexNet)
    • 5.7 Networks using Duplicate Elements (VGG)
    • 5.8 Network within a Network (NiN)
    • 5.9 Networks with Parallel Links (GoogLeNet)
    • 5.10 Batch normalization
    • 5.11 Residual Network (ResNet)
    • 5.12 Dense Connection Network (DenseNet)
  • 6. Recurrent neural network
    • 6.1 Language Model
    • 6.2 Recurrent neural network
    • 6.3 Language Model Data Set (Lyrics from Jay Chou’s Album)
    • 6.4 Implementation of cyclic neural network from scratch
    • 6.5 Simple implementation of recurrent neural network
    • 6.6 Reverse Propagation through Time
    • 6.7 Gated Cycle Unit (GRU)
    • 6.8 Short and Long Term Memory (LSTM)
    • 6.9 Deep recurrent neural network
    • 6.10 Bidirectional cyclic neural network
  • 7. Optimize the algorithm
    • 7.1 Optimization and deep learning
    • 7.2 Gradient descent and Stochastic gradient descent
    • 7.3 Small batch stochastic gradient descent
    • 7.4 the momentum method
    • 7.5 AdaGrad algorithm
    • 7.6 RMSProp algorithm
    • 7.7 AdaDelta algorithm
    • 7.8 Adam algorithm
  • 8. Computational performance
    • 8.1 Mixed imperative and symbolic programming
    • 8.2 Asynchronous Computing
    • 8.3 Automatic Parallel computing
    • 8.4 Multi-GPU Computing
  • Computer vision
    • 9.1 Image enhancement
    • 9.2 fine-tuning
    • 9.3 Target detection and boundary boxes
    • 9.4 the anchor box
    • 9.5 Multi-scale target detection
    • 9.6 Target detection Data set (Pikachu)
    • To be updated…
  • 10. Natural language processing
    • 10.1 Word Embedding (word2vec)
    • 10.2 Approximate training
    • 10.3 Implementation of word2vec
    • 10.4 Embedding Sub-Words (fastText)
    • 10.5 Word Embedding of global Vector (GloVe)
    • Find synonyms and analogies
    • 10.7 Text emotion classification: Using recurrent neural networks
    • 10.8 Text Sentiment Classification: Using Convolutional Neural Network (textCNN)
    • 10.9 Encoder – Decoder (SEQ2SEQ)
    • 10.10 beam search
    • 10.11 Attention mechanisms
    • 10.12 Machine translation

Updates continue at……

The original address book

Chinese Version: hands-on Learning Deep Learning | making warehouse English Version: Dive into Deep Learning | making Repo

reference

If you use this project in your research please quote the original book:

@book{zhang2019dive,
    title={Dive into Deep Learning},
    author={Aston Zhang and Zachary C. Lipton and Mu Li and Alexander J. Smola},
    note={\url{http://www.d2l.ai}},
    year={2019}
}
Copy the code