For how to use Deep Network Designer, you can open your Own MATLAB, find Deep Network Designer in the APP bar and click open. The specific process used is described in my introduction to LeNet5, which will not be repeated here. Address: www.jianshu.com/p/86f591c44…
Deep Network Designer
AlexNet
Network flow program diagram
Network parameter details
AlexNet is the cornerstone of the VGG network and resten network families. The novel features of its network architecture are shown below
1. Replace sigmoID and TANH functions with ReLu. Practice has proved that this can make the network convergence faster
2. The concept of Max pooling is also proposed in AlexNet, that is, to select the maximum pixel value as the output of the “pool” composed of each adjacent pixel. In LeNet, pooled pixels do not overlap; In AlexNet, there is overlapping pooling. (PS: I also adopted maximum pooling in LeNet.) A large number of practices show that overlapping maximum pooling can overcome over-fitting problems and improve system performance.
In order to avoid over-fitting caused by rapid update of system parameters, a certain proportion of neurons are randomly “dropped” when training samples are used to update parameters each time. The discarded neurons will no longer participate in the training process, and the weight coefficients of input and output neurons will not be updated. The network architecture of each training is different, and these different network architectures share the weight coefficient of the common training. Practice shows that the random discarding technique can slow down the convergence degree of network and avoid the occurrence of overfitting with a large probability.
4. Train on multiple Gpus. A single GPU has limited storage space. Two Gpus are used to store half of kernels on each GPU. These two Gpus communicate with each other on a specific layer