Title: Neural Networks and Deep Learning
In course 1, we introduced some basic concepts of deep learning. In the ml course I learned last time, THERE was content about neural network, and I had a basic understanding of forward propagation and back propagation algorithms, so it was relatively easy to learn. But there are also differences:
- There are some changes in notation, the arguments are w and b, and b cannot be omitted;
- Relu excitation function is introduced (the previous ML course only introduced sigmoID as the excitation function). Sigmoid will not be used as the excitation of the hidden layer in deep neural network, but can only be used as the excitation of the output layer.
- The concept of hyperparameters is proposed: model is the set of W and B, and hyperparameters are used to adjust W and B, among which one of the most important hyperparameters is learning_rate. There are many hyperparameters in deep learning, and the selection and tuning of hyperparameters are very important.
- We’ve been making the analogy between neural networks and the brain. Wu En said after so many years of development, he is no longer tend to go to a simple comparison, because the link between the two may not have thought so closely, though this analogy inspire a lot of deep learning methods, but overall deep learning and walk yourself’s road, the complexity of the secret and the brain neurons are far has not been revealed.
- We introduced Python and Jupyter Notebook, which will be the environment for our homework (and of course the TensorFlow framework, exciting). The flow chart drawn here is very clear, about forward propagation and back propagation, in fact, forward operation and reverse derivation, through the chain rule to understand, really very clear and easy (before I thought that the back propagation is still a little difficult to understand).
- Introduces a simple random parameter initialization method.
- The detail of deep neural network propagation algorithm of infrastructure is explained from implementation level.