Li Hang’s Statistical Learning Methods can be said to be the introduction of machine learning treasure, many machine learning training classes, Internet enterprises interview, written questions, many reference this book. On May 1, 2019, the Second Edition of Statistical Learning Methods was published! My Github has also been updated accordingly.
Github (code implementation of Statistical Learning Methods by Li Hang) :
Github.com/fengdu78/li…
Some mistakes have been corrected this time, and the overview of each chapter has been added. After updating the first 12 chapters, the content of the second edition will be added in the future.
Modify major error: ****
Chapter 3 max_count error in k nearest Neighbor method \
Chapter 10 viterbi index error of Hidden Markov model
Added content: \
Add a summary of each chapter \
Introduction to Statistical Learning Methods
Statistical Learning Methods comprehensively and systematically introduces the main methods of statistical learning, especially supervised learning methods, including perceptron, K-nearest neighbor method, naive Bayes method, decision tree, Logistic regression and support vector machine, lifting method, EM algorithm, hidden Markov model and conditional random field, etc. In addition to chapter 1, introduction and conclusion, each chapter introduces one method. The narration starts with specific problems or examples, from the shallow to the deep, clarifies the ideas and gives the necessary mathematical derivation, so as to facilitate readers to master the essence of statistical learning methods and learn to use them.
“Statistical Learning Methods” can be said to be the introduction of machine learning treasure book, many machine learning training courses, Internet enterprises interview, written questions, many reference this book. \
First edition course Catalogue:
Chapter 1 Introduction to statistical learning methods
Chapter 2 Perceptron \
Chapter 3 k Nearest Neighbor method \
Chapter 4 naive Bayes
Chapter 5 decision tree
Chapter 6: Regression of Logic
Chapter 7 Support Vector machines \
Chapter 8 Methods of Ascension \
Chapter 9 EM algorithm and its Generalization
Chapter 10 Hidden Markov Model \
Chapter 11 conditional random airport
Chapter 12 summary of statistical learning methods
Second Edition Course Catalogue:
Chapter 1 Supervision of palm practice
Chapter 1 Introduction to Statistical and supervised learning Chapter 2 Perceptron, Chapter 3 K-nearest Neighbor method, Chapter 4 Naive Bayes method, Chapter 5 Decision tree, Chapter 6 Logistic regression and preferred entropy model, Chapter 7 Support vector machine, Chapter 8 Promotion methods, Chapter 9 EM algorithm and its Generalization, Chapter 10 Hidden Markov model, Chapter 11 Conditional random fields Chapter 12 Summary of supervised learning methods Chapter 2 Unsupervised learning Chapter 13 Introduction to unsupervised Learning Chapter 14 Clustering methods Chapter 15 Singular value decomposition Chapter 16 Principal Component Analysis Chapter 17 Latent semantic analysis Chapter 18 Probabilistic latent semantic analysis Chapter 19 Markov chain Monte Carlo method \
Chapter 20 potential Dirichlet allocation
Chapter 21 PageRank algorithm \
Chapter 22 summary of unsupervised learning methods
Appendix A Gradient descent method
Appendix B Newton’s method and quasi-Newton’s method
Appendix C Lagrange duality
Appendix D Basic subspaces of matrices
Appendix E Definition of KL divergence and properties of dirichlet distribution
I have made a comparison, the first 12 chapters of the second edition and the first edition are completely consistent, and the first edition of the error has been revised, I suggest directly buy the second edition. Purchase link:
Statistical learning method of code implementation
Statistical Learning Methods does not officially provide code implementation, but there are many machine learning enthusiasts online who have tried to code each chapter. \
This site collected some code from github, and made some modifications, using python3.6 implementation of chapter 1-12 course code.
Code catalog and screenshots:
Figure: Code directory (IPynb format) \
import numpy as np
import math
import matplotlib.pyplot as plt
plt.rcParams['font.sans-serif'] = ['SimHei']
plt.rcParams['axes.unicode_minus'] = False
plt.figure(figsize=(10.8))
x = np.linspace(start=- 1, stop=2, num=1001, dtype=np.float)
logi = np.log(1 + np.exp(-x)) / math.log(2)
boost = np.exp(-x)
y_01 = x < 0
y_hinge = 1.0 - x
y_hinge[y_hinge < 0] = 0
plt.plot(x, y_01, 'g-', mec='k', label='(0/1 Loss) 0/1 Loss', lw=2)
plt.plot(x, y_hinge, 'b-', mec='k', label='Hinge Loss', lw=2)
plt.plot(x, boost, 'm--', mec='k', label='(index Loss) Adaboost Loss', lw=2)
plt.plot(x, logi, 'r-', mec='k', label='Logistic Loss', lw=2)
plt.grid(True, ls=The '-')
plt.legend(loc='upper right',fontsize=15)
plt.xlabel($yf(x)$,fontsize=20)
plt.title('Loss function',fontsize=20)
plt.show()
Copy the code
\
Figure: Code screenshot (Chapter 12 plotting loss function) (IPynb format) \
The courseware \
“Statistical learning methods” courseware
Author: Yuan Chun, Graduate School of Shenzhen, Tsinghua University, provided 12 chapters of PPT.
Picture: screenshot of courseware
Conclusion \
Statistical Learning MethodsMachine learning is one of the most important aspects of machine learning in the world. It is one of the most important aspects of machine learning in the worldCode implementation,The coursewareande-booksDownload.
Github (code implementation of Statistical Learning Methods by Li Hang) :
Github.com/fengdu78/li…
Note: Dr. Huang haiguang’s Github can be directly downloaded from Baidu cloud. \
Site introduction ↓↓↓
“Machine Learning Beginners” is a personal public account to help artificial intelligence enthusiasts get started (founder: Huang Haiguang)
Beginners on the road to entry, the most need is “help”, rather than “icing on the cake”.
ID: 92416895\
Currently, the planet of Knowledge in the direction of machine learning ranks no. 1.
Past wonderful review \
-
Conscience Recommendation: Introduction to machine learning and learning recommendations (2018 edition) \
-
Github Image download by Dr. Hoi Kwong (Machine learning and Deep Learning resources)
-
Printable version of Machine learning and Deep learning course notes \
-
Machine Learning Cheat Sheet – understand Machine Learning like reciting TOEFL Vocabulary
-
Introduction to Deep Learning – Python Deep Learning, annotated version of the original code in Chinese and ebook
-
The mathematical foundations of machine learning
-
Machine learning essential treasure book – “statistical learning methods” python code implementation, ebook and courseware
-
Blood vomiting recommended collection of dissertation typesetting tutorial (complete version)
-
Installation of Python (Anaconda+Jupyter Notebook +Pycharm)
-
What if Python code is ugly? Recommend a few artifacts to save you
-
Blockbuster | complete AI learning course, the most detailed resources arrangement!
Note: This site’S QQ group: 865189078 (a total of 8 groups, do not add repeatedly).
To join the wechat group of this site, please add the assistant wechat of Huang Bo, explanation: public number user group.