K nearest neighbor algorithm
I. K nearest neighbor classification
Is the fruit orange or grapefruit? Classify fruits according to their characteristics: size and color.
K nearest neighbor feature extraction
In the example above, size and color are the comparison features, and the features can be positioned as an axis, and then distance is calculated to measure the similarity. While calculating distance can be usedPythagoras formula can be calculated.
.
There are not only two features, but also multi-dimensional features, and the distance can still be calculated according to the Pythagorean formula. When selecting the nearest number, you can choose 2, 10, or 10, 000 instead of 3. That’s why this algorithm is called k-nearest neighbor instead of 3-nearest neighbor!
3. K nearest Neighbor feature regression (predicted value)
The regression is obtained by taking the average value of the last K values.
Machine learning
OCR: Identifies numbers by picture. KNN can be used.
- Browse through a large number of digital images and extract features from these numbers. ———— Training
- When you come across a new image, you extract its features and find out who its closest neighbors are.
Naive Bayes classifier
Calculate the probability of being classified as class X.