A list,
1 The past and present life of Haar classifier
Face detection belongs to the category of computer vision, early people’s main research direction is face recognition, that is, according to the face to identify the identity of people, later in the complex background of face detection demand is increasing, face detection has gradually developed as a separate research direction.
1.1 Current face detection methods mainly fall into two categories: knowledge-based and statistics-based. ø Knowledge-based method: It mainly uses prior knowledge to regard the face as a combination of organ features, and detects the face according to the features of eyes, eyebrows, mouth, nose and other organs and their geometric position relationship with each other. ø Statistics-based method: face is regarded as a whole pattern — two-dimensional pixel matrix. Face pattern space is constructed through a large number of face image samples from a statistical point of view, and face existence is judged according to similarity measurement.
1.2 Knowledge-based face detection methods: template matching, face features, shape and edge, texture characteristics, color features
1.3 Face detection methods based on statistics: Principal component analysis and feature face, neural network method, support vector machine, hidden Markov model, Adaboost algorithm This paper introduces the Haar classifier method, including Adaboost algorithm, this algorithm will be introduced in detail later. The so-called classifier, here refers to the face and non-face classification algorithm, in the field of machine learning, many algorithms are the process of classifying and clustering things. The ML module in OpenCV provides many algorithms for classification and clustering. Note: What is the difference between clustering and classification? ø Classification: Generally, the recognition method for the total number of known object categories is called classification, and the trained data is labeled. For example, it has been explicitly specified whether it is a face or a non-face. This is a kind of supervised learning. ø Clustering: There are also methods that can deal with the uncertainty of the total number of categories or training data without labels. This is clustering, which does not require information about object categories in the learning stage and is a kind of unsupervised learning. They include Mahalanobis distance, K-means, Naive Bayes Classifier, Decision tree, Boosting, Random forest, Haar classifier, expectation maximization, K-nearest Neighbors, Neural network, and support vector Machine. The Haar classifier we are going to discuss is actually an application of Boosting algorithm. Haar uses The AdaBoost algorithm in Boosting algorithm, and it only cascdes the strong classifiers trained by AdaBoost algorithm. And in the feature extraction of the bottom of the efficient rectangular feature and integral graph method, several terms involved here will be discussed in detail.
In 2001, Both Viola and Jones published the classic Rapid Object Detection using a 1951 Cascade of Simple Features and the Robust real-time Face Based on AdaBoost algorithm, haar-like wavelet feature and integral graph method are used for face Detection. They are not the first ones to propose wavelet feature, but they design more effective features for face Detection and cascade the strong classifier trained by AdaBoost. This can be said to be a milestone in the history of face detection, so the proposed algorithm was called the Viola-Jones detector. After some time, Rainer Lienhart and Jochen Maydt extended this detector [3] and finally formed OpenCV’s current Haar classifier. AdaBoost, an algorithm proposed by Freund and Schapire in 1995, is a big improvement over the traditional Boosting algorithm. Boosting algorithm is the core idea of Boosting weak learning method to strong learning algorithm, that is, “three heads are better than one” Haar classifier = Haar-like feature + integral graph + AdaBoost + cascade; The main points of Haar classifier are as follows: ① Use haar-like feature to detect. ② The Integral Image is used to accelerate the evaluation of haar-like features. ③ The AdaBoost algorithm is used to train the strong classifier to distinguish faces from non-faces. (4) Filter cascade is used to cascade strong classifiers together to improve accuracy.
2 Shallow in and shallow out of Haar classifier
2.1 Haar-like Characteristics Who are you?
What is a characteristic, I put it in the following situations to describe, in face detection, we need to have a child window in the image to be detected so constant displacement of sliding window, each child window to a position, can calculate the characteristics of the region, and then use our trained cascade classifier for the characteristics of filtering, Once the feature passes all strong classifier screening, the region is judged to be a face. So how do we represent this feature? Well, that’s what the bulls do. And later they called these things that they worked out haar-like features.
Here are the haar-like features proposed by the Viola cows.
Here are the haar-like features proposed by Lienhart et al.
What are these so-called features but a bunch of striped rectangles? I explain this, will any of the above a rectangle on the face region, and then, the white area of pixels and the pixels and minus the black area, the resulting value we call this face feature values, if you put the rectangle in a non face region, then calculate the eigenvalues of should and face feature value is different, and the less the better, So the purpose of these squares is to quantify facial features, to distinguish between faces and non-faces.
In order to increase the degree of discrimination, multiple rectangular features can be calculated to obtain a higher degree of discrimination of the eigenvalue, so what kind of rectangular features and how to combine into a piece can better distinguish between faces and non-faces, this is what the AdaBoost algorithm to do. So let’s leave the integral graph behind, and just for the sake of consistency, LET’s go straight to AdaBoost.
Ii. Source code
function varargout = gui(varargin)
% GUI MATLAB code for gui.fig
% GUI, by itself, creates a new GUI or raises the existing
% singleton*.
%
% H = GUI returns the handle to a new GUI or the handle to
% the existing singleton*.
%
% GUI('CALLBACK',hObject,eventData,handles,...) calls the local
% function named CALLBACK in GUI.M with the given input arguments.
%
% GUI('Property'.'Value',...). creates anew GUI or raises the
% existing singleton*. Starting from the left, property value pairs are
% applied to the GUI before gui_OpeningFcn gets called. An
% unrecognized property name or invalid value makes property application
% stop. All inputs are passed to gui_OpeningFcn via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one % instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES
% Edit the above text to modify the response to help gui
% Last Modified by GUIDE v2. 5 19-Dec- 2016. 02:28:04
% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @gui_OpeningFcn, ...
'gui_OutputFcn', @gui_OutputFcn, ...
'gui_LayoutFcn', [],...'gui_Callback'[]);if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
% --- Executes just before gui is made visible.
function gui_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% varargin command line arguments to gui (see VARARGIN)
% Choose default command line output for gui
handles.output = hObject;
% Update handles structure
guidata(hObject, handles);
% UIWAIT makes gui wait for user response (see UIRESUME)
% uiwait(handles.figure1);
% --- Outputs from this function are returned to the command line.
function varargout = gui_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% Get default command line output from handles structure
varargout{1} = handles.output; % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % button functions % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % % - Executes on the button press in pushbutton1.function pushbutton1_Callback(hObject, eventdata, handles)% Read image button [filename, pathname]= uigetfile( {'*.jpg; *.jpeg; *.tif; *.png; *.gif'.'All Image Files';'*. *'.'All Files' },'Please select picture');
str=[pathname filename];
global im;
im = imread(str);
axes(handles.axes1);
imshow(im);
title(filename);
set(handles.edit1,'string'.sprintf('I x % % I', size(im,1), size(im,2)));
% --- Executes on button press in pushbutton2.
function pushbutton2_Callback(hObject, eventdata, handles)% Exit program button close;
% --- Executes on button press in pushbutton3.
function pushbutton3_Callback(hObject, eventdata, handles)% Face recognition button mergeThreshole= str2num(get(handles.edit2,'String'));
maxS = str2num(get(handles.edit3,'String'));
minS = str2num(get(handles.edit4,'String'));
isPrint = 0; % 0Does not print the five senses isTest =0; % 1It means to perform a five-senses testif get(handles.radiobutton1,'value')
isPrint = 1;
end
if get(handles.radiobutton2,'value')
isTest = 1;
end
detector = buildDetector(mergeThreshole,1,maxS, minS); % Build probe global IM; [bbimg, faces] = detectFaceParts(detector,im,5,isPrint,isTest);
axes(handles.axes2);
imshow(bbimg);
title('Recognition result');
for i=1:7% clear original map STR = ['cla(handles.axes' num2str(i+2) ') '];
eval(str);
end
for i=1:7
if i > size(faces, 1)
break;
end
str = ['axes(handles.axes' num2str(i+2) ') '];
eval(str);
imshow(faces{i});
title(sprintf('face % I', i));
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% --- Executes during object creation, after setting all properties.
function axes1_CreateFcn(hObject, eventdata, handles)
% hObject handle to axes1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
set(hObject,'xTick'[]),; % Drop the axesset(hObject,'ytick'[]); % Drop the axesset(hObject,'box'.'on'); % Hint: place code in OpeningFcn to populate axes1 % --- Executes during object creation, after setting all properties.function axes2_CreateFcn(hObject, eventdata, handles)
% hObject handle to axes2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
set(hObject,'xTick'[]),; % Drop the axesset(hObject,'ytick'[]); % Drop the axesset(hObject,'box'.'on'); % Hint: place code in OpeningFcn to populate axes2 % --- Executes during object creation, after setting all properties.function axes3_CreateFcn(hObject, eventdata, handles)
% hObject handle to axes3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
set(hObject,'xTick'[]),; % Drop the axesset(hObject,'ytick'[]); % Drop the axesset(hObject,'box'.'on'); % Hint: place code in OpeningFcn to populate axes3 % --- Executes during object creation, after setting all properties.function axes4_CreateFcn(hObject, eventdata, handles)
% hObject handle to axes3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
set(hObject,'xTick'[]),; % Drop the axesset(hObject,'ytick'[]); % Drop the axesset(hObject,'box'.'on'); % Hint: place code in OpeningFcn to populate axes3 % --- Executes during object creation, after setting all properties.function axes5_CreateFcn(hObject, eventdata, handles)
% hObject handle to axes3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
set(hObject,'xTick'[]),; % Drop the axesset(hObject,'ytick'[]); % Drop the axesset(hObject,'box'.'on'); % Hint: place code in OpeningFcn to populate axes3 % --- Executes during object creation, after setting all properties.function axes6_CreateFcn(hObject, eventdata, handles)
% hObject handle to axes3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called
set(hObject,'xTick'[]),; % Drop the axesset(hObject,'ytick'[]); % Drop the axesset(hObject,'box'.'on'); % Hint: Place code in OpeningFcn to populate axes3Copy the code
3. Operation results
Fourth, note
Version: 2014 a