Neural network – support vector machine

Support Vector Machine (SVM) was first proposed by Cortes and Vapnik in 1995. It shows many unique advantages in solving small sample size, nonlinear and high-dimensional pattern recognition, and can be generalized to other Machine learning problems such as function fitting. 1 Mathematics section 1.1 Two-dimensional space​​ ​​ ​​ ​​ ​​ ​​ ​​ ​​ 2 algorithm Part​​ ​​ ​​

Two, flower pollination algorithm

Flower Pollination Algorithm (FPA) was proposed by Yang, a scholar from The University of Cambridge in 2012. Its basic idea is derived from the simulation of self-pollination and cross-pollination of flowers in nature, and it is a new meta-heuristic swarm intelligence stochastic optimization technology. In order to simplify the calculation, it is assumed that each plant has only one flower and each flower has only one gamete. We can consider each gamete as a candidate solution in the solution space.

Yang abstracts the following four rules from her research on flower pollination:

1) Biological cross-pollination was considered as a global detection behavior of the algorithm, and global pollination was realized by pollinators through Levy flight mechanism.

2) Abiotic self-pollination is regarded as the local mining behavior of the algorithm, or local pollination;

3) The constancy of flowers can be regarded as the probability of reproduction, which is directly proportional to the similarity of two flowers participating in pollination.

4) global pollination and local pollination of flowers are regulated by transition probability p∈[0,1]. Due to the influence of physical proximity and wind, the transition probability P is a very important parameter in the whole pollination. According to the experimental study on this parameter in literature [1], p =0.8 is more conducive to algorithm optimization.

Step up directly (take multivariate function optimization as an example) :

Min g = f(x1,x2,x3,x4……….. xd)

Setting parameters: N (number of candidate solutions), ITER (maximum number of iterations), P (conversion probability), LAMda (Levy flight parameters)

Initialize the flower, randomly set a NXd matrix;

Calculate the fitness, namely the function value;

Obtain the optimal solution and the location of the optimal solution;

A Loop 1:1: iter

B cycle

if rand < p

Global pollination;

else

Local pollination;

end if

Update the new generation of flowers and fitness (function variables and function values);

B cycle end

Obtain the optimal solution of the new generation and the optimal solution position;

A circular end

Global update formula: XI (t+1) = XI (t) + L(xi(t) -xbest (t)) L is subject to Levy distribution, specifically can search cuckoo algorithm.

Local update formula: xi(t+1) = xi(t) + m(xj(t) -xk (t)) m is a random number subject to uniform distribution on [0,1]. Among them, Xj and XK are two different individuals

Three, code,

The function [mem, bestSol bestFit, optima, FunctionCalls] = FPA (para) % Default parameters if nargin < 1, para = 50 0.25 [500]; end n=para(1); % Population size p=para(2); % Probabibility switch N_iter=para (3); % Number of iterations phase = 1; % First state phaseIte = [0.5, 0.9, 1.01]; %State vector %Deb Function d = 1; Lb = 0; Ub = 1; optima = [.1;.3;.5;.7;.9]; % Initialize the population for i=1:n, Sol(i,:)=Lb+(Ub-Lb).*rand(1,d); Fitness(i)=fitFunc(Sol(i,:)); %%Evaluate fitness function end % Initialice the memory [mem,bestSol,bestFit,worstF] = memUpdate(Sol,Fitness, [], zeros(1,d), 100000000, 0, phase,d,Ub,Lb); S = Sol; FunctionCalls = 0; % Main Loop for ite = 1 : N_iter, %For each pollen gamete, modify each position acoording %to local or global pollination for i = 1 : n, % Switch probability if rand>p, L=Levy(d); dS=L.*(Sol(i,:)-bestSol); S(i,:)=Sol(i,:)+dS; S(i,:)=simplebounds(S(i,:),Lb,Ub); else epsilon=rand; % Find random flowers in the neighbourhood JK=randperm(n); % As they are random, the first two entries also random % If the flower are the same or similar species, then % they can be pollenated, otherwise, no action. % Formula: x_i^{t+1}+epsilon*(x_j^t-x_k^t) S(i,:)=S(i,:)+epsilon*(Sol(JK(1),:)-Sol(JK(2),:)); % Check if the simple limits/bounds are OK S(i,:)=simplebounds(S(i,:),Lb,Ub); end Fitness(i)=fitFunc(S(i,:)); end %Update the memory [mem,bestSol,bestFit,worstF] = memUpdate(S,Fitness,mem,bestSol,bestFit,worstF,phase,d,Ub,Lb); Sol = get_best_nest(S, mem, p); FunctionCalls = FunctionCalls + n; if ite/N_iter > phaseIte(phase) %Next evolutionary process stage phase = phase + 1; [m,~]=size(mem); %Depurate the memory for each stage mem = cleanMemory(mem); FunctionCalls = FunctionCalls + m; end end %Plot the solutions (mem) founded by the multimodal framework x = 0:.01:1; y = ((sin(5.*pi.*x)).^ 6); plot(x,y) hold on plot(mem(:,1),-mem(:,2),'r*');Copy the code

5. References:

The book “MATLAB Neural Network 43 Case Analysis”