Neural network – support vector machine
Support Vector Machine (SVM) was first proposed by Cortes and Vapnik in 1995. It shows many unique advantages in solving small sample size, nonlinear and high-dimensional pattern recognition, and can be generalized to other Machine learning problems such as function fitting. 1 Mathematics section 1.1 Two-dimensional space 2 algorithm Part
Second, the lion pride algorithm
Three, code,
_________________________________________________________________________ % % % % lions algorithm %_________________________________________________________________________% function [Best_pos, Best_score, curve] = LSO (pop, Max_iter, lb, ub, dim, fobj) beta = 0.5; Nc = round(pop*beta); % Adult lion population Np = POP-NC; % staff number if (Max (size (ub)) = = 1) ub = ub. * 'ones (1, dim); lb = lb.*ones(1,dim); End % X0= Initialization (pop,dim,ub,lb); X = X0; % Calculate the initial fitness value fitness = zeros(1, POP); for i = 1:pop fitness(i) = fobj(X(i,:)); end [value, index]= min(fitness); % GBestF = value; % GBestX = X(index,:); % Global optimal position curve=zeros(1,Max_iter); XhisBest = X; fithisBest = fitness; indexBest = index; gbest = GBestX; For t = 1: Max_iter % Disturbance factor calculation stepf = 0.1*(mean(UB) -mean (lb)); alphaf = stepf*exp(-30*t/Max_iter)^10; % Cub movement range disturbance factor calculated alpha = (Max_iter -t)/Max_iter; For I = 1:Nc index = I; while(index == i) index = randi(Nc); % randomly selected a lioness end X (I, :) = (X (I, :) + X (index, :)). * (1 + alphaf. * randn ()). / 2; End % for I = Nc+1:pop q=rand; if q<=1/3 X(i,:) = (gbest + XhisBest(i,:)).*( 1 + alpha.*randn())/2; elseif q>1/3&&q<2/3 indexT = i; while indexT == i indexT = randi(Nc) + pop - Nc; End X % random position (I, :) = (X (indexT, :) + XhisBest (I, :)). * (1 + alpha. * randn ()) / 2; else gbestT = ub + lb - gbest; X(i,:) = (gbestT + XhisBest(i,:)).*( 1 + alpha.*randn())/2; For j = 1:pop for a = 1: dim if(X(j,a)>ub) X(j,a) =ub(a); end if(X(j,a)<lb) X(j,a) =lb(a); For j=1: POP fitness(j) = fobj(X(j,:)); end for j = 1:pop if(fitness(j)<fithisBest(j)) XhisBest(j,:) = X(j,:); fithisBest(j) = fitness(j); end if(fitness(j) < GBestF) GBestF = fitness(j); GBestX = X(j,:); indexBest = j; Temp = gbest.*(1 + randn().*abs(XhisBest(indexBest,:) -gbest)); Temp(Temp>ub)=ub(Temp>ub); Temp(Temp<lb) = lb(Temp<lb); fitTemp = fobj(Temp); if(fitTemp<GBestF) GBestF =fitTemp; GBestX = Temp; X(indexBest,:)=Temp; fitness(indexBest) = fitTemp; end [value, index]= min(fitness); Gbest = X(index,:); % in the current generation, the optimal population value curve(t) = GBestF; end Best_pos = GBestX; Best_score = curve(end); endCopy the code
5. References:
The book “MATLAB Neural Network 43 Case Analysis”