Neural network – support vector machine
Support Vector Machine (SVM) was first proposed by Cortes and Vapnik in 1995. It shows many unique advantages in solving small sample size, nonlinear and high-dimensional pattern recognition, and can be generalized to other Machine learning problems such as function fitting. 1 Mathematics section 1.1 Two-dimensional space 2 algorithm Part
Artificial bee colony algorithm
Inspired by the organized foraging process of honeybee colonies, Karaboga proposed a method that mimics the foraging process of honeybeesArtificial Bee ColoniesThe algorithm is used to solve multi-dimensional and multi-peak valley optimization problems. The algorithm was originally used to findSphere,RosenbrockandRastriginThe minimum value of the function. The first is based on beesswingThe process characteristics of foraging are introduced. In Figure 1, there are two discoveredFood sourceA and B. In the beginning, the potential worker beesNot bee of employmentSearch for the identity of. It doesn’t know anything about nectar sources near the hive. Therefore, it has the following two possible choices: (1) become oneScout bees, spontaneously search the area around the hive based on its own potential power or external factors (see S in FIG. 1); (2) After watching the wagging dance, become aBy recruitingAnd start searching for nectar sources (see R in Figure 1). After locating the nectar source, the bee is able to use its ability to remember the location of the food source and immediately explore it. The bee has now become aHire the bee. After collecting the honey, the hired bees return to the hive from the source and unload the honey into the honey chamber. After unloading the honey, the hire bee has three options: (1) to abandon the nectar source that has already been collected and become one recruited by other wagging dancersfollower(UF). (2) Perform the tail wagging dance to recruit the companions in the hive and return to the original food source (EF1). (3) Do not recruit other bees and continue to explore the collected food source (EF2).
Second, the algorithm process
The artificial bee colony algorithm consists of four successive stages, respectivelyInitialization phase,Lead (hire) the bee phase,Follow the end of the beeandScout bee stage. Artificial bee colony algorithm divides artificial bee colony intoLead the bee,Follow the beesandScout beesIn the third category, in each search process, the leading bees and the following bees mine the food source successively, that is, looking for the optimal solution, while the scout bees observe whether they fall into the local optimal. If they fall into the local optimal, they randomly search for other possible food sources. Each food source represents a possible solution to the problem, and the amount of nectar from the food source corresponds to the quality of the corresponding solution (fitness value F I t fitfit).
1. Initialization phase
2, leading bee stage
3. Follow the bee stage
4. Scout bee stage
5. Food sources
Three, code,
clc;
clear;
close all;
%% Problem Definition
CostFunction=@(x) Sphere(x); % Cost Function
nVar=5; % Number of Decision Variables
VarSize=[1 nVar]; % Decision Variables Matrix Size
VarMin=-10; % Decision Variables Lower Bound
VarMax= 10; % Decision Variables Upper Bound
%% ABC Settings
MaxIt=200; % Maximum Number of Iterations
nPop=100; % Population Size (Colony Size)
nOnlooker=nPop; % Number of Onlooker Bees
L=round(0.6*nVar*nPop); % Abandonment Limit Parameter (Trial Limit)
a=1; % Acceleration Coefficient Upper Bound
%% Initialization
% Empty Bee Structure
empty_bee.Position=[];
empty_bee.Cost=[];
% Initialize Population Array
pop=repmat(empty_bee,nPop,1);
% Initialize Best Solution Ever Found
BestSol.Cost=inf;
% Create Initial Population
for i=1:nPop
pop(i).Position=unifrnd(VarMin,VarMax,VarSize);
pop(i).Cost=CostFunction(pop(i).Position);
if pop(i).Cost<=BestSol.Cost
BestSol=pop(i);
end
end
% Abandonment Counter
C=zeros(nPop,1);
% Array to Hold Best Cost Values
BestCost=zeros(MaxIt,1);
%% ABC Main Loop
for it=1:MaxIt
% Recruited Bees
for i=1:nPop
% Choose k randomly, not equal to i
K=[1:i-1 i+1:nPop];
k=K(randi([1 numel(K)]));
% Define Acceleration Coeff.
phi=a*unifrnd(-1,+1,VarSize);
% New Bee Position
newbee.Position=pop(i).Position+phi.*(pop(i).Position-pop(k).Position);
% Evaluation
newbee.Cost=CostFunction(newbee.Position);
% Comparision
if newbee.Cost<=pop(i).Cost
pop(i)=newbee;
else
C(i)=C(i)+1;
end
end
% Calculate Fitness Values and Selection Probabilities
F=zeros(nPop,1);
MeanCost = mean([pop.Cost]);
for i=1:nPop
F(i) = exp(-pop(i).Cost/MeanCost); % Convert Cost to Fitness
end
P=F/sum(F);
% Onlooker Bees
for m=1:nOnlooker
% Select Source Site
i=RouletteWheelSelection(P);
% Choose k randomly, not equal to i
K=[1:i-1 i+1:nPop];
k=K(randi([1 numel(K)]));
% Define Acceleration Coeff.
phi=a*unifrnd(-1,+1,VarSize);
% New Bee Position
newbee.Position=pop(i).Position+phi.*(pop(i).Position-pop(k).Position);
% Evaluation
newbee.Cost=CostFunction(newbee.Position);
% Comparision
if newbee.Cost<=pop(i).Cost
pop(i)=newbee;
else
C(i)=C(i)+1;
end
end
% Scout Bees
for i=1:nPop
if C(i)>=L
pop(i).Position=unifrnd(VarMin,VarMax,VarSize);
pop(i).Cost=CostFunction(pop(i).Position);
C(i)=0;
end
end
% Update Best Solution Ever Found
for i=1:nPop
if pop(i).Cost<=BestSol.Cost
BestSol=pop(i);
end
end
% Store Best Cost Ever Found
BestCost(it)=BestSol.Cost;
% Display Iteration Information
disp(['Iteration ' num2str(it) ': Best Cost = ' num2str(BestCost(it))]);
end
%% Results
figure;
%plot(BestCost,'LineWidth',2);
semilogy(BestCost,'LineWidth',2);
xlabel('Iteration');
ylabel('Best Cost');
grid on;
Copy the code
5. References:
The book “MATLAB Neural Network 43 Case Analysis”
Complete code download or simulation consulting www.cnblogs.com/ttmatlab/p/…