A list,
Particle swarm optimization is derived from Complex Adaptive System (CAS). CAS theory was formally put forward in 1994. Members in CAS are called subjects. For example, in a bird system, each bird is called a subject. The agent is adaptive, it can communicate with the environment and other agents, and according to the process of communication “learn” or “accumulate experience” change its structure and behavior. The evolution or evolution of the whole system includes: the creation of new levels (birthing); The emergence of differentiation and diversity (birds in a flock breaking up into many small groups); The emergence of new themes (birds constantly discover new foods as they search for food). Therefore, the subject in CAS system has four basic characteristics (these characteristics are the basis of the development and change of particle swarm optimization algorithm) : First, the subject is active and active. The main body and the environment and other main body are mutual influence, interaction, this influence is the main power of system development and change. The influence of environment is macroscopical, the influence between subjects is microscopical, macroscopical and microscopical should be combined organically. Finally, the whole system may be affected by some random factors. Particle swarm optimization is based on the research of a CAS system, the social system of birds. Particle Swarm Optimization (PSO) was first proposed by Eberhart and Kennedy in 1995, and its basic concept was derived from the study of the foraging behavior of birds. Consider this scenario: a group of birds are randomly searching for food. There is only one piece of food in the area. None of the birds know where the food is, but they know how far it is from the current location. So what’s the optimal strategy for finding food? The simplest and most effective method is to search the area around the bird that is currently closest to the food. PSO algorithm is inspired by the behavior characteristics of this species and used to solve optimization problems. In PSO, the potential solution of each optimization problem can be imagined as a point in the D-dimensional search space, which we call a Particle. All particles have a Fitness Value determined by the objective function, and each Particle has a velocity that determines the direction and distance they fly. The particles then follow the current optimal particle through the solution space. Reynolds’ study of bird flight found that. The bird is tracking only a limited number of its neighbors but the overall result is that the whole flock seems to be under the control of one center. That complex global behavior is caused by the interaction of simple rules.
Ii. Source code
%% Dynamic particle swarm optimization %% Clear environment clear CLC %% Set bimodal parameters % Set con1 parameter X1 =25;
Y1 = 25;
H1 =410; % set con2 H2=zeros(1.1200);
i=1:200;
H2(i)=450-fix(i/5);
i=201:700;
H2(i)=410;
i=701:1000;
H2(i)=350 + fix((i- 500.) /10) *3;
i=1001:1200;
H2(i) = 500;
X2=zeros(1.1200);
i=1:1200;
Y2(i)=- 25;
i=1:500;
X2(i)=- 25 + (i- 1) *25/500;
i=501:1000;
X2(i)=0;
i=1001:1200;
X2(i)=(i- 1000.) *25/200; %% Initializes particles and sensitive particles % population size n =20; % Particle and sensitive particle pop = unidrnd(501,[n,2]);
popTest = unidrnd(501[5*n,2]); % h = DF1function(X1,Y1,H1,X2)1),Y2(1),H2(1));
V = unidrnd(100,[n,2])- 50;
Vmax=25; Vmin=- 25; Fitness value of particle and sensitive particle fitness= Zeros (1,n);
fitnessTest=zeros(1,n);
for i=1:n
fitness(i)=h(pop(i,1),pop(i,2));
fitnessTest(i)=h(popTest(i,1),popTest(i,2)); end oFitness=sum(fitnessTest); % sensitive particle [value,index]= Max (fitness); popgbest=pop; popzbest=pop(index,:); fitnessgbest=fitness; fitnesszbest=fitness(index); %% algorithm parameter vmax =10;
vmin = - 10;
popMax=501;
popMin=1;
m = 2;
nFitness = oFitness;
Tmax=100; % Number of iterations fitnessRecord= Zeros (1.1200); %% loop to find the bestfor k = 1:1200H = DF1function(X1,Y1,H1,X2(k),Y2(k),H2(k)); Sensitive to particle changesfor i=1:5*n
fitnessTest(i)=h(popTest(i,1),popTest(i,2)); end oFitness=sum(fitnessTest); % exceeds the threshold and is re-initializedif abs(oFitness - nFitness)>1
index=randperm(20);
pop(index(1:10),:)=unidrnd(501[10.2]);
V(index(1:10),:)=unidrnd(100[10.2])- 50; End % Particle searchfor i=1:Tmax
for j=1:n % Speed update V(j,:)=V(j,:)+floor(rand*(popgbest(j,:)-pop(j,:)))+floor(rand*(popzbest - pop(j,:))); index1=find(V(j,:)>Vmax); V(j,index1)=Vmax; index2=find(V(j,:)<Vmin); V(j,index2)=Vmin; Pop (j,:)=pop(j,:)+V(j,:); index1=find(pop(j,:)>popMax); pop(j,index1)=popMax; index2=find(pop(j,:)<popMin); pop(j,index2)=popMin; Fitness (j)=h(POP (j,1),pop(j,2)); % Individual extremum updatesif fitness(j) > fitnessgbest(j)
popgbest(j,:) = pop(j,:); fitnessgbest(j) = fitness(j); End % group extreme value updateif fitness(j) > fitnesszbest
popzbest= pop(j,:);
fitnesszbest = fitness(j);
end
end
end
fitnessRecord(k)=fitnesszbest;
fitnesszbest=0;
fitnessgbest=zeros(1.20);
end
Copy the code
3. Operation results
Fourth, note
Version: 2014 a