The characteristics of LSSVM

1) The original dual problem is also solved, but the QP problem in SVM (simplified solution process) is replaced by solving a linear equation set (caused by linear constraints in the optimization goal), which is also applicable to classification and regression tasks in high-dimensional input space; 2) It is essentially a process of solving linear matrix equations, and Gaussian processes, Regularization networks and Fisher Discriminant Analysis are combined. 3) Sparse approximation (to overcome the disadvantages of using the algorithm) and robust regression (robust statistics) are used; 4) Bayesian inference was used; 5) It can be extended to unsupervised learning: kernel PCA or density clustering; 6) It can be extended to recursive neural network.

 

 

% = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = % initialization CLC close all clear format long tic % = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = % % import data data = xlsread (' 1. XLSX '); [row,col]=size(data); x=data(:,1:col-1); y=data(:,col); set=1; % Set measurement sample number row1=row-set; % train_x=x(1:row1,:); train_y=y(1:row1,:); test_x=x(row1+1:row,:); Test_y =y(row1+1:row,:); % forecast output train_x=train_x'; train_y=train_y'; test_x=test_x'; test_y=test_y'; %% data normalization [train_x, MINx, MAxx, train_YY, MINy, MAxy] =premnmx(train_x,train_y); test_x=tramnmx(test_x,minx,maxx); train_x=train_x'; train_yy=train_yy'; train_y=train_y'; test_x=test_x'; test_y=test_y'; %% parameter initialization EPS = 10^(-6); %% define LSSVM parameter type='f'; kernel = 'RBF_kernel'; proprecess='proprecess'; Lb = [0.01 0.02]; % ub=[1000 100]; % dim=2; % dimension, which is an optimization parameter SearchAgents_no=20; % Number of search agents Max_iter=100; % Maximum numbef of iterations n=10; Number of samples: 1. % Loudness (Constant or Decreasing) R =0.5; % Pulse rate (constant or decreasing) % This frequency range determines the scalings Qmin=0; % Frequency minimum Qmax=2; % Frequency maximum % Iteration parameters tol=10^(-10); % Stop tolerance Leader_pos=zeros(1,dim); Leader_score=inf; %change this to -inf for maximization problems %Initialize the positions of search agents for i=1:SearchAgents_no Positions(i,1)=ceil(rand(1)*(ub(1)-lb(1))+lb(1)); Positions(i,2)=ceil(rand(1)*(ub(2)-lb(2))+lb(2)); Fitness(i)=Fun(Positions(i,:),train_x,train_yy,type,kernel,proprecess,miny,maxy,train_y,test_x,test_y); v(i,:)=rand(1,dim); end [fmin,I]=min(Fitness); best=Positions(I,:); Convergence_curve=zeros(1,Max_iter); t=0; % Loop counter % Start the iterations -- Bat Algorithm %% Result analysis plot(Convergence_curve,'LineWidth',2); Title ([' wolves optimization algorithm fitness curves', '(parameters c1 =', num2str (Leader_pos (1)), ', c2 = ', num2str (Leader_pos (2)), ', termination of algebra = ', num2str (Max_iter), and ') '], 'the Font Size',13); Xlabel (' Evolution algebra '); Ylabel (' error fitness '); bestc = Leader_pos(1); bestg = Leader_pos(2); End RD=RD' disp([' SVM prediction error =',num2str(D)]) % figure % plot(test_predict,':og') % hold on % plot(test_y,'- *') % Legend (' forecast output ',' expected output ') % title(' network forecast output ','fontsize',12) % yLabel (' function output ','fontsize',12) % xLabel (' sample ','fontsize',12) figure Plot (train_predict,':og') hold on plot(train_y,'- *') Ylabel (' function output ','fontsize',12) xlabel(' sample ','fontsize',12) TOC % calculation timeCopy the code