I. Introduction to EMD and SVM

Time series prediction is to arrange the historical data of prediction targets into time series according to the time order, and then analyze its change trend with time, and extrapolate the predicted value. Time series prediction, especially non-stationary and nonlinear time series prediction, has important applications in economy, finance, industry, biomedicine and other fields. At present, regression model and neural network are commonly used in time series prediction, but these traditional single prediction methods are difficult to make accurate and effective prediction under the condition of information poverty and uncertainty. , which makes it necessary to find a prediction method with strong robustness, high prediction accuracy and practicality according to the law and characteristics of time series fluctuation trend. Empirical Mode Decomposition (EMD) can decompose information adaptively into different Intrinsic Mode functions (IMF) according to the Intrinsic characteristics of signals themselves. It is an effective non-stationary and nonlinear signal analysis method. In this paper, based on EMD and Support Vector Regression (SVR) method, a new hybrid intelligent prediction model is proposed, which provides a good solution to the prediction problem of non-stationary and nonlinear time series. The accuracy and validity of the model are fully demonstrated by the example of textile material price prediction.

2 Theoretical basis 2.1 Empirical mode decompositionEMD can decompose any signal into the sum of several IMF and a remainder 34. The so-called IMF is a function or signal that meets the following two conditions :(1) in the whole data series, the number of extreme points (including maximum points and minimum points) and the number of zero crossing points must be equal, or at most almost the same; (2) At any point, the mean values of the upper envelope determined by the local maximum and the lower envelope determined by the local minimum of the signal are zero. The specific steps of EMD decomposition are as follows: (1) Suppose the signal is X (t), and the sequence composed of the local mean values of its upper and lower envelope is m(t), then h(t)=x(t)-m(t). (1) For nonlinear and non-stationary data, a single processing is generally insufficient to form IMF, and some asymmetric waves still exist. Take h(t) as the data to be processed, repeat the above operation k times, and get:(2) When Hg (t) meets the conditions of IMF, the first IMF is obtained, which is remembered as jin (t)= Hg (t). (2) The first IMF is separated from the signal to obtain the remaining signal R, () : r(t)=x(t)-f(t) (3) 2.2 Support vector regressionSVR was introduced by Vapnik et al., 1995. The algorithm has many advantages such as small sample learning, global optimization and strong generalization ability. It has been successfully applied in many fields such as traffic flow and wind speed prediction, and shows superior performance than traditional artificial neural networks such as multilayer perceptron. 3. Hybrid intelligent prediction modelWhen single SVR is used for prediction, the same time series with different kernel functions will get different prediction results. Therefore, this paper proposes a hybrid intelligent prediction model based on EMD and SVR, which improves the prediction effect well. The basic calculation flow of the hybrid intelligent prediction model is as follows:

Two, some source code

%% Clears the environment variable tic; close all; clear; clc; format compact; %% data extraction and pretreatment TSX =xlsread('001.xlsx'.'A1:H406');
ts=xlsread('001.xlsx'.'J1:J406');
ts = ts';
tsx = tsx';
[TS,TSps] = mapminmax(ts,1.2);	

[TSX,TSXps] = mapminmax(tsx,1.2); % Transpose data to format rand('seed'.0)
[m n]=sort(rand(1,length(TS)));
m=350;
TSX1 = TSX(:,n(1:m))';
TS1 = TS(:,n(1:m))';
TSX2 = TSX(:,n(m+1:end))';
TS2 = TS(:,n(m+1:end))'; %% Select the best SVM parameter c&G KernelType = for regression prediction analysis1; %kerneltype=0(Linear kernel)1(Polynomial kernel)2(RBF kernel function)3(sigmoid kernel function) [bestmse, BESTC,bestg] = SVMcgForRegress(TS1,TSX1,4 -.4.4 -.4.3.0.5.0.5.0.05,kerneltype);
disp('Print selection results');
str = sprintf( 'Best Cross Validation MSE = %g Best c = %g Best g = %g',bestmse,bestc,bestg); disp(str); SVM network training using regression prediction analysis of the best parameters CMD = ['-c ', num2str(bestc), ' -g ', num2str(bestg) , '-s 3 -p 0.01 -d 1']; model = svmtrain(TS1,TSX1,cmd); function [mse,bestc,bestg] = SVMcgForRegress(train_label,train,cmin,cmax,gmin,gmax,v,cstep,gstep,msestep,kerneltype) % % Use grid search and cross validation to select kernel and penalty parameters. % cmin, cmax,c range. % gmin, gmax, g range. %v is the cross validation parameter5Fold cross validation % cstep gstep grid value step % msestep error step is only for the final drawing of the contour diagram for use, so that it is better looking % kernelType =0(Linear kernel)1(Polynomial kernel)2(RBF kernel function)3(sigmoid kernel function) (X, Y) = meshgrid (cmin: cstep: cmax, gmin: gstep: gmax); [m,n] = size(X); cg = zeros(m,n); eps =10^ (4 -);

bestc = 0;
bestg = 0;
mse = Inf;
basenum = 2;
for i = 1:m % Each change of a group of C g values, run5Fold cross validation finally select the group with the highest accuracy C Gfor j = 1:n
        cmd = ['-v ',num2str(v),' -c ',num2str( basenum^X(i,j) ),' -g ',num2str( basenum^Y(i,j) ),' -s 3 -t ',num2str(kerneltype),'-p 0.1 -d 1'];
        cg(i,j) = svmtrain(train_label, train, cmd);
        
        if cg(i,j) < mse
            mse = cg(i,j);
            bestc = basenum^X(i,j);
            bestg = basenum^Y(i,j);
        end
        
        if abs( cg(i,j)-mse )<=eps && bestc > basenum^X(i,j)
            mse = cg(i,j);
            bestc = basenum^X(i,j);
            bestg = basenum^Y(i,j);
        end
Copy the code

3. Operation results

Matlab version and references

1 matlab version 2014A

[1] Yang Baoyang, YU Jizhou, Yang Shan. Intelligent Optimization Algorithm and Its MATLAB Example (2nd Edition) [M]. Publishing House of Electronics Industry, 2016. [2] ZHANG Yan, WU Shuigen. MATLAB Optimization Algorithm source code [M]. Tsinghua University Press, 2017. [3] Zhou Pin. Design and Application of MATLAB Neural Network [M]. Tsinghua University Press, 2013. [4] Chen Ming. MATLAB Neural Network Principle and Examples of Fine Solution [M]. Tsinghua University Press, 2013. [5] FANG Qingcheng. MATLAB R2016a Neural Network Design and Application of 28 Cases Analysis [M]. Tsinghua University Press, 2018. [6] Wang Wei, Zhao Hong, LIANG Zhaohui, Ma Tao. Hybrid intelligent forecasting model based on EMD and SVR and its empirical research [J]. Computer engineering and applications, 2012,48(04)