(1)newrb()

www.mathworks.com/help/deeple…

It can be used to design an approximate radial basis network (Approximate RBF). The call format is:

[net,tr]=newrb(P,T,GOAL,SPREAD,MN,DF)

Where P is the R*Q bit matrix composed of Q group input vectors, T is the S*Q dimensional matrix composed of Q group target classification vectors. GOAL is the Mean Squard Error GOAL, default is 0.0; SPREAD is the expansion speed of the radial basis function, which is 1 by default. MN is the maximum number of neurons, which is Q by default. The number of neurons added between two displays of DF dimension, default is 25; Ner is the return value, an RBF network, TR is the return value, training record.

Creating RBF network with newrb () is a process of continuous attempts (as can be seen from the operation of the program). In the process of creation, it is necessary to continuously increase the sum number of neurons in the middle layer until the output error of the network meets the preset value.

(2)newrbe()

www.mathworks.com/help/deeple…

This function is used to design an exact RBF. The call format is:

net=newrbe(P,T,SPREAD)

Where P is the R*Q dimensional matrix composed of Q group input vectors, T is the S*Q dimensional matrix composed of Q group target classification vectors. SPREAD is the expansion speed of the radial basis function, and the default is 1

Unlike Newrb (), Newrbe () can quickly and error-free design a radial basis network based on design vectors.

(3)radbas()

This function is the radial basis transfer function, and the call format is

A=radbas(N)

info=radbas(code)

Where N is the S*Q dimensional matrix of the input (column) vector, and A is the function return matrix, corresponding to N one by one, that is, each element of N gets A through the radial basis function; Info =radbas(code) returns different information about the function depending on the value of code. including

Derive — Returns the name of the derivative function

Name – Returns the full name of the function

Output — Returns the input range

Active — Returns the range of available inputs

Exact radial basis network is used to realize nonlinear functional regression:

%% Clear the environment variable CLC clear %% generate input and output data % Set step interval=0.01; X1 =-1.5:interval:1.5; X2 = 1.5: interval: 1.5; F=20+x1.^2-10*cos(2* PI *x1)+x2.^2-10*cos(2* PI *x2); %% network establishment and training % network establishment with input [x1;x2] and output F. Net =newrbe([x1;x2],F); Ty =sim(net,[x1;x2]); Figure plot3(x1, X2,F,'rd'); hold on; plot3(x1,x2,ty,'b-.'); The view (113, 4); Title (' Visual method to observe the fitting effect of strict RBF neural network '); xlabel('x1') ylabel('x2') zlabel('F') grid onCopy the code

Approximate RBF network is used to fit the function

%% clear the environment variable CLC clear %% generate training sample, training input, training output %ld is sample membership ld=400; % produces 2*ld matrix x=rand(2,ld); % convert x to [-1.5 1.5] between x=(x-0.5)*1.5*2; X1 =x(1,:); x2=x(2,:); % network output value F = F + 20 x1. ^ 2-10 * cos (2 * PI * x1 + x2. ^ 2-10 * cos (2 * PI * x2); %% Establishes RBF neural network % uses approximate RBF neural network, spread is the default value net=newrb(x,F); %% Establish test sample interval=0.1; [I, j] = meshgrid (- 1.5: the interval: 1.5); row=size(i); tx1=i(:); tx1=tx1'; tx2=j(:); tx2=tx2'; tx=[tx1;tx2]; %% Use the established RBF network to simulate, the network output ty= SIM (net,tx); %% Using the image, draw a 3d graph % the real function image interval=0.1; [x1, x2] = meshgrid (- 1.5: the interval: 1.5); F=20+x1.^2-10*cos(2*pi*x1)+x2.^2-10*cos(2*pi*x2); Subplot (1,3,1); mesh(x1,x2,F); Zlim ([0, 60]); Title (' Real function image '); V = 0 0 (ty,row); 0 0 Subplot (1, 31); mesh(i,j,v); Zlim ([0, 60]); Title ('RBF neural network result '); Subplot (1,3,3); mesh(x1,x2,F-v); Zlim ([0, 60]); Title (' error image '); Set (GCF, 'the position', [300250900400])Copy the code

Conclusion: It can be seen that the results of neural network training can approximate the nonlinear function F, the error graph shows that real network prediction effect in edge data error is bigger, the other book points out that the fitting effect is very good, the output of the network value and function of interpolation between the number of neurons in hidden layer for 100 is close to zero, It shows that the network output can approximate the function very well.