svm - How to get the predicted values in training data set for Least Squares Support Vector Regression -
i make prediction using least squares support vector machine regression, proposed suykens et al. using ls-svmlab, can find matlab toolbox here. let's consider have independent variable x , dependent variable y, both simulated. following instructions in tutorial.
>>x = linspace(-1,1,50)’;
>>y = (15*(x.^2-1).^2.*x.^4).*exp(-x)+normrnd(0,0.1,length(x),1);
>>type = ’function estimation’;
>>[gam,sig2] = tunelssvm({x,y,type,[], [],’rbf_kernel’},’simplex’,...’leaveoneoutlssvm’,’mse’});
>>[alpha,b] = trainlssvm({x,y,type,gam,sig2,’rbf_kernel’});
>>plotlssvm({x,y,type,gam,sig2,’rbf_kernel’},{alpha,b});
the code above finds best parameters using simplex method , leave-one-out cross validation , trains model , give me alphas (support vector values data points in training set) , b coefficients. however, not give me predictions of variable y. draws plot. in articles, saw plots 1 below,
as said before, ls-svm toolbox not give me predicted values of y, draws plot no values in workspace. how can these values , draw graph of predicted values actual values?
there 1 solution think of. using x values in training set, re-run model , prediction of values y using simlssvm
command not seem reasonable me. solution can offer? in advance.
i afraid have answered own question. way obtain prediction training points in ls-svmlab simulating training points after training model.
Comments
Post a Comment