支持向量机matlab实例及理论 - 20131201 联系客服

发布时间 : 星期二 文章支持向量机matlab实例及理论 - 20131201更新完毕开始阅读b2cf018069eae009591bec8e

支持向量机matlab分类实例及理论

线性支持向量机可对线性可分的样本群进行分类,此时不需要借助于核函数就可较为理想地解决问题。非线性支持向量机将低维的非线性分类问题转化为高维的线性分类问题,然后采用线性支持向量机的求解方法求解。此时需要借助于核函数,避免线性分类问题转化为非线性分类问题时出现的维数爆炸难题,从而避免由于维数太多而无法进行求解。

第O层:Matlab的SVM函数求解分类问题实例

0.1Linear classification(线性分类)

%Two Dimension Linear-SVM Problem, Two Class and Separable Situation %Method from Christopher J. C. Burges:

%\9

%Optimizing ||W|| directly:

% Objective: min \% Subject to: yi*(xi*W+b)-1>=0, function (12); clear all; close all clc;

sp=[3,7; 6,6; 4,6;5,6.5] % positive sample points nsp=size(sp);

sn=[1,2; 3,5;7,3;3,4;6,2.7] % negative sample points nsn=size(sn) sd=[sp;sn]

lsd=[true true true true false false false false false] Y = nominal(lsd)

figure(1); subplot(1,2,1)

plot(sp(1:nsp,1),sp(1:nsp,2),'m+'); hold on

plot(sn(1:nsn,1),sn(1:nsn,2),'c*'); subplot(1,2,2)

svmStruct = svmtrain(sd,Y,'showplot',true);

0.2 NonLinear classification(非线性分类)(平方核函数)

clear all; close all clc;

sp=[3,7; 6,6; 4,6; 5,6.5] % positive sample points nsp=size(sp);

sn=[1,2; 3,5; 7,3; 3,4; 6,2.7; 4,3;2,7] % negative sample points nsn=size(sn) sd=[sp;sn]

lsd=[true true true true false false false false false false false] Y = nominal(lsd)

figure(1);

subplot(1,2,1)

plot(sp(1:nsp,1),sp(1:nsp,2),'m+'); hold on

plot(sn(1:nsn,1),sn(1:nsn,2),'c*');

subplot(1,2,2)

% svmStruct = svmtrain(sd,Y,'Kernel_Function','linear', 'showplot',true);

svmStruct = svmtrain(sd,Y,'Kernel_Function','quadratic', 'showplot',true);

% use the trained svm (svmStruct) to classify the data

RD=svmclassify(svmStruct,sd,'showplot',true) % RD is the classification result vector

0.3 Gaussian Kernal Classification(高斯核函数分类)

clear all; close all clc;

sp=[5,4.5;3,7; 6,6; 4,6; 5,6.5] % positive sample points nsp=size(sp);

sn=[1,2; 3,5; 7,3; 3,4; 6,2.7; 4,3;2,7] % negative sample points nsn=size(sn) sd=[sp;sn]

lsd=[true true true true true false false false false false false false] Y = nominal(lsd)

figure(1); subplot(1,2,1)

plot(sp(1:nsp,1),sp(1:nsp,2),'m+'); hold on

plot(sn(1:nsn,1),sn(1:nsn,2),'c*'); subplot(1,2,2) svmStruct =

svmtrain(sd,Y,'Kernel_Function','rbf','rbf_sigma',0.6,'method','SMO','showplot',true);

% svmStruct = svmtrain(sd,Y,'Kernel_Function','quadratic', 'showplot',true);

% use the trained svm (svmStruct) to classify the data RD=svmclassify(svmStruct,sd,'showplot',true) % RD is the classification result vector

svmtrain(sd,Y,'Kernel_Function','rbf','rbf_sigma',0.2,'method','SMO','showplot',true);

0.4 Svmtrain Function

svmtrain Train a support vector machine classifier

SVMSTRUCT = svmtrain(TRAINING, Y) trains a support vector machine (SVM) classifier on data taken from two groups. TRAINING is a numeric matrix of predictor data. Rows of TRAINING correspond to observations; columns correspond to features. Y is a column vector that contains the known