Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
9 Views

linear svm decision function

In this case it is a simple linear function. Not sure how to get hyperplane normal from training results.
 

0 Kudos
3 Replies
Highlighted
9 Views

Hello,

Linear decision function in SVM model is defined as D(x) = w* ∙ x + b. And the hyperplane normal w* for linear SVM is computed using the formula below:

                                w* = ∑yk ak xk,

where xk and ck = yk ak are support vectors and classification coefficients, respectively, computed during the training of SVM model.

You can read them using the methods svm::Model::getSupportVectors() and svm::Model::getClassificationCoefficients(), respectively.

For additional details please see chapter 2.1 in B. E. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp: 144–152.

Best regards,

Victoriya

0 Kudos
Highlighted
Beginner
9 Views

Thanks for explaining.

There is one more question I would like to ask.

When using linear svm there is no need to add const bias term to every feature, i.e. bias term is taken care by the algorithm itself.

Is that right?

 

 

0 Kudos
Highlighted
9 Views

Yes, there is no need to add bias term to each feature.

SVM training algorithm computes bias term as part of a model. svm::Model::getBias() method returns the value of bias term.

0 Kudos