SVM radial basis generate equation for hyperplane

I need to generate an equation for hyperplane, I have two independent variables and one binary dependent variable.

Regarding this following equation for svm , $f(x)=sgn( sum_i alpha_i K(sv_i,x) + b )$

I have two independent variables (say P and Q) with 130 point values for each variable. I used svm radial basis function for binary classification (0 and 1) and I calculated for radial basis kernelized case, and now I have

  • One column of 51 y (i) alpha (i) or (dual coefficients).

  • Two columns of 51 sv (support vectors)for P and Q.

  • One single value for b .

I received these using scikit SVC.

So, how can I generate the equation now?

Can I multiply those 51 y (i) alpha (i) or (dual coefficients) with 51 sv (support vectors) for each variable P and Q so that I have two coefficients for P and Q so that finally my equation appears as f(x)=sgn( mP + nQ +b) where m = sum of the (product of 51 sv of P with 51 dual coefficients) and n = sum of the (product of 51 sv of Q with 51 dual coefficients)?

Topic machine-learning-model scikit-learn svm python machine-learning

Category Data Science


I'm not sure if I've fully understood you. Radial basis kernel assumes that you transform your features into an infinite space and the dot product of your transformed vectors is exactly the radial basis kernel.

$k(x,y)=\phi(x)\cdot \phi(y)$

$\phi(x)$ - mapping

The main reason for using a kernel trick is the ability to transform features into higher dimensions without knowing the map function explicitly. Your hyperplane has infinite number of coefficients. You can always expend the radial basis kernel into Taylor series and get some of the initial coefficients.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.