SVR模型如何预测数值?

huangapple go评论51阅读模式
英文:

How does a trained SVR model predict values?

问题

我一直在尝试理解使用支持向量机进行回归训练的模型如何预测数值。我已经使用sklearn.svm.SVR训练了一个模型,现在我想知道如何"手动"预测输入的结果。

一些背景信息 - 该模型是使用核函数SVR训练的,使用RBF函数并使用对偶形式。所以现在我有双系数数组、支持向量的索引以及支持向量本身。

我找到了用于拟合超平面的函数,但尝试在没有使用.predict函数的情况下手动预测结果都未成功。我尝试了一些方法,其中都包括输入(特征)数组与所有支持向量的点积。

英文:

I've been trying to understand how does a model trained with support vector machines for regression predict values. I have trained a model with the sklearn.svm.SVR, and now I'm wondering how to "manually" predict the outcome of an input.
Some background - the model is trained with kernel SVR, with RBF function and uses the dual formulation. So now I have arrays of the dual coefficients, the indexes of the support vectors, and the support vectors themselves.
I found the function which is used to fit the hyperplane but I've been unsuccessful in applying that to "manually" predict outcomes without the function .predict.
The few things I tried all include the dot products of the input (features) array, and all the support vectors.

答案1

得分: 1

以下是您要翻译的内容:

If anyone ever needs this, I've managed to understand the equation and code it in python.
The following is the used equation for the dual formulation:

SVR模型如何预测数值?

where N is the number of observations, and αi multiplied by yi are the dual coefficients found from the model's attributed model.dual_coef_. The xiT are some of the observations used for training (support vectors) accessed by the attribute model.support_vectors_ (transposed to allow multiplication of the two matrices), x is the input vector containing a value for each feature (its the one observation for which we want to get prediction), and b is the intercept accessed by model.intercept_.
The xiT and x, however, are the observations transformed in a higher-dimensional space, as explained by mery in this post.
The calculation of the transformation by RBF can be either applied manually step by stem or by using the sklearn.metrics.pairwise.rbf_kernel.
With the latter, the code would look like this (my case shows I have 589 support vectors, and 40 features).
First we access the coefficients and vectors:

 support_vectors = model.support_vectors_
 dual_coefs = model.dual_coef_[0]

Then:

pred = (np.matmul(dual_coefs.reshape(1,589), 
                  rbf_kernel(support_vectors.reshape(589,40),
                             Y=input_array.reshape(1,40), 
                             gamma=model.get_params()['gamma']
                             )
                 )
        + model.intercept_
       )

If the RBF function needs to be applied manually, step by step, then:

vrbf = support_vectors.reshape(589,40) - input_array.reshape(1,40)
pred = (np.matmul(dual_coefs.reshape(1,589), 
                  np.diag(np.exp(-model.get_params()['gamma'] * 
                                 np.matmul(vrbf, vrbf.T)
                                )
                         ).reshape(589,1)
                 )
        + model.intercept_
       )

I placed the .reshape() function even where it is not necessary, just to emphasize the shapes for the matrix operations.
These both give the same results as model.predict(input_array)

英文:

If anyone ever needs this, I've managed to understand the equation and code it in python.
The following is the used equation for the dual formulation:

SVR模型如何预测数值?

where N is the number of observations, and &alpha;<sub>i</sub> multiplied by y<sub>i</sub> are the dual coefficients found from the model's attributed model.dual_coef_. The x<sub>i</sub><sup>T</sup> are some of the observations used for training (support vectors) accessed by the attribute model.support_vectors_ (transposed to allow multiplication of the two matrices), x is the input vector containing a value for each feature (its the one observation for which we want to get prediction), and b is the intercept accessed by model.intercept_.
The x<sub>i</sub><sup>T</sup> and x, however, are the observations transformed in a higher-dimensional space, as explained by mery in this post.
The calculation of the transformation by RBF can be either applied manually step by stem or by using the sklearn.metrics.pairwise.rbf_kernel.
With the latter, the code would look like this (my case shows I have 589 support vectors, and 40 features).
First we access the coefficients and vectors:

 support_vectors = model.support_vectors_
 dual_coefs = model.dual_coef_[0]

Then:

pred = (np.matmul(dual_coefs.reshape(1,589), 
                  rbf_kernel(support_vectors.reshape(589,40),
                             Y=input_array.reshape(1,40), 
                             gamma=model.get_params()[&#39;gamma&#39;]
                             )
                 )
        + model.intercept_
       )

If the RBF funcion needs to be applied manually, step by step, then:

vrbf = support_vectors.reshape(589,40) - input_array.reshape(1,40)
pred = (np.matmul(dual_coefs.reshape(1,589), 
                  np.diag(np.exp(-model.get_params()[&#39;gamma&#39;] * 
                                 np.matmul(vrbf, vrbf.T)
                                )
                         ).reshape(589,1)
                 )
        + model.intercept_
       )

I placed the .reshape() function even where it is not necessary, just to emphasize the shapes for the matrix operations.
These both give the same results as model.predict(input_array)

huangapple
  • 本文由 发表于 2023年2月8日 20:34:24
  • 转载请务必保留本文链接:https://go.coder-hub.com/75385861.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定