英文:
How to fix ConvergenceWarning in Gaussian process regression in sklearn?
问题
I will provide translations for the code and relevant information from your text:
我正在尝试将 sklearn Gaussian 过程回归器拟合到我的数据上。数据具有周期性但没有均值趋势,因此我定义了一个与 [Mauna Loa 数据教程](https://scikit-learn.org/stable/auto_examples/gaussian_process/plot_gpr_co2.html) 类似的内核,去除了长期趋势,如下所示:
from sklearn.gaussian_process.kernels import (RBF, ExpSineSquared,
RationalQuadratic, WhiteKernel)
from sklearn.gaussian_process import GaussianProcessRegressor as GPR
import numpy as np
# 模拟周期性
seasonal_kernel = (
2.0**2
* RBF(length_scale=100.0, length_scale_bounds=(1e-2,1e7))
* ExpSineSquared(length_scale=1.0, length_scale_bounds=(1e-2,1e7),
periodicity=1.0, periodicity_bounds="fixed")
)
# 模拟小的变化
irregularities_kernel = 0.5**2 * RationalQuadratic(length_scale=1.0,
length_scale_bounds=(1e-2,1e7), alpha=1.0)
# 模拟噪音
noise_kernel = 0.1**2 * RBF(length_scale=0.1, length_scale_bounds=(1e-2,1e7)) + \
WhiteKernel(noise_level=0.1**2, noise_level_bounds=(1e-5, 1e5)
)
co2_kernel = (
seasonal_kernel + irregularities_kernel + noise_kernel
)
然后,我使用该内核来定义回归器并拟合数据:
gpr = GPR(n_restarts_optimizer=10, kernel=co2_kernel, alpha=150, normalize_y=False)
for x,y in zip(x_list, y_list):
gpr.fit(x,y)
然而,在拟合过程中,我收到了多个 `ConvergenceWarning`。它们都类似于以下内容:
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:430: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k2__k1__constant_value is close to the specified upper bound 100000.0. Increasing the bound and calling fit again may find a better value.
我设法通过在内核内的所有函数中添加 `length_scale_bounds` 参数来修复其中一些问题,但我不确定是否设置了不必要地降低执行时间的过多界限,而且我不知道如何纠正 alpha 和常数值的问题。在线查看这些错误并没有提供任何帮助。
我知道模型没有正确拟合,因为高斯过程回归器的性能远远不及简单的 SVR,尽管后者要快得多。是否有人知道我可以如何做到以下两点:
1. 将每个警告与更广泛的内核中的特定子内核关联起来?
2. 如何修复 alpha 和常数值的警告?
Please note that I have provided translations for the code and the questions you asked. If you have further specific questions or need assistance with the code or warnings, please feel free to ask.
英文:
I am trying to use fit a sklearn Gaussian process regressor to my data. The data has periodicity but no mean trend, so I defined a kernel similarly to the tutorial on the Mauna Loa data, without the long term trend, as follows:
from sklearn.gaussian_process.kernels import (RBF, ExpSineSquared,
RationalQuadratic, WhiteKernel)
from sklearn.gaussian_process import GaussianProcessRegressor as GPR
import numpy as np
# Models the periodicity
seasonal_kernel = (
2.0**2
* RBF(length_scale=100.0, length_scale_bounds=(1e-2,1e7))
* ExpSineSquared(length_scale=1.0, length_scale_bounds=(1e-2,1e7),
periodicity=1.0, periodicity_bounds="fixed")
)
# Models small variations
irregularities_kernel = 0.5**2 * RationalQuadratic(length_scale=1.0,
length_scale_bounds=(1e-2,1e7), alpha=1.0)
# Models noise
noise_kernel = 0.1**2 * RBF(length_scale=0.1, length_scale_bounds=(1e-2,1e7)) + \
WhiteKernel(noise_level=0.1**2, noise_level_bounds=(1e-5, 1e5)
)
co2_kernel = (
seasonal_kernel + irregularities_kernel + noise_kernel
)
Then I use the kernel to define a regressor and fit the data:
gpr = GPR(n_restarts_optimizer=10, kernel=co2_kernel, alpha=150, normalize_y=False)
for x,y in zip(x_list, y_list):
gpr.fit(x,y)
However, during fit I get multiple ConvergenceWarning
s. They all look like the following:
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:430: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k2__k1__constant_value is close to the specified upper bound 100000.0. Increasing the bound and calling fit again may find a better value.
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:430: ConvergenceWarning: The optimal value found for dimension 0 of parameter k2__k1__k1__constant_value is close to the specified upper bound 100000.0. Increasing the bound and calling fit again may find a better value.
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:430: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k2__k2__alpha is close to the specified upper bound 100000.0. Increasing the bound and calling fit again may find a better value.
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:430: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k1__k1__k1__constant_value is close to the specified upper bound 100000.0. Increasing the bound and calling fit again may find a better value.
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:420: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k1__k1__k2__length_scale is close to the specified lower bound 0.01. Decreasing the bound and calling fit again may find a better value.
C:\Users\user\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.10_qbz5n2kfra8p0\LocalCache\local-packages\Python310\site-packages\sklearn\gaussian_process\kernels.py:430: ConvergenceWarning: The optimal value found for dimension 0 of parameter k1__k2__k1__constant_value is close to the specified upper bound 100000.0. Increasing the bound and calling fit again may find a better value.
I managed to fix some of them by blanket adding the length_scale_bounds
arguments to all of the functions within the kernel, but I'm not sure if I've set overextended bounds which needlessly degrade execution time for parts of the kernel that were running just fine, and I don't know how to remediate to the problem with alpha nor the constant values. Looking the errors online does not provide any help.
I know that the model is not being fitted properly because the Gaussian process regressor is performing far worse than a simple SVR, despite the latter being much faster. Does anybody know how I can:
- Associate each warning to a specific subkernel within the wider kernel?
- How do I fix the warning for alpha and constant value?
答案1
得分: 0
我找到了解决方案的文档超参数内核API。整个内核的超参数集可以如下显示:
for hp in co2_kernel.hyperparameters:
print('co2', hp)
输出如下:
co2 Hyperparameter(name='k1__k1__k1__k1__constant_value', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k1__k1__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k1__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k1__k2__periodicity', value_type='numeric', bounds='fixed', n_elements=1, fixed=True)
co2 Hyperparameter(name='k1__k2__k1__constant_value', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k2__k2__alpha', value_type='numeric', bounds=array([[1.e+02, 1.e+07]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k2__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k2__k1__k1__constant_value', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k2__k1__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k2__k2__noise_level', value_type='numeric', bounds=array([[1.e-09, 1.e+01]]), n_elements=1, fixed=False)
这些参数与内核的各个部分的参数相关联。正如文档所指出的,“由于内核的嵌套结构(通过应用内核运算符,参见下文),内核参数的名称可能会相对复杂。一般来说,对于二进制内核运算符,左操作数的参数以 k1__
为前缀,右操作数的参数以 k2__
为前缀。” 根据操作的优先级顺序,分叉从最右边开始考虑。
例如,季节性内核的超参数以 k1__k1__
开始,因为要到达那里,我们需要首先取两个外部加法之间的左操作数,即 (seasonal_kernel + irregularities_kernel)
和 noise_kernel
之间的加法,然后再取 seasonal_kernel
和 irregularities_kernel
之间的加法的左操作数。在这里,我们可以两次取左操作数,以获得 2.0**2
(它会转换为 ConstantKernel
),它有一个超参数 k1__k1__k1__k1__constant_value
,或者首先取左操作数然后取右操作数以获得 RBF
内核,它有参数 k1__k1__k1__k2__length_scale
。另一个例子:参数 k2__k2__noise_level
是与 noise_kernel
内的 WhiteKernel
中的噪声水平相关的参数,因为您可以首先在 (seasonal_kernel + irregularities_kernel)
和 noise_kernel
之间的加法中取右操作数,然后再在 noise_kernel
中的加法中取右操作数。
这一开始可能感觉非常复杂,但很快就会变得更容易。一旦我们知道哪些内核中的哪些参数有问题,我们就可以通过相应地扩展相应的 _bounds
变量来解决问题。例如,我可以通过将 0.5**2
替换为 ConstantKernel(constant_value=1, constant_value_bounds=(1e3, 1e6))
来解决第一个错误。
英文:
It took me a while but I found the solution in the documentation for the hyperparameter kernel API. The hyperparameter set for the whole kernel can be shown as follows:
for hp in co2_kernel.hyperparameters:
print('co2',hp)
which outputs the following:
co2 Hyperparameter(name='k1__k1__k1__k1__constant_value', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k1__k1__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k1__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k1__k2__periodicity', value_type='numeric', bounds='fixed', n_elements=1, fixed=True)
co2 Hyperparameter(name='k1__k2__k1__constant_value', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k2__k2__alpha', value_type='numeric', bounds=array([[1.e+02, 1.e+07]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k1__k2__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k2__k1__k1__constant_value', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k2__k1__k2__length_scale', value_type='numeric', bounds=array([[1.e-05, 1.e+05]]), n_elements=1, fixed=False)
co2 Hyperparameter(name='k2__k2__noise_level', value_type='numeric', bounds=array([[1.e-09, 1.e+01]]), n_elements=1, fixed=False)
The parameters relate to arguments of the various pieces of the kernel. As the documentation points out, "Note that due to the nested structure of kernels (by applying kernel operators, see below), the names of kernel parameters might become relatively complicated. In general, for a binary kernel operator, parameters of the left operand are prefixed with k1__
and parameters of the right operand with k2__
.". The bifurcation are considered starting from the rightmost, according to the order of precedence of the operations.
For example, the hyperparameters for the seasonal kernel start with k1__k1__
because to get there we need to take the left operand of both the outer additions, first the one between (seasonal_kernel + irregularities_kernel)
and noise_kernel
, and then the one between seasonal_kernel
and irregularities_kernel
. Here we can take the left operand both times to get to the 2.0**2
(which gets transformed to a ConstantKernel
), which has one hyperparameter k1__k1__k1__k1__constant_value
, or take first the left operand and then the right to get to the RBF
kernel, which has the parameter k1__k1__k1__k2__length_scale
. Another example: the parameter k2__k2__noise_level
is the one relative to the noise level in the WhiteKernel
within noise_kernel
, because you can get there by first taking the right operand in the addition between (seasonal_kernel + irregularities_kernel)
and noise_kernel
, then the right operand again in the addition within noise_kernel
.
This feels impossibly complicated at first but gets easier pretty quickly. Once we know which parameters within which kernels are problematic, we can sort the problem by extending the corrisponding _bounds
variable accordingly. For example, I could solve the first error by replacing 0.5**2 with ConstantKernel(constant_value=1,constant_value_bounds =(1e3,1e6))
.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论