Scipy Optimize无法在具有强起始参数的情况下进行优化。

huangapple go评论53阅读模式
英文:

Scipy Optimize unable to optimize even with strong starting parameters

问题

I have a fairly simple function that I would like to optimize the parameters for but I cannot get scipy.optimize.minimize to succeed.

Here's a simplified version of the data and the problem:

ref = np.array([0.586, 0.659, 0.73 , 0.799, 0.865, 0.929, 0.991, 1.05 , 1.107, 1.162])

input = np.array([70.0, 77.0, 82.0, 87.0, 93.0, 98.0, 98.0, 102.0, 106.0, 109.0])

x = np.array([6.96,  9.24, 10.92, 12.24, 13.92, 15.24, 15.24, 16.32, 17.64, 18.96])

## Function 
def fun(beta, x):
    return ((input**beta[0])*beta[2])*(x**beta[1])

## Starting parameters 
initial_guess = [0.15, 0.9475, 0.0427]

## Objective to be minimized 
def objective(beta, model, x, ref):
    return sum(((np.log(model(beta, x))-np.log(ref)))**2)

minimize(objective, initial_guess, args = (fun, x, ref))

I know that these starting parameters are almost correct as print(fun(initial_guess, x)) returns estimates close to the reference data (and they're much closer in my real situation than in this minimal reproducible example).

I've tried many combinations of starting parameters and cannot find any that lead to successful optimization.

I've tried making the function more basic (e.g., removing the additional beta terms and x, leaving only beta[0]). This successfully optimizes (success: True) however the predictions are inadequate (presumably because the function is not complex enough to convert the input a desirable output with respect to the reference).

I've minimized functions apparently more complex than this in recent times (and used the same approach in this instance as in previous ones), so I am confused as to why this one is not working.

英文:

I have a fairly simple function that I would like to optimize the parameters for but I cannot get scipy.optimize.minimize to succeed.

Here's a simplified version of the data and the problem:

ref = np.array([0.586, 0.659, 0.73 , 0.799, 0.865, 0.929, 0.991, 1.05 , 1.107, 1.162])

input = np.array([70.0, 77.0, 82.0, 87.0, 93.0, 98.0, 98.0, 102.0, 106.0, 109.0])

x = np.array([6.96,  9.24, 10.92, 12.24, 13.92, 15.24, 15.24, 16.32, 17.64, 18.96])

## Function 
def fun(beta, x):
    return ((input**beta[0])*beta[2])*(x**beta[1])

## Starting parameters 
initial_guess = [0.15, 0.9475, 0.0427]

## Objective to be minimized 
def objective(beta, model, x, ref):
    return sum(((np.log(model(beta, x))-np.log(ref)))**2)

minimize(objective, initial_guess, args = (fun, x, ref))

I know that these starting parameters are almost correct as print(fun(initial_guess, x)) returns estimates close to the reference data (and they're much closer in my real situation than in this minimal reproducible example).

I've tried many combinations of starting parameters and cannot find any that lead to successful optimization.

I've tried making the function more basic (e.g., removing the additional beta terms and x, leaving only beta[0]). This successfully optimizes (success: True) however the predictions are inadequate (presumably because the function is not complex enough to convert the input a desirable output with respect to the reference).

I've minimized functions apparently more complex than this in recent times (and used the same approach in this instance as in previous ones), so I am confused as to why this one is not working.

答案1

得分: 1

minimize不是正确的函数调用。使用curve_fit,即使没有您的对数步骤,它也能正常工作。此外,始终为minimize(或curve_fit)提供合理的边界;如果您过去“使用相同方法”并且没有边界也能正常工作,那只是巧合。

在某种意义上,这实际上是对三个维度的曲面拟合,如果按这种方式解释,它没有足够的输入数据。我会期望在xinput中有多个非单调跃迁的情况。以下是应该是什么样子的(具有不同的ix值):

import numpy as np
from matplotlib import pyplot as plt
from scipy.optimize import curve_fit

def fun(ix: np.ndarray, b0: float, b1: float, b2: float) -> np.ndarray:
    input_, x = ix
    return input_**b0 * b2 * x**b1

ref = np.array([0.586, 0.659, 0.73, 0.799, 0.865, 0.929, 0.991, 1.05, 1.107, 1.162])
ix = np.array((
    [70.0, 77.0, 82.0, 87.0, 93.0, 98.0, 98.0, 102.0, 106.0, 109.0],
    [6.96, 9.24, 10.92, 12.24, 13.92, 15.24, 15.24, 16.32, 17.64, 18.96],
))
initial_guess = (2, -0.5, 4e-4)
fit_param, _ = curve_fit(
    f=fun, xdata=ix, ydata=ref, p0=initial_guess,
    bounds=((-1,-1,0), (10, 10, 10)),
)
print(fit_param)

fig, ax = plt.subplots()
ax.plot(ix[0], ref, label='experiment')
ax.plot(ix[0], fun(ix, *initial_guess), label='guess')
ax.plot(ix[0], fun(ix, *fit_param), label='fit')
ax.legend()
plt.show()

Scipy Optimize无法在具有强起始参数的情况下进行优化。


[![猜测和拟合][1]][1]

<details>
<summary>英文:</summary>

`minimize` is not the right function call. Use `curve_fit`, and it works fine even without your log step. Further, always give `minimize` (or `curve_fit`) sane bounds; if you&#39;ve &quot;used the same approach&quot; and it worked without bounds in the past that&#39;s only by coincidence.

In one sense this is really a surface fit over three dimensions, and interpreted as such it doesn&#39;t have enough input data. I would expect multiple non-monotonic jumps in one of `x` or `input` for such a scheme. What this _should_ look like (with different values in `ix`):

```python
import numpy as np
from matplotlib import pyplot as plt
from scipy.optimize import curve_fit


def fun(ix: np.ndarray, b0: float, b1: float, b2: float) -&gt; np.ndarray:
    input_, x = ix
    return input_**b0 * b2 * x**b1


ref = np.array([0.586, 0.659, 0.73, 0.799, 0.865, 0.929, 0.991, 1.05, 1.107, 1.162])
ix = np.array((
    [70.0, 77.0, 82.0, 87.0, 93.0, 98.0, 98.0, 102.0, 106.0, 109.0],
    [6.96, 9.24, 10.92, 12.24, 13.92, 15.24, 15.24, 16.32, 17.64, 18.96],
))
initial_guess = (2, -0.5, 4e-4)
fit_param, _ = curve_fit(
    f=fun, xdata=ix, ydata=ref, p0=initial_guess,
    bounds=((-1,-1,0), (10, 10, 10)),
)
print(fit_param)

fig, ax = plt.subplots()
ax.plot(ix[0], ref, label=&#39;experiment&#39;)
ax.plot(ix[0], fun(ix, *initial_guess), label=&#39;guess&#39;)
ax.plot(ix[0], fun(ix, *fit_param), label=&#39;fit&#39;)
ax.legend()
plt.show()

Scipy Optimize无法在具有强起始参数的情况下进行优化。

huangapple
  • 本文由 发表于 2023年5月7日 17:28:51
  • 转载请务必保留本文链接:https://go.coder-hub.com/76193092.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定