参数的最大似然估计,遵循多项式逻辑回归。

huangapple go评论70阅读模式
英文:

maximum likelihood estimation of parameters following polynomial logistic regression

问题

这是数据集,使用frair库,数据为gammarus。

我想估计参数p0、p1、p2和p3的值,公式如下:
*NA/No= exp(P0+ P1*density+ P2*density^2+P3*density^3)/(1+exp(P0+P1*density+ P2*density^2+P3*density^3))*,其中Na是被捕食的猎物数量,No是提供的猎物数量。

英文:

enter image description here

this is the datset, library(frair), data=gammarus

i want to estimate the parameters p0, p1 p2 and p3 formula is
*NA/No= exp(P0+ P1*density+ P2*density^2+P3*density^3)/(1+exp(P0+P1*density+ P2*density^2+P3*density^3))*, where Na is prey eaten and No is prey offered

答案1

得分: 0

## 设置

```R
library(dplyr)
library(frair)

d <- gammarus %>% mutate(y = eaten/(eaten + alive))

步骤 1:回归

你可以使用 lm(线性模型)函数从方程 参数的最大似然估计,遵循多项式逻辑回归。 中估算系数:

lm(y ~ density, data = d)

步骤 2:多项式回归

如果想要使用多项式函数形式,你可以使用 poly 函数。第一个参数是变量,第二个是多项式的次数,然后你必须指定你想要原始多项式还是正交多项式。在我们的情况下,它将是一个原始多项式,可以查看此帖子获取更多细节。

你可以通过将密度替换为三次原始密度多项式来估算 参数的最大似然估计,遵循多项式逻辑回归。 中的四个系数:

lm(y ~ poly(density, 3, raw = T), data = d)

步骤 3:逻辑回归

最后一步是从线性的 参数的最大似然估计,遵循多项式逻辑回归。 转换为逻辑的 参数的最大似然估计,遵循多项式逻辑回归。。为此,你将需要 glm 函数(广义线性模型),并且你必须指定你想要的是 logit(而不是例如 probit,参见此帖子)规范,使用 family = binomial(link = "logit")

glm(y ~ poly(density, 3, raw = T), data = d, family = binomial(link = "logit"))

<details>
<summary>英文:</summary>

## Setup

library(dplyr)
library(frair)

d <- gammarus %>% mutate(y = eaten/(eaten + alive))


## Step 1: Regression

You can estimate the coefficients from an equation ![y = p_0 + p_1 \times x](https://chart.googleapis.com/chart?cht=tx&amp;chl=y%20%3D%20p_0%20%2B%20p_1%20%5Ctimes%20x) with the `lm` (linear model) function:

lm(y ~ density, data = d)


## Step 2: Polynomial regression

To have a polynomial functional form instead, you can use the `poly` function. The first argument is the variable, the second is the degree of the polynomial, and you must then specify whether you want a raw or an orthogonal polynomial. In our case it would be a raw polynomial, check [this post](https://stats.stackexchange.com/questions/258307/raw-or-orthogonal-polynomial-regression) for more detail.  

You can estimate the four coefficients from ![y = p_0 + p_1 x + p_2 x^2 + p_3 x^3](https://chart.googleapis.com/chart?cht=tx&amp;chl=y%20%3D%20p_0%20%2B%20p_1%20x%20%2B%20p_2%20x%5E2%20%2B%20p_3%20x%5E3) by replacing density with a third degree raw polynomial of density:

lm(y ~ poly(density, 3, raw = T), data = d)


## Step 3: Logistic regression

The final step is to switch from the linear ![y = p_0 + p_1 x + p_2 x^2 + p_3 x^3](https://chart.googleapis.com/chart?cht=tx&amp;chl=y%20%3D%20p_0%20%2B%20p_1%20x%20%2B%20p_2%20x%5E2%20%2B%20p_3%20x%5E3) to the logistic ![y = \frac{e^{p_0 + p_1 x + p_2 x^2 + p_3 x^3}}{1+e^{p_0 + p_1 x + p_2 x^2 + p_3 x^3}}](https://chart.googleapis.com/chart?cht=tx&amp;chl=y%20%3D%20%5Cfrac%7Be%5E%7Bp_0%20%2B%20p_1%20x%20%2B%20p_2%20x%5E2%20%2B%20p_3%20x%5E3%7D%7D%7B1%2Be%5E%7Bp_0%20%2B%20p_1%20x%20%2B%20p_2%20x%5E2%20%2B%20p_3%20x%5E3%7D%7D). For this you would need the `glm` function (generalized linear model) and you must specify that you want a logit (and not a probit for instance, cf. [this post](https://stats.stackexchange.com/questions/20523/difference-between-logit-and-probit-models)) specification with `family = binomial(link = &quot;logit&quot;)`. 

glm(y ~ poly(density, 3, raw = T), data = d, family = binomial(link = "logit"))


</details>



huangapple
  • 本文由 发表于 2023年2月19日 18:33:59
  • 转载请务必保留本文链接:https://go.coder-hub.com/75499497.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定