英文:
maximum likelihood estimation of parameters following polynomial logistic regression
问题
这是数据集,使用frair库,数据为gammarus。
我想估计参数p0、p1、p2和p3的值,公式如下:
*NA/No= exp(P0+ P1*density+ P2*density^2+P3*density^3)/(1+exp(P0+P1*density+ P2*density^2+P3*density^3))*
,其中Na是被捕食的猎物数量,No是提供的猎物数量。
英文:
this is the datset, library(frair), data=gammarus
i want to estimate the parameters p0, p1 p2 and p3 formula is
*NA/No= exp(P0+ P1*density+ P2*density^2+P3*density^3)/(1+exp(P0+P1*density+ P2*density^2+P3*density^3))*
, where Na is prey eaten and No is prey offered
答案1
得分: 0
## 设置
```R
library(dplyr)
library(frair)
d <- gammarus %>% mutate(y = eaten/(eaten + alive))
步骤 1:回归
你可以使用 lm
(线性模型)函数从方程 中估算系数:
lm(y ~ density, data = d)
步骤 2:多项式回归
如果想要使用多项式函数形式,你可以使用 poly
函数。第一个参数是变量,第二个是多项式的次数,然后你必须指定你想要原始多项式还是正交多项式。在我们的情况下,它将是一个原始多项式,可以查看此帖子获取更多细节。
你可以通过将密度替换为三次原始密度多项式来估算 中的四个系数:
lm(y ~ poly(density, 3, raw = T), data = d)
步骤 3:逻辑回归
最后一步是从线性的 转换为逻辑的
。为此,你将需要
glm
函数(广义线性模型),并且你必须指定你想要的是 logit(而不是例如 probit,参见此帖子)规范,使用 family = binomial(link = "logit")
。
glm(y ~ poly(density, 3, raw = T), data = d, family = binomial(link = "logit"))
<details>
<summary>英文:</summary>
## Setup
library(dplyr)
library(frair)
d <- gammarus %>% mutate(y = eaten/(eaten + alive))
## Step 1: Regression
You can estimate the coefficients from an equation  with the `lm` (linear model) function:
lm(y ~ density, data = d)
## Step 2: Polynomial regression
To have a polynomial functional form instead, you can use the `poly` function. The first argument is the variable, the second is the degree of the polynomial, and you must then specify whether you want a raw or an orthogonal polynomial. In our case it would be a raw polynomial, check [this post](https://stats.stackexchange.com/questions/258307/raw-or-orthogonal-polynomial-regression) for more detail.
You can estimate the four coefficients from  by replacing density with a third degree raw polynomial of density:
lm(y ~ poly(density, 3, raw = T), data = d)
## Step 3: Logistic regression
The final step is to switch from the linear  to the logistic . For this you would need the `glm` function (generalized linear model) and you must specify that you want a logit (and not a probit for instance, cf. [this post](https://stats.stackexchange.com/questions/20523/difference-between-logit-and-probit-models)) specification with `family = binomial(link = "logit")`.
glm(y ~ poly(density, 3, raw = T), data = d, family = binomial(link = "logit"))
</details>
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论