云运行 Golang 容器问题/误解

huangapple go评论81阅读模式
英文:

Cloud Run Golang container issue/missunderstanding

问题

我正在尝试生成我们组织在云存储中所有项目中的所有对象的报告。我正在使用Google专业服务的这个存储库,因为它正好符合我们的要求:https://github.com/GoogleCloudPlatform/professional-services/tree/main/tools/gcs2bq

我们希望使用容器而不仅仅是在云函数上运行的Go代码,主要是为了可移植性。

在本地一切都很好,程序的行为符合预期,但是当我尝试在Cloud Run中运行时就有些棘手了。据我所了解,Go部分需要监听一个端口,因此我在main函数的开头添加了端口,以便可以部署容器,而实际上也成功部署了:

// 确定HTTP服务的端口
port := os.Getenv("PORT")
if port == "" {
        port = "8080"
        log.Printf("默认端口为 %s", port)
}

// 启动HTTP服务器
log.Printf("监听端口 %s", port)
if err := http.ListenAndServe(":"+port, nil); err != nil {
        log.Fatal(err)
}

但是正如您在存储库中所看到的,第一个被调用的文件是run.sh。它设置环境变量,然后调用main.go。它成功完成了它的任务,即获取不同文件的大小。但之后run.sh没有“继续”执行,也没有执行将数据上传到BigQuery表的部分,而在本地是可以工作的。

这是我在run.sh文件中遇到问题的部分。注意:我执行./gcs2bq时没有出现错误。注意2:每个环境变量都有正确的值。

./gcs2bq $GCS2BQ_FLAGS || error "导出失败!" 2    <- 在这一行之后无法继续执行

gsutil mb -p "${GCS2BQ_PROJECT}" -c standard -l "${GCS2BQ_LOCATION}" -b on "gs://${GCS2BQ_BUCKET}" || echo "信息:存储桶已存在:${GCS2BQ_BUCKET}"

gsutil cp "${GCS2BQ_FILE}" "gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}" || error "将${GCS2BQ_FILE}复制到gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}失败!" 3

bq mk --project_id="${GCS2BQ_PROJECT}" --location="${GCS2BQ_LOCATION}" "${GCS2BQ_DATASET}" || echo "信息:BigQuery数据集已存在:${GCS2BQ_DATASET}"

bq load --project_id="${GCS2BQ_PROJECT}" --location="${GCS2BQ_LOCATION}" --schema bigquery.schema --source_format=AVRO --use_avro_logical_types --replace=true "${GCS2BQ_DATASET}.${GCS2BQ_TABLE}" "gs://${GCS2BQ_BUCKET}/${GCS2BQ_FIL$
  error "将gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}加载到BigQuery表${GCS2BQ_DATASET}.${GCS2BQ_TABLE}失败!" 4

gsutil rm "gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}" || error "删除gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}失败!" 5

rm -f "${GCS2BQ_FILE}"

我对容器和Cloud Run还比较新手,即使阅读了项目和文档,我也不确定自己做错了什么。调用main.go时,.sh文件“卡住”是正常的吗?如果需要,我可以提供更多细节/解释。

英文:

I'm trying to do a report of all the objects in all the projects we have in Cloud Storage of our Org. I'm using this repo from the Google Professionnal Services as it's doing exactly what we want: https://github.com/GoogleCloudPlatform/professional-services/tree/main/tools/gcs2bq

We want to use containers instead of just the go code on a Cloud Function for portability mainly.

Locally everything is good and the program behave as expected but when I try in Cloud Run things get tricky. From what I understand, the go part needs to listen to a port, which I added at the beginning of the main so the container can be deployed, which it is:

// Determine port for HTTP service
port := os.Getenv(&quot;PORT&quot;)
if port == &quot;&quot; {
        port = &quot;8080&quot;
        log.Printf(&quot;defaulting to port %s&quot;, port)
}

Start HTTP server.
log.Printf(&quot;listening on port %s&quot;, port)
if err := http.ListenAndServe(&quot;:&quot;+port, nil); err != nil {
        log.Fatal(err)
}

But as you can see in the repo, the first file called is the run.sh one. Which set environment variables and then call the main.go. It sucessfully complete it's task, which is get all the size of the different files. But after that the run.sh doesnt "resume" and go to the part where it uploads the data in a BigQuery table, which locally work.

Here is the part in the run.sh file where I have a problem. Note : I don't have errors from executing the ./gcs2bq Note 2 : Every environment variable has a correct value

./gcs2bq $GCS2BQ_FLAGS || error &quot;Export failed!&quot; 2    &lt;- doesnt get past this line


gsutil mb -p &quot;${GCS2BQ_PROJECT}&quot; -c standard -l &quot;${GCS2BQ_LOCATION}&quot; -b on &quot;gs://${GCS2BQ_BUCKET}&quot; || echo &quot;Info: Storage bucket already exists: ${GCS2BQ_BUCKET}&quot;

gsutil cp &quot;${GCS2BQ_FILE}&quot; &quot;gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}&quot; || error &quot;Failed copying ${GCS2BQ_FILE} to gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}!&quot; 3

bq mk --project_id=&quot;${GCS2BQ_PROJECT}&quot; --location=&quot;${GCS2BQ_LOCATION}&quot; &quot;${GCS2BQ_DATASET}&quot; || echo &quot;Info: BigQuery dataset already exists: ${GCS2BQ_DATASET}&quot;

bq load --project_id=&quot;${GCS2BQ_PROJECT}&quot; --location=&quot;${GCS2BQ_LOCATION}&quot; --schema bigquery.schema --source_format=AVRO --use_avro_logical_types --replace=true &quot;${GCS2BQ_DATASET}.${GCS2BQ_TABLE}&quot; &quot;gs://${GCS2BQ_BUCKET}/${GCS2BQ_FIL$
  error &quot;Failed to load gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME} to BigQuery table ${GCS2BQ_DATASET}.${GCS2BQ_TABLE}!&quot; 4

gsutil rm &quot;gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}&quot; || error &quot;Failed deleting gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}!&quot; 5

rm -f &quot;${GCS2BQ_FILE}&quot;

I'm kinda new to containers and Cloud Run and even after reading projects and documentation, I'm not sure what I'm doing wrong, Is it normal that the .sh is "stuck" when calling the main.go? I can provide more details/explaination if needed.

答案1

得分: 0

好的,以下是翻译好的内容:

好的,对于遇到类似情况的任何人,这是我让它对我起作用的方法。

容器不应该停止,所以没有退出,它只会返回到主函数。

这意味着当我调用可执行文件时,它只会循环,永远不会退出并完成任务。所以解决方案是将调用后的所有内容直接重新编码到 main.go 中。

在这里,run.sh 是无用的,所以我使用了另一个 .go 文件来监听 HTTP 请求,然后调用收集数据并将其发送到 BigQuery 的代码。

英文:

Okay so for anyone who encounter similar situation this is how I made it work for me.

The container isn't supposed to stop so no exit, it will just go back to the main function.

That means that when I called executable it just looped and never exited and completed the task. So the solution here is to "recode" everything past the call in golang directly into the main.go

Here the run.sh is then useless so I used another .go file that listen for http request and then call the code that gather data and send it to Bigquery.

huangapple
  • 本文由 发表于 2021年12月28日 22:53:06
  • 转载请务必保留本文链接:https://go.coder-hub.com/70508733.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定