GCP在Storage和BigQuery之间的管道

huangapple go评论63阅读模式
英文:

GCP pipilene between Storage and BigQuery

问题

每当数据被添加到 GCP 存储桶中时,我该如何处理这些数据?我需要使用自己的 API 处理每个文件,并将其添加到 BigQuery 中。我正在寻找一种在数据被添加到存储桶时每次触发端点的方法。

英文:

How can I process the data whenever it's added to the GCP storage bucket? I need to process each file with my own API and add it to BigQuery. I`m looking for a way to trigger the endpoint each time the data is added to the storage bucket.

答案1

得分: 1

你可以使用“Cloud Storage 触发器”函数,具体来说是事件类型 finalize,此事件在成功完成对存储桶中对象的 写入覆盖 操作时触发。当执行其中之一操作时,该函数将被执行,在函数内部,您可以执行所需的操作,比如将数据发送到 BigQuery。

这里有一个教程,以便您能够实现并部署您的触发器

英文:

You can use a function Cloud Storage Trigger, specifically the event type finalize, this event trigger when a write or overwrite a object at your bucket is successfully finalized. When one of these operations is carried out, the function will be executed and within it you could already perform the operations you want, such as sending data to BigQuery.

Here is a tutorial so you can implement and deploy your trigger.

huangapple
  • 本文由 发表于 2020年10月14日 22:48:42
  • 转载请务必保留本文链接:https://go.coder-hub.com/64355909.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定