GCS云存储标准定价

huangapple go评论70阅读模式
英文:

GCS cloud storage standard pricing

问题

我正在使用GCS标准存储桶。如果我从多个进程中写入100 GB的数据,然后在提取一次后删除它,存储成本是如何计算的?假设我有5个进程,每个进程写入100 GB,然后在使用数据一次后将其删除。在某一时刻,24小时内的30分钟内,存储桶的最大大小将为250。在这种情况下,定价是如何工作的?

英文:

I am using GCS standard bucket. If I am writing 100 GB of data from multiple processes and deleting after fetching it for once, how does the storage cost work ? Let's say I have 5 processes that writes 100 GB each and deletes them after using the data once. At a given point in time , max bucket size will be 250 for 30 mins in 24 hours. How is the pricing work in this situation ?

答案1

得分: 2

根据定价表,看起来您只会被收费:

  • 操作(例如 storage.bucket.list 等),这将取决于您的流程为那100GB的数据调用了多少次操作
  • 任何区域间复制(除非您只有一个区域的存储桶,否则双/多区域存储桶的复制成本为每GB $0.02,因此$100GB将花费您$2)
  • 出口成本(根据您的用例和出口的位置而变化)
英文:

According to the pricing tables, it looks like you would just get charged for:

  • operations (i.e. storage.bucklet.list, etc.), which would be depend on how many operations your processes are calling for that 100GB of data
  • any inter-region replication (unless you have just a regional bucket, replication costs in dual/multi-region buckets would be @ $0.02/GB, so $100GB would cost you $2)
  • egress costs (which varies depending on your use case and where it egresses to)

答案2

得分: 1

对于标准存储桶,它是按比例计费的。所以,如果我们有100GB的数据,持续1小时,我们将被收取100 * 0.026 * 1/960的费用。请参考https://cloud.google.com/storage/pricing-examples#prorate。

英文:

For a standard bucket, its prorated. So if we have data of 100GB for 1 hour. We will be charged 100 * 0.026 * 1/960. Refer to https://cloud.google.com/storage/pricing-examples#prorate

huangapple
  • 本文由 发表于 2023年4月17日 11:11:02
  • 转载请务必保留本文链接:https://go.coder-hub.com/76031476.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定