英文:
Set limit on maximum size of bucket in GCP
问题
我正在使用Google Cloud Storage(GCS)存储一些图像,并希望设置存储桶的最大大小限制。
实际用例是,我将使用Google容器注册表(GCR),它又使用GCS存储图像。
我不希望存储桶的大小或总存储库超过100GB(或XGB)。
我一直在寻找一种使用某种GCP配置来设置此限制的方法,但一直没有找到。
我知道可以定期检查存储桶的大小以保持其在控制范围内。但是否有一种方法可以设置GCS存储桶的最大大小硬限制?
英文:
I am using Google Cloud Storage(GCS) to store some images, and want to set a limit on the maximum size the bucket can reach.
The actual use case is, I will be using Google Container Registry(GCR), which in turn uses GCS to store the images.
I don't want the bucket size or the total repository to cross 100Gb(or X Gb).
I have been looking for a way to set this limit using some GCP configuration but to no avail.
I know I can keep checking the bucket size at regular intervals to keep it in check. But is there a way where I can set the hard-limit on the max size of a GCS bucket?
答案1
得分: 1
你不能使用Google Cloud管理控制或约束来设置Cloud Storage存储桶的大小限制。
此外,Container Registry和Artifact Registry使用的存储桶最好不要改动。如果您想控制大小,请手动删除不再使用的图像,不要删除对象,因为图像由多个对象(图像层)组成。容器镜像通常不是单个文件(对象)。管理不当将导致孤立的对象和使用这些图像的服务中断。
从技术上讲,您可以编写自己的函数来监控这些存储桶。请重新阅读上一段以了解管理总使用空间的挑战。
英文:
You cannot set a limit on the size of a Cloud Storage bucket using Google Cloud management controls or constraints.
Additionally, buckets used by Container Registry and Artifact Registry are best left alone. If you want to control the size, delete images that you are no longer using manually. Do not delete objects as images consist of multiple objects (image layers). A container image is usually not a single file (object). Poor management will result in orphan objects and broken services that use those images.
Technically, you can write your own function to monitor those buckets. Please reread the previous paragraph to understand the challenges in managing the total used space.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论