限制雪花多部分卸载中的文件大小到特定值

huangapple go评论57阅读模式
英文:

Restricting file size in snowflake multipart unload to a specific value

问题

我正在尝试使用COPY选项将一些数据从Snowflake卸载到AWS S3。我希望每个单独的文件都小于1MB。我尝试使用MAX_FILE_SIZE属性,但仍然有些文件超过了1MB。有没有办法强制Snowflake遵守大小限制?

英文:

I am trying to unload some data from Snowflake to AWS S3 using COPY option. I want each individual file to be less than 1MB. I tried using MAX_FILE_SIZE attribute, but still some of my files are greater than 1MB. Is there a way I can force Snowflake to stick to the size limit?

答案1

得分: 1

这并不是保证的,特别是对于非常小的文件大小(默认为16MB),这在我们的文档中有记录,通过以下注释说明:

注意

COPY命令一次卸载一组表行。如果您设置了非常小的MAX_FILE_SIZE值,一组行中的数据量可能会超过指定的大小。

一般来说,实际文件大小和卸载的文件数量取决于数据总量和可用于并行处理的节点数量。

您可以在此处阅读更多信息。

英文:

This is not guaranteed, especially for very small file sizes (default is 16MB) and it is documented on our docs via this note:

Note

The COPY command unloads one set of table rows at a time. If you set a very small MAX_FILE_SIZE value, the amount of data in a set of rows could exceed the specified size.

In general, the actual file size and number of files unloaded are determined by the total amount of data and number of nodes available for parallel processing.

You can read more about this here.

huangapple
  • 本文由 发表于 2023年4月20日 03:06:16
  • 转载请务必保留本文链接:https://go.coder-hub.com/76058032.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定