英文:
S3 Presigned URL for large files not working - python boto3
问题
我正在使用Cloudflare的R2存储(与S3兼容),用于从私有存储桶下载文件的预签名URL在文件小于1GB时工作正常。当文件大小超过1GB时,它返回访问被拒绝。
生成预签名URL的代码如下:
temp_link = s3.generate_presigned_url('get_object', Params={'Bucket': bucketName, 'Key': File_key, 'ResponseContentDisposition': 'attachment'}, ExpiresIn=3600*8)
英文:
I am using Cloudflare's R2 storage (s3 compatible), and the pre-signed URL for downloading files from a private bucket works well for files smaller than 1GB. When the file size is more than 1GB it returns access denied.
Code to generate pre-signed URL is here:
temp_link = s3.generate_presigned_url('get_object',Params={'Bucket': bucketName,'Key': File_key, 'ResponseContentDisposition': 'attachment'},ExpiresIn=3600*8)
答案1
得分: 1
Presigned URLs在生成它们的凭证失效时无效。所以,在角色方面,当假定角色凭证过期时,Presigned URL就不再有效。<br/>您还可以尝试使用多部分上传来上传大文件(> 5GB),通过多个部分。
英文:
Presigned URLs are not valid when the credentials that generated them are no longer valid. So in case of a role, when the Assume Role Credentials expire, the Presigned URL is no longer valid. <br/>
You can also try using Multi-part upload for uploading large files (> 5GB) via multiple parts.
答案2
得分: 0
经过一个小时的努力,终于,我修复了它...希望能帮助其他人。
问题不在于文件或文件大小,而是我的文件名包含**“&”**字符。由于这个原因,预签名的URL会中断(不进行“&”转义,GET参数将无法正确识别)。
所以故事的寓意是...永远不要在文件名中使用“&”字符(或者实施适当的转义)
这真是一场噩梦。
英文:
After spending an hour, finally, I fixed it...hope it may help others.
The issue was not with the file or file size, My file name contains "&" character(s). Due to this, the pre-signed URL is breaking (without '&' escaping it will not figure out GET params correctly).
So the moral of the story is...never use "&" characters in the file names (OR implement proper escaping)
It was a nightmare.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论