英文:
Upload big file to S3 without passing through server
问题
我正在开发一个Web应用程序,允许用户将大文件上传到服务器,并且我正在使用S3来存储这些文件。为了能够上传任意大小的文件,我正在使用s3manager.Uploader
来简化多部分上传的操作。
由于其中一些文件可能很大(约15GB),我想知道是否可以像预签名URL那样,为多部分上传做类似的操作?我希望避免将数据上传到我的服务器,然后再上传到S3。理想情况下,我希望有一个URL,可以直接在HTML表单中使用POST
方法,并避免将我的Web服务器用作文件的代理。
英文:
I'm developing a web app that allows users to upload big files to the server and I'm using S3 to store those files. In order to be able to upload files of any size, I'm using s3manager.Uploader
to simplify doing the multipart upload.
Because some of the files are going to be big (~15gb) I wanted to know if it's possible to do something like the presigned URLs, but for multipart uploads? I want to avoid having to upload the data to my server and then upload it to S3. Ideally I would want an URL where I can just direct a POST
in a html form and avoid using my web server as a proxy for the files.
答案1
得分: 1
@michael-sqlbot是正确的。在那里只能上传每个文件5GB的限制。如果您在上传之前对每个请求进行预签名,可以进行多部分上传,这意味着您需要管理文件的拆分并要求服务器在上传之前对每个请求进行签名。
我最终遵循了@fl0cke的建议,使用https://github.com/TTLabs/EvaporateJS来管理上传。它会处理所有文件管理工作,您只需要实现服务器端函数来为每个文件签署请求。
英文:
@michael-sqlbot was right. Just doing posts there's a limit of 5GB per file. You can do a multipart upload if you presign each request before uploading, which means you need to manage the splitting of the file and asking the server to sign each request before uploading.
I ended up following @fl0cke advice and using https://github.com/TTLabs/EvaporateJS to manage the upload. It does all the file managing and you just have to implement the server side function to sign the request for each file.
答案2
得分: 0
每个对象存储的限制为5TB,您可以在此处找到参考链接链接
英文:
The limit for each object to the store is 5TB, Here you can found the reference link
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论