英文:
http Request.FormFile : handle zip files?
问题
我正在使用Go语言编写一个Web服务器。
在其中一个页面上,用户可以上传文件。
我想要能够处理zip文件。
在archive/zip
包中,我只看到两个函数可以从zip归档中读取:
func OpenReader(name string) (*ReadCloser, error)
func NewReader(r io.ReaderAt, size int64) (*Reader, error)
如果我想使用第二个函数,我需要在调用函数之前知道上传文件的大小,我想避免写入和从磁盘读取。
问题
我将把问题分为两个部分:
-
通过标准的
multipart/form-data
HTML表单上传的zip文件的解压后内容的惯用方法是什么? -
如何获取通过HTML表单上传的文件的实际大小?
func(req *http.Request) { f, h, err := req.FormFile("fileTag") if err != nil { panic(err) } var fileSize int = ?? unzipper, err := zip.NewReader(f, fileSize) }
英文:
I'm writing a web server in go.
On one of the pages, the user can upload a file.
I would like to be able to handle zip files.
In the archive/zip
package, I only see two functions which allow me to read from a zip archive :
func OpenReader(name string) (*ReadCloser, error)
func NewReader(r io.ReaderAt, size int64) (*Reader, error)
I would like to avoid writing and reading back from the disk,
if I want to use the second function, I need to know the size of the uploaded file before calling the function.
Question
I will split my question in two parts :
-
What would be the idiomatic way to read the unzipped content of a zip file uploaded through a standard
multipart/form-data
html form ? -
How can I get the actual size of a file uploaded through a html form ?
func(req *http.Request) { f, h, err := req.FormFile("fileTag") if err != nil { panic(err) } var fileSize int = ?? unzipper, err := zip.NewReader(f, fileSize) }
答案1
得分: 3
你可以在FormFile的Header中查找文件大小(它是一个MIMEHeader)。
h.Header.Get("Content-Length")
如果文件没有内容长度,你可以先将其读入缓冲区以获取大小。
var buff bytes.Buffer
fileSize, err := buff.ReadFrom(f)
其他选项是将其定位到末尾,就像你在答案中提到的,或者从接口中获取具体的Reader。如果是多部分文件且存在于内存中,它将是一个io.SectionReader;如果是写入到临时文件中,它将是一个os.File:
switch f := f.(type) {
case *io.SectionReader:
fileSize = r.Size()
case *os.File:
if s, err := f.Stat(); err == nil {
fileSize = s.Size()
}
}
英文:
You can look for the file size in the FormFile's Header (which is a MIMEHEader).
h.Header.Get("Content-Length")
If there is no content length for the file, you can read it into a buffer first to get the size.
var buff bytes.Buffer
fileSize, err := buff.ReadFrom(f)
Other options are to seek to the end, as you put in your answer, or get the concrete Reader out of the interface. A multipart File will be an *io.SectionReader
if it's in memory, or an *os.File
if it was written to a temp file:
switch f := f.(type) {
case *io.SectionReader:
fileSize = r.Size()
case *os.File:
if s, err := f.Stat(); err == nil {
fileSize = s.Size()
}
}
答案2
得分: 2
这是我找到的一种获取大小的方法:
func(req *http.Request) {
f, h, err := req.FormFile("fileTag")
if err != nil {
panic(err)
}
fileSize, err := f.Seek(0, 2) //2 = from end
if err != nil {
panic(err)
}
_, err = f.Seek(0, 0)
if err != nil {
panic(err)
}
unzipper, err := zip.NewReader(f, fileSize)
}
我觉得这种解决方案不够优雅或符合惯用方式。
难道没有更简洁的处理方法吗?
英文:
Here is a way I found to get the size :
func(req *http.Request) {
f, h, err := req.FormFile("fileTag")
if err != nil {
panic(err)
}
fileSize, err := f.Seek(0, 2) //2 = from end
if err != nil {
panic(err)
}
_, err = f.Seek(0, 0)
if err != nil {
panic(err)
}
unzipper, err := zip.NewReader(f, fileSize)
}
I don't find this solution very elegant or idiomatic.
Isn't there some cleaner way to handle this case ?
答案3
得分: 1
我会使用内存缓冲区,并确保限制文件的最大上传大小(大约100MB)。以下是使用io.Copy
的示例代码:
import (
"archive/zip"
"bytes"
"io"
"net/http"
)
func HttHandler(req *http.Request) {
f, _, err := req.FormFile("fileTag")
if err != nil {
panic(err)
}
buf := new(bytes.Buffer)
fileSize, err := io.Copy(buf, f)
if err != nil {
panic(err)
}
zip.NewReader(bytes.NewReader(buf.Bytes()), fileSize)
}
英文:
I would use an in-memory buffer and make sure to limit the max upload size of a file (~100MB?)
Here it is using io.Copy
import (
"archive/zip"
"bytes"
"io"
"net/http"
)
func HttHandler(req *http.Request) {
f, _, err := req.FormFile("fileTag")
if err != nil {
panic(err)
}
buf := new(bytes.Buffer)
fileSize, err := io.Copy(buf, f)
if err != nil {
panic(err)
}
zip.NewReader(bytes.NewReader(buf.Bytes()), fileSize)
}
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论