英文:
How to limit download speed with Go?
问题
我目前正在使用Go语言开发一个下载服务器。我需要将用户的下载速度限制在100KB/s。
这是我的代码:
func serveFile(w http.ResponseWriter, r *http.Request) {
fileID := r.URL.Query().Get("fileID")
if len(fileID) != 0 {
w.Header().Set("Content-Disposition", "attachment; filename=filename.txt")
w.Header().Set("Content-Type", r.Header.Get("Content-Type"))
w.Header().Set("Content-Length", r.Header.Get("Content-Length"))
file, err := os.Open(fmt.Sprintf("../../bin/files/test.txt"))
defer file.Close()
if err != nil {
http.NotFound(w, r)
return
}
io.Copy(w, file)
} else {
io.WriteString(w, "Invalid request.")
}
}
然后我在GitHub上找到了一个包,我的代码变成了以下形式:
func serveFile(w http.ResponseWriter, r *http.Request) {
fileID := r.URL.Query().Get("fileID")
if len(fileID) != 0 {
w.Header().Set("Content-Disposition", "attachment; filename=Wiki.png")
w.Header().Set("Content-Type", r.Header.Get("Content-Type"))
w.Header().Set("Content-Length", r.Header.Get("Content-Length"))
file, err := os.Open(fmt.Sprintf("../../bin/files/test.txt"))
defer file.Close()
if err != nil {
http.NotFound(w, r)
return
}
bucket := ratelimit.NewBucketWithRate(100*1024, 100*1024)
reader := bufio.NewReader(file)
io.Copy(w, ratelimit.Reader(reader, bucket))
} else {
io.WriteString(w, "Invalid request.")
}
}
但是我遇到了以下错误:
内容损坏错误
由于数据传输中检测到错误,无法显示您要查看的页面。
这是我在Go Playground上的代码:http://play.golang.org/p/ulgXQl4eQO
英文:
I'm currently developing a download server in Go. I need to limit the download speed of users to 100KB/s.
This was my code:
func serveFile(w http.ResponseWriter, r *http.Request) {
fileID := r.URL.Query().Get("fileID")
if len(fileID) != 0 {
w.Header().Set("Content-Disposition", "attachment; filename=filename.txt")
w.Header().Set("Content-Type", r.Header.Get("Content-Type"))
w.Header().Set("Content-Length", r.Header.Get("Content-Length"))
file, err := os.Open(fmt.Sprintf("../../bin/files/test.txt"))
defer file.Close()
if err != nil {
http.NotFound(w, r)
return
}
io.Copy(w, file)
} else {
io.WriteString(w, "Invalid request.")
}
}
Then I found a package on github and my code became the following:
func serveFile(w http.ResponseWriter, r *http.Request) {
fileID := r.URL.Query().Get("fileID")
if len(fileID) != 0 {
w.Header().Set("Content-Disposition", "attachment; filename=Wiki.png")
w.Header().Set("Content-Type", r.Header.Get("Content-Type"))
w.Header().Set("Content-Length", r.Header.Get("Content-Length"))
file, err := os.Open(fmt.Sprintf("../../bin/files/test.txt"))
defer file.Close()
if err != nil {
http.NotFound(w, r)
return
}
bucket := ratelimit.NewBucketWithRate(100*1024, 100*1024)
reader := bufio.NewReader(file)
io.Copy(w, ratelimit.Reader(reader, bucket))
} else {
io.WriteString(w, "Invalid request.")
}
}
But I'm getting this error:
> Corrupted Content Error
>
> The page you are trying to view cannot be shown because an error in
> the data transmission was detected.
Here's my code on the Go playground: http://play.golang.org/p/ulgXQl4eQO
答案1
得分: 2
与其自己操心正确的内容类型和长度头部,使用http.ServeContent
可能会更好,它会为你处理这些问题(还支持“如果修改自”、“范围请求”等)。如果你能提供一个“ETag”头部,它还可以处理“如果范围”和“如果无匹配”请求。
如前所述,通常最好在写入端进行限制,但是包装http.ResponseWriter
会很麻烦,因为各种http函数还会检查可选接口,如http.Flusher
和http.Hijacker
。相比之下,包装ServeContent
所需的io.ReadSeeker
要容易得多。
例如,可以这样实现:
func pathFromID(fileID string) string {
// 根据需要替换为适当的逻辑
return "../../bin/files/test.txt"
}
// 或者更详细地称之为“limitedReadSeeker”
type lrs struct {
io.ReadSeeker
// 这个读取器不能缓冲,只需简单地将Read调用传递给ReadSeeker
r io.Reader
}
func (r lrs) Read(p []byte) (int, error) {
return r.r.Read(p)
}
func newLRS(r io.ReadSeeker, bucket *ratelimit.Bucket) io.ReadSeeker {
// 这里我们知道/期望ratelimit.Reader对Read调用除了添加延迟外不会做任何事情,因此它不会破坏任何io.Seeker调用。
return lrs{r, ratelimit.Reader(r, bucket)}
}
func serveFile(w http.ResponseWriter, req *http.Request) {
fileID := req.URL.Query().Get("fileID")
if len(fileID) == 0 {
http.Error(w, "无效的请求", http.StatusBadRequest)
return
}
path := pathFromID(fileID)
file, err := os.Open(path)
if err != nil {
http.NotFound(w, req)
return
}
defer file.Close()
fi, err := file.Stat()
if err != nil {
http.Error(w, "blah", 500) // XXX fixme
return
}
const (
rate = 100 << 10
capacity = 100 << 10
)
// 通常我们更喜欢限制写入器,但是包装http.ResponseWriter会很麻烦,因为它可能还实现了http.Flusher或http.Hijacker。
bucket := ratelimit.NewBucketWithRate(rate, capacity)
lr := newLRS(file, bucket)
http.ServeContent(w, req, path, fi.ModTime(), lr)
}
以上是一个示例,你可以根据需要进行修改。
英文:
Rather than mucking around with getting the correct the content type and length headers yourself it'd probably be much better to use http.ServeContent
which will do that for you (as well as support "If-Modified-Since", range requests, etc. If you can supply an "ETag" header it can also handle "If-Range" and "If-None-Match" requests as well).
As mentioned previously, it's often preferable to limit on the write side but it's awkward to wrap an http.ResponseWriter
since various http functions also check for optional interfaces such as http.Flusher
and http.Hijacker
. It's much easier to wrap the io.ReadSeeker
that ServeContent
needs.
For example, something like this perhaps:
func pathFromID(fileID string) string {
// replace with whatever logic you need
return "../../bin/files/test.txt"
}
// or more verbosely you could call this a "limitedReadSeeker"
type lrs struct {
io.ReadSeeker
// This reader must not buffer but just do something simple
// while passing through Read calls to the ReadSeeker
r io.Reader
}
func (r lrs) Read(p []byte) (int, error) {
return r.r.Read(p)
}
func newLRS(r io.ReadSeeker, bucket *ratelimit.Bucket) io.ReadSeeker {
// Here we know/expect that a ratelimit.Reader does nothing
// to the Read calls other than add delays so it won't break
// any io.Seeker calls.
return lrs{r, ratelimit.Reader(r, bucket)}
}
func serveFile(w http.ResponseWriter, req *http.Request) {
fileID := req.URL.Query().Get("fileID")
if len(fileID) == 0 {
http.Error(w, "invalid request", http.StatusBadRequest)
return
}
path := pathFromID(fileID)
file, err := os.Open(path)
if err != nil {
http.NotFound(w, req)
return
}
defer file.Close()
fi, err := file.Stat()
if err != nil {
http.Error(w, "blah", 500) // XXX fixme
return
}
const (
rate = 100 << 10
capacity = 100 << 10
)
// Normally we'd prefer to limit the writer but it's awkward to wrap
// an http.ResponseWriter since it may optionally also implement
// http.Flusher, or http.Hijacker.
bucket := ratelimit.NewBucketWithRate(rate, capacity)
lr := newLRS(file, bucket)
http.ServeContent(w, req, path, fi.ModTime(), lr)
}
答案2
得分: 1
我没有看到错误,但我注意到代码中有一些问题。对于这部分代码:
w.Header().Set("Content-Type", r.Header.Get("Content-Type"))
你应该使用mime包中的:
func TypeByExtension(ext string) string
来确定内容类型。(如果最终得到的是空字符串,则默认为application/octet-stream
)
对于:
w.Header().Set("Content-Length", r.Header.Get("Content-Length"))
你需要从文件本身获取内容长度。通过使用请求的内容长度,对于GET
请求,这基本上不会有任何操作,但对于POST
请求,你返回的长度是错误的,这可能解释了你看到的错误。在打开文件后,进行如下操作:
fi, err := file.Stat()
if err != nil {
http.Error(w, err.Error(), 500)
return
}
w.Header().Set("Content-Length", fmt.Sprint(fi.Size()))
最后一件事,当你打开文件时,如果出现错误,你不需要关闭文件句柄。可以像这样处理:
file, err := os.Open(...)
if err != nil {
http.NotFound(w, r)
return
}
defer file.Close()
英文:
I'm not seeing the error, but I did notice some issues with the code. For this:
w.Header().Set("Content-Type", r.Header.Get("Content-Type"))
You should use the mime package's:
func TypeByExtension(ext string) string
To determine the content type. (if you end up with the empty string default to application/octet-stream
)
For:
w.Header().Set("Content-Length", r.Header.Get("Content-Length"))
You need to get the content length from the file itself. By using the request content length, for a GET
this basically ends up as a no-op, but for a POST
you're sending back the wrong length, which might explain the error you're seeing. After you open the file, do this:
fi, err := file.Stat()
if err != nil {
http.Error(w, err.Error(), 500)
return
}
w.Header().Set("Content-Length", fmt.Sprint(fi.Size()))
One final thing, when you open the file, if there's an error, you don't need to close the file handle. Do it like this instead:
file, err := os.Open(...)
if err != nil {
http.NotFound(w, r)
return
}
defer file.Close()
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论