使用Golang通过multipart处理图像并流式传输到Azure

huangapple go评论95阅读模式
英文:

Golang processing images via multipart and streaming to Azure

问题

在学习golang的过程中,我正在尝试编写一个具有多图上传功能的Web应用程序。

我正在使用Azure Blob Storage来存储图像,但是我在将图像从多部分请求流式传输到Blob Storage时遇到了问题。

以下是我目前编写的处理程序:

func (imgc *ImageController) UploadInstanceImageHandler(w http.ResponseWriter, r *http.Request, p httprouter.Params) {
    reader, err := r.MultipartReader()

    if err != nil {
        http.Error(w, err.Error(), http.StatusInternalServerError)
        return
    }

    for {
        part, partErr := reader.NextPart()

        // 没有更多的部分需要处理
        if partErr == io.EOF {
            break
        }

        // 如果part.FileName()为空,则跳过此次迭代。
        if part.FileName() == "" {
            continue
        }

        // 检查文件类型
        if part.Header["Content-Type"][0] != "image/jpeg" {
            fmt.Printf("\nNot image/jpeg!")
            break
        }

        var read uint64
        fileName := uuid.NewV4().String() + ".jpg"
        buffer := make([]byte, 100000000)

        // 获取大小
        for {
            cBytes, err := part.Read(buffer)

            if err == io.EOF {
                fmt.Printf("\nLast buffer read!")
                break
            }

            read = read + uint64(cBytes)
        }

        stream := bytes.NewReader(buffer[0:read])
        err = imgc.blobClient.CreateBlockBlobFromReader(imgc.imageContainer, fileName, read, stream, nil)

        if err != nil {
            fmt.Println(err)
            break
        }
    }

    w.WriteHeader(http.StatusOK)
}

在我的研究过程中,我阅读了使用r.FormFile、ParseMultipartForm的方法,但决定尝试学习如何使用MultiPartReader。

我能够使用MultiPartReader将图像上传到golang后端并将文件保存到我的计算机上。

目前,我能够将文件上传到Azure,但它们最终会损坏。文件大小似乎是正确的,但显然有些地方出了问题。

我是否误解了如何为CreateBlockBlobFromReader创建io.Reader?

非常感谢任何帮助!

英文:

In the process of learning golang, I'm trying to write a web app with multiple image upload functionality.

I'm using Azure Blob Storage to store images, but I am having trouble streaming the images from the multipart request to Blob Storage.

Here's the handler I've written so far:

func (imgc *ImageController) UploadInstanceImageHandler(w http.ResponseWriter, r *http.Request, p httprouter.Params) {
reader, err := r.MultipartReader()
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
for {
part, partErr := reader.NextPart()
// No more parts to process
if partErr == io.EOF {
break
}
// if part.FileName() is empty, skip this iteration.
if part.FileName() == "" {
continue
}
// Check file type
if part.Header["Content-Type"][0] != "image/jpeg" {
fmt.Printf("\nNot image/jpeg!")
break
}
var read uint64
fileName := uuid.NewV4().String() + ".jpg"
buffer := make([]byte, 100000000)
// Get Size
for {
cBytes, err := part.Read(buffer)
if err == io.EOF {
fmt.Printf("\nLast buffer read!")
break
}
read = read + uint64(cBytes)
}
stream := bytes.NewReader(buffer[0:read])
err = imgc.blobClient.CreateBlockBlobFromReader(imgc.imageContainer, fileName, read, stream, nil)
if err != nil {
fmt.Println(err)
break
}
}
w.WriteHeader(http.StatusOK)

}

In the process of my research, I've read through using r.FormFile, ParseMultipartForm, but decided on trying to learn how to use MultiPartReader.

I was able to upload an image to the golang backend and save the file to my machine using MultiPartReader.

At the moment, I'm able to upload files to Azure but they end up being corrupted. The file sizes seem on point but clearly something is not working.

Am I misunderstanding how to create a io.Reader for CreateBlockBlobFromReader?

Any help is much appreciated!

答案1

得分: 4

如@Mark所说,您可以使用ioutil.ReadAll将内容读取到字节数组中,代码如下所示。

import (
   "bytes"
   "io/ioutil"
)

partBytes, _ := ioutil.ReadAll(part)
size := uint64(len(partBytes))
blob := bytes.NewReader(partBytes)
err := blobClient.CreateBlockBlobFromReader(container, fileName, size, blob, nil)

根据CreateBlockBlobFromReader的godoc,如下所示。

该API拒绝大小大于64 MiB的请求(但此限制未由SDK检查)。要写入更大的Blob,请使用CreateBlockBlob,PutBlock和PutBlockList。

因此,如果大小大于64MB,则代码应如下所示。

import "encoding/base64"

const BLOB_LENGTH_LIMITS uint64 = 64 * 1024 * 1024

partBytes, _ := ioutil.ReadAll(part)
size := uint64(len(partBytes))
if size <= BLOB_LENGTH_LIMITS {
   blob := bytes.NewReader(partBytes)
   err := blobClient.CreateBlockBlobFromReader(container, fileName, size, blob, nil)
} else {
   // 创建一个空的Blob
   blobClient.CreateBlockBlob(container, fileName)
   // 创建一个块列表,并上传每个块
   length := size / BLOB_LENGTH_LIMITS
   if length%limits != 0 {
       length = length + 1
   }
   blocks := make([]Block, length)
   for i := uint64(0); i < length; i++ {
        start := i * BLOB_LENGTH_LIMITS
        end := (i+1) * BLOB_LENGTH_LIMITS
        if end > size {
            end = size
        }
        chunk := partBytes[start: end]
        blockId := base64.StdEncoding.EncodeToString(chunk)
        block := Block{blockId, storage.BlockStatusCommitted}
        blocks[i] = block
        err = blobClient.PutBlock(container, fileName, blockID, chunk)
        if err != nil {
        .......
        }
   }
   err = blobClient.PutBlockList(container, fileName, blocks)
   if err != nil {
      .......
   }
}

希望对您有所帮助。

英文:

As @Mark said, you can use ioutil.ReadAll to read the content into a byte array, the code like below.

import (
&quot;bytes&quot;
&quot;io/ioutil&quot;
)
partBytes, _ := ioutil.ReadAll(part)
size := uint64(len(partBytes))
blob := bytes.NewReader(partBytes)
err := blobClient.CreateBlockBlobFromReader(container, fileName, size, blob, nil)

According to the godoc for CreateBlockBlobFromReader, as below.

> The API rejects requests with size > 64 MiB (but this limit is not checked by the SDK). To write a larger blob, use CreateBlockBlob, PutBlock, and PutBlockList.

So if the size is larger than 64MB, the code shoule be like below.

import &quot;encoding/base64&quot;
const BLOB_LENGTH_LIMITS uint64 = 64 * 1024 * 1024
partBytes, _ := ioutil.ReadAll(part)
size := uint64(len(partBytes))
if size &lt;= BLOB_LENGTH_LIMITS {
blob := bytes.NewReader(partBytes)
err := blobClient.CreateBlockBlobFromReader(container, fileName, size, blob, nil)
} else {
// Create an empty blob
blobClient.CreateBlockBlob(container, fileName)
// Create a block list, and upload each block
length := size / BLOB_LENGTH_LIMITS
if length%limits != 0 {
length = length + 1
}
blocks := make([]Block, length)
for i := uint64(0); i &lt; length; i++ {
start := i * BLOB_LENGTH_LIMITS
end := (i+1) * BLOB_LENGTH_LIMITS
if end &gt; size {
end = size
}
chunk := partBytes[start: end]
blockId := base64.StdEncoding.EncodeToString(chunk)
block := Block{blockId, storage.BlockStatusCommitted}
blocks[i] = block
err = blobClient.PutBlock(container, fileName, blockID, chunk)
if err != nil {
.......
}
}
err = blobClient.PutBlockList(container, fileName, blocks)
if err != nil {
.......
}
}

Hope it helps.

答案2

得分: 0

一个Reader可以同时返回io.EOF和有效的最后读取的字节,看起来最后的字节(cBytes)没有添加到read的总字节数中。另外要小心:如果part.Read(buffer)返回的错误不是io.EOF,读取循环可能不会退出。考虑使用ioutil.ReadAll

CreateBlockBlobFromReader接受一个Reader,part也是一个Reader,所以你可以直接传递part。

你可能还需要考虑Azure块大小限制可能比图像小,参见Asure blobs

英文:

A Reader can return both an io.EOF and a valid final bytes read, it looks like the final bytes (cBytes) is not added to read total bytes. Also, careful: if an error is returned by part.Read(buffer) other than io.EOF, the read loop might not exit. Consider ioutil.ReadAll instead.

CreateBlockBlobFromReader takes a Reader, and part is a Reader, so you may be able to pass the part in directly.

You may also want to consider Azure block size limits might be smaller than the image, see Asure blobs.

huangapple
  • 本文由 发表于 2017年4月3日 22:30:43
  • 转载请务必保留本文链接:https://go.coder-hub.com/43187362.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定