英文:
Display image from requests via framebuf
问题
I've built an HTTP API returning an image (`HTTP GET http://myRaspi/api/latestImage` → returns image with content-type `image/jpeg`).
On my Raspberry Pi Pico, I loading the image with the [`urequests` library](https://docs.python-requests.org/en/latest/user/quickstart/#binary-response-content):
```python
import network
import urequests
wifi = network.WLAN(network.STA_IF)
wifi.active(True)
wifi.connect('<<SSID>>', '<<Password>>')
response = urequests.get("http://myRaspi/api/latestImage")
imageBytes = response.content
I've attached a Waveshare ePaper display to the Raspi Pico which can be controlled via this API based on framebuf
.
Drawing primitives works like a charm, but I want to display the retrieved image, so I somehow have to translate between the image's byte array and framebuf
, but I'm struggling with finding the easiest way. I came up with the following code:
from PIL import Image
from io import BytesIO
import network
import urequests
wifi = network.WLAN(network.STA_IF)
wifi.active(True)
wifi.connect('<<SSID>>', '<<Password>>')
response = urequests.get("http://myRaspi/api/latestImage")
imageBytes = response.content
i = Image.open(BytesIO(imageBytes))
for x in width:
for y in height:
pixelValue = i.getpixel([x,y])
myFrameBuffer.pixel(x, y, pixelValue)
...where myFrameBuffer
would be Waveshare's self.imageblack
FrameBuffer
.
But so far, this does nothing but failing because Pillow is not supported on MicroPython 😅
<details>
<summary>英文:</summary>
I've built an HTTP API returning an image (`HTTP GET http://myRaspi/api/latestImage` → returns image with content-type `image/jpeg`).
On my Raspberry Pi Pico, I loading the image with the [`urequests` library](https://docs.python-requests.org/en/latest/user/quickstart/#binary-response-content):
```python
import network
import urequests
wifi = network.WLAN(network.STA_IF)
wifi.active(True)
wifi.connect('<<SSID>>', '<<Password>>')
response = urequests.get("http://myRaspi/api/latestImage")
imageBytes = response.content
I've attached a Waveshare ePaper display to the Raspi Pico which can be controlled via this API based on framebuf
.
Drawing primitives works like a charm, but I want to display the retrieved image, so I somehow have to translate between the image's byte array and framebuf
, but I'm struggling with finding the easiest way. I came up with the following code:
from PIL import Image
from io import BytesIO
import network
import urequests
wifi = network.WLAN(network.STA_IF)
wifi.active(True)
wifi.connect('<<SSID>>', '<<Password>>')
response = urequests.get("http://myRaspi/api/latestImage")
imageBytes = response.content
i = Image.open(BytesIO(imageBytes))
for x in width:
for y in height:
pixelValue = i.getpixel([x,y])
myFrameBuffer.pixel(x, y, pixelValue)
...where myFrameBuffer
would be Waveshare's self.imageblack
FrameBuffer
.
But so far, this does nothing but failing because Pillow is not supported on MicroPython 😅
答案1
得分: 1
为了使这个工作,你需要找到(或者可能是编写)一个在MicroPython上运行的JPEG解码器。这可能不太容易实现(除非你感到特别有动力)。
一个更简单的解决方案是让在你的树莓派上运行的HTTP API 返回一张不需要进一步处理的图像。也就是说,在树莓派上运行PIL代码(或其他代码),然后让/latestImage
端点返回一块字节(以及宽度和高度),你可以直接写入帧缓冲区。
也许你的API会返回类似这样的内容:
+--------------------------------------------------------------------------+
| 宽度(2字节) | 高度(2字节) | 图像内容(宽度 * 高度位) |
+--------------------------------------------------------------------------+
你可以像这样读取宽度和高度:
import struct
import urequests
...
res = urequests.get("http://myRaspi/api/latestImage")
width = struct.unpack('h', res.raw.read(2))
height = struct.unpack('h', res.raw.read(2))
其余的部分只是从响应中读取字节,并将相应的像素值写入你的帧缓冲区。
英文:
For this to work, you would need to find (or probably write) a JPEG decoder that runs on Micropython. That's probably a non-starter (unless you're feeling especially motivated).
A much simpler solution is to have the HTTP API running on your raspberry pi return an image that doesn't require further processing. That is, run the PIL code (or whatever) on the Pi, and have the /latestImage
endpoint return a block of bytes (and a height and width) that you can write directly to the framebuffer.
Maybe your API would return something like:
+--------------------------------------------------------------------------+
| width (2 bytes) | height (2 bytes) | image content (width * height bits) |
+--------------------------------------------------------------------------+
You could read the width and height like this:
import struct
import urequests
...
res = urequests.get("http://myRaspi/api/latestImage")
width = struct.unpack('h', res.raw.read(2))
height = struct.unpack('h', res.raw.read(2))
The remainder is just reading bytes from the response and writing the corresponding pixel values to your frame buffer.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论