Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add support for "multipart/x-mixed-replace" (for streaming video) #95

Open
aguaviva opened this issue Jun 22, 2024 · 17 comments
Open

add support for "multipart/x-mixed-replace" (for streaming video) #95

aguaviva opened this issue Jun 22, 2024 · 17 comments

Comments

@aguaviva
Copy link

I presume this has already been considered. If so, I'd be great to document why it has been implemented yes (technical limitations, memory,..) that way if others decide to implement it, can use this information to make better decisions.

Thanks for your awesome work.

@michalpokusa
Copy link
Contributor

Hi, could you share some of your code so I can see how do you want to use it?

Maybe using a ChunkedResponse class would be sufficient. I am not sure how good (and if) it will work, as streaming video will likely require lots of memory, and considering that many boards can barely load the library itself, this use case might require using e.g. Feather ESP32-S3.

I may be wrong, but I believe that even if you goal is possible at all, it likely will require specific hardware with enough memory.

@aguaviva
Copy link
Author

Each frame is a few Kb, plus I was streaming video using Micropython before, (Note: I am migrating to CircuitPython because the Micropython platform is a total mess). So it should be possible, all we need is the functionality in the server.

I don't have any decent code to share yet because I am still learning so probably I am making lots of shameful mistakes :)

@michalpokusa
Copy link
Contributor

I will try to make a simple video streaming example using on disk images, but I would really appreciate some, even work in progress code. I curently do not have any camera that I can connect to a microcontroller so I want to understand the workflow of capturing the frames from camera that later could be adapted to a "x-mixed-replace" response.

@aguaviva
Copy link
Author

In CircuitPython I tried using the following code:

BOUNDARY = "FRAME"
@server.route("/")
def base(request):
    response =  Response(request)
    response._send_headers(content_type='multipart/x-mixed-replace; boundary=%s' % BOUNDARY)
    for i in range(10):
        jpeg = b"cacacacacacacacacaca" #cam.take()
        response._send_bytes(request.connection, b'--%s\r\n' % BOUNDARY)
        response._send_bytes(request.connection, b'Content-Type: plain/text\r\nContent-Length: %d\r\n\r\n' % len(jpeg))
        response._send_bytes(request.connection, jpeg)
        response._send_bytes(request.connection, b'\r\n')
    return response

As you can see I am only streaming here some text

And with MicroPython I used https://github.com/wybiral/micropython-aioweb with this code:

app.route('/vid')
sync def vid_handler(r, w):
   PART_BOUNDARY = "123456789000000000000987654321"
   STREAM_CONTENT_TYPE = "Content-Type: multipart/x-mixed-replace;boundary=" + PART_BOUNDARY + "\r\n"
   STREAM_PART = "Content-Type: image/jpeg\r\nContent-Length: %u\r\n\r\n";
   STREAM_BOUNDARY = "\r\n--" + PART_BOUNDARY + "\r\n";

   w.write(b'HTTP/1.0 200 OK\r\n')
   w.write(STREAM_CONTENT_TYPE)
   while True:
       w.write(STREAM_BOUNDARY)
       f = camera.capture()
       w.write(STREAM_PART % len(f))
       w.write(f)
       await w.drain()

@michalpokusa
Copy link
Contributor

Thanks, I will try making it work and will come back to you with the results.

@michalpokusa
Copy link
Contributor

I managed to make a working example: https://github.com/michalpokusa/Adafruit_CircuitPython_HTTPServer/blob/x-mixed-replace-example/x_mixed_replace_example.py

Make sure to also download the "frames" folder, or change the code to work with camera from the start.

I am not sure whether it should be a feature in lib itself - it is already very big. Maybe I will make a PR based on example above as your usage seems like it might be common.

Please try the code above, I will be happy to help with any problems you encounter.

@aguaviva
Copy link
Author

Thanks!!! I'll check it out later today.

@aguaviva
Copy link
Author

aguaviva commented Jun 24, 2024

This is awesome, and it works like a charm :) I have some minor comments:

I tried to combine the below two lines into 1

        self._send_bytes(
            self._request.connection, 
            bytes(f"{self._boundary}\r\n", "utf-8")
        )
        self._send_bytes(
            self._request.connection,
            bytes(f"Content-Type: {self._frame_content_type}\r\n\r\n", "utf-8"),
        )

but for some reason that was caused an unrelated error:

Traceback (most recent call last):
  File "adafruit_httpserver/request.py", line 349, in __init__
  File "adafruit_httpserver/request.py", line 480, in _parse_request_header
  File "adafruit_httpserver/headers.py", line 59, in __init__
ValueError: need more than 1 values to unpack

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "adafruit_httpserver/server.py", line 408, in poll
  File "adafruit_httpserver/server.py", line 295, in _receive_request
  File "adafruit_httpserver/request.py", line 351, in __init__
ValueError: ('Unparseable raw_request: ', b'GET /live-feed HTTP/1.1\r\nHost: 192.168.1.57:5000\r\nUser-Agent: Wget/1.21.2\r\nAccept: */*\r\nAccept-Encoding: identity\r\nConnection: Keep-Alive\r\n\r\n')
Traceback (most recent call last):

Also, I'd be great to minimize copies by moving the b"\r\n" to the above _send_bytes (only the second time this gets called)

self._send_bytes(self._request.connection, bytes(encoded_frame) + b"\r\n")

(note that the last b"\r\n" is not utf8 encoded)

also, this bit is not needed:

content_type="video/mp4"

Do you think the suggested changes would help? Otherwise this is great and gets a decent frame rate (140Kb/s)

@aguaviva
Copy link
Author

173K/s by batching the b"\r\n" like this:

        if self.frame_idx > 0:
            self._send_bytes(
                self._request.connection, 
                bytes(f"\r\n{self._boundary}\r\n", "utf-8")
            )
        else:
            self._send_bytes(
                self._request.connection, 
                bytes(f"{self._boundary}\r\n", "utf-8")
            )
            
        self.frame_idx +=1

For some reason merging the two _send_bytes doesn't work

@michalpokusa
Copy link
Contributor

  1. You are right about content_type="video/mp4", I don't remember putting it there, it indeed doesnt make sense as this is not mp4, probably a leftover from example I copied or Copilot added it and I didn't spot it.

  2. I think wgeting the live feed wouldn't work, as wget would have to keep downloading the file, and considering the file is jpeg, it probably does not know what to do with next frames.

  3. You are right about minimizing copying the encoded frame, although I wouldn't keep track of current frame, you could simply

self._send_bytes(self._request.connection, bytes(encoded_frame))
self._send_bytes(self._request.connection, bytes("\r\n", "utf-8"))

It makes the code clear and does not really impact performance. ChunkedResponse already does it this way, overhead is very, very minimal.

I could improve that example, but it was mainly a quickly written proof-of-concept.

  1. Regarding the ValueError, I believe the cause of it are the double \r\n at the end of request
    ...Connection: Keep-Alive \r\n\r\n

The headers parser uses splitline and then splits each line to get header name and value, I suspect it tried to split empty line between "\r\n" and "\r\n", which resulted in ValueError, the question is why did the request containe doubled newline at the end.

I might add check for that to the parser in the PR, thanks for spotting that.

When you finish your project, please share a link if you can, I always like to see new ways how people use the adafruit_httpserver. 👍

@aguaviva
Copy link
Author

You code and CircuitPython made work this small setup :)
image
The antenna is highly recommended as otherwise the camera casues cross over issues with the on board antenna

BTW I cant believe how much better is Circuitpython compared to Micropython. I love the web interface and its software stack is great.

Oh and I'd be great to have a way to known when the connection got closed (try/except maybe? or maybe a callback?), that way we could deinit the camera or whatever we are streaming.

@michalpokusa
Copy link
Contributor

I do not have experience with MicroPython, but I agree that CircuitPython is very good and easy to use for most part.

It might be worth to present your setup on Show and Tell that Adafruit is hosting on Wednesday, this way more people can see what you created.

When it comes to knowing when connection is closed, I encourage you to check out an Websocket example from docs, it stores the connection in global scope, and uses async to handle requests between sending next frames. It might take some work to implement this behaviour, but it is not very hard, examples show most of the functionality.

@aguaviva
Copy link
Author

I looked at the websocked example but I couldn't figure out how to detect when the connection closes, what did you have in mind?

@michalpokusa
Copy link
Contributor

michalpokusa commented Jun 28, 2024

What I meant is that when the client disconnects, futher sent frames should result in BrokenPipeError, which you can catch and mark the connection as closed, look at fail_silently argument in Websocket.

I will have some time during the weekend, so I will try to make a sync await example that detects the closed connection. It is kind of necessary as without it you would be stuck in sending the one response, and the MCU wouldn't be able to do anything until client disconnects. I think I should have a working example by Monday.

@michalpokusa
Copy link
Contributor

michalpokusa commented Jun 29, 2024

I am nearly done, I managed to make a live video feed to multiple devices at the same time and detecting when the connection is dropped, this seems like a perfect example on how to create custom response types.

I will make a PR later with the elegant code. For now a little preview:

Recording.2024-06-29.182853.mp4

@aguaviva
Copy link
Author

Wow that is pretty cool! And there is no lag, are you using a esp32?

@michalpokusa
Copy link
Contributor

I used ESP32-S2 TFT for this.

The lag probably will be more noticeable with bigger frames. I suspect that under the hood the transfer is optimized somehow, as it seems to good to be true...

Recording.2024-06-29.191759.mp4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants