Jetson Nano is an edge computing platform meant for low-power, unmonitored and standalone use. It is ideal for use without peripherals like display monitors or keyboards connected to it. It may be needed to view the real-time camera feed and manipulations the software is making, without necessarily having a display monitor tethered to the board.
This article shows how to live stream a video from a Raspberry Pi camera to a web browser and access the stream on any other device connected on the same network.
- Jetson Nano Developer Kit
- Raspberry Pi Camera v2
- Power Source for Developer Kit
The steps to connect the Raspberry Pi camera to the board, confirm its operation and setting up OpenCV are already covered in the previous article on Real-time Face Detection on Jetson Nano using OpenCV.
Once everything is set up, we create a simple Python application that uses OpenCV to capture the video feed from the camera, resize each frame, and stream it to an HTML webpage using the Python Flask framework.
What is Flask?
Python has several web-development frameworks used to create reliable and high-performance web applications. Django, Flask, and Pyramid are the most common frameworks.
Flask is a microframework that doesn’t need external libraries or tools. It is more flexible than Django and allows users to include different plug-ins and/or extensions. Its data-abstraction or data-validation layer are found in other frameworks. Users can deploy third-party extensions for authentication, data format validation, SQL management, request handling, and user permissions.
How to Install Flask on Jetson Nano
Install Flask on Jetson Nano and confirm its version:
Flask’s Basic Tools and Terminologies
- app = Flask(__name__) creates the Flask web application.
- @app.route("/") where the string argument is the HTML webpage. “/” refers to the homepage. When the user visits the particular webpage, the function this descriptor precedes.
Simple Video Stream to Web Browser Application
As mentioned in the previous article, the Jetson Nano board uses the GStreamer pipeline to handle media applications. The GStreamer pipeline utilizes the appsink sink plugin to access the raw buffer data.
When a user sends a kill command (Ctrl + C), it waits for all data buffers to be processed before exiting and this blocks the streaming thread in our application. So, the Gstreamer object created here is additionally given a wait-on-eos=false and only one buffer (max-buffers=true) argument to avoid a dirty kill and hanging of the pipeline.
import cv2 import time import threading from flask import Response, Flask # Image frame sent to the Flask object global video_frame video_frame = None # Use locks for thread-safe viewing of frames in multiple browsers global thread_lock thread_lock = threading.Lock() # GStreamer Pipeline to access the Raspberry Pi camera GSTREAMER_PIPELINE = 'nvarguscamerasrc ! video/x-raw(memory:NVMM), width=3280, height=2464, format=(string)NV12, framerate=21/1 ! nvvidconv flip-method=0 ! video/x-raw, width=960, height=616, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink wait-on-eos=false max-buffers=1 drop=True' # Create the Flask object for the application app = Flask(__name__) def captureFrames(): global video_frame, thread_lock # Video capturing from OpenCV video_capture = cv2.VideoCapture(GSTREAMER_PIPELINE, cv2.CAP_GSTREAMER) while True and video_capture.isOpened(): return_key, frame = video_capture.read() if not return_key: break # Create a copy of the frame and store it in the global variable, # with thread safe access with thread_lock: video_frame = frame.copy() key = cv2.waitKey(30) & 0xff if key == 27: break video_capture.release() def encodeFrame(): global thread_lock while True: # Acquire thread_lock to access the global video_frame object with thread_lock: global video_frame if video_frame is None: continue return_key, encoded_image = cv2.imencode(".jpg", video_frame) if not return_key: continue # Output image as a byte array yield(b'--frame\r\n' b'Content-Type: image/jpeg\r\n\r\n' + bytearray(encoded_image) + b'\r\n') @app.route("/") def streamFrames(): return Response(encodeFrame(), mimetype = "multipart/x-mixed-replace; boundary=frame") # check to see if this is the main thread of execution if __name__ == '__main__': # Create a thread and attach the method that captures the image frames, to it process_thread = threading.Thread(target=captureFrames) process_thread.daemon = True # Start the thread process_thread.start() # start the Flask Web Application # While it can be run on any feasible IP, IP = 0.0.0.0 renders the web app on # the host machine's localhost and is discoverable by other machines on the same network app.run("0.0.0.0", port="8000")
The important sections of the code are commented with explanations. The application is written with thread-safe implementations of video frame capture and display to avoid data corruption.
- Save the application shown above as web_streaming.py and run it as python web_streaming.py. This launches the Flask app and starts the video stream.
- Run ifconfig to obtain the IP address of the Jetson Nano Developer Kit.
You can access the video feed in a browser window on any device connected on the same network by accessing the IP address. For example, type in 192.168.1.2:8000 and you can see the live stream.
Results and Conclusions
Seeing the camera operations in real-time is essentially critical when the Jetson Nano board is deployed on remotely operational platforms like robots or monitoring sites. This is a very simple technique to stream real-time camera feed over the network and view it on other machines. This technique can easily be combined with the previous article on Real-time Face detection on Jetson Nano using OpenCV to stream the processed image with face-detection and shapes on the image and viewed remotely.