Let me start by saying that I am a new Linux and GStreamer
user and I am completely stuck.
My goal is get up to 4 cameras being displayed simultaneously in a single window as close to real time as possible. There is no audio involved.
I am using a Raspberry Pi 4 with 4GB of RAM running the “2020-12-02-raspios-buster-armhf - Desktop.img” OS image. I’m using a 1920 x1080 LCD monitor. I have 512Mb of RAM dedicated to the GPU.
In GStreamer I am trying to get 4 USB cameras to display in a 2 x 2 window. Currently I am testing with only 2 cameras and 2 videotestsrc. My issue is this configuration works when the video size is set to 320 by 240. I get very responsive image updates on both cameras with the configuration below. The RPi Task Manager shows the Pi running with about 23% CPU and 184Mb of RAM used.
gst-launch-1.0 videomixer name=mix ! videoconvert ! ximagesink sync=false
v4l2src device=/dev/video0 ! videoconvert ! video/x-raw, format=AYUV, width=320, height=240 ! videobox border-alpha=0 top=0 left=0 ! mix.
v4l2src device=/dev/video2 ! videoconvert ! video/x-raw, format=AYUV, width=320, height=240 ! videobox border-alpha=0 top=0 left=-320 ! mix.
videotestsrc pattern=5 ! videoconvert ! video/x-raw, format=AYUV, width=320, height=240 ! videobox border-alpha=0 top=-240 left=0 ! mix.
videotestsrc pattern=6 ! videoconvert ! video/x-raw, format=AYUV, width=320, height=240 ! videobox border-alpha=0 top=-240 left=-320 ! mix.
When I double the video size to 640 by 480 the main window opens just like when using the smaller 320 by 240 window size BUT the camera video never updates after that. It seems to be frozen with the image of whatever was in view when it started. The RPi Task Manager shows the Pi running with about 2% CPU and 201Mb of RAM used. Gst-launch-1.0 shows 0% CPU.
gst-launch-1.0 videomixer name=mix ! videoconvert ! ximagesink sync=false
v4l2src device=/dev/video0 ! videoconvert ! video/x-raw, format=AYUV, width=640, height=480 ! videobox border-alpha=0 top=0 left=0 ! mix.
v4l2src device=/dev/video2 ! videoconvert ! video/x-raw, format=AYUV, width=640, height=480 ! videobox border-alpha=0 top=0 left=-640 ! mix.
videotestsrc pattern=5 ! videoconvert ! video/x-raw, format=AYUV, width=640, height=480 ! videobox border-alpha=0 top=-480 left=0 ! mix.
videotestsrc pattern=6 ! videoconvert ! video/x-raw, format=AYUV, width=640, height=480 ! videobox border-alpha=0 top=-480 left=-640 ! mix.
Questions:
Am I configuring the 4 cameras the correct way using [videomixer] or is there a more responsive way to do this?
gst-inspect-1.0 indicates that videomixer can accept the following inputs (sinks):
video/x-raw format: { (string)AYUV, (string)BGRA, (string)ARGB, (string)RGBA, (string)ABGR, (string)Y444, (string)Y42B, (string)YUY2, (string)UYVY, (string)YVYU, (string)I420, (string)YV12, (string)NV12, (string)NV21, (string)Y41B, (string)RGB, (string)BGR, (string)xRGB, (string)xBGR, (string)RGBx, (string)BGRx }
The GStreamer
web site says videomixer is limited to 3 different inputs, I quote:
“Videomixer can accept AYUV, ARGB and BGRA video streams”. My cameras output YUY2 natively.
I’m not sure which of these 3 I should use for best performance but I have found that if I don’t use one of them, then when the window is displayed only the last video source specified is displayed with the other 3 blacked out.
My cameras put out YUY2 video. When I try the following command I only get the last video displayed.
gst-launch-1.0 -v videomixer name=mix ! videoconvert ! xvimagesink sync=false
v4l2src device=/dev/video0 ! video/x-raw, format=YUY2, width=640, height=480, framerate=15/1 ! videobox border-alpha=0 top=0 left=0 ! mix.
v4l2src device=/dev/video2 ! video/x-raw, format=YUY2, width=640, height=480, framerate=15/1 ! videobox border-alpha=0 top=0 left=-640 ! mix.
videotestsrc pattern=5 ! video/x-raw, format=YUY2, width=640, height=480, framerate=15/1 ! videobox border-alpha=0 top=-480 left=0 ! mix.
videotestsrc pattern=6 ! video/x-raw, format=YUY2, width=640, height=480, framerate=15/1 ! videobox border-alpha=0 top=-480 left=-640 ! mix.
When tested individually the cameras each worked fine at higher XY sizes.
I have no idea where to look or what to try to debug this.
Any help would be greatly appreciated.
question from:
https://stackoverflow.com/questions/65893949/how-to-display-3-or-4-cameras-in-a-single-gstreamer-window-videomixer