[英]Capturing h.264 stream from camera with Gstreamer

I'm trying to capture H264 stream from locally installed Logitech C920 camera from /dev/video0 with Gstreamer 1.0 v4l2src element.

我正在嘗試用Gstreamer 1.0 v4l2src元素在/dev/video0中從本地安裝的Logitech C920相機捕獲H264流。

v4l2-ctl --list-formats shows that camera is capable to give H264 video format:


# v4l2-ctl --list-formats

        Index       : 1
        Type        : Video Capture
        Pixel Format: 'H264' (compressed)
        Name        : H.264


But pipeline


# gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! fakesink

# gst-launch-1.0 - v4l2src設備=/dev/video0 !視頻/x-h264,寬度=800,高度=448,幀數=30/1 !fakesink

keeps giving me not-negotiated (-4) error:


/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, width=(int)800, height=(int)448, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2809): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 67687169 ns.

Any help!


5 个解决方案



Is gstreamer mandatory for your needs? I also have lots of problems with the Logitech C920 in H264 mode and gstreamer. But I managed to use VLC as a RTSP server to use the C920 with H264:


$ cvlc -v v4l2:///dev/video0:chroma="H264":width=1024:height=570:fps=30 \

Then I can connect with another VLC to the URI rtsp://localhost:8554/live

然后我可以與另一個VLC連接到URI rtsp://localhost:8554/live。

If GStreamer is mandatory for you, I only managed to use it with a capture utility that you can find here: https://github.com/csete/bonecam - directory "capture"

如果GStreamer對您來說是強制的,那么我只使用了一個捕獲實用程序,您可以在這里找到:https://github.com/csete/bonecam - directory“capture”

You have to compile it, but if you have some programming skills it shoud be very easy as there is only one C file and a script to help. Just pass "host" as a parameter to the script :


# Get the bonecam/capture content or git clone the directory, and then
$ cd bonecam/capture
$ ./build host

You can use the "capture" utility with something like that :


$ v4l2-ctl -d /dev/video0 --set-fmt-video=width=1024,height=570,pixelformat=1
$ v4l2-ctl -d /dev/video0 --set-parm=30
$ ./bonecam/capture/capture -d /dev/video0 -c 100000 -o | \
      gst-launch -e filesrc location=/dev/fd/0 ! legacyh264parse ! rtph264pay ! udpsink host= port=5000

If you do not like to specify the number of frame to get ("-c" parameter with "capture"), there is a fork to this utility that you can find here: https://github.com/DeLaGuardo/bonecam

如果您不喜歡指定要獲取的幀數(“-c”參數與“capture”),那么您可以在這里找到這個實用工具的一個fork: https://github.com/DeLaGuardo/bonecam。

I know there is a plugin categorized as "bad", called uvch264 for gstreamer 0.10, that should work with the C920. But I do not know for gstreamer 1.0, and I could not test it.

我知道有一個插件被分類為“壞”,gstreamer 0.10被稱為uvch264,它應該與C920合作。但是我不知道gstreamer 1.0,我無法測試它。



Don't forget to add --rtsp-timeout=-1 to cvlc command line like


$ cvlc -v v4l2:///dev/video0:chroma="H264":width=1024:height=570:fps=30 \
       --sout="#rtp{sdp=rtsp://:8554/live}" --rtsp-timeout=-1

Without this option streaming only lasts for 60 seconds by default.




I have been trying to do the same thing and I got the same error. I believe I was using GStreamer 1.0.6.

我一直試着做同樣的事情,我也犯了同樣的錯誤。我相信我使用的是GStreamer 1.0.6。

What I found, possibly even thanks to Fergal Butler's answer, was the following page:

我發現,甚至可能要感謝Fergal Butler的回答,是以下的頁面:



Here Youness Alaoui describes the uvch264_src element he made to bring H264 camera support to GStreamer.

這里,Youness Alaoui描述了uvch264_src元素,他將H264相機支持帶到GStreamer中。

He describes the port to GStreamer 1.0 as pending in his article. So over the last week I've been looking into this. It turns out that it has now been ported to GStreamer 1.0, but only in a developer release (Version 1.1.2).

他將這個端口描述為GStreamer 1.0,這在他的文章中是懸而未決的。所以在過去的一周里,我一直在研究這個。事實證明,它現在已經被移植到GStreamer 1.0,但只在開發人員發布版本(版本1.1.2)中。

You can get version 1.1.2 here:




It's called "uvch264src" now, and it's a part of gst-plugins-bad. I think it is also present in version 1.1.1 but I haven't really looked into that.


I had a bit of a hard time getting it installed, I think mostly due to to conflicts with GST 1.0 packages installed on my PC (so my own fault). But note that it has dependencies on libgudev-1.0-dev and libusb-1.0-0-dev so install these packages first - it took me a while to work out it was those two I was missing.

我在安裝它時遇到了一些困難,我認為主要是因為我的PC上安裝了GST 1.0軟件包(所以我自己的錯誤)。但是請注意,它依賴於libgudev-1.0-dev和libusb-1.0-0-dev,所以首先安裝這些包——我花了一段時間才發現這是我丟失的那兩個包。

Here is a pipeline I got to work which uses uvch264:


gst-launch-1.0 uvch264src device=/dev/video0 name=src auto-start=true src.vfsrc ! video/x-raw, format=YUY2, width=160, height=90, framerate=5/1 ! xvimagesink src.vidsrc ! queue ! video/x-h264, width=800, height=448, framerate=30/1 ! h264parse ! avdec_h264 ! xvimagesink

gst-launch-1.0 uvch264src設備=/dev/video0 name=src自動啟動=true src。vfsrc !視頻/x-raw格式=YUY2, width=160, height=90, framerate=5/1 !xvimagesink src。vidsrc !隊列!視頻/x-h264,寬度=800,高度=448,幀數=30/1 !h264parse !avdec_h264 !xvimagesink

If you don't want to use the preview video (from the vfsrc pad) just hook src.vfsrc straight up to a fakesink. I should also mention that even though this pipeline is working for me, I get a lot of warnings about "Got data flow before segment event". So obviously I'm not doing something right, but I'm not sure what.

如果您不想使用預覽視頻(來自vfsrc pad),只需hook src。vfsrc直接指向fakesink。我還應該提到,盡管這條管道是為我工作的,但是我得到了很多關於“在段事件之前得到數據流”的警告。顯然我做的不對,但我不確定。

Anyway, after all of that messing about getting 1.1.2 and uvch264src completely installed and working, I decided to give v4l2src a quick go again. And it turns out that v4l2src supports H264 properly after all :/. (See the short answer.)


Short Answer:

So the short answer to your question is that if you are happy to install 1.1.2 from source you'll be able to do exactly what you want in the same way you've been trying to do it. You shouldn't need uvch264src. I've tested your pipeline and it worked fine with my installation. I've also tried this simple pipeline, to display the video on-screen, and it worked fine for me as well:


gst-launch-1.0 -e v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! avdec_h264 ! xvimagesink sync=false

gst-launch-1.0 -e v4l2src設備=/dev/video0 !視頻/x-h264,寬度=800,高度=448,幀數=30/1 !avdec_h264 !xvimagesink同步= false



I don't believe v4l2src supports h264 at the moment. See here:




and here:






Try use videoconvert for automatically convert the video to a format understood by the video sink


gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! videoconvert ! ...



I've also got a Logitech C920 camera, and have used the following pipeline to record H.264 video from the camera:

我也有一個Logitech C920相機,並使用了以下的管道記錄了H.264視頻。

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-h264,width=1280,height=720,framerate=30/1 ! mpegtsmux ! filesink location=output.ts

This asks the camera to produce H.264 data, which I then mux into a MPEG transport stream container, and write to disk. I can play the resulting file successfully with Totem.


The above pipeline records at 720p. The camera can also record at 1080p if you change the requested format to width=1920,height=1080.




粤ICP备14056181号  © 2014-2021 ITdaan.com